Adopting Inferable
Adopting Inferable into an existing service architecture
One of the core principles of Inferable is to be as unobtrusive as possible. Adopting a new service into an existing architecture can be a daunting task.
Therefore, we’ve designed the Inferable developer tools to be bolt-on, meaning you can add Inferable to your existing services without having to make any significant changes to your existing codebase.
Inferable provides 3 ways to integrate with your existing services. Depending on your architecture, one of these methods may be more suitable than the others.
1. Direct Integration via SDK
The Inferable SDK is a library that allows you to interact with the Inferable API. It provides a simple interface for registering your functions, inline in your codebase. We have general availability for Typescript and Go with C# / .Net coming soon.
Pros
- Simple to use: The SDK is designed to be simple to use and easy to integrate into your existing codebase.
- Compilation Type Checking: In typed languages, you get the benefits of type checking Inferable interfaces against your functions.
Cons
- Language Support: Currently, the Inferable SDK is only available for Typescript and Go. We are working on SDKs for .Net, Ruby, and Python.
- In-service Message Consumer: The SDK requires a message consumer to be running in your service to receive and process messages from Inferable. This may not be appropriate for your use case (e.g serverless environments) or require additional resources.
2. Proxy Integration
The Inferable Proxy is a standalone service that sits between your service and the Inferable API. It acts as a bridge between your service and Inferable, allowing you to interact with Inferable without having to make any changes to your existing codebase. The proxy will pass messages between your service and Inferable, leveraging HTTP and GraphQL APIs that you have already implemented in your services.
Code generation for REST and GraphQL APIs is available out of the box. This allow you to generate Inferable functions from your existing OpenAPI specifications or GraphQL schemas.
The Proxy is open-source and available as a Docker container. It is designed to be run as a standalone service alongside your existing services.
Pros
- No Code Changes: The Proxy allows you to interact with Inferable without having to make any changes to your existing codebase.
- Language Agnostic: While the Proxy itself is written in TypeScript, it communicates via HTTP. Meaning you can use it with services written in any language.
- Isolated Runtime: The Proxy runs as a separate service, meaning it won’t impact the performance of your existing services.
Cons
- Additional Service: The Proxy is an additional service that you need to run alongside your existing services.
- Requires APIs: The Proxy requires your services to expose REST or GraphQL APIs that it can interact with, internally. (No public APIs required)
3. Custom Integration
Custom Integration is only available for Inferable Enterprise customers. Contact us at [email protected] to learn more.
Both the SDK and Proxy are designed to be flexible and extensible. If you have a specific use case that doesn’t fit into the SDK or Proxy, we provide direct access to Inferable via a REST API backed by an OpenAPI specification.
Pros
- Customization: The REST API allows you to interact with Inferable in any way you see fit.
- No SDK Dependency: If you don’t want to use the SDK, you can interact with Inferable directly via the REST API.
- Language Agnostic: The REST API is language agnostic, meaning you can use it with any service written in any language.
- Controlling the UX: The REST API allows you to control the user experience of end-user integration with Inferable, allowing you to build methods of issuing commands to Inferable other than Inferable notebooks.
Cons
- Manual Integration: The REST API requires manual integration with Inferable, meaning you will need to write the code to interact with Inferable yourself, and handle any errors that may occur.
Was this page helpful?