Disclaimer: I’m the co-founder of Pontus.so
In the era of integrating AI into various software applications, it's essential to remember that AI is, at its core, still software. The same fundamental software principles apply, and one such principle worth emphasizing is the Separation of Concerns (SoC). The primary goal of the separation of concerns is to enhance the modularity, readability, maintainability, and reusability of software systems.
As defined by Wikipedia:
In computer science, the separation of concerns is a design principle for dividing a computer program into distinct sections, each addressing a specific concern.
The decision of whether or not to directly invoke OpenAI's services within your business logic depends on your context. In the world of personal endeavors and small startups, agility is your most trusted companion. It's the realm where creativity knows no bounds, and innovation often thrives on the boundless enthusiasm of a small, dedicated team. As such, if you are working on a personal side project or managing a small startup, you may proceed with whatever it takes to complete tasks and deliver value to your users or customers.
However, in the case of more established companies with a team of over 20 developers, direct integration with AI services could lead to long-term technical debt and reduced future agility. Technical debt can account for up to 40% of the overall value of your technology. This accumulation of technical debt results in increased coupling, making it challenging to switch language model (LLM) providers and implement observability, caching, and other essential components.
These costs continue to mount as you postpone the creation of an abstraction layer within your enterprise, ultimately resulting in wasted time. Thus, it is advisable to avoid direct calls to OpenAI without utilizing an interface.
An AI Orchestration Layer (AOL) is the key to efficiently managing the components required to transition from a prompt to a response. An AI Orchestration Layer (AOL) serves as the architectural backbone of AI applications. It functions as a central hub that coordinates and manages the flow of information and actions within the AI system. It provides reusable components that can be easily replaced as needed.
A survey of the AI landscape reveals various reusable components, including:
If you prefer a library-based approach and work with Python or JavaScript, consider using Langchain, which can assist you in implementing the necessary AI orchestration layer. However, if you require broader language support and a more opinionated approach to AI integration, you may need to develop your own solution or explore Pontus.so.
Also published here.