Some things to keep in mind before jumping to low-code
Low-code is presented as this end-all, be-all in the development world, bringing along these clickbait titles such as “The era of coding is ending”. This always makes me chuckle a bit, because who do you think is coding those low-code platforms then?
Don’t get me wrong, low-code platforms are very promising and open up many new opportunities, but they also come with their own share of drawbacks.
When you look at the strengths and weaknesses of low-code, or even no-code platforms, it actually becomes clear that they are not a fit for AI use cases on their own.
In this article, I will run you through my reasoning for saying this. However, I don’t want to rain on your parade too much and I will offer some suggestions on how you can leverage the power of low-code platforms, whilst also picking the right tools for your AI use case. But first, let’s dive into the history of low-code a bit and what allowed them to rise to their current popularity.
One of the main reasons that make me sure that low-code platforms are not this perfect solution that will render developers and data scientists unnecessary is the fact it’s actually not that new.
The term low-code might have been first coined in 2014 by a Forrester analyst, but its roots go back to data flow programming in the 60s. In 1966 Bert Sutherland published his PhD thesis entitled “The On-line Graphical Specification of Computer Procedures” in which he presented one of the first graphical data flow programming frameworks. Data flow programming focussed on representing code as a directed graph of data flowing between “operations”. At the time it was still quite complex, but the concept is actually fairly similar to the workings of present day low code platforms!
When we jump a couple of decades ahead to the 90s we end up at the so-called fourth-generation programming languages (4GL). They are still programming languages but they were aimed at non-technical users. Database languages like SQL were developed with the idea in mind that users could just type normal English sentences describing the data they needed, as opposed to having to instruct the computer exactly how to fetch that data. These 4GL languages lacked a proper visual interface, but they did take a huge leap forward in making programming more understandable for someone without an IT background.
The 90s also gave rise to another primitive form of low-code platforms: end-user development tools. End-user development focussed on providing the end-users of an application with the means to make that application. This might sound complex but Microsoft Excel is actually a great example of this. Excel offers non-programmers the opportunity to write “programs” that represent very complex data models.
With an increasing need for companies to deliver more apps at a faster rate, low-code platforms successfully entered the market in the early 2010s, building upon the foundations of the previously mentioned frameworks. With quality developers being scarce it was the perfect time and it’s no surprise that low-code platforms managed to rise quickly and create a lot of hype.
Forbes even called low-code platforms “extraordinarily disruptive”. In the last couple of years, they have certainly managed to make a lot of waves in the digital landscape and have made app development much more accessible to the general public. However, being a great tool for app development, does not mean it will be a great tool for AI use cases.
So here we are; we’ve got these amazing low-code platforms like Mendix or Outsystems that help companies to deliver new apps quickly, and without having to hire a bunch of developers which are scarce. Yet low-code is not a fit for enhancing these apps with AI. Why? Well, there are a multitude of reasons but the following are what I consider the primary ones:
Let’s explore this a little more.
By abstracting away all the code into drag and droppable blocks, a lot of overhead is introduced, and with it, increased latency.
Low-code platforms are made for ease of use and rapid development, they are simply not made for efficiency and speed. For simple applications, that’s perfectly fine, but for AI this can really be a blocker. Think about a recommendation model for instance, that uses AI to recommend items or topics to app users. A model like that needs to be able to make a recommendation in the milliseconds range to be effective otherwise you will lose customers. Amazon even noted that every 100 ms of latency cost them 1% in sales!
On top of that, AI models tend to be on the heavier side so you often cannot afford too much extra latency from the platform you’re running the model on.
Of course not every AI model needs low latency, some can just run in the background for hours, but are those the ones you would even want to put behind a low-code application though? Perhaps, but I think the majority of use cases for AI behind mobile or web applications will require low-latency to yield good results. Especially with users expecting incredibly fast response times nowadays.
When you need something that is not already in a low-code template, chances are you’ll run into the lack of flexibility these platforms inherently have. Considering that AI use cases are generally speaking quite specific and highly dependent on what type of data is available, and how it is stored and processed, AI will be difficult to encompass in predefined templates. You will probably end up with the need for custom code to make the model work, and adding that custom code in a way that it works with the rest in the template might end up costing more time than it would have been to write the full code directly. And when you get to the custom code writing part, you will run into the issue that you need to work with the language the low-code platform is using. Often these platform languages are not your typical programming language to use for AI.
Low-code is not really a straightforward fit for the iterative nature of Data science. Models constantly need to be retrained or tweaked so they can keep up with the new data that is coming in over time. However, low-code apps force you into these templates that can be hard to tweak. Even outside of the AI use cases, low-code apps are notoriously hard to update to match changing technical requirements. Not only that, if your models are maintained within your low-code environment, will this mean that every time a model needs to be updated that the entire application needs to be updated as well? This introduces a lot of unnecessary overhead and can actually lead to outdated models in production because the threshold to update them properly is too high. If you’re interested in this, Ben Hosking wrote a great article about the maintenance nightmare of low-code apps.
Even though low-code platforms might not be the best fit for AI, that is not to say that AI and low-code cannot work together. Low-code platforms are amazing for developing scalable apps quickly and it’s only a matter of time before the companies who decided to develop those apps will want to enhance it with AI. Perhaps you will not find the right tools to do that within the low-code platforms themselves, but there are apps specialized in hosting and serving models out there that could work great in combination with a low-code platform. The important thing is to find the best tool for each job. A good low-code platform for general app development, and a good tool for hosting and serving AI models that need to run behind the apps.
The most important thing when putting AI behind a low-code app is that your AI code needs to be callable from your app. To do so, it’s good to have your AI code reside behind an API. That way you can just send a request from your low-code app, to the API, which will, in turn, run the AI code.
Since APIs are so common most low-code platforms have ready-made elements you can use to make the API call, which makes it easy to set up the connection. This only leaves the problem of actually getting your code behind an API and hosting it somewhere.
You could do this yourself from scratch, but you can also use tools that put your code behind an API automatically like AWS lambda, Azure functions or UbiOps. If you decide to go for a tool there are different options and which ones fit best depends on your use case. Cloud functions are very fast, but they are limited in what type of code they can run and they tend to require a bit of general IT knowledge.
UbiOps is a tool that is specifically designed for serving data science code and also sets up an API for you automatically. It is targeted more at actual data scientists. If you want to have a look at how it would actually work to use UbiOps for enhancing your app with AI you can have a look at this article.
All in all, low-code platforms are indeed great tools, but just not that great (on their own) for AI purposes. AI just happens to be something that is better taken care of through proper code, albeit with the help of some handy tools that help you make it get it live. Just remember to look for a tool that best fits your needs, and find a way to tie it all together!