paint-brush
Generative AI Will Kill Old Stack Vendors. Let It.by@tprstly
329 reads
329 reads

Generative AI Will Kill Old Stack Vendors. Let It.

by Theo PriestleyOctober 6th, 2024
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Can generative AI and LLMs dismantle old stack and Saas vendors?
featured image - Generative AI Will Kill Old Stack Vendors. Let It.
Theo Priestley HackerNoon profile picture


Halloween. It’s spooky season, so let’s bludgeon a few old stack vendors to death — Appian, IBM, Salesforce, SAP, Pegasystems, IFS, Oracle, Software AG, TIBCO, UIPath…they’re The Walking Dead, and they know it.


Back in 2010–2012 I wrote a fair bit about business process automation over on my old blog, BPM Redux, and something I was fascinated with was the concept of self-assembling or dynamic processes, created on the fly to handle a task using artificial intelligence without the need for drawing a process map or dicking around designing another set of screens.


The problem I have with ‘business rules’ being an inherent part of some BPMS is that they start to constrain process decision making and flexibility, but having a set of AI routines that not only adapts on the fly (or JIT) but also learn and potentially create a brand new process instance out of evolving from previous instances…


Fast forward to today and it might finally become a reality.


I looked at what some of the larger stack and cloud vendors are doing today, and they all look so desperate — throwing Generative AI into their software stacks to appear relevant and forward-thinking. The trouble is, Gen-AI renders them pointless. Every. Single. One.

Let’s take Pegasystems for example (disclosure: I used to work for them years ago). They have Generative AI as part of their software solutions now but it’s still rooted in the past, forcing enterprise users to not only continue using Pega but also to continually think about building workflows with it — it doesn’t matter that it’s “low-code” pseudo-development or that you can instantly draw new process steps using the magic of AI, you’re still stuck with using Pega.



The same goes for all the rest — whether it be Salesforce, IFS, SAP; whether it’s ERP, CRM, RPA — what they all potentially understand and won’t tell customers is that you have all the data already, both structured and unstructured, and a Gen-AI layer is all you need on top.


Think about the following example — a customer complaint.


A customer emails in a complaint moaning about one of your products or services and expecting an action in response. Typically, the email will come from a contact form with all the required details to identify the customer or client, the product in question and what they expect done. If not, then it’s freeform unstructured data with a dollop of misery and anger.


Normally, it falls into a defined process that an agent has been trained to perform, with clear steps already mapped out. If not, then it’s treated as an exception and passed to someone higher up the chain to make a decision on. All that takes time to create, and develop into someone’s expensive SaaS solution that looks like a dog’s dinner on the screen. Invariably, a few years later there will be a multi-million dollar “digital transformation” program sold by a consultancy deck to eke a few tweaks and savings out of this godforsaken process to justify the cost.


But what if all this didn’t need to be done anymore?


What if training a Large Language Model on your existing processes, rules, customer data, product or service information was all you needed to do?


Going down this train of thought leads to a very different outcome.


The LLM reads and understands the content of the email, pulling out key information like customer details, the product or service mentioned, and the specific issue being raised. In the case of a telephone conversation then we’ve already seen some Gen-AI solutions transcribe in real-time an interaction and then offer up advice, a response, or an answer to a question. So, already there is no need for parsing of data to another system to fill out a screen for an agent to look at.

Based on this information, the LLM figures out the priority of the complaint, sees if the issue is common or recurring, and decides if any specific business rules or past actions apply. In some cases there may well be sentiment analysis done on the language of the complaint to determine just how important this is, how urgent or how annoyed the customer is.


Things can get funky from here.


Based on the type of complaint, the LLM creates a workflow to resolve the issue on the fly. This could involve updating customer data, processing a refund, or sending the complaint to a specialised team but it’s all done without a front end system we’ve all grown accustomed to loathe.


With JIT (Just In Time) capabilities, the LLM can generate code on the spot to do necessary backend tasks. For example, if a refund is needed, the LLM creates the code to process it and updates the backend customer database with the details.

Since the LLM has been trained on business rules, it makes sure all actions follow company policies — checking if the customer is eligible for a refund before processing it for example. The results of each complaint handling process are used to train the LLM further. If a complaint is resolved successfully, the LLM learns from the actions taken, improving future responses and workflows.

Over time, the LLM can find recurring issues, improve existing workflows, and even suggest changes to business rules or policies based on patterns in customer complaints.


The point here is that (a) you don’t need to define what the business process is any more than you do with task definition, nobody cares about static processes anymore when you can dynamically generate and execute them, (b) the LLM generates code and minimal screen UI to complete the task and have the agent involved but after that’s done the code and the UI is deleted, it’s an artefact that is no longer required because it was created to handle an interaction that is personal to that customer, and (c) humans are still in the loop, the efficiency gains are not in reducing headcount but removing the expensive excuse of paying for ancient software and unnecessary transformation programs based on old methods that are always initiated by some incoming c-suite with a relationship with a favourite vendor.


It’s not as simplistic and current Gen-AI solutions aren’t geared up for this (yet) but this should be enough to start to question why you need a huge ERP implementation, or why an expensive CRM is required at all if you can just have a conversation with the customer data directly.

Remember Master Data Management? Enterprises spent millions and years on making sure all their backend databases were fit for purpose, clean, deduped, they’re in potentially good shape and structured enough for a single Gen-AI layer on top and nothing else. Imagine not having to worry about integrating one behemoth frankenstack vendor to another, having to sync data sources, transform the data, orchestrate processes between them…it’s a pretty tantalising prospect and one that none of the aforementioned vendors really want you to know about.


Major vendors offering Gen-AI wallpaper on top of the cracked and creaking software they push are slowly dying, the benefits they offer will eventually run dry when stacked up against using artificial intelligence as the alternative on its own.


In a way Altman was right, it’s a new type of operating system, in this case it’s a business operating system and one that doesn’t need the bloatware.


I’ve maybe sat on these ideas for too long now, I might go and start building again. I mean, depending on who you read the global workflow automation market size was valued at $16.41 billion in 2021 and is projected to reach $34.4 billion by 2030, growing at a CAGR of 9.71%. Not small numbers.


Who’s with me?