
How businesses move that to production requires all of those underlying layers up the stack, and then that must be overlaid with the right data, as well as security — to form a lot of the basis underneath.
For Howe, AI is not a transactional tool, and it shouldn’t be bolted on to existing processes. When it comes to the technology, earlier iterations required deep science and research knowledge.
“For a lot of organisations there’s a lot more product-driven transformation taking place off the back of AI,” Howe said.
This means if a business wants to use AI to be closer to customers and to get into that simplification and revenue growth, it requires whole business transformation and not just transacting necessarily in the same way they did with software, or software-as-a-service.
“If they want to reimagine process and workflows and listening to customers, they probably must deal more on product to make that happen, rather than bolting that on.
“But then the same sort of operational things come along as well on the other side of that, and that’s what makes it difficult.”
Howe said businesses still need to “prevent things, protect things and correct AI” through governance.
Where misalignment happens in terms of AI is having to stop an agent eating bad data or parameters.
“If you’re detecting spot hallucinations or drifts off a model; correcting or retraining AI – then it has to be re-anchored,” Howe explained. “You have to roll it back and it it’s an agent – it needs a really clear set of tasks that aren’t ambiguous.
“In governance, you’ve always got those areas of boundaries where you’ve got to do that catch.”
Watching out for the pitfalls
According to Howe there are five common pitfalls and failures V2 AI has observed when the foundations aren’t in place.
“They will keep happening if you don’t employ a strategy of setting those metrics,” he noted. “For example, setting that measurable business outcome and working backwards. Having real use cases that are tested within the business that you want to run with and can help incrementally build your capability along the way
“Otherwise, you build an AI platform, and nobody shows up to use it because they don’t know how.
“It’s the same things that has happened in software and cloud previously.”
Howe said one of the best ways to make sure the AI platform isn’t under-utilised is to upskill teams. Otherwise, one of them will fall short somewhere, along with the business case for the AI.
“You can’t have the team not having the capability, and you can’t have a solid platform or your governance, risk and compliance connected,” he explained.
AI value framework
For consulting firm V2 AI, having an AI value framework ties this altogether, because it includes procurement, commercials, trust and safety and brings that together.
This type of framework can start to provide a bit more coordination about some of the nuances that OpenAI and Anthropic. All those large language models (LLMs) aren’t necessarily familiar with traditional enterprise systems and processes.
“For example, OpenAI is a research-led type organisation,” he noted. “When a business tries to mould ChatGPT into a service contract, service agreements and commercialisation, it doesn’t always fit in the way that a business wants.”
This is important as one of AI’s greatest values lies in its ability for scale and hyper personalisation. However, its content output is generic and not as genuine as human insight – which means human content still remains primary to trust.
“If there’s something that’s written by an agent or AI versus a human, the human one is going to be far more valued,” Howe said. “Especially as there is information and attention overload.
“I think anything that’s personalised to you, especially if it’s human-to-human, is going to be deemed so much more valuable.”
That’s why getting that responsible AI, ethics framework and governance in place is paramount.
“You hear me talk a lot about governance, risk, and compliance, but putting that value framework in place really pulls out the essence of where this really sits,” Howe said.