The large language model (LLM) wars have begun. As we quickly approach the peak of inflated expectations for ChatGPT and other generative AI technologies, it is a good time to explore the growing chorus of vendors releasing new and AI-washed messaging into the market.
LLMs are dominating the conversation about generative AI and the disruption they will cause across all sectors of the economy. What we lack is clarity on how fast this capability will move through the trough of disillusionment and begin to become ready for mainstream adoption. As part of this process, we are seeing vendors, including
AWS and
Databricks, speak about the rapid pace of innovation they bring when releasing implementations targeted at traditional enterprises.
As we move through the trough of disillusionment, the winning vendors will be those that think holistically about how generative AI capabilities will be deployed in complex enterprise environments and integrated with established business systems and mature business processes in a way that protects existing revenue streams. At the same time, the organization transforms to adopt these new capabilities and accelerate innovation. To enable this adoption, enterprise vendors will need to focus on ensuring their implementations enable several key considerations:
-
Enterprise Class Management Platforms – LLMs are complex. When deployed in an enterprise environment, they must cleanly integrate with data governance, vendor management, and infosec standards for logging, user approvals, data access, and vulnerability management. The vendors that move the fastest when creating these enterprise management capabilities designed to meet the needs of complex and regulated environments will have a distinct advantage.
-
Best in Class Privacy Protections – LLMs have the potential to make vast amounts of data accessible in new ways but also risk opening new doors in an organization, allowing individuals access to unauthorized data. LLMs must work seamlessly with enterprise governance systems to ensure data protections carry through these new tools.
-
Business Systems Integration – LLMs will not stand on their own, nor can they cause added complexity in business workflows. Traditional business applications from vendors, including SAP, Oracle, Salesforce and others, will continue to play large roles in enterprise environments. The LLM vendors that create seamless integrations with these systems by giving users additional context and detail without breaking workflows will win out with their ability to increase productivity and accuracy without introducing unnecessary complexity.
-
High Industry-Specific Accuracy – As with any new technology, trust is broken with users when data is inaccurate, out of date or incomplete. The LLM vendors that win will be those that deliver high levels of accuracy within the domain of the LLM deployed.
These capabilities will assist vendors in making the transition through the slope of enlightenment and onto the plateau of productivity. They provide a guide for you to begin assessing AI capabilities and readiness to be deployed in your enterprise. They also assist you in assessing vendors to ensure the true value of their AI capabilities and avoid vendors’ AI washing their legacy products.
Generative AI is at a similar inflection point that cloud computing saw 8-ish years ago. Everyone knows there is immense value and that it will fundamentally alter how we deploy digital capabilities. Still, the technology is nowhere near ready to meet the expectations of general users, our compliance teams or regulatory obligations. As your company embarks on this journey, explore the technology and how it intersects with your organization’s operating model, your team’s skills, and your privacy and regulatory requirements. Use a baseline of operating model, governance needs and team operating model to begin assessing vendors for their enterprise-level management capabilities, integration capabilities and operating model.
Finally, always be ready to pivot quickly. New vendors will come into focus every day, and existing ones will accelerate their roadmaps leading to a lot of uncertainty for early adopters.