A Prompt by Any Other Name: IBM’s Watsonx Gets a Generative AI Enhancement
When I first began using the term “prompt engineering” last year, I thought the eye rolling would knock the planet off its axis. I got a similar reaction a dozen years earlier when I proposed writing a book on “social media” to an east coast publisher. And don’t get me started on the initial feedback on “the cloud.”
Technology nomenclature is a writhing beast, and prompt engineering hit the zeitgeist like a breaching humpback soaking eager whale watchers. This discipline, essentially undifferentiated before the precipitous rise of ChatGPT and other advanced machine learning large language models (LLMs) we’re calling “AI,” is now commanding a salary range of between $250k and $375k USD, according to Forbes
All of which is a slightly self-aggrandizing way of getting to the news that IBM is set to integrate a prompt tuner into a component of its watsonx enterprise AI and data platform.
Big Blue created the aptly named “Tuning Studio” to help users write better prompts for generative AI. It will be included in the watsonx.ai component of the platform. As the name implies, organizations will be able to use it to “tune” their foundation models with labeled data for better performance and accuracy.
According to the HR software provider Workable, a prompt engineer specializes in designing, developing and refining AI-generated text prompts, optimizing prompt performance, and improving the AI prompt generation process for a range of applications. (Exactly how “engineer” got tacked onto the job of creating input instructions for genAI engines is beyond me. Like I said, writhing beast.)
IBM’s watsonx is an enterprise-focused AI platform the company distinguishes from the generative AI used for “entertainment,” such as writing song lyrics or seeing how a version of your wedding vows would sound if written by Hunter S. Thompson. The company debuted the platform in July of this year with three components:
- watsonx.ai: This new studio for foundation models, generative AI and machine learning can help organizations train, validate, tune, and deploy foundation and machine learning models.
- watsonx.data: This is for scaling AI workloads, for all data, anywhere with a fit-for-purpose data store built on an open lakehouse architecture.
- watsonx.governance: This enables responsibility, transparency and explainability in data and AI workflows, helping organizations to direct, manage and monitor its AI activities.
The watsonx.ai component will get Tuning Studio in the third quarter of this year, the company says. The other two components of the platform will also receive some upgrades:
- watsonx.data: Planned generative AI capabilities in watsonx.data will help users discover, augment, visualize and refine data for AI through a self-service experience powered by a conversational, natural language interface. The company plans to issue a tech preview in the fourth quarter of this year. It also plans to integrate a vector database capability into watsonx.data to support watsonx.ai retrieval augmented generation use cases, again in a tech preview in the fourth quarter.
- watsonx.governance: Model risk governance for generative AI: This is yet another tech preview, in which clients can explore capabilities for automated collection and documentation of foundation model details and model risk governance capabilities. IBM said these help stakeholders view relevant metrics in dashboards of their enterprise-wide AI workflows with approvals, so humans are engaged at the right times.
IBM is also enhancing the watsonx platform with some AI assistants to help users with things like application modernization, customer care, and human resources. And the company plans to embed watsonx.ai tech across its hybrid cloud software and infrastructure products.
Is prompt engineering a “game-changing skill,” as some feverish tech reporters have suggested, or will it fizzle as more specialty tools like Tuning Studio emerge? I suspect that both are true… sort of. Generative AI is already changing the way developers work. GitHub Copilot and Amazon’s CodeWhisperer are just two examples of a type of AI-supported coding assistant that is certain to become ubiquitous. And the ability to develop and refine AI-generated text for modern applications and systems is likely to find its way into a lot of developer toolboxes.
Posted by John K. Waters on October 9, 2023 at 12:11 PM