OpenAI announced its intent to let customers build their own “GPTs” during their DevDay conference on November 6, 2023. Here’s what they said on their corresponding blog that day.
We’re rolling out custom versions of ChatGPT that you can create for a specific purpose — called GPTs. GPTs are a new way for anyone to create a tailored version of ChatGPT to be more helpful in their daily life, at specific tasks, at work, or at home — and then share that creation with others. For example, GPTs can help you learn the rules to any board game, help teach your kids math, or design stickers. — OpenAI
Creating custom versions of ChatGPT sounds excellent. But there is a caveat: you must have a GPT Plus or Enterprise account to use the new GPTs. The cost starts at US$20 per month. However, if other GPT Plus users interact with your custom GPT, OpenAI pays you a small royalty based on the number of user interactions.
I spent the last month experimenting with custom GPTs to understand the system’s benefits and limitations. I built a creative writing chatbot called the RobGon Dialog Assistant, which suggests new dialog inspired by literature in the public domain. I also created two versions of a chatbot to generate musical chord progressions based on songs in the public domain. The first version, RobGon Chord Composer, loads relevant song data from a simple text file. The second version, RobGon Chord Composer Presto, gets the song data via a custom service I wrote.
Retrieval Augmented Generation
Large Language Models (LLMs), like ChatGPT, often perform better if they can access external data before answering users’ questions. This technique is called Retrieval Augmented Generation (RAG) [1]. Instead of only relying on the LLM’s internal memory, RAG systems can find and inject relevant text data that may help the language model handle the users’ queries.
These retrieval systems can work in different ways. One method is to use a semantic text search based on the…