Limited Time Offer: Watch All Our Bubble Tutorial Videos for Only $99 per Year!

Creating a Chatbot experience using Bubble & ChatGPT API

In this Bubble tutorial we demonstrate how to use the ChatGPT API to create an AI chatbot like experience in your Bubble app. We dive right the deep end of the Bubble API Connector, we have getting started videos for the OpenAI API on our channel already. This video shows one way to approach the challenge of formating the message history correctly in JSON so that your chatbot is aware of the message history and respond accordingly.

ChatGPT have just released their API, and here is a tour of my Bubble app, which I've set up to make use of the conversational nature and abilities of ChatGPT.

Why ChatGPT is special

What I mean by that is the ability, if we look at the documentation here, that once you ask a question and you get an answer back, you can ask a follow up question and ChatGPT is aware of the previous points and previous text of the conversation that's come before your current prompt.

Understanding the ChatGPT JSON

So we can see this is how ChatGPT want you to send your request. And it's basically saying that every time you send a message, you have to provide the full history of the conversation up until that point. And we're jumping right into the deep end of the Bubble API connector here. Do check out our earlier videos to show more basic set ups of what you'll need. But this is taking about half an hour to get working, and I'm going to show you how I've done it.

So if I go into the Bubble API connector, I've got an OpenAI connection set up, and this is the body of my JSON object. And part of the challenge of this is that we need to provide this format, like role with system or user assistant, every time we send a request, and we have to keep a history of it and we have to do that in a way of ensuring that our JSON syntax is correct, that it's not going to cause any errors.

My Bubble Workflow

I've simply set up that I can enter in my API call everything in the messengers section. I just have a value set up there. Let's have a look at my page. So I've got a multiline input, a button, and a repeating group. I've created a content type called message. And let's have a look what happens when you click the button. So I create a new message, and the message has got two fields for storing text. One is the plain text that says the user enters it into the multiline input. And then I have a second field here, which is my way of generating the necessary syntax and parameters to create a line just like this as detailed in the documentation. I've also got an option set called GPT role, and I've got the different roles, the user, system and assistant.

Using Join with to structure the JSON

So I'm simply printing user in that space there, and then goes in my multiline input. That's what my user has entered. And then so that I can display it, have a bit of control over how I display in my repeating group, I'm also saving the... What do they call it? The role. There we go. Is it system, is it user, or is it the assistant? I'm saving that as an option set field with my message. I then send the API request to OpenAI. And I do this by searching through messages. And you might want to group these in conversations, but I've just kept it simple for this demo. And then I say each item's JSON. And so that is where I've generated this line here. And then I say join with a comma and a space. Here we go. If we look at the JSON syntax here, we can see that it needs to join every statement with a comma and a space. One of the errors I see people making all the time with JSON is that you want to put a comma after the final bit, and there you don't. You have a comma between every statement.

So that's got the format right and that prints it. And by doing a search for, I get the history of all the messages sent previously. I then create a new message. This is my reply. So I've set the GPT role as assistant. I then have to go through the same process of formatting it. So I say role is assistant, and the content is the response of the OpenAI choices first item message content and trimmed. And I've added trimmed in there because sometimes the response you get back can have a space or a line break at the start of it, that just clears out anything that isn't text from the start and finish of the text that's returned. I then also do the same thing and I store this as plain text. Lastly, I reset my input ready to take another input into the app.

Testing conversational aware AI

So let's check it. Let's see if this is a history aware conversational AI. So I can ask it, how tall is the Shard in London? And I get a factual statement back from the assistant. And then let's say, is it the tallest building in London?

So I'm saying is it, because I'm checking to see whether my response... Sorry, whether my API request to OpenAI is aware of the previous history of the conversation that I'm talking about the Shard. Let's see if this works. Yes, the Shard is currently the tallest building in London. So there we go. We've got it working.

The last bit I'm going to show, in case anyone feels that I've missed anything, is I've got my repeating group of search for messages. And all I'm doing is printing the current cell's message GPT role. So that's showing the user or the assistant. And then I'm printing the plain message beneath it.

So there you have it. That is one way that I've been thinking of over the last 48 hours of how to format the JSON correctly so that you have that conversational, that history aware approach and methodology for using the OpenAI ChatGPT API.

Latest videos

crossmenu