Okay, here we go. Creating a no code AI chat bot powered by OpenAI using GPT-4 in Bubble. Let's build it. We have a blank slate. Here is every step you're going to need.
ChatGPT / GPT-4 is in beta
Just a quick note up front, one is that GPT-4 is currently only accessible to beta users. So you can follow this tutorial and use GPT-3.5-turbo, and it's going to work just as well, or at least the API calls will work as well. Of course, GPT-4 promises much better text generation. Anyway, here is my blank canvas in Bubble. In fact, I am starting this with a new Bubble app altogether.
Designing our chat UI
First of all, I'm going to put in a multi line drop down and style my page into a column, align it center, put a bit of padding up top. And then I'm going to add in a button because remember, we are making an AI chat bot. So we need somewhere to put a message. And then just say, send message from my button.
Connecting Bubble to OpenAI
Let's dive right in to where no code meets low code using the API documentation in OpenAI. So we have to interpret this into the Bubble API connector. So I go to add plug in to API connector. There we go. And this isn't just restricted to OpenAI. In fact, check out, subscribe to our YouTube channel. We're releasing multiple videos each week, and you'll see how to do all sorts of things from web scraping through to speech to text. It all starts here with the Bubble API connector. So we'll call this send... No, in fact, the title is OpenAI. That's the API's name.
Authentication with OpenAI
We're going to authenticate with a private key in the header. How do I know that? Because in the header, H for header, we have authorization, bearer, and then our API key. There is another header field here that's required. So I'm going to add it into shared header content type, application/json. Okay, and then I start building up my actual API call.
So this is where I write send message, and I'm going to change it to an action because I want this to be an action I can take during a workflow. And because I'm sending data, I'm going to go change it to post. I'm then going to copy everything that is in the data field of the API call and paste it into there.
Understanding different ChatGPT models
So here we go. You can see there are a number of different models. And if I open up that, then I see it. There we go. So these are different models that you can use currently as of the time recorded towards the end of March 2023. We're going to be using GPT-4 because I've got beta access to it. If you don't have beta access, your next best bet is to use this one, GPT 3.5 Turbo. That's basically the one that was powering chat GPT, and everyone justifiably got so excited about. But for now, we're going to be using GPT-4. So I'm going to change that to 4. And we need to know where we're sending our call to, which is this endpoint here. And we need our API key. So I know that the key value starts with the word bearer, and then I go into my OpenAI account and I create a new key. I'm going to copy that, and this key will be deleted by the time this video is published. I paste it in.
Let's just see what happens if I send hello. Let's try. Okay, I've got a response, and the response from OpenAI is hello. How can I help you today? So there we go. Quite a standard opening to a chat. So I'm going to click Save.
Adding ChatGPT to a webpage
And then let's get that working on our page here. So this is going to be... Enter your message here. And how do I connect the API connector, api that I've just set up, to the UI that I've created on this page? So I go start edit workflow. And then because I made an action and I've initialized the call and it's been successful, I get it here, OpenAI, send message. You'll see that that corresponds to the labels here, OpenAI, send message. So I send the message.
And in this case, and we're going to expand upon this, I'm going to start off simple. So I just get it working with sending one message at a time and getting one response back. But I will, in this video, be covering how to get the conversation going. One of the magic things about GPT is the ability to have that conversation. That OpenAI GPT is aware of messages that have been sent previously, and it bears those in mind when it composes the replies. It's aware of the conversation's history.
But for now, we'll keep it simple because I'll put the multi line input into there, and then I need a way of displaying it. So I will create a repeating group. Change this to a column, give it a little bit of spacing up top. I'm going to create a data type called message because I'm going to store every reply I get back from OpenAI as a message. And that within my database. So I shall call this text. No, I'll just call this content to make it more clear because it's of type text. So when my button is clicked, I send anything in my text box to OpenAI, and then Bubble is going to get a response, and I want to save that response.
Saving ChatGPT responses to my Bubble database
So I create a new thing and I create a message. And then in the content, I get the result of step one. And just because I've been working a lot with the OpenAI API in the last few weeks, I know that I go into choices. And it's termed as choices because you can send with your initial API request, the option of OpenAI returning more than one option as a reply.
Because we haven't stipulated that, we are saying, send us back one reply. So it's the first item, and then message content. And then to display that, I would say, messages, do a search for messages, and then I'm going to arrange them in reverse date order. In fact, there we go. Sort by date, the sending so that the latest one appears at the top. I'm going to get rid of the row limit, add in a text box. Textbox is message is content. Make this width 100%. Then to get this just a bit more nicely styled I want to get rid of the min height there. That tab put some padding in around the text. Let's go for 16. Okay, we should be able to test that.
So I should ask a question. I'm going to ask, what is the capital city of the UK? That's where we're based. Send message. Okay, I get an error. Right, I do like to keep these videos raw. So what have I done wrong here? So let's go back into the API connector. And I've not put my message in. Right, so I need to make this... I can't believe I missed this.
I need to make this basically dynamic input or dynamic values as Bubble terms here. So I will call this message content. Right, I untick private because I want to be able to access that in my workflow. I'm going to go back to my workflow. And apologies if you've been following along, that was an oversight for me. I put my multi line input into here. So you'll see now that rather than displaying the whole of the data or the body of the call, it is now just giving me space to replace this part of the message. Let's try that. What is the capital city of the UK? Let's try that again. The capital city of the United Kingdom is London. Perfect. It's correct. Amazing. Now, what I wanted to ask a follow up question. Well, if I was to remove the content of the box here and send another question, I would get an intelligent answer back from OpenAI, but it's not going to be aware of the previous points of the conversation.
Conversation aware OpenAI/ChatGPT API
So this is moving on to the second part of this video. So if I look into the documentation for Chat, you can see here that the conversation needs to be sent to OpenAI. Basically, the history of the conversation needs to be sent each time. So your messages part here will begin to look like this part here.
So how do we do that in Bubble? Well, there's a way, and I'm going to show you. So instead of the dynamic content being there, I'm going to make it all of this becomes dynamic. And I'll call this messages instead. Then I need a way of building up the correct JSON syntax so I don't get any errors. And the way I'm going to do that is by adding in a few extra steps. So if I clear that out, I shall create a message because I need the history. I need a record of what I send and what I get back. So before I send the message, I'm going to make a message here. So this is the multi line input value, and then going to create a JSON version of it. And so this has to follow the formatting of here. So it is everything expressed in a row like this. So I'm going to copy that row, paste it in. And then the actual text part is going to be the same as input into a multiline input.
But this is where my earlier video on chat GPT, something that I missed, I recorded a follow up video to it, but the thing I missed was to make it JSON safe. And that means that if there are any special characters or punctuation that could result in the JSON syntax error, Bubble escapes those characters. It means putting a back slash in front of the punctuation so that it's not confused as being part of the code. And something that is easy to overlook is that Bubble also puts speech marks around automatically for you. So I'm going to remove the speech marks. Otherwise, my content there is double speech marked. I then send my messages. And this needs to now be a list of all of my messages. So I can go with do a search for messages. And then the order of these is going to be the created date with the oldest one first. So the sending is no. And if I look at the correct syntax here, it needs to end in a curly bracket, and they're joined together with a comma. So I say, add in the JSON expression for each one, join with a comma and a space.
Okay. I then need to create a new message with the reply from OpenAI. And so I'll say content equals result of step two, choices, first item message content. I then need to create a JSON equivalent to it. And while I'm thinking of that, I just need to go through and change this to user. It's really worth experimenting with these different parameters here because the... Is there an example on this page? Or is it better? The example here. The fact that there are three different types. You can have system, user, and assistant. For the point of this video, I'm just going to be using user and assistant because it gets it working. But I'd really experiment with using system and putting that as the first request because it sets the tone, it sets an identity for OpenAI to reply. So in this case, it is you are a helpful assistant. You could say you are an SEO copywriter. People have got so creative over the last few weeks with just what you can do with it. So that's the user. And then this one is the assistant. So I paste in my line of JSON and I know that it is the assistant.
And then the content is exactly what I'm saving up at the top. So this is like my human readable value. And then this is my JSON version. So I go choices, first item, message content. And I think on this one, too, and this might come back to bite me, I need to make this JSON safe as well. Because OpenAI could return with special characters, Bubble will intelligently unescape them. I don't know if that's the correct term, but Bubble will format stuff back from OpenAI, but I need to make that formatting safe again when I send it in a follow up message.
Sending OpenAI the conversation history
So our workflow here creates a message that's my user input. It then sends that to OpenAI along with any historical messages, and it creates a new message as a reply. So that should mean that this is quite a lean approach because each time I send it, OpenAI is getting the history. Now, let's test it out. I've got a nasty feeling I've missed stuff, but I do like doing these longer videos, and I like to show when things don't work because it's all part of the process.
So in actual fact, so as not to cause any confusion, I'm going to delete that first message. Right. So let's go for what is the capital city of the UK? Send message. Okay, so there's my user's input, and we're just waiting for OpenAI to send back the assistance reply. And there we go. Right, so now let's check to see if it really is content history aware of the conversation. I'm going to say, can you recommend an itinerary for a day trip there? Okay, so I'm not referring to London again. Will it be aware of the history of the conversation? So we see the loading bar going across. It's taking its time. Hopefully, we're not going to get an error.
Waiting for a response from GPT-4
Maybe OpenAI is returning quite a large chunk of text, but we'll see. I'm also aware of the fact that Bubble can time out with responses to GPT-4 because GPT-4 can take such a long time to reply. I'm really hoping that this will work. Otherwise, I'll revert it back to GPT 3.5 Turbo, or I'll try a shorter message. I'm getting the feeling that, oh, here we go. That was so close.
But we see, I have lived in London, I can testify that Buckingham Palace, Westminster Abbey, all of these locations are indeed in London. So that's why it took so long because they've actually given us like an hour by hour, including times, including locations, all different things to do in London.
Now, there's one thing that I will add to clean this up is that when the user submits the message, I'm going to reset the input, and then I'm going to focus on the input. So that means that the cursor is going to go back to an empty field. So let's delete all our messages, and we'll try that again. In fact, even better, just from a user experience perspective, I say on page load, set focus to the multi sign input. And then if I refresh the page, my cursor is there automatically blinking away. So let's go different. Let's say what is the capital of France? Send message. The capital of France is Paris. And then I'll say, name me three things I can eat there. So by putting in three, I'm hoping that it's going to limit it to quite a short response, so we won't have to wait so long.
There we go. Croissants, escargot, and creme brulee. The only thing that I would change on that is that I would move this part up to here. That's going to work. So yeah, my new message that saves the input value. I then clear the input. Everything that goes to OpenAI, nothing is coming from the input. So it doesn't matter that I've cleared it there. It's all coming from my database of search messages. I create a new message of the response and I reset the input to the focus to my input.
So there you have it. I'm hoping that this is going to cover some of the questions that I've had on earlier videos to do with OpenAI, because I've shown you here the whole process needed. And I've shown you some rather impressive things you can do with GPT-4. Remember, if you've not got access to GPT-4, you can be putting in... Where is it here? GPT-3.5 Turbo. Your call is going to work just as well doing that until you can get access. Please do subscribe to this channel. We're so pleased with how well everything's going on YouTube.
We're putting up multiple videos each week. If you have any request for videos or any comments, you know where to put them. Put them in the comments section on YouTube. We read every single one of them. And when you suggest a video, more often than not, we add it to our list.
So there you have it. That is what I hope to be a very comprehensive take on building a chat bot using Bubble, using OpenAI, and using GPT-4.