**Developing a Custom GPT System for Contextualized Answering**
In this article, we will explore the development of a custom GPT system designed to provide contextualized answers to user queries. This system utilizes a combination of natural language processing techniques and machine learning algorithms to generate accurate and relevant responses.
**System Design and Architecture**
The proposed system consists of several components, including a get organic result function, a scrape website function, and a custom interface for inputting user queries. The system also incorporates a grounding context feature, which enables it to understand the nuances of human language and provide more accurate answers.
To develop the system, we first created a set of instructions outlining the desired functionality and behavior. We then integrated a GPT model into our system, allowing us to process and analyze large amounts of data in real-time. The custom interface provides an easy-to-use platform for users to input their queries and receive relevant responses.
**Testing and Evaluation**
To test the effectiveness of our system, we conducted a series of experiments involving various user queries. One such query was "this weekend is Sam al man fired from open AI." Our system successfully provided a contextualized answer, including three bullet points that supported the response.
We also tested the system's ability to adapt to changing circumstances by attempting to find information about recent news events. For example, we queried "who won the Las Vegas F1 Grand Prix" and received a response indicating that Max first open had taken the win.
**Output Format and Structure**
One of the key features of our system is its ability to output responses in various formats, including JSON, short concise responses, and even poetic formats. By adjusting the output format, users can tailor their responses to suit their specific needs.
For instance, we created a query for "who won the Las Vegas F1 Grand Prix" and received a response in JSON format, which provided the essential details of the event. We then modified the output format to produce a short concise response, which effectively conveyed the same information in a more compact form.
**Conclusion**
In conclusion, our custom GPT system has demonstrated its ability to provide contextualized answers to user queries with high accuracy and relevance. By integrating various components and adjusting output formats, we can tailor the responses to suit specific needs and applications. We hope that this article provides valuable insights into the development of such a system and inspires further research and innovation in natural language processing and machine learning.
**Accessing the System**
For those interested in trying out our custom GPT system for themselves, we invite you to sign up for our channel and access the GitHub repository where we will be uploading the script. We also plan to create in-depth tutorial videos for members of our community, covering topics such as system development, data preprocessing, and testing.
By working together and sharing knowledge, we can continue to push the boundaries of what is possible with natural language processing and machine learning. Thank you for tuning in, and we look forward to seeing you again soon!
"WEBVTTKind: captionsLanguage: enso of course we have to start with the most obvious query uh this weekend is Sam Alman fired from open AI in open AI Saga a Twist we found Sam Alman was dismissed but might rebound the board had once ousted him it's true but talks of return now Brew unew fired suddenly a Shockwave sent yet on governance Sam might relent if terms are met and changes made a CEO's comeback might be staged okay that's great so what you just saw is the system I created where we can take context from a Google search to kind of ground the model and also get fresh data uh all of this is kind of powered by the assistance API so let's just take a look as always let's start by looking at the flow shart so you can see in the top left corner here the user puts in a query this could be a question anything they are wondering about right uh we have then a function like that is run by gp4 Turbo to rewrite this query into a more Google search friendly query this could be like shortening it just taking out the keywords and stuff this Google Search friendly query will be sent to our assistance API that has some function calling that calls up on the Google search API function and the Beautiful soup 4 scraping function so the Google search API returns a URL that um we go and scrape and we collect the context from that website right uh before we move on I want to show you like how this works because you can see here user original user query plus website context is collected so if we take a look here now you can see we have the user query here so who won the Las Vegas F1 Grand Prix right and the rephrase Google search query could be something like Las Vegas F1 grand prix winner and then we scrape this content from our website so max foren Emer on top of the end yeah that's good retrieve contacts from Google then we put it together so we have the user query the context so maybe like who won the Las Vegas F1 Grand Prix we feed the context under here and then we could get a gbd4 response like MOX for stopen was the Las Vegas F1 grand prix winner so the model did not have any information about this but we went to Google we retrieved this information we fed it back to GPD for Turbo and now we can answer this question right so this is kind of what happens here like it handles the question with the grounded text grounded context and user receives the Rewritten final answer so we can kind of structure this format I did some tweaks about this so we can adjust what kind of format we want the output that the user receives in right uh and that is basically the flow chart now let's take a bit of a look at the python code on functions before we move on to test this okay so the first function I wanted to take a look at is the generate Google search query so here we use gp4 turbo to convert a user input into a Google search query so basically the prompt here is convert the following user query into optimize Google search query and then we have the user input right uh but I have a system message here in this uh client chat completion here you can see we used the new gp4 turbo so you are a Google search expert your task is to convert unstructured user inputs to optimize Google search queries and then I gave like an example why was Sam ultman fired from open AI optimize Google search query Sam ult fired open AI so that is basically all I gave this uh so that is a very simple function and it works pretty good we also put in a print function here so we can take a look at it uh the next one is just get organic results we use Ser API to use the to do the go Google search so only we missing our API key here I'm not going to show that we return three results and here we have the scrape website function pretty straightforward just a beautiful soup uh straight up scrap web parser I don't know what I'm going to call it and here is where we create an assistant with a specific name this is the client bet assistance crate so this is the API from open AI uh we have called it Google GPT you're an assistant capable of fetching displaying news article based on user queries I don't know that instruction I just kept it as is I'm not going to change up that yeah we have a new gp4 uh 128k turbo and here we have the tools right so this is the function calling so you can see the type is function it's going to get organic result that is this function right and we have the function scrape website this one so that is our two tools that we are going to do so uh these tools are not using like the parallel function calling because uh they are kind of Serial based because we need the URL before we can scrape the website but uh we could have a lot of more function calls here I might do a video on Parallel function calling that seems to be working much better now that we can do multiple things like at the same times or in parallel so I'm probably going to do a video on that later but yeah pretty straightforward function calling here we just want to return our URL and we're going to scrape that URL that is basically it and I put in just some it's a true Loop here so when we enter our your query we can get it back after the answer so we can follow up with another query right and that is basically it uh yeah I want to show you this too so here is kind of where we uh chat completion request with the grounding context here is where we can kind of adjust the because we have the context news content plus the user query right and then we're going to print this grounding context so here is kind of where you set the structure you want for the final output so I just put in always return only the essential parts that answers the user's original user query but add three bullet points to back up your reasoning for the answer so this is the system message I set for this and it seems to be working quite well as you will soon see uh and yeah that is basically it I added some colors here so you can separate the user input from the GPT for response and yeah I think that's it it's pretty straightforward we can also go over to open AI playground and here you can kind of see our system so Google GPT here we have the instructions we have the model and here is kind of our let me zoom in a bit our functions right so we have the get organic result function we have the scrape website function uh unfortunate this does not work in this playground because we can't scrape websites from here at least I have not got this to work uh but that it works great in Like We R run it in the terminal so I think we're just going to do some queries test up some stuff maybe change up the output here and we only going to search on like recent news so yeah let's try it out so of course we have to start with the most obvious query uh this weekend is Sam al man fired from open AI so let's find this out so here you can see the converted U Google search queries that is Sam ultimate open ey fired status uh and here we can see we get the context from a website right that's the white part this is from the words I think and here you can see the response yes Sam ultman was fired from open Ai and here we get our three bullet points the text mentions Alman was suddenly fired from the board Friday with no notice discussions about him potentially returning okay I didn't know that his firing has led to uncertainty with open CA several sen seniors researchers to resign so I got to say this is working quite well remember this of course this has nothing to do with the training data this happened just a couple of days ago so yeah I'm pretty happy how this turned out I got to say so you can see this is the context we fed it and yeah it's looking very good to be honest uh let's try some other queries and let's try to change up the output structure here but let's try something else so let's try who won the Las Vegas F1 Grand Prix wow that was a long context uh so the go converted search query was Las Vegas F1 Grand Pre winner yeah that's good and this was from today perfect MOX firstopen won the Ground Prix despite a 5-second penalty and a coition with George Russell managed to take the win this Victory Mark 53rd Grand Pre win of career first up and became the first driver to ever win three times in the same country wow that's good nice but now let's change up the output prompt a bit and see if we can get some other structures from this I changed it up to always return a short concise response in valid Json format so let's just go what Sam ultimate fire from open AI uh and yeah we get Sam ultimate fired through even date Friday Source multiple people for me with the matter okay that's something I guess it is a Json format uh okay so let's try something else so let's do who won the Las Vegas Grand Prix again so you get the search qu great let's take a look at the Json format winner Max firstopen event Las Vegas Grand Prix I guess it works so yeah I guess it works let's do something else for the output format let's let's do uh always return the response in a short poem format so let's see if this works so let's just go with the same query is Sam ultman fired in open AI Saga a Twist we found Sam ultman was dismissed but might rebound the board had once ousted him it's true but talks of return now Brew un new fired suddenly a Shockwave sent yet on governance Sam might relent if terms are met and changes made a CEO's comeback might be agage okay that's great so yeah I think we kind of pro that this works pretty good I have done some more testing too but yeah you kind of get the gist of it by just by looking at this video uh if you're interested in trying this out for yourself uh you can sign up uh as a member to my channel and I will give you access to the GitHub where I will be uploading the script and I might do some more in-depth tutorial on the members video section we will see uh other than that thank you for tuning in have a great day and I'll see you again soonso of course we have to start with the most obvious query uh this weekend is Sam Alman fired from open AI in open AI Saga a Twist we found Sam Alman was dismissed but might rebound the board had once ousted him it's true but talks of return now Brew unew fired suddenly a Shockwave sent yet on governance Sam might relent if terms are met and changes made a CEO's comeback might be staged okay that's great so what you just saw is the system I created where we can take context from a Google search to kind of ground the model and also get fresh data uh all of this is kind of powered by the assistance API so let's just take a look as always let's start by looking at the flow shart so you can see in the top left corner here the user puts in a query this could be a question anything they are wondering about right uh we have then a function like that is run by gp4 Turbo to rewrite this query into a more Google search friendly query this could be like shortening it just taking out the keywords and stuff this Google Search friendly query will be sent to our assistance API that has some function calling that calls up on the Google search API function and the Beautiful soup 4 scraping function so the Google search API returns a URL that um we go and scrape and we collect the context from that website right uh before we move on I want to show you like how this works because you can see here user original user query plus website context is collected so if we take a look here now you can see we have the user query here so who won the Las Vegas F1 Grand Prix right and the rephrase Google search query could be something like Las Vegas F1 grand prix winner and then we scrape this content from our website so max foren Emer on top of the end yeah that's good retrieve contacts from Google then we put it together so we have the user query the context so maybe like who won the Las Vegas F1 Grand Prix we feed the context under here and then we could get a gbd4 response like MOX for stopen was the Las Vegas F1 grand prix winner so the model did not have any information about this but we went to Google we retrieved this information we fed it back to GPD for Turbo and now we can answer this question right so this is kind of what happens here like it handles the question with the grounded text grounded context and user receives the Rewritten final answer so we can kind of structure this format I did some tweaks about this so we can adjust what kind of format we want the output that the user receives in right uh and that is basically the flow chart now let's take a bit of a look at the python code on functions before we move on to test this okay so the first function I wanted to take a look at is the generate Google search query so here we use gp4 turbo to convert a user input into a Google search query so basically the prompt here is convert the following user query into optimize Google search query and then we have the user input right uh but I have a system message here in this uh client chat completion here you can see we used the new gp4 turbo so you are a Google search expert your task is to convert unstructured user inputs to optimize Google search queries and then I gave like an example why was Sam ultman fired from open AI optimize Google search query Sam ult fired open AI so that is basically all I gave this uh so that is a very simple function and it works pretty good we also put in a print function here so we can take a look at it uh the next one is just get organic results we use Ser API to use the to do the go Google search so only we missing our API key here I'm not going to show that we return three results and here we have the scrape website function pretty straightforward just a beautiful soup uh straight up scrap web parser I don't know what I'm going to call it and here is where we create an assistant with a specific name this is the client bet assistance crate so this is the API from open AI uh we have called it Google GPT you're an assistant capable of fetching displaying news article based on user queries I don't know that instruction I just kept it as is I'm not going to change up that yeah we have a new gp4 uh 128k turbo and here we have the tools right so this is the function calling so you can see the type is function it's going to get organic result that is this function right and we have the function scrape website this one so that is our two tools that we are going to do so uh these tools are not using like the parallel function calling because uh they are kind of Serial based because we need the URL before we can scrape the website but uh we could have a lot of more function calls here I might do a video on Parallel function calling that seems to be working much better now that we can do multiple things like at the same times or in parallel so I'm probably going to do a video on that later but yeah pretty straightforward function calling here we just want to return our URL and we're going to scrape that URL that is basically it and I put in just some it's a true Loop here so when we enter our your query we can get it back after the answer so we can follow up with another query right and that is basically it uh yeah I want to show you this too so here is kind of where we uh chat completion request with the grounding context here is where we can kind of adjust the because we have the context news content plus the user query right and then we're going to print this grounding context so here is kind of where you set the structure you want for the final output so I just put in always return only the essential parts that answers the user's original user query but add three bullet points to back up your reasoning for the answer so this is the system message I set for this and it seems to be working quite well as you will soon see uh and yeah that is basically it I added some colors here so you can separate the user input from the GPT for response and yeah I think that's it it's pretty straightforward we can also go over to open AI playground and here you can kind of see our system so Google GPT here we have the instructions we have the model and here is kind of our let me zoom in a bit our functions right so we have the get organic result function we have the scrape website function uh unfortunate this does not work in this playground because we can't scrape websites from here at least I have not got this to work uh but that it works great in Like We R run it in the terminal so I think we're just going to do some queries test up some stuff maybe change up the output here and we only going to search on like recent news so yeah let's try it out so of course we have to start with the most obvious query uh this weekend is Sam al man fired from open AI so let's find this out so here you can see the converted U Google search queries that is Sam ultimate open ey fired status uh and here we can see we get the context from a website right that's the white part this is from the words I think and here you can see the response yes Sam ultman was fired from open Ai and here we get our three bullet points the text mentions Alman was suddenly fired from the board Friday with no notice discussions about him potentially returning okay I didn't know that his firing has led to uncertainty with open CA several sen seniors researchers to resign so I got to say this is working quite well remember this of course this has nothing to do with the training data this happened just a couple of days ago so yeah I'm pretty happy how this turned out I got to say so you can see this is the context we fed it and yeah it's looking very good to be honest uh let's try some other queries and let's try to change up the output structure here but let's try something else so let's try who won the Las Vegas F1 Grand Prix wow that was a long context uh so the go converted search query was Las Vegas F1 Grand Pre winner yeah that's good and this was from today perfect MOX firstopen won the Ground Prix despite a 5-second penalty and a coition with George Russell managed to take the win this Victory Mark 53rd Grand Pre win of career first up and became the first driver to ever win three times in the same country wow that's good nice but now let's change up the output prompt a bit and see if we can get some other structures from this I changed it up to always return a short concise response in valid Json format so let's just go what Sam ultimate fire from open AI uh and yeah we get Sam ultimate fired through even date Friday Source multiple people for me with the matter okay that's something I guess it is a Json format uh okay so let's try something else so let's do who won the Las Vegas Grand Prix again so you get the search qu great let's take a look at the Json format winner Max firstopen event Las Vegas Grand Prix I guess it works so yeah I guess it works let's do something else for the output format let's let's do uh always return the response in a short poem format so let's see if this works so let's just go with the same query is Sam ultman fired in open AI Saga a Twist we found Sam ultman was dismissed but might rebound the board had once ousted him it's true but talks of return now Brew un new fired suddenly a Shockwave sent yet on governance Sam might relent if terms are met and changes made a CEO's comeback might be agage okay that's great so yeah I think we kind of pro that this works pretty good I have done some more testing too but yeah you kind of get the gist of it by just by looking at this video uh if you're interested in trying this out for yourself uh you can sign up uh as a member to my channel and I will give you access to the GitHub where I will be uploading the script and I might do some more in-depth tutorial on the members video section we will see uh other than that thank you for tuning in have a great day and I'll see you again soon\n"