Post image

Building ChatGPT with React

Apr 8, 2023

Kacey Cleveland

Building out a ChatGPT clone using OpenAI's API is a great way to familiarize yourself both with OpenAI and React in general in itself. This post will go over a high level example of implementing a ChatGPT clone and the implementation I used in my side project. My codebase is very much a work in progress but you can follow along with my progress in my side project below!

https://github.com/kaceycleveland/help-me-out-here

Note: This is calling the OpenAI API on the client; it should not be used as is unless the requests are being proxied through a backend service of some sort to hide your API keys or if you are truly building a client side application.

Example ChatGPT client

Dependencies

To send and receive messages to OpenAI we can use OpenAI's official npm package:

https://www.npmjs.com/package/openai

In addition to this, we will be using TypeScript and Tanstack Query. Tanstack Query will serve as a wrapper to help send and process data to be consumed by our react application. You can read more about Tanstack Query here.

1. Instantiate the OpenAI Client

We first need a way to send OpenAI chat completion requests and get the responses back using the OpenAI npm package:

1import { Configuration, OpenAIApi } from "openai";
2
3const createOpenAiClient = () => {
4  const config = new Configuration({
5    organization: import.meta.env.OPENAI_ORG,
6    apiKey: import.meta.env.OPENAI_KEY,
7  });
8
9  return new OpenAIApi(config);
10};
11
12export const openAiClient = createOpenAiClient();
13

Now we can use the `openAiClient` to create chat completion requests as described here.

2. Create a Chat Mutation Hook

We can now create a react hook wrapped around the OpenAI client to make calls to the OpenAI API.

1import { useMutation, UseMutationOptions } from "@tanstack/react-query";
2import { openAiClient } from "../openai";
3import { CreateChatCompletionResponse, CreateChatCompletionRequest } from "openai";
4import { AxiosResponse } from "axios";
5
6export const useChatMutation = (
7  options?: UseMutationOptions<
8    AxiosResponse<CreateChatCompletionResponse>,
9    unknown,
10    CreateChatCompletionRequest
11  >
12) => {
13  return useMutation<
14    AxiosResponse<CreateChatCompletionResponse>,
15    unknown,
16    CreateChatCompletionRequest
17  >({
18    mutationFn: (request) => {
19      return openAiClient.createChatCompletion(request)
20    },
21    ...options,
22  });
23};
24

3. Consume the useChatMutation hook

1import { ChatCompletionRequestMessage } from "openai/dist/api";
2import { useState, useCallback, useRef } from "react";
3import { useChatMutation } from "./useChatMutation";
4
5function App() {
6  // Store the recieved messages and use them to continue the conversation with the OpenAI Client
7  const [messages, setMessages] = useState<ChatCompletionRequestMessage[]>([]);
8  const inputRef = useRef<HTMLTextAreaElement>(null);
9
10  /**
11   * Use the chat mutation hook to submit the request to OpenAI
12   * This is a basic example, but using tanstack query lets you easily
13   * render loading, error, and success states.
14   *  */
15
16  const { mutateAsync: submitChat } = useChatMutation({
17    onSuccess: (response) => {
18      const foundMessage = response.data.choices.length
19        ? response.data.choices[0].message
20        : undefined;
21      if (foundMessage) {
22        const messageBody: ChatCompletionRequestMessage[] = [
23          ...messages,
24          foundMessage,
25        ];
26        setMessages(messageBody);
27      }
28    },
29  });
30
31  const handleSubmit = useCallback(() => {
32    if (inputRef.current?.value) {
33      const messageBody: ChatCompletionRequestMessage[] = [
34        ...messages,
35        { role: "user", content: inputRef.current?.value },
36      ];
37      setMessages(messageBody);
38      // For simplicility, the settings sent to OpenAI are hard coded here.
39      submitChat({
40        model: "gpt-3.5-turbo",
41        max_tokens: 100,
42        presence_penalty: 1,
43        frequency_penalty: 1,
44        messages: messageBody,
45      });
46    }
47  }, [messages]);
48
49  return (
50    <div className="App">
51      <div>
52        {messages.map((message) => {
53          return (
54            <div>
55              <div>{message.role}</div>
56              <div>{message.content}</div>
57            </div>
58          );
59        })}
60      </div>
61      <div className="card">
62        <textarea ref={inputRef}></textarea>
63        <button onClick={handleSubmit}>Submit</button>
64      </div>
65    </div>
66  );
67}
68
69export default App;
70

Fin

This example can be expanded upon in various ways such as:

  • Customizing the interface/settings being sent with the messages
  • Customizing old and future messages to "prime" the AI for future responses.
  • Better UI rendering for different states
  • Better UI rendering for returned data. (Think rendering code blocks or markdown from the returned OpenAI data!)

Most of the above is what I am working on in my project:
https://github.com/kaceycleveland/help-me-out-here

If you want the full repo of this basic example, check it out here:
https://github.com/kaceycleveland/openai-example

Loading...