There is a lot of hype around ChatGPT nowadays. A lot of articles shows us different examples of how it can help us, others claims that it can do work of some specialties. Some articles provide examples of how we can integrate ChatGPT into our software systems. Despite the fact that it has a simple HTTP API, there are many of developers who prefer using some ready-to-use clients rather than spending time for developing their own ChatGPT client from scratch.
In this article I'm going to review two Go language ChatGPT clients. One of them is represented on the official ChatGPT website, another one is mentioned in awesome-go project repository.
Before you start any experiments, you need to obtain your personal API Key. It is used in every query and should be included in “Authorization
” HTTP Header. You can also use an organization ID if your account has multiple organizations and you want to specify to which of them the particular request belongs. It can be passed as an “OpenAI-Organization”
header value as it mentioned in the official documentation.
To obtain API Key visit this page and use “Create new secret key” button (see image 1).
Additional information can be found here.
You can check the list of ChatGPT libraries on the official website. As you can see, there are only two libraries initially provided: Python and Node.js. Nevertheless, some additional community libraries are also represented on the page and Go library (go-gpt3 by sashabaranov) among them. If you check awesome-go repository you can find one more additional ChatGTP library: openaigo by otiai10. In this article, I am going to compare these two libraries and give some examples of their usage.
Models methods of ChatGPT API allow you to get a list of available models and information about them. There are only two related methods: List models and Retrieve model. More information can be found on the official website.
otiai10/openaigo supports both methods (see full code here):
client := otiai10.NewClient(apiKey)
models, err := client.ListModels(ctx)
fmt.Printf("otiai10 ListModels: %v \n (err: %+v)\n\n",
utils.SPrintStruct(models), err)
model, err := client.RetrieveModel(ctx, models.Data[0].ID)
fmt.Printf("otiai10 RetrieveModel: %v \n (err: %+v)\n\n",
utils.SPrintStruct(model), err)
sashabaranov/go-openai supports only ListModels (but usually it’s enough, because the info data is included in the List response for each model) (see full code here):
client := sashabaranov.NewClient(apiKey)
models, err := client.ListModels(ctx)
fmt.Printf("sashabaranov ListModels: %v \n (err: %+v)\n\n",
utils.SPrintStruct(models), err)
Both Completions and Chats methods can stream back partial progress, but only sashabaranov/go-openai supports this feature.
Example of Completions stream (see full code here):
client := sashabaranov.NewClient(apiKey)
completionStream, err := client.CreateCompletionStream(ctx, sashabaranov.CompletionRequest{
Model: "text-davinci-003",
Prompt: "Say this is a test",
MaxTokens: 7,
Temperature: 0,
TopP: 1,
N: 1,
Stream: true,
Stop: []string{"\n"},
})
resp := sashabaranov.CompletionResponse{}
for {
resp, err = completionStream.Recv()
if err == io.EOF {
fmt.Printf("sashabaranov CreateCompletionStream streamReader.Recv(): EOF\n")
break
} else if err != nil {
fmt.Printf("sashabaranov CreateCompletionStream streamReader.Recv() error: %v\n", err)
break
}
fmt.Printf("sashabaranov CreateCompletionStream streamReader.Recv(): \n%v\n\n", utils.SPrintStruct(resp))
}
Example of Chat stream (see full code here):
client := sashabaranov.NewClient(apiKey)
completionStream, err := client.CreateChatCompletionStream(ctx, sashabaranov.ChatCompletionRequest{
Model: "gpt-3.5-turbo",
Messages: []sashabaranov.ChatCompletionMessage{
{
Role: "user",
Content: "Hello!",
},
},
})
resp := sashabaranov.ChatCompletionStreamResponse{}
for {
resp, err = completionStream.Recv()
if err == io.EOF {
fmt.Printf("sashabaranov ChatCompletionResponse streamReader.Recv(): EOF\n")
break
} else if err != nil {
fmt.Printf("sashabaranov ChatCompletionResponse streamReader.Recv() error: %v\n", err)
break
}
fmt.Printf("sashabaranov ChatCompletionResponse streamReader.Recv(): \n%v\n\n", utils.SPrintStruct(resp))
}
It’s worth mentioning that there is a pull request to the otiai10/openaigo repository that adds streaming support for Chats.
|
Examples | ||
---|---|---|---|
|
|
| |
- List Models | |||
- Retrieve Model |
Not supported | ||
|
|
| |
- Completion | |||
- Completion stream |
Not supported | ||
|
|
| |
- Chat | |||
- Chat stream |
Not supported | ||
|
|
| |
- Create image | |||
- Create image edit | |||
- Create image variation | |||
|
|
| |
- Create transcription |
Not supported | ||
- Create translation |
Not supported | ||
|
|
| |
- List files | |||
- Upload file | |||
- Delete file | |||
- Retrieve file | |||
- Retrieve file content |
Not supported | ||
|
|
| |
- Create fine-tune | |||
- List fine-tunes | |||
- Retrieve fine tune | |||
- Cancel fine-tune | |||
- List fine-tune events | |||
- Delete fine-tune model | |||
Engines (deprecated) |
|
|
|
- List engines (deprecated) |
Not supported | ||
- Retrieve engine (deprecated) |
Not supported |
Both libraries support most features, but not all. If you are not sure which one is best for your software system, you can choose sashabaranov's go-gpt3 because it supports more features such as streams. On the other hand, openaigo by otiai10 expecting io.Reader when you need to send some file to ChatGPT. In most cases, this is a more flexible and convenient way, for example you can embed file into a go binary.
Lead image generated with Stable Diffusion 2.1 using the prompt: “Programming robot.”