There is a lot of hype around nowadays. A lot of articles shows us different examples of how it can help us, others claims that it can do work of some specialties. Some articles provide examples of how we can integrate ChatGPT into our software systems. Despite the fact that it has a simple HTTP API, there are many of developers who prefer using some ready-to-use clients rather than spending time for developing their own ChatGPT client from scratch. ChatGPT In this article I'm going to review two Go language ChatGPT clients. One of them is represented on the official , another one is mentioned in project repository. ChatGPT website awesome-go Obtaining an API key Before you start any experiments, you need to obtain your personal Key. It is used in every query and should be included in ” HTTP Header. You can also use an organization ID if your account has multiple organizations and you want to specify to which of them the particular request belongs. It can be passed as an header value as it mentioned in the official documentation. API “Authorization “OpenAI-Organization” To obtain API Key visit and use “Create new secret key” button ( ). this page see image 1 Additional information can be found . here Libraries You can check the list of ChatGPT libraries on the . As you can see, there are only two libraries initially provided: and Node.js. Nevertheless, some additional community libraries are also represented on the page and among them. If you check repository you can find one more additional ChatGTP library: . In this article, I am going to compare these two libraries and give some examples of their usage. official website Python Go library (go-gpt3 by sashabaranov) awesome-go openaigo by otiai10 Simple Examples: Models methods of ChatGPT API allow you to get a list of available models and information about them. There are only two related methods: List models and Retrieve model. More information can be found on the . Models official website otiai10/openaigo supports both methods ( ): see full code here client := otiai10.NewClient(apiKey) models, err := client.ListModels(ctx) fmt.Printf("otiai10 ListModels: %v \n (err: %+v)\n\n", utils.SPrintStruct(models), err) model, err := client.RetrieveModel(ctx, models.Data[0].ID) fmt.Printf("otiai10 RetrieveModel: %v \n (err: %+v)\n\n", utils.SPrintStruct(model), err) sashabaranov/go-openai supports only ListModels (but usually it’s enough, because the info data is included in the List response for each model) ( ): see full code here client := sashabaranov.NewClient(apiKey) models, err := client.ListModels(ctx) fmt.Printf("sashabaranov ListModels: %v \n (err: %+v)\n\n", utils.SPrintStruct(models), err) Streams: Completions and Chat Both and methods can , but only sashabaranov/go-openai supports this feature. Completions Chats stream back partial progress Example of Completions stream ( ): see full code here client := sashabaranov.NewClient(apiKey) completionStream, err := client.CreateCompletionStream(ctx, sashabaranov.CompletionRequest{ Model: "text-davinci-003", Prompt: "Say this is a test", MaxTokens: 7, Temperature: 0, TopP: 1, N: 1, Stream: true, Stop: []string{"\n"}, }) resp := sashabaranov.CompletionResponse{} for { resp, err = completionStream.Recv() if err == io.EOF { fmt.Printf("sashabaranov CreateCompletionStream streamReader.Recv(): EOF\n") break } else if err != nil { fmt.Printf("sashabaranov CreateCompletionStream streamReader.Recv() error: %v\n", err) break } fmt.Printf("sashabaranov CreateCompletionStream streamReader.Recv(): \n%v\n\n", utils.SPrintStruct(resp)) } Example of Chat stream ( ): see full code here client := sashabaranov.NewClient(apiKey) completionStream, err := client.CreateChatCompletionStream(ctx, sashabaranov.ChatCompletionRequest{ Model: "gpt-3.5-turbo", Messages: []sashabaranov.ChatCompletionMessage{ { Role: "user", Content: "Hello!", }, }, }) resp := sashabaranov.ChatCompletionStreamResponse{} for { resp, err = completionStream.Recv() if err == io.EOF { fmt.Printf("sashabaranov ChatCompletionResponse streamReader.Recv(): EOF\n") break } else if err != nil { fmt.Printf("sashabaranov ChatCompletionResponse streamReader.Recv() error: %v\n", err) break } fmt.Printf("sashabaranov ChatCompletionResponse streamReader.Recv(): \n%v\n\n", utils.SPrintStruct(resp)) } It’s worth mentioning that there is a pull request to the otiai10/openaigo repository that adds streaming . support for Chats Supported Methods and Examples sashabaranov/go-openai otiai10/openaigo Examples Models - List Models Supported Supported example - Retrieve Model Not supported Supported example Completions - Completion Supported Supported example - Completion stream Supported Not supported example Chat - Chat Supported Supported example - Chat stream Supported Not supported example Edits Supported Supported example Images - Create image Supported Supported example - Create image edit Supported Supported example - Create image variation Supported Supported example Embeddings Supported Supported example Audio - Create transcription Supported Not supported example - Create translation Supported Not supported example Files - List files Supported Supported example - Upload file Supported Supported example - Delete file Supported Supported example - Retrieve file Supported Supported example - Retrieve file content Not supported Supported example Fine-tunes - Create fine-tune Supported Supported example - List fine-tunes Supported Supported example - Retrieve fine tune Supported Supported example - Cancel fine-tune Supported Supported example - List fine-tune events Supported Supported example - Delete fine-tune model Supported Supported example Moderations Supported Supported example (deprecated) Engines (deprecated) - List engines Supported Not supported example (deprecated) - Retrieve engine Supported Not supported example Conclusion Both libraries support most features, but not all. If you are not sure which one is best for your software system, you can choose because it supports more features such as streams. On the other hand, expecting io.Reader when you need to send some file to ChatGPT. In most cases, this is a more flexible and convenient way, for example you can embed file into a go binary. sashabaranov's go-gpt3 openaigo by otiai10 Lead image generated with Stable Diffusion 2.1 using the prompt: “Programming robot.”