Blog
/
Education

Return JSON from OpenAI to build AI enhanced APIs

cover
Jens Neuse

Jens Neuse

min read

Editor's Note: While this post offers valuable insights and focuses on building AI-enhanced APIs with the WunderGraph SDK, we’d like to introduce you to WunderGraph Cosmo , our complete solution for API and GraphQL Federation. Cosmo seamlessly integrates AI-driven capabilities, enabling you to compose, extend, and scale your APIs effortlessly across complex microservices architectures. Whether you’re handling AI-powered data transformations or building federated GraphQL endpoints, Cosmo’s unique features empower you to elevate API performance, security, and developer productivity.

We're hiring!

We're looking for Golang (Go) Developers, DevOps Engineers and Solution Architects who want to help us shape the future of Microservices, distributed systems, and APIs.

By working at WunderGraph, you'll have the opportunity to build the next generation of API and Microservices infrastructure. Our customer base ranges from small startups to well-known enterprises, allowing you to not just have an impact at scale, but also to build a network of industry professionals.

When building APIs on top of OpenAI, you're usually getting plain text back. This is fine when a human interacts with the API, because they can easily "parse" the text even though it's not structured. But what about building APIs on top of OpenAI that should be consumed by other machines? How can we build APIs and document them using OpenAPI while still using OpenAI to generate the response?

The Problem: How to return structured data (JSON) from OpenAI?

Let's say we want to build an API that returns the weather of a given country. We don't want to manually write the integration code but rather use OpenAI to generate the response. However, LLMs like GPT-3 are simply returning plain text, not structured data (JSON). So how can we force OpenAI to return an answer that conforms to a JSON Schema so that we can expose it as an API, documented using OpenAPI?

The Solution: With zod / JSON Schema and OpenAI Functions, you can return structured data (JSON) from OpenAI

OpenAI has a new feature called "Functions". Functions are a way to define Operations that can be called from within an LLM. Functions can be described using JSON Schema.

What happens though is that the LLM will not call the function directly, but rather generate inputs for the function and return them to you. So you can create a prompt and add the available functions as context. You then call the OpenAI API and the response "might" contain instructions to call a function with certain inputs.

This is a bit hard to understand, so let's look at an example.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36

So, what happens here? We ask OpenAI to create a chat completion for a previously answered prompt. We want the LLM to use a function to "send" the result to. The parameters of the function are defined using JSON Schema (zod).

As a result of this prompt, we get a response from OpenAI that it wants to call the function with a JSON encoded string as input (completions.data.choices[0].message!.function_call!.arguments!).

This string can be parsed using JSON.parse and then validated using zod. After that, we can be sure that the response is valid on following the schema we've defined.

What's left is that we put all of the pieces together, add code generation on top of it and we have a fully automated way to build APIs on top of OpenAI.

Final solution to expose an AI-enhanced API via OpenAPI

The WunderGraph Agent SDK does all of this for you out of the box. Define an Operation using TypeScript, add an agent to execute your prompt, and you're done. The framework will infer the JSON Schema from the TypeScript types and generates the OpenAPI documentation for you.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32

We can now use this Operation using any OpenAPI client, like Postman, or even just curl.

1

The response will be a JSON object that conforms to the schema we've defined.

1
2
3
4
5

You OpenAPI documentation will be generated in the following directory:

1

Learn more about the WunderGraph Agent SDK

If you want to learn more about the Agent SDK in general, have a look at the announcement blog post here.

If you're looking for instructions on how to get started with the Agent SDK, have a look at the documentation .

Conclusion

OpenAI is a powerful tool that can be used to build APIs on top of it. With the new Functions feature, we can even return structured data (JSON) from OpenAI. This allows us to build APIs on top of OpenAI that can be consumed by other machines. We've also demonstrated how to use the WunderGraph Agent SDK to write up the agents and generate OpenAPI documentation automatically.

You can check out the source code on GitHub and leave a star if you like it. Follow me on Twitter , or join the discussion on our Discord server .