Structured Outputs

Structured Output Schema

You can set the response_format parameter to your defined schema to ensure the model produces a JSON object that matches your specified structure.

1import openai
2import json
3
4# Define the OpenAI client
5client = openai.OpenAI(
6    base_url="https://api.scx.ai/v1", 
7    api_key="your-scx-api-key"
8)
9
10MODEL = "DeepSeek-V3-0324"
11
12# Define the schema
13response_format = {
14    "type": "json_schema",
15    "json_schema": {
16        "name": "data_extraction",
17        "schema": {
18            "type": "object",
19            "properties": {
20                "section": {
21                    "type": "string"
22                },
23                "products": {
24                    "type": "array",
25                    "items": {
26                        "type": "string"
27                    }
28                }
29            },
30            "required": ["section", "products"],
31            "additionalProperties": False
32        },
33        "strict": False 
34    }
35}
36
37# Call the API
38completion = client.chat.completions.create(
39    model=MODEL,
40    messages=[
41        {
42            "role": "system",
43            "content": "You are an expert at structured data extraction. You will be given unstructured text and should convert it into the given structure."
44        },
45        {
46            "role": "user",
47            "content": "the section 24 has appliances, and videogames"
48        }
49    ],
50    response_format=response_format
51)
52
53# Print the parsed result
54print(completion)
55
56

JSON mode

You can set the response_format parameter to json_object in your request to ensure that the model outputs a valid JSON. In case the mode is not able to generate a valid JSON, an error will be returned.

1import openai
2
3# Define the OpenAI client
4client = openai.OpenAI(
5    base_url="https://api.scx.ai/v1", 
6    api_key="your-scx-api-key"
7)
8
9MODEL = 'Meta-Llama-3.3-70B-Instruct'
10
11
12def run_conversation(user_prompt):
13    # Initial conversation with user input
14    messages = [
15        {
16            "role": "system",
17            "content": "Always provide the response in this JSON format: {\"country\": \"name\", \"capital\": \"xx\"}"
18        },
19
20        {
21            "role": "user",
22            "content": user_prompt,
23        }
24    ]
25
26    # First API call to get model's response
27    response = client.chat.completions.create(
28        model=MODEL,
29        messages=messages,
30        max_tokens=500,
31        response_format = { "type": "json_object"},
32        # stream = True
33    )
34    
35    response_message = response.choices[0].message
36    print(response_message)
37
38
39run_conversation('what is the capital of Austria')
40
1ChatCompletionMessage(content='{"country": "Austria", "capital": "Vienna"}', role='assistant', function_call=None, tool_calls=None)
2

Other methods of structured outputs

Beyond JSON mode, structured outputs can be generated using the Instructor library. Learn more on the Instructor integration page.