Use this file to discover all available pages before exploring further.
Portkey exposes OpenAI’s Batch API through one consistent endpoint, so you can run large, asynchronous evaluation jobs at 50 % lower cost.Use batches when you need to run large jobs offline — e.g. nightly evals, A/B tests, or bulk embeddings.
Upload your .jsonl file first — see Files API for more details and then use the following code to create a batch job.
from portkey_ai import Portkey# Initialize the Portkey clientportkey = Portkey( api_key="PORTKEY_API_KEY", # Replace with your Portkey API key provider="@PROVIDER")start_batch_response = portkey.batches.create( input_file_id="file_id", # file id of the input file endpoint="/v1/chat/completions", completion_window="24h", metadata={} # metadata for the batch)print(start_batch_response)
from portkey_ai import Portkey# Initialize the Portkey clientportkey = Portkey( api_key="PORTKEY_API_KEY", # Replace with your Portkey API key provider="@PROVIDER")batches = portkey.batches.list()print(batches)
from portkey_ai import Portkey# Initialize the Portkey clientportkey = Portkey( api_key="PORTKEY_API_KEY", # Replace with your Portkey API key provider="@PROVIDER")batch = portkey.batches.retrieve(batch_id="batch_id")print(batch)
The status of a given Batch object can be any of the following:
Status
Description
validating
the input file is being validated before the batch can begin
failed
the input file has failed the validation process
file_upload_failed
the batch job failed to upload the file to the provider after validation
batch_start_failed
the file upload succeeded but the batch job submission to the provider failed
in_progress
the input file was successfully validated and the batch is currently being run
finalizing
the batch has completed and the results are being prepared
completed
the batch has been completed and the results are ready
expired
the batch was not able to be completed within the 24-hour time window
cancelling
the batch is being cancelled (may take up to 10 minutes)
from portkey_ai import Portkey# Initialize the Portkey clientportkey = Portkey( api_key="PORTKEY_API_KEY", # Replace with your Portkey API key provider="@PROVIDER")cancel_batch_response = portkey.batches.cancel(batch_id="batch_id")print(cancel_batch_response)