Provider Integration
For Providers
If you’d like to be a model provider and sell inference on OpenRouter, fill out our form to get started.
To be eligible to provide inference on OpenRouter you must have the following:
1. List Models Endpoint
You must implement an endpoint that returns all models that should be served by OpenRouter. At this endpoint, please return a list of all available models on your platform. Below is an example of the response format:
NOTE: pricing
fields are in string format to avoid floating point precision issues, and must be in USD.
Valid quantization values are: int4
, int8
, fp4
, fp6
, fp8
, fp16
, bf16
, fp32
.
Valid sampling parameters are: temperature
, top_p
, top_k
, repetition_penalty
, frequency_penalty
, presence_penalty
, stop
, seed
.
Valid features are: tools
, json_mode
, structured_outputs
, web_search
, reasoning
.
2. Auto Top Up or Invoicing
For OpenRouter to use the provider we must be able to pay for inference automatically. This can be done via auto top up or invoicing.