Use this endpoint to stream individual inference data points to Openlayer. If you want to upload many inferences in one go, please use the batch upload method instead.

Authorizations

Authorization
string
header
required

Bearer authentication header of the form Bearer <token>, where <token> is your workspace API key. See Find your API key for more information.

Path Parameters

inferencePipelineId
string
required

The inference pipeline id (a UUID).

Body

application/json
config
object
required

Configuration for the data stream. Depends on your Openlayer project task type.

rows
object[]
required

A list of inference data points with inputs and outputs

Response

200 - application/json
success
boolean
required