Stream your catalog data to your source

To send your catalog data to your catalog source, you must use the Stream API.


This process acts as a rebuild, therefore any currently indexed items not contained in the payload will be deleted.

This process consists of three steps:

When your catalog requires an update to a subset of products, see How to update your catalog.


Stream API operations are also available in a C# Platform SDK.

Required parameters

Refer to the Push API (V1) reference for a comprehensive list of required parameters.

Step 1: Open a stream


Content-Type: application/json
Accept: application/json
Authorization: Bearer <MY_ACCESS_TOKEN>

You will get a response like this one:

    "streamId": "1234-5678-9101-1121",
    "uploadUri": "link:[...]",
    "fileId": "b5e8767e-8f0d-4a89-9095-1127915c89c7",
    "requiredHeaders": {
      "x-amz-server-side-encryption": "AES256",
      "Content-Type": "application/octet-stream"

Step 2: Upload your catalog data into the stream

Using the uploadUri you received, make the following call:

PUT {uploadUri}

x-amz-server-side-encryption: AES256
Content-Type: application/octet-stream

You can now upload your catalog data (JSON file). The following is an example of content payload that contains products, variants, and availabilities in the body of the request:

    "AddOrUpdate": [
     "DocumentId": "product://001-red",
     "FileExtension": ".html",
     "ec_name": "Coveo Soccer Shoes - Red",
     "model": "Authentic",
     "ec_brand": ["Coveo"],
     "ec_description": "<p>The astonishing, the original, and always relevant Coveo style.</p>",
     "color": ["Red"],
     "ec_item_group_id": "001",
     "productid": "001-red",
     "ec_images": ["https://myimagegallery?productid"],
     "gender": "Men",
     "ec_price": 28.00,
     "ec_category": "Soccer Shoes",
     "objecttype": "Product"
     "DocumentId": "variant://001-red-8_wide",
     "FileExtension": ".html",
     "ec_name": "Coveo Soccer Shoes - Red / Size 8 - Wide",
     "sku": "001-red-8_wide",
     "productsize": "8",
     "width": "wide",
     "productid": "001-red",
     "objecttype": "Variant"
      "DocumentId": "store://s000002",
      "title": "Montreal Store",
      "lat": 45.4975,
      "long": -73.5687,
      "availableskus": ["001-red-8_wide","001-red-9_wide","001-red-10_wide","001-red-11_wide", "001-blue-8_wide"],
      "availabilityid": "s000002",
      "objecttype": "Availability"
   // ...

When your Catalog source is used in a catalog configuration, currently indexed items not contained in the payload will be automatically deleted. To prevent users from inadvertently deleting a large amount of items in a Catalog source, the delete operation is skipped during the stream (rebuild) process if all of existing items were to be deleted. To delete indexed items, users should instead perform a full document update.

When your Catalog source isn’t used with a catalog configuration, and you open and close a stream with no content payload, all content will be deleted from your source.

Catalog payload exceeds 256 MB

When a single catalog metadata payload exceeds 256 MB, you will need to send the additional payload in multiple document uploads or chunks.

Leading practice

To validate the parsing of the file is successful, we recommend that you test a subset of your catalog data before uploading the entire catalog.

When you initially open the stream, you get an uploadUri and a streamId. After you have sent the first payload, you need to send the next content payload in the body of the request:


Content-Type: application/json
Accept: application/json
Authorization: Bearer <MY_ACCESS_TOKEN>



You then receive a second uploadUri and you must repeat the process until you have uploaded all metadata to your catalog.

Step 3: Close the stream


Authorization: Bearer <MY_ACCESS_TOKEN>

When you upload a catalog into a source, it will replace the previous content of the source completely. Expect a 15-minute delay for the removal of the old items from the index.

Stream API limits

The Stream API enforces certain limits on request size and frequency.

These limits differ depending on whether the organization to which data is pushed is a production or non-production organization.

The following table indicates the Stream API limits depending on your organization type:

Organization type Maximum API requests per day Burst limit (requests per 5 minutes) Maximum LOAD requests per day Maximum batch size





256 MB





256 MB


These limits could change at any time without prior notice. Contact us to modify these limits.

Stream API error codes

If a request to the Stream API fails because one of the limits has been exceeded, the API will trigger one of the following response status codes:

Status code Triggered when


The total Stream API request size exceeds 256 MB when pushing a large file container. See Catalog Payload Exceeds 256 MB


The amount of total Stream API (LOAD and UPDATE) requests exceeds 15,000 per day (10,000 for non-production organizations). The quota is reset at midnight UTC.

The amount of total Stream API LOAD requests exceeds 96 per day (4 per hour). The quota is reset at midnight UTC.

The amount of total Stream API requests exceeds 250 (150 for non-production organizations) in an interval of 5 minutes. The retry-after header indicates how long the user agent should wait before making another request.

What’s next?

Once your initial catalog data upload is complete, you can make updates to the catalog content by performing a full document update or by making smaller adjustments to information on single products with a partial document update.