Using Pre-Push Extensions

This article applies to the old Crawling Module, which works with Docker. If you are using the new Crawling Module, see Using Pre-Push Extensions instead.

The old Crawling Module has reached its end-of-life on December 31, 2020. We recommend switching to the new Crawling Module, which doesn’t require Docker.

To identify the Crawling Module you’re currently using, on the Crawling Modules page of the Coveo Administration Console, look at the Maestro reported version:

  • Versions > 1: new Crawling Module

  • Versions < 1: Crawling Module with Docker

After you create a content source, you may need to add extensions to your indexing pipeline to customize the way source items are indexed (see Creating a Crawling Module Source, Indexing Pipeline, and Indexing Pipeline Extension Overview). You can manage your indexing pipeline extensions from the Coveo Administration Console (see Extensions Page).

As detailed under Indexing Pipeline Extension Overview, indexing pipeline extensions are applied in the cloud, after the Push API has pushed your content into the Coveo Platform (see Workflow). An indexing pipeline extension can therefore only handle data that has been pushed into the platform, and can’t interact with on-premises resources to customize your content indexing process. So, if you need an extension to leverage data that isn’t indexed by your source, for example, you must rather use a pre-push extension.

A pre-push extension is a Python script that you write and save in your Crawling Module folder (see Installing Maestro). This script is applied to every item crawled by your source before it’s pushed into the cloud. Pre-push extensions are distinct and independent from indexing pipeline extensions. Consequently, you can apply a pre-push and an indexing pipeline extension to your content.

Applying a pre-push extension to a source may significantly slow down the content crawling process for this source, as the script is executed for every item crawled.

To apply a pre-push extension to a source

  1. Write a script to be executed by a Python 3 interpreter. A do_extension should be called first.

    The following script creates a new metadata named mynewmetadata. You could replace my new metadata value with a script associating mynewmetadata to data imported from a local database.

     # Name of the entry point method MUST be do_extension
     def do_extension(body):
         # process the body (json representation of the document that will be pushed)
         # example of adding a metadata
         body['mynewmetadata']="my new metadata value"
         # return new new body with the modifications
         return body

    Scripts importing an external dependency or Python library are supported.

  2. Save the script to C:\ProgramData\Coveo\data\Python3PrePushExtensions, along with any external dependency formatted as a python package or module.

  3. In the Coveo Administration Console, edit your source JSON to add the PrePushExtension parameter (see Edit a Source JSON Configuration). The value must be your script file name.

    In this excerpt of a source JSON configuration, the pre-push extension script is

         "parameters": {
         "IndexSharePermissions": {
             "sensitive": false,
             "value": "false"
         "PauseOnError": {
             "sensitive": false,
             "value": "true"
         "OrganizationId": {
             "sensitive": false,
             "value": "myorganization"
         "SourceId": {
             "sensitive": false,
             "value": "uxayrw42v6tkn2zz45tdcqsize-myorganization"
         "PrePushExtension": {
             "sensitive": false,
             "value": ""
  4. Rebuild your source (see Refresh, Rescan, or Rebuild Sources).

Recommended Articles