Using Pre-Push Extensions

This article applies to the new Crawling Module, which works without Docker. If you still use the Crawling Module with Docker, see Using Pre-Push Extensions (Docker Version) instead. You might also want to read on the advantages of the new Crawling Module.

To identify the Crawling Module you’re currently using, on the Crawling Modules page of the Coveo Administration Console, look at the Maestro reported version:

  • Versions > 1: new Crawling Module

  • Versions < 1: Crawling Module with Docker

After you create a Crawling Module source, you may need to add extensions to your indexing pipeline to customize the way source items are indexed. You can manage your indexing pipeline extensions on the Extensions Administration Console page.

As detailed under Indexing Pipeline Extension Overview, indexing pipeline extensions are applied in the cloud, after the Push API has pushed your content into the Coveo Platform. An indexing pipeline extension can therefore only handle data that has been pushed into the Platform and can’t interact with on-premises resources to customize your content indexing process. So, if you need an extension to leverage data that isn’t indexed by your source, for instance, you must rather use a pre-push extension.

A pre-push extension is a Python script that you write and save on your server. This script is applied to every item crawled by your source before it’s pushed to the cloud. Pre-push extensions are distinct and independent of indexing pipeline extensions. Consequently, you can apply a pre-push and an indexing pipeline extension to your content.

Applying a pre-push extension to a source may significantly slow down the content crawling process for this source, as the script is executed for every item crawled.

To apply a pre-push extension to a source

  1. Write a script to be executed by a Python 3 interpreter. A do_extension should be called first.

    The following script creates a new metadata named mynewmetadata. You could replace my new metadata value with a script associating mynewmetadata to data imported from a local database.

     # Name of the entry point method MUST be do_extension
     def do_extension(body):
         # process the body (json representation of the document that will be pushed)
         # example of adding a metadata
         body['mynewmetadata']="my new metadata value"
         # return new new body with the modifications
         return body
    

    Scripts importing an external dependency or a Python library are supported.

  2. Save the script to C:\ProgramData\Coveo\Maestro\Python3PrePushExtensions. In addition, if your script has external dependencies, add them to the requirements.txt file located in this folder. As you index content, the specified packages and modules will be installed or updated to ensure your extension script works properly.

  3. On the Sources page of the Administration Console, edit your source JSON configuration to add the PrePushExtension parameter. The value must be your script file name.

    In this excerpt of a source JSON configuration, the pre-push extension script is MyPrePushExtension.py.

     ...
         "parameters": {
         "IndexSharePermissions": {
             "sensitive": false,
             "value": "false"
         },
         "PauseOnError": {
             "sensitive": false,
             "value": "true"
         },
     ...
         "OrganizationId": {
             "sensitive": false,
             "value": "myorganization"
         },
         "SourceId": {
             "sensitive": false,
             "value": "uxayrw42v6tkn2zz45tdcqsize-myorganization"
         },
         "PrePushExtension": {
             "sensitive": false,
             "value": "MyPrePushExtension.py"
         }
         }
     ...
    
  4. Rebuild your source.

Recommended Articles