About Crawling Module Logs

This article applies to the new Crawling Module, which works without Docker. If you still use the Crawling Module with Docker, see About Logs (Docker Version) instead. You might also want to read on the advantages of the new Crawling Module.

To identify the Crawling Module you’re currently using, on the Crawling Modules page of the Coveo Administration Console, look at the Maestro reported version:

  • Versions > 1: new Crawling Module

  • Versions < 1: Crawling Module with Docker

When you request help from the Coveo Support team for an issue with the Coveo On-Premises Crawling Module, you might be asked to provide your instance logs. These logs are located under C:\ProgramData\Coveo\Maestro\Logs, unless you changed this location. Each log file contains traces of Crawling Module activity and events, which can be helpful when troubleshooting unusual issues.

For each Crawling Module instance:

  • Crawlers logs contain information regarding content update requests that have been received from the Coveo Platform. Each of these log files is identified with a source ID that’s also displayed on the Sources page of the Coveo Administration Console.

  • Dumps are used by the Coveo Support team when investigating your issue, along with logs.

  • JobHandlers logs contain traces of the workers polling the Coveo Platform for update tasks. For example, these logs may be useful when you launch a refresh operation in the Coveo Administration Console, but the operation doesn’t appear to be processed by the Crawling Module.

  • Maestro logs are relevant when debugging issues affecting the Crawling Module as a whole, such as communication issues with the Coveo Platform, failure to start the workers, and inability to complete downloads.

  • Security-Provider logs contain information relative to security identity update requests that have been received from the Coveo Platform.

  • WorkerService logs, also known as NodeAgent logs, provide information on the worker launching process.

Component logs aren’t shared between Crawling Module instances. So, if you have more than one instance and encounter issues with a source, you must review the logs of the Crawling Module instance associated to this source. In other words, you must access the Logs folder on the server on which you deployed the Crawling Module instance with which this source is paired.

Recommended Articles