Troubleshooting Implementation Issues

The first time you deployed the search solution or after you made major changes to it, you may have introduced issues that are difficult to identify using traditional quality assurance techniques. You can monitor the usage analytics after the search solution is live to ensure people are finding what they are looking for.

Information in this topic refers to the out-of-the-box pre-configured explorers such as Content Gaps and the Summary dashboard that can be modified or deleted by a member with the required roles. Consequently, pre-configured explorers and the Summary dashboard may be missing or modified in your Coveo Cloud organization. You can recreate the out-of-the-box pre-configured explorers and the Summary dashboard (see Reset Templates and Creating Usage Analytics Dashboards).

Here are some typical issues you can find using usage analytics:

  • Non-indexed items

  • Non-indexed item fields containing relevant text

  • Queries that do not return results

  • Search boxes which apply special filters that do not work as expected

Analytics can help you making sure everything is working fine in a production environment.

To troubleshoot search implementation issues with usage analytics

Use one or more of the following methods:

  • Looking at the Content Gaps explorer:

    1. Access the administration console Content Gaps explorer (in the navigation bar on the left, under Analytics, select Reports, and then in the Reports page, in the Name column, click Content Gaps).

      This explorer provides a list of queries that did not return results (see Content Gaps).

    2. Investigate why each query with the highest Query Count value did not return results.

      • While it is normal to have queries without results due to typos or non-existing content, make sure that no queries for which you should expect results appear in the list.

      • A query might return results only for some users or in some interfaces, due to permissions on items and hidden filters on queries used in some interfaces.

    3. Identify if the query never return results or if it does only under some patterns.

      In the following capture, people are not finding anything when they include the keyword darkstar. It might be that the items use dark star instead, so a thesaurus entry can be added to solve this issue (see Managing Query Pipeline Thesaurus).

      Admin-ContentGapsQueriesd

  • Looking at the Search Relevance explorer:

    1. Take a sample of the top queries that you feel should be followed by the most clicks by your users.

    2. Access the administration console Search Relevance page (in the navigation bar on the left, under Analytics, select Reports, and then in the Reports page, in the Name column, click Search Relevance).

    3. Add the Click Count metric to the table (see Add Dimensions or Metrics to the Data Table).

    4. Sort the table with descending Click Count metric values.

    5. Verify that your top queries are often followed by a click by your users.

  • Looking at the Summary dashboard after updating your search solution:

    1. Access the administration console Summary dashboard (in the navigation bar on the left, under Analytics, select Reports, and then in the Reports page, in the Name column, click Summary).

    2. Spot any abnormal change in the data. In particular, the Query Count (number of reported queries), the Average Click Rank Over Time (in the Relevancy tab) and Query Click-Through (%) (in the Relevancy tab) metrics should remain stable.

      In the following example, further investigations should be done if the decrease in the Query Click-Through (%) around Apr 25th matches the date of an update to the search solution:

      Admin-SummaryContentGapsb

  • Looking at the Search Relevance explorer:

    1. Access the administration console Search Relevance explorer (in the navigation bar on the left, under Analytics, select Reports, and then in the Reports page, in the Name column, click Search Relevance.

    2. Remove the Query dimension and Visit Count metric (see Add Dimensions or Metrics to the Data Table).

    3. Add the Origin 1 (Page/Hub) and Origin 2 (Tab/Interface) dimensions to report the data by search interface (see Add Dimensions or Metrics to the Data Table and Usage Analytics Dimensions).

      As each search interface may have different configurations (filters, ranking, query extensions…), checking the relevance metrics for each interface can help you identify invalid configurations.

      if you notice low query counts (which is the case for the Portal Search hub below) or unusual low Query Click-Through (%) or Relevance Index, there might be some configuration issues with this search interface.

      Admin-RelevanceDatad