Evaluate Your Implementation Before Going Live

Once the implementation process is finished and ready to go live, we recommend that you test the workability of the main features of your search solution. This article provides a step-by-step guide to help ensure that your solution is well-implemented, regarding Coveo Usage Analytics (Coveo UA) and Coveo Machine Learning (Coveo ML).

The default Coveo-powered search pages are already properly configured to leverage Coveo UA and Coveo ML. However, during development, you may have changed the way the page behaves, and want to confirm that Coveo UA and Coveo ML are properly configured for your search interface.

You can use the implementation checklist below to help you track your progress throughout this article. Once you have verified the workability of Coveo features for a given step, check the box next to it, and then proceed to the next step.

  1. Ensure that Coveo ML query suggestions are requested.

  2. Ensure that user queries are sent.

  3. Ensure that Coveo usage analytics events of all types are being recorded:

  4. Ensure that search hubs are implemented.

  5. Validate usage analytics metadata.

  6. Validate that usage analytics events are being recorded using the Administration Console.

  7. Inspect Coveo ML models:

  8. Test Coveo ML models:

Step 1: Access Your Browser Developer Tools

To properly follow the guidelines below, access the page where your search component is integrated, and then select the Network tab of your browser developer tools, depending on your browser:

This article uses the Google Chrome developer tools for its examples.

In the Network tab, on the Filter bar, select XHR. Selecting XHR data type allows you to filter out irrelevant requests.

Step 2: Ensure That Coveo Machine Learning Query Suggestions Are Requested

When users access your search interface, they likely type a query in the search box. To provide the best possible search experience, ensure that your implementation leverages Coveo ML Query Suggestions (QS) (see Providing Coveo Machine Learning Query Suggestions).

  1. Access a page in which you integrated a Coveo search box.

  2. Access the Network tab of your browser developer tools.

  3. Start typing a query in the search box.

    In the Network tab, under the Name column, you should see a coveo/rest/v2/querySuggest? request for every keystroke.

    CoveoQS

  4. Click on the latest coveo/rest/v2/querySuggest? request.

    In the Response tab, you should see the currently displayed query completions.

    CoveoQS

    See Providing Coveo Machine Learning Query Suggestions.

Step 3: Ensure That Queries Are Sent and Search Events Are Recorded

When a query is triggered, you must ensure that the query is sent to the Search API, and that an appropriate search event is logged.

3.1: Ensure That Queries Are Sent

  1. In the page where your search component is integrated, type a query in the search box, and then press Enter or click the search button.

  2. In your browser developer tools, in the Network tab, under the Name column, you should see a request to the Search API containing /rest/search/v2.

    V2Request

    See Searching With Coveo Cloud.

If your implementation doesn’t rely on the Coveo JavaScript Search Framework , see Submitting a Query.

3.2: Ensure That Coveo UA Search Events Are Recorded

In your browser developer tools, in the Network tab, under the Name column, you should see a request containing /rest/v15/analytics/searches?.

SearchRequest

If your search events aren’t sent, you need to add the CoveoAnalytics component on your page (see Coveo Analytics Component).

Step 4: Ensure That Coveo UA Click Events Are Recorded

When a click action is performed (e.g, clicking a ResultLink or a Quickview in the result list), you must ensure that an appropriate click event is logged.

In your browser developer tools, in the Network tab, under the Name column, you should see a request containing /rest/v15/analytics/click?.

ClickRequest

If your implementation relies on JavaScript Search Framework version v2.6063 or later, you must deselect XHR data type to see the click request in the developer tools. (see JSUI-2460 Side Effect: Click Events Are No Longer Flagged as XHR Requests).

If you don’t have click events, you need to add the CoveoResultLink component back in all of your result templates (see Coveo ResultLink Component). This component takes care of handling the click events for you, and shouldn’t be removed from result templates.

Since the September 2019 Release (v2.7023) of the JavaScript Search Framework, not including the CoveoResultLink component in your templates triggers an error in your browser console.

If your implementation doesn’t rely on the Coveo JavaScript Search Framework, see Opening Query Results and Logging Click Events.

Step 5: Ensure That Coveo UA View Events Are Recorded

If you want to leverage advanced Coveo ML algorithms and personalization such as Coveo ML Event Recommendations (ER), you must ensure that your search interface logs page view events every time a tracked page is opened.

In your browser developer tools, in the Network tab, under the Name column, you should see a request containing /rest/v15/analytics/view.

ViewRequest

When creating your Coveo ML model, you may want to select specific contentType values to recommend to your users (see Coveo Machine Learning Recommendation Content Types).

See Sending Usage Analytics View Events.

If your implementation doesn’t rely on the Coveo JavaScript Search Framework , see Logging View Events.

Step 6: Ensure That Search Hubs Are Implemented

In your solution, each search interface intended for a specific audience should have its own unique identifier, or search hub. You can set the search hub directly in search tokens. Alternatively, in JavaScript Search Framework interfaces, you can do so by setting the searchHub option of the Analytics component.

You have two search interfaces to specifically target two distinct audiences (customers and service agents). You thus set the following search hubs to personalize the search experience to each audience:

CommunitySearch

 <div id="search" class="CoveoSearchInterface">
  <!-- ... -->
  <div class="CoveoAnalytics" data-search-hub="CommunitySearch"></div>
  <!-- ... -->
</div>

AgentPanel

<div id="search" class="CoveoSearchInterface">
  <!-- ... -->
  <div class="CoveoAnalytics" data-search-hub="AgentPanel"></div>
  <!-- ... -->
</div>

Furthermore, setting search hubs according to your audiences allows Coveo ML to learn from the behavior of a particular segment of users. You can also obtain insightful information about the audience targeted by a certain search hub in Coveo UA.

In the Coveo Cloud Administration Console, the Consumption Dashboard allows you to analyze logged queries per search hub (see Using the Search Consumption Dashboard).

Step 7: Validate Usage Analytics Metadata

Validating usage analytics events metadata is particularly useful in the following scenarios:

  • Your implementation doesn’t rely on the Coveo JavaScript Search Framework.

  • Your implementation relies on the Coveo JavaScript Search Framework, but includes custom code that can modify standard usage analytics events.

  • Your implementation relies on the Coveo JavaScript Search Framework, and you’re pushing custom user context (typically using the PipelineContext component). See Sending Custom Context Information.

When logging usage analytics events, you should ensure that the data sent to Coveo UA and Coveo ML is accurate.

The following table indicates the properties sent to Coveo UA and Coveo ML depending on the logged event type.

You can validate these values in your implementation using your browser developer tools:

  1. Access your Coveo-powered search interface, and then in the search box, performed a query.

  2. Enable the debug mode by either:

    • Adding &debug=true at the end of the current URL address.

      debugTrueExample

    OR

    • Holding the Alt key (Option key for Mac) and double-clicking any search result box. At the top right of the Debug panel, make sure the Enable query debug option is selected. You can then close this panel by pressing the Escape key.

      QueryDebugExample

  3. Access the Network tab of your browser developer tools.

  4. Depending on the event category you want to inspect, click the corresponding request:

    • For a search event, the request contains /rest/v15/analytics/searches?.

    • For a click event, the request contains /rest/v15/analytics/click?.

    • For a view event, the request contains /rest/v15/analytics/view.

    • For a custom event, the request contains /rest/v15/analytics/custom.

  5. In the Headers tab, find the Request Payload section.

    If you’re inspecting search events, in the Request Payload section, expand the items in the payload array

    You can now see the values for each logged properties.

    requestPayload

  • Some of the property values found in the usage analytics event request must match those passed in the corresponding request to the Search API. You can review these values in the Preview tab of your browser developer tools:

    1. Click the corresponding request to the Search API.

    2. Expand the executionReport property, and then expand the children property.

    3. Expand the last ResolveContext property, and then expand the result property.

    You can now compare the values of the properties displayed in the Request Payload of the usage analytics event request and those found in the request to the Search API.

    Notably, you should validate the searchHub/originLevel1, tab/originLevel2 values.

    Coveo ML separates its models according to the search hubs and tabs. This means that if your search hub or tab values don’t correspond between your queries and your usage analytics events, Coveo ML won’t be able to suggest relevant content.

    The following table shows the equivalence of each relevant value between the Search API and Analytics API values.

    Query Value Usage Analytics Event Value
    searchHub originLevel1
    tab originLevel2

    If the searchHub/originLevel1 or tab/originLevel2 values are different, Coveo ML is going to consider the query and usage analytics events as unrelated, usually preventing Coveo ML rules from applying.

  • The searchUid value is found at the bottom of the page generated by the Preview tab, under the results property.

Step 8: Use the Administration Console to Validate That Usage Analytics Events Are Being Recorded

The Visit Browser and the Reports pages of the Administration Console offer various tools to help you monitor activities within your search interfaces.

You should thus ensure that the information displayed in usage analytics is accurate.

You must first make sure that the Analytics component is integrated into your search interfaces. This component is required to log search and click events leveraged by Coveo ML and usage analytics reports.

You can test your implementation to confirm that all the desired events are recorded:

  1. Access the search interface you want to test.

  2. Type a query in the search box, and then press Enter or click the search button.

  3. Perform search actions such as:

    • Clicking search results

    • Selecting facets and tabs

    • Sending new queries.

  4. Access the Visit Browser page, and then select the date on which you performed the search actions (see Reviewing User Visits With the Visit Browser).

    VisitBrowserDate

  5. Find your visit, and then expand it.

    You can now ensure that every single action that triggered a search, click, or custom event is listed under your visit.

    UserVisit

    Events performed in the search interface don’t appear instantaneously in the Visit Browser. It may takes few minutes.

  6. Access the Reports page, and then access one of the reports.

    The default Summary dashboard provides good insights on your implementation (see Reviewing the Search Usage Trends From the Summary Dashboard).

  7. In the upper-right corner of the page, select the date on which you performed the search actions (as you did in step 4).

  8. In the upper-left corner of the page, add a User Id dimension filter to only show information related to your visit (see Adding Global Dimension Filters).

    UserIdFilter

  9. Ensure that the values displayed in the Summary dashboard match those displayed in the Visit Browser.

  10. Review information displayed on the Incoherent Events page to see if there’s usage analytics events discarded by Coveo UA (see Reviewing Incoherent Usage Analytics Events).

See Also

Step 9: Inspect Coveo Machine Learning Models

Coveo ML features leverage AI-powered recommendations to provide the best possible search experience to your users according to their context. You must thus ensure that your Coveo ML models are properly implemented and well-trained.

9.1: Leveraging Coveo Machine Learning

If your implementation doesn’t leverage Coveo ML, the following article provides the guidelines to create and configure your models: Leveraging Coveo Machine Learning.

9.2: Configuring and Training Coveo Machine Learning Models

Depending on your context, your needs, and your available data, you may need to change the default Coveo ML models configuration and training settings (see Training and Retraining).

9.3: Reviewing Coveo Machine Learning Candidates Using the “Models” Page

You can use the Models page of the Administration Console to ensure that your Coveo ML models have relevant candidates to suggest to your end-users (see Reviewing Coveo Machine Learning Model Information).

Once you have met the prerequisites to create your model, you can review the candidates (if any) suggested by Coveo ML.

To review these candidates and other useful information about your model, access the Models page, and then select the model you want to inspect. The options are:

The information provided in this page allows you to ensure that all the prerequisites for the creation of a model are met. You can also review Coveo ML candidates that are automatically shown to your end-users (see Coveo Machine Learning Model Information Reference).

9.4: Testing ART Models

You can use the Model Testing page of the Administration Console to compare the ranking of two Coveo ML ART models. You can also use this tool to evaluate the efficiency of a given ART model compared to the default ranking behavior provided without ART (see Testing Coveo Machine Learning Models).

Once you analyzed the Models page for your Coveo ML ART model candidates and tested its ranking behavior on the Model Testing page, you may want to test its workability in your actual implementation.

Usually, ART boosts the ranking score of the five best search results for a given query (see Adding and Managing Coveo Machine Learning Models).

  1. Access the search interface your want to test.

  2. Send a query for which your model is able to recommend items (see Reviewing Coveo Machine Learning Model Information).

You can test if those search results recommendations behave as expected using the following methods.

Using the Debug Panel

You can ensure an ART model recommends search results to your end-users using the JavaScript Search Debug Panel.

  1. Open the debug panel by pressing the ALT key and double-clicking any JavaScript Search Framework component in the interface (see Access the JavaScript Search Debug Panel).

  2. In the debug panel header, select the Highlight recommendation checkbox (see Debug Panel Header).

Assuming search results are being sorted by relevance, the results whose ranking scores were modified through ART will now be highlighted in the result list.

Using Your Browser Developer Tools

  1. Enable the debug mode by either:

    • Adding &debug=true at the end of the current URL address.

      debugTrueExample

    OR

    • Holding the Alt key (Option key for Mac) and double-clicking on any search result box. At the top right of the panel you just opened, make sure the Enable query debug option is selected. You can then close this panel by pressing the Escape key.

      QueryDebugExample

  2. Access your browser developer tools.

  3. In the Network tab, under the Name column, select the last HTTP request to /rest/search/v2.

    V2Request

  4. Select the Preview tab. You should now see the query response body.

  5. Expand the executionReport property.

  6. Expand the children property. You should now be able to visualize the whole path your query took before the results were returned to the search interface.

    children-subsectionExample

  7. Expand the PreprocessQuery property, and then expand the children property.

  8. Expand the CallingRevealTopClicks property, and then expand the response property.

  9. Expand the predictions property.

Under predictions, you should see items suggested by ART and their ranking score (see Understanding Search Result Ranking).

SuggestedItems

Suggested items shown under predictions are identified with their contentIdValue field value.

9.5: Testing Query Suggestions (QS)

You can use the Model Testing page of the Administration Console to compare the suggestions and the workability of two Coveo ML QS models (see Testing Coveo Machine Learning Models).

Once you tested a model in the Model Testing you may want to test its workability in your actual implementation:

  1. Ensure That Coveo ML QS are logged.

  2. In the search box, start typing queries for which your model can provide suggestions, and then ensure that the expected suggestions are displayed (see Reviewing Coveo Machine Learning Model Information).

    QsGif

See Providing Coveo Machine Learning Query Suggestions and Create a QS Model).

9.6: Testing Event Recommendations (ER)

Coveo ML ER uses views and search actions from all your users to predict and suggest the most relevant content for the user in their current session.

  1. Access a recommendation interface that leverages the ER model you want to test.

  2. Access Your Browser Developer Tools.

  3. Access one of the pages that should be considered as a recommendation by your ER model, and then ensure that loading this page properly logs a view event.

  4. Click the view event request.

  5. In the Headers tab, find the Request Payload section.

  6. In the Request Payload section, ensure that you minimally find the following fields:

    • contentIdKey

    • contentIdValue

    • location

    • If you want this specific page to be also considered as a specific type of item, ensure that the contentType field is populated as expected.

    • If you leverage custom user context, you should also validate that the customData object contains the expected key-values.

  7. Validate that recommended items match the view event request.

    1. Access the Content Browser page of the Administration Console.

    2. In the search box, type the contentIdKey and the contentIdValue returned in the Request Payload of the view event request separated by the Contains operator (=).

      If the contentIdKey of the view event is @clickableuri and the contentIdValue is https://docs.coveo.com/en/2640/, type @clickableuri=https://docs.coveo.com/en/2640/, and then press Enter or click the search button.

    3. Validate that the only item returned as search result in the Content Browser is the one you inspected.

  8. Access another page containing a Recommendation interface whose output should come from your ER model.

  9. In the Name section of your browser developer tools, click the request to the Search API that contains /rest/search/v2?organizationId=[organizationId value]

  10. In the Headers tab, under Form Data, you should see the actionsHistory parameter, which is essential for Coveo ML ER to properly work (see actionsHistory (array of ActionHistory)). When analyzing the value of the actionsHistory parameter, ensure that:

    • The actionsHistory contains an object matching a page view for the page from which you accessed the current page.

    • The latest item in actionsHistory contains an object matching a page view for the current page.

    actionsHistory

  11. Still under the Form Data, ensure that the recommendation property contains a value.

  12. In the Preview tab, ensure that the request is routed to the query pipeline that contains the ER model.

    ErPipeline2

To learn how to configure Recommendations, see Coveo Machine Learning Event Recommendations Deployment Overview.

9.7: Testing Dynamic Navigation Experience (DNE)

If you leverage Coveo ML DNE, you may want to ensure that the model properly orders facets and facet values according to the current context (see Deploying Dynamic Navigation Experience).

You must first ensure that the DynamicFacet and DynamicFacetManager components are included in the search interface.

You can test if your DNE model behaves as expected using the following methods:

Using the Debug Panel

  1. In the search interface that leverages your DNE model, send a query for which the model can recommend items (see Reviewing Coveo Machine Learning Model Information).

  2. Hold the Alt key (Option key for Mac) and double-click a facet name that should be ordered by DNE.

  3. Ensure that the following information is displayed:

    DneDebugcopy

Using Your Browser Developer Tools

  1. In the search interface where your DNE model is implemented, send a query for which the model can recommend items (see Reviewing Coveo Machine Learning Model Information).

  2. Enable the debug mode by either:

    • Adding &debug=true at the end of the current URL address.

      debugTrueExample

    OR

    • Holding the Alt key (Option key for Mac) and double-clicking any search result box. At the top right of the Debug panel, make sure the Enable query debug option is selected. You can then close this panel by pressing the Escape key.

      QueryDebugExample

  3. Access Your Browser Developer Tools.

  4. In the Network tab, under the Name column, select the last HTTP request to /rest/search/v2.

    V2Request

  5. Select the Preview tab. You should now see the query response body.

  6. Expand the executionReport property.

  7. Expand the children property. You should now be able to visualize the whole path your query took before the results were returned to the search interface.

    children-subsectionExample

  8. Expand the PreprocessQuery property, and then expand the children property.

  9. Expand the CallingRevealFacetSense property, and then expand the response property.

  10. Under the response property, expand the facetSenseResults property.

You should now see the four following properties which reveal actions from DNE:

  • facetOrdering: the order in which the facetable fields are presented to the user according to their context.

  • rankingBoost: the boost granted to the facetable fields according to the user’s context.

  • valuesOrdering: the order in which the values for each facet are displayed.

  • userContext: the context for which DNE bases its suggestions (apart from the query).

DneExec

What’s Next

Recommended Articles