Evaluate Your Implementation Before Going Live
Once the implementation process is finished and ready to go live, we recommend that you test the workability of the main features of your search solution. This article provides a step-by-step guide to help ensure that your solution is well-implemented, regarding Coveo Usage Analytics (Coveo UA) and Coveo Machine Learning (Coveo ML).
The default Coveo-powered search pages are already properly configured to leverage Coveo UA and Coveo ML. However, during development, you may have changed the way the page behaves, and want to confirm that Coveo UA and Coveo ML are properly configured for your search interface.
You can use the implementation checklist below to help you track your progress throughout this article. Once you have verified the workability of Coveo features for a given step, check the box next to it, and then proceed to the next step.
-
Ensure that Coveo usage analytics events of all types are being recorded:
-
Validate that usage analytics events are being recorded using the Administration Console.
-
Inspect Coveo ML models:
-
Test Coveo ML models:
Step 1: Access Your Browser Developer Tools
To properly follow the guidelines below, access the page where your search component is integrated, and then select the Network tab of your browser developer tools, depending on your browser:
This article uses the Google Chrome developer tools for its examples.
In the Network tab, on the Filter bar, select XHR. Selecting XHR data type allows you to filter out irrelevant requests.
Step 2: Ensure That Coveo Machine Learning Query Suggestions Are Requested
When users access your search interface, they likely type a query in the search box. To provide the best possible search experience, ensure that your implementation leverages Coveo ML Query Suggestions (QS) (see Providing Coveo Machine Learning Query Suggestions).
-
Access a page in which you integrated a Coveo search box.
-
Access the Network tab of your browser developer tools.
-
Start typing a query in the search box.
In the Network tab, under the Name column, you should see a
coveo/rest/v2/querySuggest?
request for every keystroke. -
Click on the latest
coveo/rest/v2/querySuggest?
request.In the Response tab, you should see the currently displayed query completions.
-
Coveo ML QS must be leveraged in your search interface for
coveo/rest/v2/querySuggest?
requests to be displayed under the Name column of your developer tools. -
If your implementation doesn’t rely on the Coveo JavaScript Search Framework, see Providing Coveo Machine Learning Query Suggestions.
Step 3: Ensure That Queries Are Sent and Search Events Are Recorded
When a query is triggered, you must ensure that the query is sent to the Search API, and that an appropriate search event is logged.
3.1: Ensure That Queries Are Sent
-
In the page where your search component is integrated, type a query in the search box, and then press
Enter
or click the search button. -
In your browser developer tools, in the Network tab, under the Name column, you should see a request to the Search API containing
/rest/search/v2
.
If your implementation doesn’t rely on the Coveo JavaScript Search Framework , see Submitting a Query.
3.2: Ensure That Coveo UA Search Events Are Recorded
In your browser developer tools, in the Network tab, under the Name column, you should see a request containing /rest/v15/analytics/searches?
.
If your search events aren’t sent, you need to add the CoveoAnalytics
component on your page (see Coveo Analytics Component).
Step 4: Ensure That Coveo UA Click Events Are Recorded
When a click action is performed (e.g, clicking a ResultLink
or a Quickview
in the result list), you must ensure that an appropriate click event is logged.
In your browser developer tools, in the Network tab, under the Name column, you should see a request containing /rest/v15/analytics/click?
.
If your implementation relies on JavaScript Search Framework version v2.6063 or later, you must deselect XHR data type to see the click request in the developer tools. (see JSUI-2460 Side Effect: Click Events Are No Longer Flagged as XHR Requests).
If you don’t have click events, you need to add the CoveoResultLink
component back in all of your result templates (see Coveo ResultLink Component). This component takes care of handling the click events for you, and shouldn’t be removed from result templates.
Since the September 2019 Release (v2.7023) of the JavaScript Search Framework, not including the CoveoResultLink
component in your templates triggers an error in your browser console.
If your implementation doesn’t rely on the Coveo JavaScript Search Framework, see Opening Query Results and Log Click Events.
Step 5: Ensure That Coveo UA View Events Are Recorded
If you want to leverage advanced Coveo ML algorithms and personalization such as Coveo ML Event Recommendations (ER), you must ensure that your search interface logs page view events every time a tracked page is opened.
In your browser developer tools, in the Network tab, under the Name column, you should see a request containing /rest/v15/analytics/view
.
When creating your Coveo ML model, you may want to select specific contentType
values to recommend to your users (see Coveo Machine Learning Recommendation Content Types).
See Sending Usage Analytics View Events.
If your implementation doesn’t rely on the Coveo JavaScript Search Framework , see Log View Events.
Step 6: Ensure That Search Hubs Are Implemented
In your solution, each search interface intended for a specific audience should have its own unique identifier, or search hub. You can set the search hub directly in search tokens. Alternatively, in JavaScript Search Framework interfaces, you can do so by setting the searchHub
option of the Analytics
component.
You have two search interfaces to specifically target two distinct audiences (customers and service agents). Therefore, you set the following search hubs to personalize the search experience to each audience:
CommunitySearch
<div id="search" class="CoveoSearchInterface">
<!-- ... -->
<div class="CoveoAnalytics" data-search-hub="CommunitySearch"></div>
<!-- ... -->
</div>
AgentPanel
<div id="search" class="CoveoSearchInterface">
<!-- ... -->
<div class="CoveoAnalytics" data-search-hub="AgentPanel"></div>
<!-- ... -->
</div>
Furthermore, setting search hubs according to your audiences allows Coveo ML to learn from the behavior of a particular segment of users. You can also obtain insightful information about the audience targeted by a certain search hub in Coveo UA.
In the Coveo Cloud Administration Console, the Consumption Dashboard allows you to analyze logged queries per search hub (see Using the Search Consumption Dashboard).
Make sure to validate the searchHub
/originLevel1
correspondence.
Step 7: Validate Usage Analytics Metadata
Validating usage analytics events metadata is particularly useful in the following scenarios:
-
Your implementation doesn’t rely on the Coveo JavaScript Search Framework.
-
Your implementation relies on the Coveo JavaScript Search Framework, but includes custom code that can modify standard usage analytics events.
-
Your implementation relies on the Coveo JavaScript Search Framework, and you’re pushing custom user context (typically using the
PipelineContext
component). See Send Custom Context Information.
When logging usage analytics events, you should ensure that the data sent to Coveo UA and Coveo ML is accurate.
The following table indicates the properties sent to Coveo UA and Coveo ML depending on the logged event type.
You can validate these values in your implementation using your browser developer tools:
-
Access your Coveo-powered search interface, and then in the search box, performed a query.
-
Enable the debug mode by either:
-
Adding
&debug=true
at the end of the current URL address.
OR
-
Pressing and holding the
Alt
key (Option
key for Mac) and double-clicking any search result box. At the top right of the Debug panel, make sure the Enable query debug option is selected. You can then close this panel by pressing theEscape
key.
-
-
Access the Network tab of your browser developer tools.
-
Depending on the event category you want to inspect, click the corresponding request:
-
For a search event, the request contains
/rest/v15/analytics/searches?
. -
For a click event, the request contains
/rest/v15/analytics/click?
. -
For a view event, the request contains
/rest/v15/analytics/view
. -
For a custom event, the request contains
/rest/v15/analytics/custom
.
-
-
In the Headers tab, find the Request Payload section.
If you’re inspecting search events, in the Request Payload section, expand the items in the payload array
You can now see the values for each logged properties.
-
Some of the property values found in the usage analytics event request must match those passed in the corresponding request to the Search API. You can review these values in the Preview tab of your browser developer tools:
-
Click the corresponding request to the Search API.
-
Expand the
executionReport
property, and then expand thechildren
property. -
Expand the last
ResolveContext
property, and then expand theresult
property.
You can now compare the values of the properties displayed in the Request Payload of the usage analytics event request and those found in the request to the Search API.
Notably, you should validate the
searchHub
/originLevel1
,tab
/originLevel2
values.Coveo ML separates its models according to the search hubs and tabs. This means that if your search hub or tab values don’t correspond between your queries and your usage analytics events, Coveo ML won’t be able to suggest relevant content.
The following table shows the equivalence of each relevant value between the Search API and Analytics API values.
Query Value Usage Analytics Event Value searchHub
originLevel1
tab
originLevel2
If the
searchHub
/originLevel1
ortab
/originLevel2
values are different, Coveo ML is going to consider the query and usage analytics events as unrelated, usually preventing Coveo ML rules from applying. -
- The
searchUid
value is found at the bottom of the page generated by the Preview tab, under theresults
property.
Step 8: Use the Administration Console to Validate That Usage Analytics Events Are Being Recorded
The Visit Browser and the Reports pages of the Administration Console offer various tools to help you monitor activities within your search interfaces.
Therefore, you should ensure that the information displayed in usage analytics is accurate.
You must first make sure that the Analytics
component is integrated into your search interfaces. This component is required to log search and click events leveraged by Coveo ML and usage analytics reports.
You can test your implementation to confirm that all the desired events are recorded:
-
Access the search interface you want to test.
-
Type a query in the search box, and then press
Enter
or click the search button. -
Perform search actions such as:
-
Access the Visit Browser page, and then select the date on which you performed the search actions (see Review User Visits With the Visit Browser).
-
Find your visit, and then expand it.
You can now ensure that every single action that triggered a search, click, or custom event is listed under your visit.
Events performed in the search interface don’t appear instantaneously in the Visit Browser. It may takes few minutes.
-
Access the Reports page, and then access one of the reports.
The default Summary dashboard provides good insights on your implementation (see Review the Search Usage Trends From the Summary Dashboard).
-
In the upper-right corner of the page, select the date on which you performed the search actions (as you did in step 4).
-
In the upper-left corner of the page, add a User Id dimension filter to only show information related to your visit (see Add Global Dimension Filters).
-
Ensure that the values displayed in the Summary dashboard match those displayed in the Visit Browser.
-
Review information displayed on the Incoherent Events page to see if there’s usage analytics events discarded by Coveo UA (see Review Incoherent Usage Analytics Events).
See Also
-
Frequently asked questions about Coveo UA: Coveo Usage Analytics FAQ.
Step 9: Inspect Coveo Machine Learning Models
Coveo ML features leverage AI-powered recommendations to provide the best possible search experience to your users according to their context. Therefore, you must ensure that your Coveo ML models are properly implemented and well-trained.
9.1: Leveraging Coveo Machine Learning
If your implementation doesn’t leverage Coveo ML, the following article provides the guidelines to create and configure your models: Leveraging Coveo Machine Learning.
9.2: Configuring and Training Coveo Machine Learning Models
Depending on your context, your needs, and your available data, you may need to change the default Coveo ML models configuration and training settings (see Training and Retraining).
9.3: Reviewing Coveo Machine Learning Candidates Using the “Models” Page
You can use the Models page of the Administration Console to ensure that your Coveo ML models have relevant candidates to suggest to your end-users (see Reviewing Coveo Machine Learning Model Information).
-
Coveo ML models leverage Coveo UA to provide suggestions, you must therefore ensure that all the data that needs to be recorded within your search interfaces is properly tracked).
-
Before reviewing the candidates suggested by Coveo ML, you should ensure that you have met the prerequisites to create your desired model.
Once you have met the prerequisites to create your model, you can review the candidates (if any) suggested by Coveo ML.
To review these candidates and other useful information about your model, access the Models page, and then select the model you want to inspect. The options are:
-
Query Suggestions (QS)
-
Event Recommendations (ER)
The information provided in this page allows you to ensure that all the prerequisites for the creation of a model are met. You can also review Coveo ML candidates that are automatically shown to your end-users (see Coveo Machine Learning Model Information Reference).
9.4: Testing ART Models
You can use the Model Testing page of the Administration Console to compare the ranking of two Coveo ML ART models. You can also use this tool to evaluate the efficiency of a given ART model compared to the default ranking behavior provided without ART (see Testing Coveo Machine Learning Models).
Once you analyzed the Models page for your Coveo ML ART model candidates and tested its ranking behavior on the Model Testing page, you may want to test its workability in your actual implementation.
Usually, ART boosts the ranking score of the five best search results for a given query (see Automatic Relevance Tuning).
-
Access the search interface your want to test.
-
Send a query for which your model is able to recommend items (see Reviewing Coveo Machine Learning Model Information).
You can test if those search results recommendations behave as expected using the following methods.
Using the Debug Panel
You can ensure an ART model recommends search results to your end-users use the JavaScript Search Debug Panel.
-
Open the debug panel by holding the
Alt
key (Option
key for Mac) and double-clicking any JavaScript Search Framework component in the interface (see Access the JavaScript Search Debug Panel). -
In the debug panel header, select the Highlight recommendation checkbox (see Debug Panel Header).
Assuming search results are being sorted by relevance, the results whose ranking scores were modified through ART will now be highlighted in the result list.
Using Your Browser Developer Tools
-
Enable the debug mode by either:
-
Adding
&debug=true
at the end of the current URL address.
OR
-
Pressing and holding the
Alt
key (Option
key for Mac) and double-clicking on any search result box. At the top right of the panel you just opened, make sure the Enable query debug option is selected. You can then close this panel by pressing theEscape
key.
-
-
In the Network tab, under the Name column, select the last HTTP request to
/rest/search/v2
. -
Select the Preview tab. You should now see the query response body.
-
Expand the
executionReport
property. -
Expand the
children
property. You should now be able to visualize the whole path your query took before the results were returned to the search interface. -
Expand the
PreprocessQuery
property, and then expand thechildren
property. -
Expand the
CallingRevealTopClicks
property, and then expand theresponse
property. -
Expand the
predictions
property.
Under predictions
, you should see items suggested by ART and their ranking score (see Understanding Search Result Ranking).
Suggested items shown under predictions
are identified with their contentIdValue
field value.
9.5: Testing Query Suggestions (QS)
You can use the Model Testing page of the Administration Console to compare the suggestions and the workability of two Coveo ML QS models (see Testing Coveo Machine Learning Models).
Once you tested a model in the Model Testing you may want to test its workability in your actual implementation:
-
In the search box, start typing queries for which your model can provide suggestions, and then ensure that the expected suggestions are displayed (see Reviewing Coveo Machine Learning Model Information).
See Providing Coveo Machine Learning Query Suggestions and Create a Query Suggestion Model).
9.6: Testing Event Recommendations (ER)
Coveo ML ER uses views and search actions from all your users to predict and suggest the most relevant content for the user in their current session.
-
Access a recommendation interface that leverages the ER model you want to test.
-
Access one of the pages that should be considered as a recommendation by your ER model, and then ensure that loading this page properly logs a view event.
-
Click the view event request.
-
In the Headers tab, find the Request Payload section.
-
In the Request Payload section, ensure that you minimally find the following fields:
-
contentIdKey
-
contentIdValue
-
location
-
If you want this specific page to be also considered as a specific type of item, ensure that the
contentType
field is populated as expected. -
If you leverage custom user context, you should also validate that the
customData
object contains the expected key-values.
-
-
Validate that recommended items match the view event request.
-
Access the Content Browser page of the Administration Console.
-
In the search box, type the
contentIdKey
and thecontentIdValue
returned in the Request Payload of the view event request separated by theContains
operator (=).If the
contentIdKey
of the view event is@clickableuri
and thecontentIdValue
ishttps://docs.coveo.com/en/2640/
, type@clickableuri=https://docs.coveo.com/en/2640/
, and then pressEnter
or click the search button. -
Validate that the only item returned as search result in the Content Browser is the one you inspected.
-
-
Access another page containing a Recommendation interface whose output should come from your ER model.
-
In the Name section of your browser developer tools, click the request to the Search API that contains
/rest/search/v2?organizationId=[organizationId value]
-
In the Headers tab, under Form Data, you should see the
actionsHistory
parameter, which is essential for Coveo ML ER to properly work (see actionsHistory (array of ActionHistory)). When analyzing the value of theactionsHistory
parameter, ensure that:-
The
actionsHistory
contains an object matching a page view for the page from which you accessed the current page. -
The latest item in
actionsHistory
contains an object matching a page view for the current page.
-
-
Still under the Form Data, ensure that the
recommendation
property contains a value. -
In the Preview tab, ensure that the request is routed to the query pipeline that contains the ER model.
To learn how to configure Recommendations, see Coveo Machine Learning Event Recommendations Deployment Overview.
9.7: Testing Dynamic Navigation Experience (DNE)
If you leverage Coveo ML DNE, you may want to ensure that the model properly orders facets and facet values according to the current context (see Deploy Dynamic Navigation Experience).
You must first ensure that the DynamicFacet
and DynamicFacetManager
components are included in the search interface.
You can test if your DNE model behaves as expected using the following methods:
Using the Debug Panel
-
In the search interface that leverages your DNE model, send a query for which the model can recommend items (see Reviewing Coveo Machine Learning Model Information).
-
Press and hold the
Alt
key (Option
key for Mac) and double-click a facet name that should be ordered by DNE. -
Ensure that the following information is displayed:
Using Your Browser Developer Tools
-
In the search interface where your DNE model is implemented, send a query for which the model can recommend items (see Reviewing Coveo Machine Learning Model Information).
-
Enable the debug mode by either:
-
Adding
&debug=true
at the end of the current URL address.
OR
-
Pressing and holding the
Alt
key (Option
key for Mac) and double-clicking any search result box. At the top right of the Debug panel, make sure the Enable query debug option is selected. You can then close this panel by pressing theEscape
key.
-
-
In the Network tab, under the Name column, select the last HTTP request to
/rest/search/v2
. -
Select the Preview tab. You should now see the query response body.
-
Expand the
executionReport
property. -
Expand the
children
property. You should now be able to visualize the whole path your query took before the results were returned to the search interface. -
Expand the
PreprocessQuery
property, and then expand thechildren
property. -
Expand the
CallingRevealFacetSense
property, and then expand theresponse
property. -
Under the
response
property, expand thefacetSenseResults
property.
You should now see the four following properties which reveal actions from DNE:
-
facetOrdering
: the order in which the facetable fields are presented to the user according to their context. -
rankingBoost
: the boost granted to the facetable fields according to the user’s context. -
valuesOrdering
: the order in which the values for each facet are displayed. -
userContext
: the context for which DNE bases its suggestions (apart from the query).
What’s Next
-
You will likely want to leverage the query correction feature to prevent potential content gaps.
-
To ensure a smooth deployment, you may want to notify your Coveo point of contact, if applicable, such as your customer success manager.