Troubleshoot ART Models

Coveo Machine Learning (Coveo ML) Automatic Relevance Tuning (ART) models learn from click, search, and custom usage analytics events that occurred sequentially during the same visit. Based on those events, a given ART model extracts candidates, which are queries for which the model can recommend at least one item, and provides the most relevant items as top search results.

An ART model learns from the way users interact with your website. The more traffic you have, the better the model will get. However, an ART model needs a minimum of 100 click and search events to start boosting items among the top search results (see About Search Result Ranking).

With under 100 events, an ART model doesn’t boost the ranking weight of any item. You can however change this threshold if needed (see Edit the Number of Events Required to Build a Model).

To help a new model, you can train it by linking queries to results.

ART requires real data from a specific community to ensure ART results respond to real user intentions. Therefore, it’s strongly NOT recommended to provide false generated data.

Sending the Required Usage Analytics Event Data

Items can be used in ART model building only if they’re identifiable from the usage analytics events that pertain to them. To that end, the contentIdKey and contentIdValue parameters must be present in the customData of the click events on that item.

For technical details on those events, see Log Click Events.

Reviewing Your Coveo ML ART Model Learning Dataset

When you have the required privileges, you can use the Analytics section of the Coveo Administration Console to browse user visits, and therefore evaluate if your search interface produces enough data.

The following procedure assumes that you’re familiar with global dimension filters and Visit Browser features (see Review User Visits With the Visit Browser).

  1. On the Visit Browser page:

    Admin-VisitBrowserART

    1. Select a date interval of three months (see Review Search Usage Data by Date Interval).

    2. In the Show visits containing section, add the following filters (see Add Visit Filters):

      • a search event WHERE:

        • Query is not blank or n/a

        • Origin 1 (page/hub) is [Search page or search hub name]

        • Origin 2 (tab/interface) is [Tab or search interface name]

        • Language is [Language]

      • and at least a click event

  2. At the bottom right of the screen, you can see the number of visits in the selected period from which an ART model could learn.

    Admin-VisitBrowserART2

Training ART Models by Linking Queries to Results

When leveraging Coveo ML, manually adjusting the relevance of search results (e.g., creating query pipeline rules) becomes less necessary. For example, you no longer have to create thesaurus rules since ART models recommend the same results for different queries when they contain synonyms simply by analyzing the usage analytics data of a specific search interface. You can help accelerate the learning process of an ART model by linking queries to results such as pointing synonyms as well as words behind acronyms.

  • The help you provide lasts as long as the data is used to train the model in the worst-case scenario (see Data Period). Best-case scenario, your search interface users repeat the actions you take to train the model.

    For example, you point to the ART model that BO means business optimization and your model is based on three months of data. In 90 days, you may have to help the model understand the same thing if no users searching for BO and business optimization clicked the same results.

  • ART models don’t know synonyms, but learn links from queries to clicks. Therefore, in the example above, an ART model would have learned the link between BO, business optimization, and the clicked results; therefore removing the need for a thesaurus rule.

To help train Coveo ML ART models by linking queries to results

  1. Ensure that the Coveo ML ART feature is configured and enabled on your search interface (see Create an Automatic Relevance Tuning Model).

  2. On your search interface, perform a query, and then click the result you want to be recommended.

  3. Repeat the procedure for each desired query.

  4. Once the model is trained, on the Content Browser page, ensure that the model returns the expected result (see Inspect Items With the Content Browser). Repeat the procedure if needed.

    After the next (scheduled) model update, your model will recommend results based on your training.

Recommended Articles