--- title: Query pipeline performance slug: p47e0293 canonical_url: https://docs.coveo.com/en/p47e0293/ collection: tune-relevance source_format: adoc --- # Query pipeline performance The **Overview** tab on the [**Query Pipelines**](https://platform.cloud.coveo.com/admin/#/orgid/search/pipelines/) ([platform-ca](https://platform-ca.cloud.coveo.com/admin/#/orgid/search/pipelines/) | [platform-eu](https://platform-eu.cloud.coveo.com/admin/#/orgid/search/pipelines/) | [platform-au](https://platform-au.cloud.coveo.com/admin/#/orgid/search/pipelines/)) page in the [Coveo Administration Console](https://docs.coveo.com/en/183/) provides a seven-day snapshot of your [query pipeline](https://docs.coveo.com/en/180/)’s performance. This snapshot is created using key [metrics](https://docs.coveo.com/en/263/) gathered from [Coveo Analytics](https://docs.coveo.com/en/182/). The goal of a well-tuned query pipeline is to return relevant results to users' [queries](https://docs.coveo.com/en/231/). Analyzing these metrics on a regular basis helps you assess whether the pipeline is meeting its relevance and performance goals. This tab includes the [**Relevance**](#relevance-subtab) and [**Performance**](#performance-subtab) subtabs. They each provide different metric charts that help you assess the pipeline's current effectiveness in terms of user engagement and efficiency. This is especially useful when you want to evaluate the impact of recent changes to the pipeline, such as adding or editing [rules](https://docs.coveo.com/en/236/) or [Coveo Machine Learning (Coveo ML)](https://docs.coveo.com/en/188/) [model](https://docs.coveo.com/en/1012/) associations. **Example** While reviewing the **Overview** tab of your query pipeline, you notice that the clickthrough rate is lower than expected. Given a recent influx of newly indexed [items](https://docs.coveo.com/en/210/), you suspect the pipeline isn't returning the most relevant results for certain queries. To address this, you create a [result ranking rule](https://docs.coveo.com/en/3234/) that boosts items added after a specific date. Within a few days of adding this new rule, you notice that the clickthrough rate begins to rise, suggesting that your changes helped improve user engagement. ## "Relevance" subtab The **Relevance** subtab contain the **Average click rank**, **Clickthrough rate**, **Total Searches**, and **Searches with clicks** metric charts. It also contains the **How to optimize relevance?** section, which provides suggestions for improving the relevance of your results. ![Relevance subtab in Query pipeline overview | Coveo](https://docs.coveo.com/en/assets/images/tune-relevance/query-pipeline-relevance-metrics.png) 1 **Average click rank**: This chart displays the [Average Click Rank (ACR)](https://docs.coveo.com/en/2836/) which measures the average position of the clicked items in the search results. The lower the average click rank, the more relevant the results are, since it indicates that users click results that appear higher in the list. 2 **Clickthrough rate**: This chart displays the clickthrough rate, which measures the percentage of searches in which users clicked one or more results. This metric is calculated by adding the total number of search events that were followed by one or more click events, divided by the total number of search events. It's a good indicator of how engaging or relevant your results are. 3 **Total searches** and **Searches with clicks**: This combined chart compares the total number of searches to the total number of searches that resulted in clicks. It helps you further understand the **Clickthrough rate** metric since it provides a visual breakdown of the number of searches that resulted in clicks versus those that did not. 4 **How to optimize relevance?**: This section provides suggestions of how you can improve the relevance of your results with ML models. It only appears if your query pipeline isn't associated with any [Automatic Relevance Tuning (ART)](https://docs.coveo.com/en/1013/) or [Query Suggestion (QS)](https://docs.coveo.com/en/1015/) models. This section shows you which existing models can be associated with the query pipeline. You have the option to either [associate existing models to the query pipeline](https://docs.coveo.com/en/2816#associate-a-model-with-a-query-pipeline) or [create a new model](https://docs.coveo.com/en/1832/). > **Tip** > > You can dismiss the model association alert for a given model type by clicking icon:minus-16px[alt=minus,width=16,role=red] next to the suggested model type. ## "Performance" subtab The **Performance** subtab contains the following metrics: **Total queries** and **Average response time**. ![Query pipeline overview tab - Performance subtab | Coveo](https://docs.coveo.com/en/assets/images/tune-relevance/query-pipeline-performance-metrics.png) 1 **Total queries**: The total number of queries received from both user searches and automated requests that trigger a query. This metric is useful for understanding the volume of queries your query pipeline handles. This includes all queries, regardless of whether they returned results or not. 2 **Average response time**: The average time your search interface takes to return results. This metric is useful in understanding the speed and efficiency of your query pipeline. A lower average response time indicates a more efficient pipeline, while a higher response time could affect other metrics, such as total queries. ## What's next? To evaluate the performance of your query pipeline following rule or model association changes over a longer period of time, you can use [the **A/B Test** feature to test different configurations](https://docs.coveo.com/en/3255/).