This week’s episode is the second part of a two-part series discussing CTR optimization. To learn more about marketing your product, be sure to check out Part I on YouTube!
Transcript has been changed.
Hello, everyone. Dejan Marketing’s Dan Petrovic is here. I’ve got an unusual instance of CTR optimization today, and I’ll walk you through the process.
Many people asked what tool I used after reading part one of this series, and the answer is you.algoroo.com. In this situation, we utilized the same free tool for the CT analysis. Collectiveray.com is the website we’re examining at today. We’ll look at its CTR, check for abnormalities, and come up with CTR experiment ideas.
Analysis of the CTR
We discovered the site-specific CTR averages for collectiveray.com for non-branded inquiries only after analyzing the data in Algoroo. Because branded searches had unusually high CTRs, we didn’t want them to contribute to our site averages, we used non-branded inquiries instead. What good are site averages if you don’t know what they are? That is, after all, the only method to uncover any irregularities. It’s also useful to know whether something underperforms or outperforms expectations when studying click-through rates.
The first exercise is to see whether anything deviates from the norm, either negatively or positively. Here’s a sampling of our data:
We have a high degree of confidence; we used a lot of terms to calculate our figures, and here’s what we discovered:
In this scenario, the CTR-based traffic loss is almost 5,000 clicks. We anticipated much over 7,000 clicks, maybe even 8,000, but only received a little more than 2,500. For whatever reason, a substantial number of clicks were missing from the SERPs.
We discovered that over 1300 inquiries were responsible for roughly 7% of total organic non-branded traffic loss. 774 inquiries were responsible for almost 80% of the loss. So the plan is to go through them and see if anything sticks out as a potential possibility. Needless to say, we employed a lot of terms in our study, close to 20,000 in all.
Here are a few of the worst offenders:
These are the inquiries that have lost traffic since they are less appealing to click on than the rest of the site. “Font squirrel” is at the top, followed by “psd to wordpress conversion,” “web design blog,” “bloom email,” “avada,” “hire app developer,” and so on.
I went through a handful of them personally and looked into what was going on.
With the “font squirrel” question, I dug a little further into history and discovered an interesting peak that occurred at one time, which is perhaps worth looking into. Why did CTR for this query suddenly increase at that point in time?
We looked at the query CTR before, and here is the data for the landing page:
It seems to vary on a frequent basis. There’s nothing exceptional about it. When I look at the real SERPs, I see that there is an official location that the majority of people appear to be searching for.
In this example, I believe “font squirrel” is a false positive, and there isn’t much you can do about it than what we did with the SERP snippet.
While researching “psd to wordpress,” I discovered an interesting dip in CTR that later rebounded, indicating that there is probably something to look into for this website.
I saw the changes in rank as well, although it was a bit of a counter-intuitive perception deviation. Unfortunately, when I looked at the SERPs, I thought to myself, “Well, there isn’t much I can do about this since this query has a wall of advertisements.”
Then there are the unique search tools, such as accordions and videos, which are ideal for anyone seeking for lesson material.
Moving on to “web design blog,” there is a significant decline in CTR that should be investigated.
One thing I did notice with this one is the odd tiny special serve feature at the top, as well as a slew of advertisements:
“Well, if I were to spend all of my time testing, I probably wouldn’t touch this one since I’m not sure I’d be able to enhance CTR in such a circumstance,” I reasoned.
If we reverse the situation and look at the positives, there is one that stands out as an intriguing scenario. What caused the click-through rate for “hire app developer” to skyrocket? Which isn’t bad in and of itself, but what happened? What went wrong here, and what can we take from from it?
I looked into the SERPs and noticed the same wall of advertisements, the complete pack of ads getting in the way, which I believe is part of the reason why this search snippet isn’t receiving the attention it deserves.
However, I believe I have a hypothesis for an experiment in this circumstance.
Locating a Possibility
“Top 5 Places to Hire Freelance iOS/Android or App Developers” is the real snippet:
I took a closer look at this and thought about the user’s aim. Part of the difficulty, I suppose, is that Google users can just click on an accordion portion of the SERP, receive their answer, and go. But I wasn’t persuaded since when I got to the website, I had to go through multiple screens to get the real solution that the page claims.
The top five locations to discover the best app developer are listed on the page. The true solution is tucked away towards the bottom of the page.
Idea for a CTR Experiment
My genuine CTR experiment concept is to connect the promise to the response more directly. The top five sites to look for an app developer. These five are the solution at the end of all that fluff on the page. So here it is, the test. What I propose is that you take the description for this page and slightly alter it.
Giving the answer is what we are essentially doing. “Toptal, Gun.io, Hired, X-Team, and Fiverr Pro are among them. But then I’m going to ask you which one is the greatest for you. We go through how each platform operates, as well as its benefits and drawbacks, as well as cost.” This, I believe, is more directly tied to the relevant query’s user intent.
We also have another factor to consider:
What I’d want to see is an accordion feature that simply states, “Here are the top five destinations, click through to learn more.” And the only way to achieve so is to establish a shortlist and place it to the very top of the page content, rather than burying it deep inside the material.
Measuring the Outcomes
We can return to measuring after we’ve decided to do an experiment like this. This may be done on paper, an Excel spreadsheet, or Google Docs. But I like to do it in AlgoYou since it offers me with an engaging platform and does all of the calculations for me. So I go to the Experiments tab, select “Create new experiment,” retrieve the page’s data, and execute the experiment for 30 days.
“Hire app developer” is the term. Case A is the current case, and Case B is the one I’m switching to.
I’m listing the top five; the issue is, which one is right for you? We go through how each platform operates, as well as its benefits and drawbacks, as well as cost. Then I click “Create new experiment,” and it appears in my list of ongoing experiments. I’ll be able to compare the average CTR on this page for that specific query versus the change when the data comes in.
When you construct Case B, you must edit the meta description on the page and add the recommended content element to the top of the page, then submit it to Google using Search Console to ensure that the experiment is running with new parameters.
This is the website’s initial experiment concept. It’ll be fascinating to see whether this webmaster can put it into practice and report back on the outcomes. We’d want to know if they’re favorable or bad. It’s always beneficial to be informed. Thank you so much, everyone.