Predicting the ROI of SEO [PODCAST]

Predicting the ROI of SEO [PODCAST]

Given the dynamic nature of the search landscape, predicting the ROI of SEO can be a risky move. But with enough data at your disposal, it is possible to produce a realistic projection.

In this episode of the Search Engine Journal Show, I had the opportunity to interview Francois Goube, CEO and Founder of OnCrawl.

Goube offered tips on predicting the ROI of SEO, the key factors you need to consider, and how to get the data you need for your projections.

Predicting the ROI of SEO

How can people predict what kind of ROI they’re going to get from their SEO efforts?

Francois Goube (FG): We’re working with SEO geeks… and they focus too much on the technique. Instead of focusing on their North Star metrics, in terms of business.

For example, some people tend to check their rankings daily…  They see a big drop in their rankings and in their organic traffic. But the main thing is not about how much traffic you lose – it’s about how much business you lose.

So there have been some Google updates that we’re focusing on that were impacting ecommerce websites. And when you’re looking at data, effectively, there have been a drop in the rankings in their organic traffic.

But when you’re looking at how much money they make, it’s still the same.

They just lost some traffic. And it doesn’t matter at all, because the business is still the same.

Keeping that in mind, at the end of the day, it’s all about what’s on your table and how much money you’re making.

This is something that sometimes SEOs are, not thinking about because they are too deep in their SEO techniques.

Brent Csutoras (BC): We see that a lot and it’s something you need to kind of wake up and realize that, your business goals are separate from just random metrics and marketing. Right?

How can marketers look at identifying a path to seeing ROI from SEO?

FG: I believe we need to step back a little bit to really understand what’s occurring there.

First thing is, not all websites are born equal. They are addressing different topics, have different sizes, and don’t have the same overall popularity on the web.

It means that if you’re an ecommerce niche player, Google won’t behave the same way on your website [compared to] a big online media website.

A lot of people are thinking that Google has infinite money, but they are really picky regarding their expenses. When it comes to SEO, they’re running zillions of servers to crawl the web, and it costs money to do that.

So when you’re trained to think about how we all, might optimize [Google’s] own crawling resources, you can point to the right direction.

You also need to understand which ranking factor matters most for your website because you know that all websites are not born equal and that Google is trying to optimize its expenses.

What other factors should people pay attention to?

FG: Something that has been very powerful, because we saw customers having terrific results optimizing that, is structured data.

Adding some schema.org tagging on your product pages, for instance. Let’s say you’re adding aggregate reviews so that you can get a rich snippet showing those stores on Google.

When you’re optimizing your product pages with structured data, chances are you’ll get better click-through rates.

Our customers were able to optimize by adding structured data because they can check it with OnCrawl and they can see the direct results.

When they look at logs, for instance, the number of products sold on those pages with structured data, comparing it with pages with that structured data, they see that Google appreciates that they were adding that kind of schema.

And, they gain way better click-through rates. So for the same ranking, they add a lot more traffic.

I believe that people should really pay attention to what’s occurring on the schema.org side.

BC: You need to have schema at this point or you’re just missing out, you’re just not going to rank the same way, you’re not going to show up the same way, you’re not going to get the same click-through rates.

Schema results in you being included in so many different sites other than just Google. It’s a universally accepted coding structure for information, so I highly agree that schema is an absolute must at this point.

How do you take this information and turn it into something strategic to be able to predict which efforts are going to benefit you and how much they’re going to benefit you?

FG: For SEOs, the main thing is about understanding what is your business and what’s your North Star metric.

If you’re an online media, that might be the revenue per visit.

If you’re an ecommerce player, it might be the average order value, your daily revenues, or the monthly revenues per product page.

At OnCrawl, we’ve unlocked, a few years back, a feature called segmentation. You can map your website with any metric you want (i.e., you can map your page depth levels with your monthly revenues per product page).

You can check if you have revenue-generating pages that are deep in your architecture or if they are closer to the homepage.

If you have those kinds of pages that are too deep in your website, then you might not be doing your job, because they are generating a lot of revenue, but you’re not sending enough internal popularity to them. There might be something wrong here.

The main thing is about understanding your business and bringing value on top of your SEO metrics. When you know your business, you know, your constraints, and you know your business metrics.

So let’s merge your SEO data with your business metrics. This is the first step.

As soon as you get your business constraint in minds, have your SEO data at your fingertips, and have understood what type of website you’re working on, then you can add a lot of value with predicting your results.

We talked about identifying which ranking factors you need to work on first because not all websites are born equal. When you’re merging your log data with your crawl data, it is very easy for you to prioritize things. You can do a test-and-learn approach.

Analyze how much traffic you’re getting and what are you ranking for, you know that you’re ranking.

Aside from knowing what you’re ranking for, you need to detect what are the areas of your website you want to work on first.

Is there any business opportunity somewhere? If you’re taking into account the level of competition, you can use tools like SEMrush to do that.

You plug in your keywords, then you’ll know the level of competition.

Perhaps you have 20 pages that are ranking on the first page of Google between, let’s say, rank 4 and rank 10. And for those pages, the competition is very low.

You can then plug your SEMrush data onto OnCrawl to map your website with the level of competition and you can see where you might have an opportunity to rank and get more traffic.

Let’s take these 20 pages ranking from rank four to 10. With very low level of competition, then you know, that with the few efforts, you will gain some rankings because the competition is very low. Your pages are already ranking that well, but if you are going to move up the top three spots, then you’ll get way more traffic.

By understanding the click-through rates and the volume of queries on those keywords, you can predict the gain in terms of traffic.

When you need to evangelize your bosses or your customers, you can rely on that kind of method.

My colleague, Alice Roussel, also published an article on Search Engine Journal about How to Predict Your Organic Traffic: Two Methods.

To listen to this Search Engine Show Podcast with Francois Goube:

  • Listen to the full episode at the top of this post
  • Subscribe via iTunes
  • Sign up on IFTTT to receive an email whenever the Search Engine Journal Show RSS feed has a new episode
  • Listen on Stitcher, Overcast, or Pocket Casts

[“source=searchenginejournal”]