Conducting a quick site audit
– [Instructor] SEO is complicated. And it’s nearly impossible for me to give you everything you need to know to succeed in this foundations course. However, I want to walk you through a few important SEO audit steps in a very short timeframe. This way, you’ll know where to start and you’ll get a good idea of what’s involved. From here you can check out more courses on the topic and expand your SEO skills. Now we’re going to move fairly quickly. So if you feel lost feel free to pause or rewind. And to speed things up, as you may have noticed, I’ve already opened up all of the tools that we’ll be using in the tabs here in my browser.
So let’s talk about what our plan is. The goal of our SEO audit is to identify three main elements, how we’re doing technically, how we’re doing with our content, and how we’re doing with our exposure on the web. For this example, we’re going to be using the siteexplorecalifornia.org to perform our audit. So the first thing that I do is I open up the siteand I simply take a quick glance at what’s going on. I wanna make sure the site loads and get a sense of how you navigate the site, what the content is about, and what the key information on the site is.
In this case I’m going to assume that Explore California is primarily trying to drive SEO traffic to individual tourist pages. In that case, I can select any of these various options to learn more about each page. The other thing I wanna do is identify any content that isn’t text. So here I can see the headline is text, but this find your tour button is actually an image. Therefore, a find your tour is not text that Google is seeing. If this was an important piece of SEO content I might want to resolve that.
I’ll also open this up on my mobile device and identify how it looks and works on my mobile phone. Because SEO is primarily mobile these days, that is an important step as well.The next thing I like to do is identify what Google sees about this site. So I’ll go to google.com and I’ll run a query for site colon and then my domain name. And there is no space between the colon and the domain name. If you put the dub dub dub or not it doesn’t matter. But if you leave the dub dub dub off it may reveal some interesting insights.
So right away I can see that Google has indexed some content. There’s about 12 results.And the first page looks good, it’s my homepage. But now you’ll notice that there’s some strange things that Google has picked up. You’ll see that they have services.explorecalifornia.org. And the headline is index of. And this suggests that this is content that I don’t want on Google. I may also see PDFs or private data that I didn’t intend for Google to find that they did find. So you always wanna start by exploring these results.
Now seeing things like this suggest to me that we aren’t doing a good job of telling Google what not to crawl. So from there I need to inspect the site. I’ve gone ahead and opened up a tourist page, and the first thing that I’ll do is look around the page. I want to identify if all the main elements are in place. I need to understand I the content that I’m seeing in the Google index is being misinterpreted from this page. First thing I like to do is look at the headlines.
These tend to be our primary keywords. In this case I have one of the top Our Toursfollowed by another headline labeled Backpack California. Now using Chrome on a Mac I choose to right click and then select inspect. And here I can see that our heading one tag is our tours. If I right click on Backpack Cali and choose inspect, I see too that that is a heading one tag. Now there shouldn’t be two heading one tags. That’s a red flag. And I also need to understand if this heading one tag is really a good keyword.
Maybe there’s something better that I should be doing here. Maybe this keyword should really be Backpack California. If I’ve done my research I’d know what I need to put in my heading one. Now we also saw some content that was indexed that shouldn’t be there. So I’ll next check our rotobs.txt file. I simply put slash robots.txt at the end of the URL, and here I can see that we don’t have one of these files, and that’s problematic. A robots.txt file tells Google and other crawlers what to do.
We don’t want the crawler wasting time browsing sections of our site that don’t need to be indexed. So we need to disallow those sections here. Next, I want to look for sitemap.xml. I do that in the same way that I look for robots.txt. Here we can see that we don’t have a site map here. Another red flag. A site map helps Google navigate all the sections of your site.It’s an important component, and the fact that we don’t have it here is not a good sign. So I’d make note that we need to create one. To show you what these look like I’ve gone ahead and pulled up lynda.com’s robot.txt file.
You can see that they don’t want Google crawling a ton of sections. I’ve also opened up their site map. Now in this case, they use a site map index. This tells Google that they have many site maps, and they exist at many locations listed here. I’ll select into one of those site maps to show you what this should look like. Here we can see all the information that Lynda is telling Google about every page on its website. This includes video site maps indicating this is a particular website with a particular video.
Here’s the thumbnail on the video, the title, and so on. Next we need to dive into the technical considerations. I start by running a mobile-friendly speed test, and I’ll provide the URLs to each of these on the screen for you. What we do is we enter in our URL and Google’s going to tell us whether our page is mobile friendly or not. Now you’ll notice something interesting. Google says the page is mobile friendly, but the preview they show in the right-hand side doesn’t look that friendly to me.
Clearly there’s a problem here. You’ll notice that there is a warning at the top of the page.Google says, I had some issues loading this page. If I select view details, Google’s gonna tell me all about what’s broken. It says it couldn’t load these page resources, and these would be areas that we’d want to investigate. Next, I like to run a page speed insights. You’ll find that this will help you understand what you need to optimize to score better in Google.Typically a score above 75 is going to be okay.
Anything below that and you really need to follow the instructions that Google’s provided.Here we can see that mobile has an exclamation point. And this means we have a problem.You can see the score is low. And Google is gonna tell you all of the things that need to be fixed. Here I can click show how to fix, and Google’s gonna tell me exactly what I need to do to score better. Google will also link you to their help docs, which will help you understand how to solve these problems. The top I can toggle into desktop, and as we saw earlier we see a preview, we wanna make sure that that looks right to us, and we can also view the suggestions for desktop.
From here we pop into the Google search console and review all the errors there. Now we took a look at search console earlier so I’ll skip over this. From there I like to look for structured data. Structured data helps Google understand what’s on your site. Now if you’re not using structured data, that’s not necessarily a problem, but there’s a lot of benefit in SEO to using it. Here we can see that Google has not detected anything. This indicates that I might want to add structured data to help improve my SEO.
To show you what it looks like when there is information I went ahead and ran this searchon a New York Times article. Here on the right-hand side we can see that New York Timeshas identified that there’s a news article on this page, and they’ve given Google all kinds of helpful insights. They’re telling Google what the headline is, the description, the genre,where the article section is, and so on. Anything in red indicates a problem that you’d wanna look at and resolve. Now finally we need to look at our actual crawl. And this is where I’ll use a tool called Screaming Frog SEO Spider.
This is a really powerful tool and you can download it for free, although the free version only has limited functions. They do have a premium option and it’s worth every penny. To get it you’ll simply select download from the SEO Spider dropdown. I’ve already downloaded the tool, so I’ll go ahead and open it. What this tool does is crawls your website in the same manner that Google would, and it helps you identify problems. I’ve simply added our URL in the top and selected start, and now you’ll see an outcome of our results.
This is bringing in every page and image and it’s been tested. It lets us know what the type of content is. In this case, HTML, an image, CSS, a video, and here we can see the status.And you really want to see the number 200. That means everything’s okay. A 404 means there’s a problem. And that means that I linked from somewhere on my site to somewhere else on my site and the page that I linked to doesn’t exist. And that’s a big red flag to Google because it means that you’re not totally sure what’s happening on your own site.
So if I select into a 404 you’ll notice the bottom we get helpful insights. We can choose the inlinks. We can see all the pages that this is being linked from and the outlinks if we were linking out from this page. Now there are no outlinks on a 404 page because it doesn’t exist. But using the inlink section I can identify all of the pages that are linking out to this broken link. And I could go ahead and remove that or resolve it. What’s also helpful is you can explore the content. At the tabs at the top I can look at the page titles for every page.
And this helps us because we know that we don’t want duplicate page titles. And so we see the same page title over and over. And that would help me understand which pages I needed to resolve. Same thing with the meta description. We can see if the meta description is custom for each individual page. You can also check the length as well as the pixel length to understand if it’s too long. You’ll notice here on the right-hand side, this tool helps you understand where things need help so I can click to duplicate and it’s gonna show me all the duplicate meta descriptions.
If there are some that are too short I can select that option. If there are some that are too long it would let me know here. We also have the ability to select into our heading one tags, and again we can immediately identify that these aren’t the right heading one tags.Therefore, I have a problem. I need to fix those. As you can see, this crawler is incredibly powerful, and there’s a lot that you can do here. So take a few minutes and run through this quick SEO audit to see if you can find any issues that jump out at you. You’d be surprised at what a few tweaks can do to greatly improve your traffic.