SEOmoz Beginners Guide to SEO – Ch8 Review
The first concept covered is Sitemaps, what they are and why you want to use them. It’s all about making it easy for your content to be found.
Real simple syndication allows you to get your content out to interested users who opt in to your feed. It gives the pro and con of RSS technology.
Here you get an explanation of what text files are. The advantage to using this format is that it is simple to use but does not offer meta data.
The Robot.txt has been covered before. This file allows you to designate what you want crawled and what you want kept out of the search engine data base.
This part of the chapter explains some of the commands you need to know to set up your robot.txt file properly.
Here you will get an explanation about the meta robot tags, index, noindex,follow ,noflollow and tells you what they do and how to set them up.
Search Engine Tools
Under search engine tools you get to learn about Google and Bing webmaster tools and what they offer in the way of information about your website. You’ll want to set up your website in both of these tools to get data about who is visiting and what content is attracting the most attention on your site.
The last part of the chapter mentions SEOmoz tools that can help you manage your SEO and data to maximize performance.
Take a look at the video rundown. I think you’ll find it useful and a good set up for your read of the material.
As always, if you find this post helpful, give it a “loke”, a “tweet”, a “pin”, a “Google+1” to get the word out. Many Thanks for help.
Stay with it, stay well and talk soon.
For you readers, there is a transcript so click read more to get it.
Hey, Claude Pelanne, affiliate Starting Line. Welcome.
This is a continuation of the SEOmoz Beginner’s Guide to SEO, and this is about Chapter 8. 8 is about Search engine tools and services. This explains certain concepts behind how you can use different SEO tools, to figure out the SEO for your site.
It starts with the concept of site maps. Site maps are a list of files that tell the search engines how they can crawl your site. Each page of your website is its own URL and each page can be fit with certain tags that will tell the search engine whether it can index it or not, or follow or not and site maps are the file that has those URLs for each one of those pages in it and they help the search engines crawl your site. And so a Search Engine will look for site maps when they show up on your site. Some people set them up, some people don’t and they make a big mistake by not doing that. So the language used for site maps is called XML, Extensible Markup Language and this simply says this is the format of the language. It’s a very good language to use because it can produce a lot of site map generators. It can create these site maps easily. The problem is it creates very large files. So the pro is that it’s a widely accepted format for site maps. The con is it creates large files.
Then it goes on to explain three other different formats that you’re going to find on websites. RSS, Real Simple Syndication, rich site summary, is a piece of code, and it allows for people to access the content of their website dynamically. And what it means is you set up an RSS on your site, people sign up to that RSS feed, and when you add content to the site, it automatically sends them a notification to their RSS readers, that you have content on your site and they can click it and access it. So RSS is easy to maintain. RSS site maps can easily be coded and automatically update but the con is they’re harder to manage and it’s a variation of XML and it’s harder to manage.
Text files. Text files are un-formatted text in a file and it’s a specific type of format, .txt. They’re easy to use. The text site format is one URL per line and most site maps limit 50,000 lines and then they have to create a new site map. I wouldn’t worry about that. It does not provide the ability to add metadata so you’re not going to be able to indicate index, no index, no file, etc. but you don’t have to worry about that because there’s other means of doing that.
Robots.TXT. We already looked at this. This is a file that allows you to designate whether you want to allow a search engine to crawl you. It explains the different commands that occur. In the robot.txt file, this allows site map to crawl delay. It shows you an example, so in this example here we have a website called www.example.com/robots.txt. The page on the website that we’re talking about is robots.txt.
User agent is the place where the name of the bot will be, that you want to either disallow. And disallow is the command. Right now disallow is anybody to crawl, and there’s no disallow at all so this designation right here, this code here allows crawling of any kind. Down below it, the user agent, you specify the user agent, spam bot. You don’t want this bot to index your site. It could be a spamming, you know, that allows the spammer and you have created a disallow by putting a parameter behind. So now this spam bot will not be allowed to index your site. That’s what that means. And it shows you the format for doing that and you can do that for every crawler you want.
So what it tells you in the little paragraph here is generally that works very well but sometimes these coders are pretty smart and they work their way around it or they create a new spamming bot that you haven’t designated yet.
MetraBots. Again, this goes back to what we talked about before. You can designate whether you want to index or no index, whether you want to follow or no follow and it explains here the code of how that occurs, on whatever page it is that you want to invoke those commands and those parameters. Ad it shows you the code that’s needed to do it.
We already covered some of this in another part of the book. but it shows you here and it gives you an example of the code of how to code for example, the no follow. The tricky one is the canonical. This is where you may have several pages on your website. In this case they’re showing four pages and they’re pretending these all carry the same content. You want to avoid that. You want to designate one of these as the go-to page, and the other three you want to tell the search engine to ignore. You do that by going on these three pages that you want to ignore, and you create the code that you see here on the right. So it shows you the code designation for telling the search engine that on the default.asp page, you want to designate this page here in red as the main page to go to for the content.
So there is an article, a very good article I might add, over here, over here, over here, on the Daily SEO blog that explains canonical designations and how you should play those. So you want to go to the SEOmoz site and look for that.
Search Engine Tools. Google webmaster tools is one free tool, I just kind of briefly showed you that. Let’s go to, this is the webmaster tools here for a website. You can see what it does. So that’s what it would look like. This is a training site that has nothing in it but this tool here would give you information about the configuration of your site, the site links, the URL parameters. It’ll show you the health of your site, give you crawl errors, tell you whether you have any Malware. It will give you information about traffic, the links coming to your site. So this tool will help you analyze what’s going on with your website. And that’s what this information here tells you. It will give you geographical targeting information, preferred domain information, URL parameters, crawl information, site links, Google One metrics, all sorts of information about your site.
Then it goes on to say that Bing has exactly the same kind of tool available and this is what the Bing site would look like. Again, it gives you information here on the right hand site about site maps, crawl information. All of the stuff that you want to know from Bing on how your site is doing. And these are free tools. The only thing you have to learn how to do is set them up. I’ll have some videos on how it’s done.
Then, down here at the very end it talks about SEO Max Open Site Explorer. That’s a great tool and then it gives you an indication of what it can do and it also pushes SEO Max products, which it should. This is a great outfit to use if you have the need for it and everybody does so it’s a subscription service and you can go there and find out what it costs. I think it’s $100 a month for individuals and it can go around to $500 a month for businesses but it’s an absolutely universal tool used by all professional SEO’s, at least that I know of and so you want to know about it.
Eventually, if you end up in the SEO world working in it, you’re going to probably be using this as one of the tools that you’ll be using to analyze. So that’s it. That’s Chapter 8 of the SEOmoz Beginners Guide to SEO. I hope this video has been helpful. This is Claude Pelanne, Affiliate Starting Line. Stay with it. Stay well. I will talk to you soon.