Proofing for Problems: SEO and Web Development

You’ve read all the SEO blogs out there, including this one. You’ve tried, tested, tweaked and torque your site and search marketing efforts to the best possible calibrations. You’re confident that your site is as tight as it can be, and your marketing efforts will soon pay off.

But…

Life is a Beach

It’s hard for a business owner to just sit back and wait. As much as we dream about being paid for doing nothing, and as much as those pictures of people relaxing on the beach with a beer and a laptop appeal to us, we know that isn’t reality. No… reality is more like Murphy’s Law. Whatever can go wrong, will go wrong.

Instead of waiting and chewing your fingernails down to the quick, start proofing for problems in your SEO efforts. Here’s a brief list of things you can look for and do something about before they get in the way:

That Darn Robots.txt File

Many a website owner has blown the first month or so of an SEO campaign because of a simple mistake in the robots.txt file of their site. Two lines can really jack up a campaign:

User-agent: *

Disallow: /

The first line says, “Hey, all you robots, search engine spiders, crawlers, whatever…” The second line says, “Don’t visit any pages on this site.” Generally, these lines are placed in the robots.txt by a webmaster who forgets to take them out after a design/redesign.

Proofing this problem: The robots.txt file can be found in the main directory of your website. You can view it with any browser, simply by typing in the URL http://mysite.com/robots.txt (or http://www.mysite.com/robots.txt, depending on your settings). If your site is supposed to be visible to search engines, make sure the “disallow” line isn’t in your file.

Canonical URLs – To W3 or Not?

View your site structure as Google sees it. You can do this by typing “site:mysite.com” or “site:www.mysite.com”. Scroll down your personal SERPs, looking at the URLs. What you’re looking for is a www.mysite.com and a mysite.com version. You don’t want both. If you’re seeing both versions, you have an issue with your canonical URLs. There’s a penalty for this, but it’s self-imposed.

You see, all the links, ranking and so on you accumulate to your site is finite. You have a limited amount. –And, although you see www and non-www versions as the same site, search engines don’t. This means that your SEO efforts become split between two versions.

Proofing this problem: If you find both versions, you’ll need to update your .htaccess file with a 301 redirect, again found in the main directory of your site. Add the following lines to go from non-www to www:

RewriteEngine On

RewriteCond %{HTTP_HOST} ^mysite.com [NC]

RewriteRule ^(.*)$ http://www.mysite.com/$1 [L,R=301]

You want to replace “mysite” with your actual URL. You can also set up a Google Webmaster Tools account and specify which version you prefer Google to index.

Certain Structure

Proper Content Structure

Would you put a roof on your house without first building the walls? Of course not.  A house is built in certain order, and so is a piece of content. Not only does it help your readers, in that proper structure provides an easy-to-read piece, but it also help search engines digest your pages better.

Proofing this problem:  Look over your content and make sure you have proper page structure. For example, a #1 heading (<h1></h1>) should never come after a #2 heading or be left off entirely. Make use of normal document structures for stronger pages.

Duplicate Duplication

Duplicated content doesn’t bring site penalties; after all, if it did, you’d be in trouble any time someone scraped your site. However, it does lessen the overall quality of your site, whether it’s duplicate content or copied meta information.

Proofing this problem: If you have a Google Webmaster Tools account, you can go in and see what suggestions it has for your meta information. Otherwise, look over your meta tags and titles to make sure they’re unique to the page they’re on. If not, rewrite them!

Key Term Much?

Just because you can use a key term on your page, doesn’t mean you should. Too many key terms can be as bad as too many uses of the same term. If you’ve poured a lot of key terms into the same page, you might want to rethink that strategy.

Key terms should be considered topics. Main topics, sub topics, even tertiary topics, but no more. Really, how many topics can one page be about?

Quality & Clarity

Proofing this problem:  Go through your pages; after all, you have nothing better to do than chew your nails, right? Ask yourself what three things the page is about. What’s the main topic? How many topics can it reasonably be broken in to? If you can find five or six topics, it may be better to tighten it up to two or three and create another page with the other two or three. The more tightly focused a page is, the better it will be received!

Proofing for Problems Isn’t for Wusses

It takes a little time and a lot of attention to detail to proof your SEO campaign, especially once you’ve already started implementing things. In fact, the often-meticulous nature of optimization is one of the reasons businesses hire professional SEO experts in the first place. However, by taking the time to tighten up your site and campaign from point to point, you’ll also be increasing the possibility of high returns from any marketing campaign you set your mind and resources to.

 

About Level 343

This account is where everyone involved with Level343 content marketing efforts show up. You can say there is no "I" in this team. Sometimes we will chat about a certain topic with a variation of ideas, suggestions, even opinions. Then one of us will start writing the post, hand it over to someone else who will continue the diatribe. Eventually it ends up on our editors desk who either chops the hell out of it, or you're reading it right now.

Comments

  1. In these days, SEO is more difficult than before, so we have to keep SEO in mind to develop any theme for the websites or blogs. Thanks for your post.

    Best Regards!

  2. Hi, Ron –
    Yes, you can just use Googlebot for Google’s user agent. However, the search engine has more than one user agent (example: Googlebot-image). You can find more in depth, Google-specific information in Google’s webmaster support.

    Also, you can find a comprehensive list of crawlers at User Agent String. To block, you use the bot’s name, not the whole string.

    For example: If the string is Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0) AddSugarSpiderBot http://www.idealobserver.com
    The robots.txt “User-Agent:” designation would be AddSugarSpiderBot

    Hope this helps, and thanks for stopping by!

  3. Have a question here. When using a robots.txt file, does the user agent string have to be exactly as it appears in my server logs? For example when trying to match GoogleBot, can I just use “googlebot”?

  4. Well done! You just presented key tips for a high-level DIY audit. WIN!

  5. Very resourceful post, for seo this is very important to follow above tips which help in gaining good rank in search engine.

  6. Absolutely, Jack – Good point.

    One of the things we have to be careful of when writing posts is to walk the line between readers who have a lot of technical savvy and readers who have none. Sometimes we miss the obvious answers in an effort to walk that line.

    For instance, how many know to check the URL parameters in GWT, and know what to do with it when they do check? -And how do we write this so it’s easily understandable without “talking down” to the savvy readers?

    Ah, c’est la vie! lol

    Thanks again for your comment!

  7. You’re most welcome, Karen, and thanks for taking the time to comment!

  8. Thanks for taking the time to comment, Laura. Sometimes, that robots file can be a sneaky one. It’s better to have it on a list of “things to check” and have it be empty, than to leave it off and miss something.

    - And thank you for the compliment!

  9. A smart post. Those seem to be a rare commodity these days. I’ve never done anything wtih my robot.txt but I haven’t ever checked it either.

  10. One cool feature of Google Webmaster Central is that it can help you keep from making mistakes when you generate your robots.txt file. If you’re preventing crawlers from accessing important parts of your website, Webmaster Central flashes a warning and tells you which pages you’re blocking.

  11. Just built my new site, so I’m now retracing all things related to SEO. Thanks for the timely article.

Trackbacks

  1. RT @SEOcopy: Proofing for Problems | SEO and Web Development #Optimization http://t.co/WG5lBdbk

  2. purplehayz says:

    RT @SEOcopy: Proofing for Problems | SEO and Web Development #Optimization http://t.co/RvwwPn2f

  3. Fresh Squeezed Post – Proofing for Problems – SEO & Web Development http://t.co/2rS4EtTJ

  4. Fresh Squeezed Post – Proofing for Problems – SEO & Web Development http://t.co/2rS4EtTJ

  5. RT @level343: Proofing for Problems: SEO and Web Development #Optimization http://t.co/QSI1UTFI

  6. Our latest Fresh Squeezed Post – Proofing for Problems – SEO & Web Development http://t.co/ppuCYOKv

  7. RT @SEOcopy: Proofing for Problems | SEO and Web Development #Optimization http://t.co/Vw00SWQP

  8. Robert Brady says:

    RT @SEOcopy: Proofing for Problems | SEO and Web Development #Optimization http://t.co/NLAiqW0v

  9. RT @level343: Proofing for Problems: SEO and Web Development http://t.co/ZcuiwfNa

  10. RT @level343: Our latest Fresh Squeezed Post – Proofing for Problems – SEO & Web Development http://t.co/zKg05BVC

  11. Matt Siltala says:

    RT @mattmcgee: RT @SEOcopy: Proofing for Problems | SEO and Web Development http://t.co/v9vqWotr

  12. Matt McGee says:

    RT @SEOcopy: Proofing for Problems | SEO and Web Development http://t.co/qQ8stodi

  13. RT @level343: Proofing for Problems: SEO and Web Development #Optimization http://t.co/r8hcvn6O

  14. RT @level343: Proofing for Problems %u2013 #SEO and Web Development #Optimization http://t.co/G4JhqjRT

  15. SEOTweets.me says:

    Proofing for Problems … SEO and Web Development http://t.co/w58wZKhW (via @hugoguzman)

  16. Joe Hall says:

    RT @SEOcopy: Proofing for Problems | SEO and Web Development #Optimization http://t.co/f11QT00A

  17. Our latest Fresh Squeezed Post – Proofing for Problems – SEO & Web Development http://t.co/ppuCYOKv

  18. RT @SEOcopy: Proofing for Problems | SEO and Web Development #Optimization http://t.co/MhhHVfmZ

  19. JulieJoyce says:

    RT @SEOcopy: Proofing for Problems | SEO and Web Development #Optimization http://t.co/y3qcahKj (better)

  20. JulieJoyce says:

    Twitter: RT @SEOcopy: Proofing for Problems | SEO and Web Development #Optimization http://t.co/y3qcahKj

  21. RT @level343: Our latest Fresh Squeezed Post – Proofing for Problems – SEO & Web Development http://t.co/VXP8biok

  22. Hugo Guzman says:

    RT @level343: Proofing for Problems – SEO and Web Development http://t.co/FwT9yEuK

  23. Our latest Fresh Squeezed Post – Proofing for Problems – SEO & Web Development http://t.co/2rS4EtTJ

Speak Your Mind

*