A Few Tips For Climbing The Search Engines

After our recent post highlighting a recent duplicate content issue we faced, we decided it would be a good time to start giving out free advice for building your web rankings.

One of our most commonly requested is web site optimization. Unfortunately (and fortunately) web developers are just now starting to implement a better practice of building sites to be search engine friendly. As web development continues to be outsourced, many developers just don’t feel like putting in that little bit of extra effort that truly does go a long way.

You may be familiar with some of the following issues, as I’m trying to keep this list simple, but hopefully there are a few gems that may help you on your way to the top. If you have any questions, don’t be shy to comment and we’ll help in every way we can.

  1. 301 Redirects – With almost every customer of ours, the first thing we do is set up a redirect to or from the ‘www.’ sub-domain. If you fail to do this, the search engines may determine that you have 2 websites (http://www.grapethinking.com and http://grapethinking.com) and split your ‘google juice’ between the two of them. The fix is really simple if you have an apache server with a hosting company that provides htaccess and mod_rewrite.
  2. Dynamic URLs – Which is easier to interpret? grapethinking.com/a-few--for-climbing-the-search-engines OR grapethinking.com/?p=4586 It shouldn’t be that difficult to make your url structures dynamic so you have ‘pretty’ links. This will not only help your search engines, but also make it easier for your visitors to browse your site.  It is generally a good practice for your url’s to include your page title.
  3. Site Map – If you want to make sure the search engines crawl all of your site, you should place a site map .xml file and point to it within the robots.txt file. The more friendly your site is to the search engine, and handing it a map the second it gets to your site is pretty nice, the more friendly the search engines will be to you. If you don’t have a site map, there are plenty of sites that will create one for you. One I recently used was XML-Sitemaps which was much simpler than downloading google’s python script, even though I am a fan of Google Webmaster Tools.
  4. Dynamic META info – This topic has been beat to death across the internet, but with good reason… it IS important. Make sure you use dynamic information for creating your meta titles. If they are all the same, your just telling the search engine that every page in your site is the same, AKA: you are boring. The meta descriptions aren’t as important because the robots can generally determine the more important content of the page (see next). The meta tags, while extremely valuable about 5 years ago, are now pretty much irrelevant.
  5. Content Placement – When you buy a magazine, book, or newspaper, the most relevant information is on the front cover. It’s the first thing you see. Search engines seem to think right along those same lines. The closer the information is to the top of your page the more important and relevant that content must be. If you haven’t specified meta information, and you haven’t effectively used CSS to layout your content, you may even find Google putting your navigation menu as your site’s description. You don’t necessarily have to have the relevant content displaying at the top of your page, it just needs to be first in the HTML file.
  6. Your Hosting – That’s right… your hosting can play a large role in your search engine rankings. If you have shared hosting, you might be on the same server as one of those nutsos blowing viagra and cialis all over the place, or even worse, a porn site. Even though you may have a different domain name, your IP address will still be the same, and you will be punished. It’s worth the extra cash to get a designated IP. If your bootstrapping, you can always choose a hosting company that makes it tough for spammer to create accounts via call to verify or requiring one year upfront payment instead of month-to-month.
  7. AJAX – Ajax is great for forms and images, but don’t try to “AJAXify” your whole site. This wonderful can provide great UI, but search engines are blind when it comes to javascript, and the fact that your site sits under 1 never-changing URL does not help much either.  Believe it or not, a relatively large percentage of users (~10%) have javascript disabled, and would not be able to use your AJAX heavy site.  Most large companies have JS disabled by default for protection, and a lot of users typically turn it off to speed up their browsing experience, and eliminate pop-ups(unders,overs,etc..).  JS and AJAX can greatly enhance your site, but should not be required.

So there you have it… 7 of the more common mistakes made by web developers. Granted, what you do on-site is only the foundation of optimizing your site, the off-site is what really provides the most weight in search engine rankings.

Grape Thinking is a Search Engine Optimizer (SEO). If you are interested in having a professional provide you with an analysis of your site (with detailed instructions on how to fix the problem areas), request a free quote.

Tags: , , , , , ,

Posted in Marketing, Technology | 11 Comments »

  • Steve

    Thanks for the great list Jake. Your right, I did find a few gems as I had no idea about being penalized because of shared ip’s, and I apparently missed the memo on the 301 for www. How do I know which is better, to or from the www?

  • Steve

    Thanks for the great list Jake. Your right, I did find a few gems as I had no idea about being penalized because of shared ip’s, and I apparently missed the memo on the 301 for www. How do I know which is better, to or from the www?

  • Jake

    Hi Steve, I’m glad you found the post useful. Re: the www. sub-domain, it is mostly a matter of preference. Some would argue that pointing to the www. sub-domain is a better choice as we enter into a new realm of technology, and people are using them (blog. apps. mail. rss.) more often. In the future, we may find that the engines may view the url more relevant if it specifies a http://www., but for now, it really doesn’t matter.

  • rich

    The next step in SEO was announced today by Yahoo view article with their decision to index the semantic web and microformats markup from around the web, which they will use to display more structured search results. Google is soon to follow, if they haven’t already behind the scenes. Any smart marketing company will be all over this as the opportunity for helping publishers implement semantic web markup (seo+) is about to explode.

  • rich

    The next step in SEO was announced today by Yahoo view article with their decision to index the semantic web and microformats markup from around the web, which they will use to display more structured search results. Google is soon to follow, if they haven’t already behind the scenes. Any smart marketing company will be all over this as the opportunity for helping publishers implement semantic web markup (seo+) is about to explode.

  • Tim Moore

    Thanks for the helpful guide Jake — a lot of that info is around but is locked up in SEOmoz or other SEO books — thanks for sharing. What terms are you guys ranking high for?

    Also, the semantic markup that Rich noted seems like a great opportunity to differentiate yourself in a saturated SEO market and even get your site ranked higher.

  • Tim Moore

    Thanks for the helpful guide Jake — a lot of that info is around but is locked up in SEOmoz or other SEO books — thanks for sharing. What terms are you guys ranking high for?

    Also, the semantic markup that Rich noted seems like a great opportunity to differentiate yourself in a saturated SEO market and even get your site ranked higher.

  • Jake

    Hi Rich, thanks for stopping by and sharing the breaking news. A site you might be interested in checking out is powerset.com. They have been working very hard to build a search engine that focuses on natural language query, which is the goal of any search engine trying to take advantage of the new semantic technologies. It will be very exciting to watch as the search engines apply varying weights based on the seo+, and I’m curious whether sites implementing the semantic markup will get the “wikipedia” boost due to their increased organizational structures. I don’t want to be too winded here though, look for a post on this in the near future.

  • Jake

    Hey Tim, glad we could help. All of the information can be found scattered about here, there, and everywhere. I just wanted to highlight the problem areas that plague all of our clients, and seem to be the most commonly overlooked by developers. As Steve demonstrated, people rarely think about their shared hosting and the ramifications.

    I would tell you what our high-ranking keywords are, but then I would be telling our competitors where they need to start optimizing their sites.

  • nmuncer

    Got that site URL this morning.
    It can help you know what’s on your shared hosting
    See that there was 268 overs sites on my shared account…. No serious threat but, quite a packed server…

    http://www.myipneighbors.com/
    ps:(that site is not mine, but it seemed relevant enough to me to give the link

  • nmuncer

    Got that site URL this morning.
    It can help you know what’s on your shared hosting
    See that there was 268 overs sites on my shared account…. No serious threat but, quite a packed server…

    http://www.myipneighbors.com/
    ps:(that site is not mine, but it seemed relevant enough to me to give the link

Categories

Archives