Tag Archives: seo

SEO Toolbox – Xinu

Just a quick post to let you know about a wonderful resource I’ve stumbled onto: Xinu is a brilliant mashup of valuable search optimization insights such as backlinks, page rank, social bookmark links, and pages indexed. The intuitive service includes links to each platform to investigate your site’s details, leaving only recommendations lacking; Xinu would be greatly improved with brief suggestions for improving each score.

Search Engine Friendly Design – Searchnomics Panel

Last week’s Searchnomics conference proved that the WebGuild can deliver an exceptional event. Most impressive, which I intend to reiterate here, were the keynotes from Hitwise and comScore highlighting the power of search behavior as a window into your customer, a topic we discussed just a month ago. WebGuild’s approach to search marketing is a fresh blend of entry and expert level discussion of SEO, paid search, and social media optimization and I had the honor of sitting on a panel with Google’s Shashi Thakur and DoubleClick Performics’ Cam Balzar to discuss SEO priorities such as site accessibility, technical design, and submission to engines.

Here’s a recap of my priorities with links and comments to additional information:

  1. Make your site accessible – This is a popular topic on my blog so stick around for more
  2. Push to the engines – Use Google Webmaster and Yahoo’s Search Submit tools as well as setting up sitemap autodiscovery
  3. Consider PageRank – Recall the wonderful insight from Shashi on popularity, relevance, and quality of content
  4. Optimize your page titles
  5. Improve page copy

  6. Then worry about

  7. Meta-descriptions
  8. Use of image alt-text
  9. Meta-keywords

We also discussed how to go about obtaining corporate support and internal resources; I highlighted a process I used at HP and which IBM’s Mike Moran has encapsulated: Train your web design team in search friendly considerations, set standards for design, enforce compliance, and measure progress.

I had a tremendous pleasure meeting and hope my presentation was of value to you. Let me know if you caught the conference or drop me a line if you’d like to discuss the material.

Google’s Website Quality Guidelines

At the inaugural SMX Advanced conference in Seattle last week, Matt Cutts discussed Google’s webmaster guidelines but failed to address heavily sought detail on site quality considerations and violations that can result in your being punished by Google (whether your intentions are honorable or otherwise). He did though commit to deliver more insight and Google added content yesterday to accomplish just that.

The guidelines haven’t changed but are still worth your review as they serve as an excellent SEO guide. Added are details for each guideline:

If your site fails to meet these guidelines, it may be blocked and removed from Google. If that happens, modify your site to adhere and resubmit for inclusion.

Setting up Sitemap Autodiscovery

I realized, only shortly after posting the news of the new sitemap autodiscovery protocol that there are few resources that explain the very simple steps that need to be taken to set up autodiscovery through your robots.txt file.

Assuming you’ve named the map “sitemap.xml,” simply add the following line to the robots.txt file located at the highest level of your web server

Sitemap: https://www.YOURDOMAIN.com/sitemap.xml

It does not matter where in the file you place that code.
If you have more than one sitemap, you should have a sitemap index file, the location of which should be listed instead. What is a sitemap index file? If you have more than 100 pages you want crawled through a sitemap, you should breakup your map into smaller lists of less than 100 URLs. An index file points the bots to each sitemap.

Spiders will automatically find your sitemap when they hit your domain and read the robots.txt file. Now, technically, there is no need to submit them through Site Explorer or Webmaster Central though, there is no penalty for doing so. I keep my work with Yahoo! and Google live as I appreciate the other data and shortcuts they make available and let autodiscovery work in the background. Should you continue to submit, by the way, the little known sitemap submission to Ask is through this URL:
https://submissions.ask.com/ping?sitemap=https://www.YOURDOMAIN.com/sitemap.xml

Sitemap Autodiscovery Streamlines Implementation

Ask’s blog today reiterated important news from SES New York that the major engines have agreed on a sitemap autodiscovery standard.

Sites will now be able to specify the location of each sitemap from their robot.txt file creating an open-format autodiscovery platform and eliminating the need to submit sitemaps to each search engine.

The discovery protocol has benefits for search engines as well as site publishers as sitemap standardization brings us one step closer to a comprehensive and accurate search experience.

With the now standard protocols for development and improved distribution through autodiscovery, sitemaps just moved up the list of your SEO priorities leaving you no excuse to get one done.