Wednesday, September 27, 2006

11 Best Practices for URLs

 

I could have sworn that someone has already a great post or forum thread on this topic but I can't seem to find it (no matter how advanced my operators). I'm sure Mr. Malicoat has it in his bookmarks but since blog posts are one of my personal systems for public bookmarking here goes.

Eleven Guidelines to Successful URLs

  1. Describe Your Content
    An obvious URL is a great URL. If a user can look at the Address bar (or a pasted link) and make an accurate guess about the content of the page before ever reaching it you've done your job. These URLs get pasted shared emailed written down and yes even recognized by the engines.
  2. Keep it Short
    Remeber always; brevity is a virtue. The shorter the URL the easier to copy & paste read over the phone write on a business card or use in a hundred other unorthodox fashions all of which spell better usability & increased branding.
  3. Static is the Way & the Light
    Not to bring religion into this but I can tell you with certainty that some of the engines absolutely DO treat static URLs differently than dynamic ones. And no human likes a URL where the big players are "?" "&" and "=."
  4. Descriptives are Better than Numbers
    If you're thinking of using 114/cat223/ go with /brand/adidas/ instead. Even if the descriptive isn't a keyword or particularly informative to an uninitiated user it's far better to use words when possible. If nothing else your team members will thank you for making it that much easier to ID problems in development and testing.
  5. Keywords Never Hurt
    If you know that you're going to be targeting a lot of competitive keyword phrases on your website for search traffic you'll want every advantage you can get. Keywords are certainly one elements of that strategy so take the list from marketing map it to the proper pages and get to work. For dynamicly created pages through a CMS create the option of including keywords in the URL.
  6. Subdomains Aren't the Answer
    First off never use multiple subdomains (i.e. siteexplorer.search.yahoo.com) - it's unneccesarily complex and lengthy. Secondly consider that subdomains have the potential to be treated separately from the primary domain when it comes to passing link and trust value. In most cases where just a few subdomains are used and there's good interlinking it won't hurt but I wouldn't take the chance. To me the benefits derived from reputation management (by flooding the SERPs with your subdomains) are minimal compared to the potential loss of link/trust juice. I also think that subdomain takeovers of SERPs is not something the search engines see as beneficial to their users and may shut down at any point. Luckily if you're doing it now you can always 301 to the main domain.
  7. Fewer Folders
    A URL should contain no unnecessary folders (or words or characters for that matter) for the same reason that a man's pants should contain no unnecessary pleats. The extra fabric is useless and will reduce his liklihood of impressing potential mates.
  8. Hyphens Separate Best
    When creating URLs with multiple words in the format of a phrase hyphens are best to separate the terms (i.e. /brands/dolce-and-gabbana/) followed (in order) by underscores (_) pluses ( ) and nothing.
  9. Stick with Conventions
    If your site uses a single format throughout don't consider making one section unique. Stick to your URL guidelines once established so users (and future developers) will have a clear idea of how content is organized into folders and pages. This can apply globally as well for sites that share platforms brands etc. Re-inventing the wheel in situations where reliance on convention makes everyone's tasks easier is folly.
  10. Don't be Case Sensitive
    Since URLs can accept both uppercase and lowercase characters don't ever ever allow any uppercase letters in your structure. If you have them now 301 them to all-lowercase versions to help avoid confusion. If you have a lot of type-in traffic you might even consider a 301 rule that sends any incorrect capitalization permutation to its rightful home.
  11. Don't Append Extraneous Data
    There's no point to having a URL exist in which removing characters generates the same content. You can be virtually assured that people on the web will figure it out link to you in different fashions confuse themselves their readers and the search engines (with duplicate content issues) and then complain about it.

Example Time
The following are some grievously heinous violators of the guidelines above:

  • http://www.target.com/gp/detail.html/602-9912342-3046240?
    _encoding=UTF8&frombrowse=1&asin=B000FN0KWA

    Target (who's powered by Amazon) doesn't describe their content use keywords or keep it short. That and the horrifyingly useless data that can be removed from the URL without changing the content make this URL downright ugly.
  • http://etsy.com/view_item.php?listing_id=477443&pic_id=2
    Despite being one of my favorite sites Etsy's URLs provide no descriptive information use multiple dynamic parameters and separate breaks with underscores.
  • http://maps.google.com/maps?f=q&hl=en&q=98115&ie=UTF8&z=12
    &om=1&iwloc=A
    Google should be ashamed - their guidelines for URLs practically set the town for the recommendations but their maps feature is almost unusable due to inefficient bloated URLs (when they must know that millions want to copy those URLs into emails)

These few below are doing a considerably better job but could still go the extra mile:

  • http://men.style.com/news/gadgets/092006
    It's almost there and one could almost argue that the subdomain use here is justified for branding purposes. It is too bad they gave us so much data but then cut out keywords and descriptives right at the end
  • http://www.nasa.gov/home/index.html?skipIntro=1
    Nasa has uselessly appended dynamic parameters onto the page and added /home/index.html for no logical reason
  • http://www.newyorkmetro.com/fashion/fashionshows/2007/spring/ main/newyork/womenrunway/marcjacobs/
    They're trying to be descriptive which is great but not separating words and going 7 folders deep is really pushing it.

These last examples have done nearly everything right:

  • http://www.discoverohio.com/visitors/map.asp
    Brilliant - it's short descriptive static and obvious.
  • http://web.mit.edu/is/usability/usability-guidelines.html
    Despite the subdomain everything else is near perfect.
  • http://www.whitehouse.gov/history/presidents/jk35.html
    I'm letting the White House off the hook for not using "john-kennedy" as the page title because they've wisely also provided his number (the US' 35th President).

URLs seem like one of the most simplistic parts of SEO but I find myself returning to this issue with nearly every client. Hopefully these guidelines can help a few folks make use of best practices before it becomes an issue down the road.

One last thing - do as I say not as I do. SEOmoz herself is a ship sorely in need of righting. $10 says our search traffic jumps more than 20% once we switch to the new friendlier URLs.


  posted by Smile Community @ 2:08 AM

0 Comments:

Post a Comment

<< Home

 

Search


Web W-master

Home

Previous Post

*SEO Gap Analysis -- Understanding What It Takes To...

*The Brilliance of Eloquence

*Dr. House MD - The Best TV Serial I Have Seen

*How To Configure PHP 5 With Apache 2 On Windows in...

*Contextual Search Arbitrage - Pure Click Insanity

*What I Learned in Guelph

*Authors Rule The Content World Online!

*SemioLogic Theme is Illegally And Unethically Ship...

*Friday Morning Break

*Link Popularity Algorithms - Know What is Behind L...

Archives

*February 2006

*March 2006

*April 2006

*May 2006

*August 2006

*September 2006

*October 2006

*November 2006

*December 2006

Partner Links

Best Offers

RSS Feeds

TOP Blog Links

BlogOmg >> The Best and Most Popular Blogs
Ping Your Blog!
Name:
URL:
Powered By: PingTheEmpire.com