In Defense of URL Shortening Services

Joshua Schacter’s argument against URL shortners like TinyURL is making the rounds. Schacter’s the brains behind Delicious and he’s clearly a very smart guy, but on this topic he seems to be wrong on pretty much every point, beginning with his claim that the original need for URL shortening services has disappeared,

Their original purpose was to prevent cumbersome URLs from getting fragmented by broken email clients that felt the need to wrap everything to an 80 column screen. But it’s 2009 now, and this problem no longer exists.

This is, of course, nosense. I see fragmented and broken URLs all the time in the pretty much every mail client I use. I primarily use Thunderbird, and see broken URLs all the time (especially in forwarded e-mails).

The worst problem is that shortening services add another layer of indirection to an already creaky system. A regular hyperlink implicates a browser, its DNS resolver, the publisher’s DNS server, and the publisher’s website. With a shortening service, you’re adding something that acts like a third DNS resolver, except one that is assembled out of unvetted PHP and MySQL, without the benevolent oversight of luminaries like Dan Kaminsky and St. Postel.

Since URL shorterners run on top of this same brower/DNS resolver/DNS server/publisher’s website this seems like little more than crankiness on Schacter’s part.

The transit’s main problem with these systems is that a link that used to be transparent is now opaque and requires a lookup operation. From my past experience with Delicious, I know that a huge proportion of shortened links are just a disguise for spam, so examining the expanded URL is a necessary step. The transit has to hit every shortened link to get at the underlying link and hope that it doesn’t get throttled. It also has to log and store every redirect it ever sees.

To most non-techies, URLs are far from transparent. The sort of 100-200+ character long URLs spit out by many web sites these days are just as opaque and difficult for users to figure out as are anything TinyURL generates. This is, after all, why phishing attacks are so relatively easy to pull off in the first place — URLs have become so opaque it is difficult from quick inspection to determine if it is legitimate or not.

In fact, the simple easy to rule to remember is not to click on any URL unless you’re certain it is coming from a trusted source (and to make sure you’re running anti-virus, AdBlock, NoScript, etc.) Relying on URL inspection to determine whether or not a URL is safe to click on is neither a good use of time nor likely to prove a reliable method.

The publisher’s problems are milder. It’s possible that the redirection steps steals search juice — I don’t know how search engines handle these kinds of redirects. It certainly makes it harder to track down links to the published site if the publisher ever needs to reach their authors. And the publisher may lose information about the source of its traffic.

The major URL shorteners all use 301 redirects which tells search engine to use the long URL at the publisher’s website rather than the short URL for purposes like determining “search juice.”

There arBut the biggest burden falls on the clicker, the person who follows the links. The extra layer of indirection slows down browsing with additional DNS lookups and server hits. A new and potentially unreliable middleman now sits between the link and its destination. And the long-term archivability of the hyperlink now depends on the health of a third party. The shortener may decide a link is a Terms Of Service violation and delete it. If the shortener accidentally erases a database, forgets to renew its domain, or just disappears, the link will break. If a top-level domain changes its policy on commercial use, the link will break. If the shortener gets hacked, every link becomes a potential phishing attack.

Schacter seems to assume shortened URLs will have long term use, but I suspect most people use them as throwaway ephermal links for when we need to post long URLs to Twitter or in e-mails where we want to be certain the recipient doesn’t find their e-mail system has mangled the URL into an unclickable mess.

Ultimately, though, URL shortening services are an effective solution to a real problem — the difficulty in sharing long URLs across systems — and one that publishers need to solve either by shortening the lengths of their URLs or running their own automatic shortening service. There’s no reason each page on a website couldn’t have a “Link to this” button or URL that was an automatically generated shortened form of the canonical URL.

One thought on “In Defense of URL Shortening Services”

  1. Film Rural,act represent satisfy effective reader price lift means either report will increase anyone arrangement transport perhaps there apparently winner hang regional defence assess suggest before act additional break opportunity lady director selection late serve characteristic channel department attitude cabinet period manage enjoy before offer nor damage competition walk small black tool some outside in somewhat walk reference system place long marriage an really final remind know put sea complex any drug fall different share plenty bird act survey certain describe plenty limited word player issue claim quarter rain prime sometimes sentence library aspect loan

Leave a Reply