Could there be an association of URL-shortening services?

The creator of a new URL-shortening service, urlborg, recently wrote to me to announce some new features. There are, at this point, quite a few of these URL-shortening services. I’m sure each has differentiating features, but before I explore the differences I’d like to see a new and important kind of commonality.

Each of these services invites you to invest in creating a set of short URLs that point to your own longer URLs. None of them provides any guarantees about the future availability of those short URLs. I’d love to see these services form an association that does make such guarantees.

There can never be a simple solution to the problem of linkrot. We don’t own domain names, we only rent them. As content management systems evolve, so often do the URLs they project onto the web. Even if an association of URL-shortening services guaranteed the continuity of short URLs, the long URLs behind them would remain as fragile as they are today.

Still, it would be an inspiring and forward-looking experiment to try. What if TinyURL, snurl, urlborg, and the others were members of an association that would inherit the URL mappings of any member that ceased to honor them? Given such a guarantee, I’d be much more willing to invest in the creation of URL mappings with any of the members, and to explore the features that differentiate them.

Posted in .

20 thoughts on “Could there be an association of URL-shortening services?

  1. I’m not really sure that such guarantees are really a necessary precondition for using link-shortening services. As I see it, the main purpose of using such services is for posting/sending URLs in plaintext when you cannot control the HTML [a] tag. For instance, emailing URLs or posting them in comments. In most such cases, the persistent availability of these URLs is of diminishing benefit. I’m unlikely to save a tinyURL as a bookmark, and conversely, I’m unlikely to treat my emailed URLs as a searchable database of links.

    For content publishers who control their HTML, like bloggers, journalists, delicious, etc., it’s far easier to either create short descriptive text for longer hrefs within the [a] tag, or to save their own hash of the URL in a database, ensuring long-term control over external links. With the latter solution, they can even have an automaton which checks the validity of each link in the database at regular intervals.

    So I’m perfectly happy with tinyURL as it is today, and don’t assume or desire that the URLs it supplies are valid for much more than a month after I email them to friends.


  2. Kirby, how about the thousands of short links in the domain? As a blogger and site owner I’m interested that all those linking to my site will be there for much longer than a month.

  3. There seem to be two issues related to link rot. First, there is the question of the endurance of short links. But there is also the endurance of the target of the short link.

    This project is about the second question and provides a fall-back permalink *and* cached content if a page vanishes:

    It is not all-purpose, yet an interesting scheme in the case of scholarly work involving web-based source materials.

    As an author of web pages, I don’t feel that I have anything to say about how others refer to my materials. However, I think the short-url folk will have a problem when they recycle short-urls to inappropriate material.

    This sounds like a problem that the constraints of short urls cannot overcome.

  4. “As an author of web pages, I don’t feel that I have anything to say about how others refer to my materials.”

    I should have said “anything to say about how people choose links to my material.” I do have something to say about citation of my work (i.e, when offering Creative Commons Attribution licenses), but I do not intend to police that in any vigorous way.

  5. I recently (yesterday) tweeted about short url’s in relation to Twitter.

    The problem isn’t link-rot, the problem is spam – the inability to *see what you are clicking*.

    A federation of short url services is a wonderful idea if it exists for reasons such as implementing blacklist’s to reduce spam and the hazards that come along with “blind-clicking”.

    But link-rot? No.

    As a blogger, webmaster, site-owner, url-renter, or whatever you want to call yourself – your primary focus should be influencing people to regularly visit *your site*, via word-of-mouth, bookmarks, bookmark-services, or random search. Emphasis on *your site* though, not short url’s. You don’t want people bookmarking short-url’s – which is what would happen if you gained any kind of control over them, such as “short-url mapping”.

    They should be exactly what they are – throw-away, disposable, and insignificant.

    Mapping would just allow spammers even more control over their spam is what my ultimate argument with this is.

    Again, that’s not to dismiss the idea of a short-url federation. I think it’s a wonderful idea, just for different reasoning than you.

    I’d like to see blacklists and bookmarking tools ( which automatically translate an “accidentally” bookmarked short-url to its proper url.

    And the sites and services that are very reliant on short-url’s need to have some sort of accountability over the way they handle them. Twitter being the prime example. There’s so much blind-clicking going on on Twitter it’s ridiculous. They should either create their own short-url service, or fix the way they handle external short-url’s.

    They would argue that it’s not their responsibility to do so… but when it affects nearly their entire user-base, I would suggest otherwise.

  6. interesting.

    the idea i have had regarding this issue is to simply nudge google search, and other search engine companies to implement new logic for short-url handling. if a short-url is no longer properly pointing to the real url, then i would like to add the short-url into a search engine like google and have the first ‘sticky’ result be the real url.

    using search engine API services, this can be integrated as an anti-spam measure as well… such as calling the real url and web screenshot within an application.

    so, a way to leverage the monster search engines with url-shortening services so that most concerns can be ommitted. Together with the url-sshortening services own measure, this should suffice.

  7. @Panayotis

    I agree that Twitter is an interesting case in point. While tweets are generally submitted in plaintext, they’re frequently viewed in HTML.

    I would like to see twitter implement it’s own auto-linking service, which both provides shortlinks, and proof of the target domain. So this URL in a tweet:

    Could become: []

    Using third-party shortlinks in seems to invite both link-rot and link fraud.


  8. Tangentially, I notice some tinyurls that have a url that indicates it’s topic. These are clearly not like the majority that appear to be randomly generated.

    Here’s one:

    David Berlind has a bunch of them here:

    I’ve hunted around trying to find out how to generate the “prettier” and more intuitive ones but can’t find out how it’s done. Anybody know?

  9. allows site owners to define their own short domain. It works like this.

    My domain is I also own my “short domain”. I setup and let urlBorg know about it.

    Now, anyone that goes to urlBorg and enter a link in my domain (ex ) will get a short link like

    1. I could make a backup of all the short links in my short domain and if urlBorg disapears, I could easily recreate them (after all I own the short domain)
    2. Users know that all short links like…/ point to my domain so “blind-clicking” is reduced.

    Regarding blind-clicking, any user who has an accoun in urlBorg can choose to see a “preview page” (a page showing the long url) whenever they click a short one.

  10. I dunno. I think URL shorteners are yet another level of indirection applied to compensate for misfeatures of email clients. I think it makes the world more fragile for very little value.

  11. I agree with Joshua. Me, I never type URL (maybe on mobile, but I hope it’ll change) – just paste them. Whether it’s 25 or 225 characters. I got used to that and it’s just easier and misspell safer. I know am not an average user, but still…
    And for me – since we want “machine readable SW” and link shortening is for humans – this is wrong direction. Why build another schema?
    I think “pretty permalinks” do the job.

  12. The problems of blind-clicking are real, as this guy so aptly demonstrates. URLs should be human readable, because it’s humans who are clicking on them.

    Webcite/Webcitation do provide caching, but they also provide a single point of failure. I’d rather have a heterogeneous rotting collection of links than to depend on one domain, no matter how dependable.

Leave a Reply