I’m still intrigue by this subject and can’t stop coming up with more questions and thoughts even after I wrote in my previous post.
I took my blog URL and looked at the results from few URL shortening services.
Origin URL: https://usingit.wordpress.com/
Hare are the result:
http://www.shortesturl.com/?go=e74c2e10f0 – not so useful when you get a longer URL than the original!!
http://tinyurl.com/5j3t8k – most popular
http://www.url.gen.tr/kd – this one is interesting – does it gives you some control over the hash key?
http://snurl.com/2glnl – use by Twhirl
http://is.gd/vX3 – the shortest available today
As you can see the length of the domain name is a key for a short URL. Most services uses around 1-5 characters (A_Z and 0-9) for hashing the long URLs (that should be enough for a while).
Google could use their recently acquired, supper short domain name for shortening URLs: www.g.cn
They are using it for redirecting to the Chinese localized version of their search page. I can only guess that it makes more money than tiny links service.
There is also the shortest domain that I know about: www.com but it does not beat is.gd – how did they get this domain registered?
If you are looking for information about short domain name look at this very interesting post.
Here is a list of Free Short URL Redirection Services.
Another point: if I’m right then the hyphen (-) is also a valid character in URL names. I’m not sure but I think that the URL cannot start or end with the dash. What is the reason that these services don’t use this extra characters (36 is better than 35)? Is it too complicated because of the start end constraints? Does not worth it? I’m just curious.
Final thoughts: do you think that whatever inside each of these services database is more valuable than what that a crawler can come up with? Think about it:
- These URLs are picked by humans (lots of them in Twitter and Plurk).
- They can keep statistics for how many times each URL was requested.
- They can build a search engine using these links without the need for building a sophisticated crawler for discovering new URLs.
- They can see what is interlinked inside this tiny linksphere
Am I making a big deal out of a tiny subject:)?