Website Re-launch

Posted on 22 August 2015


When I launched this site in 2010; I had no vision of what it was or would become. started off very simple; the first of its kind: a website to clearly explain to a non-technical user what their browsing set up was.

Sure, there were websites which simply echoed back the browser's user agent, but try getting a "non-technical" person to understand a user agent when they're already not savvy enough to even know which browser they're using!

Back then there was no site which did this, so the site was borne out of a simple need that I had, and over the years since then, it grew and evolved. I would come up with new ideas or people would make feature requests. So after a while the site had features including the iFrame embed or the email feature.

There's also quite a complex system under the hood which stores, aggregates and counts all the different user agents that the homepage sees; this is how I know which user agents to add detection for next.

But up until now, all of these features, additions and changes had all just organically occurred, and as such the codebase - for everything except the User Agent Parser library itself - was admittedly a bit of a mess.

It was a weird hybrid of Symfony and SilverStripe and a bunch of other glue in between. I had changed the look and feel a number of times in those years and the html structure by this point was pretty average. The management systems I had built worked, but were clunky.

After those all those years of being built with out a vision, evolving and changing rapidly, the site had become a bit of a Frankenstein's monster.

Looking forward

By early 2015 the site was getting millions of hits and was continuing to grow. As I looked at my big list of features I wanted to add, it became clear that if I wanted to continue to expand and make the site as good as it could be, I would have to spend serious time and rebuild it so that all the new features had a strong foundation.

Adding to this was the fact that I had learnt and fallen in love with Python in the previous year, and having to keep going back to PHP to maintain the site was killing me.

I now knew what the site was and I knew where I wanted to take it. It had gone from being an accidental success in to something I cared and was passionate about making excellent.


I started by rebuilding the API server in Python, using Bottle. This was easy and quick work. I also deployed it on its own dedicated server to increase redundancy and fault tolerance.

The next task was rebuilding this site in Python and this was going to be a huge job. I chose the Django framework because it made developing a large and complex site easy and I really liked the way that the framework is put together.

Bigger, better

When I rebuilt the site, I generally kept the same visual look I had at the time - everything's still flat and modern - but I added lots of new features and content - it was a rebuild but it was also an expansion.

More detection on the homepage, including things like local ip address detection. I also made individual pages for each piece of functionality detection.

I rebuilt the JavaScript that I use to do all the frontend detection, making it much more modular and reusable. It's on the roadmap to offer it as part of the API eventually.

I also spent a lot of time writing detailed guides on how to do a lot of the commonly asked questions; how to enable JavaScript, cookies, Flash and so on.

The interactive user agent parser was overhauled and I added a catalog of user agents to browse through.

"Cool urls don't change"

As Tim Berners-Lee said: "Cool urls don't change" and fortunately I was able to keep > 90% of the urls exactly the same. The Developers section was split a little different; into Tools and Guides, but with a few 301 redirects, everything worked out fine.

Most importantly, I was able to make the actual url for the iFrame embed feature identical, so that the hundreds of sites which have embedded it already didn't need to change anything on their end.


Without delving deeper and/or getting more rambly, that's about it for the public-facing rebuild.

Behind the scenes, the management system continues to grow and evolve. I'm using things like statsd and grafana to monitor everything from server vitals to API calls to types of user agents seen. OSSEC, Trac and Klaus are all vital parts of the system.

I now continue to build a bigger and better management system so that I can keep everything running and continue to provide the best user agent parser and hopefully continue my original goal: to make a tool to help other web developers with their clients.

Thanks for coming on the journey with me.

Back to the Blog index

Get a VPN to help stay safe online