START Try & Buy

contact us
  1. ARITMETIKA
  2. Tips&Topics
  3. Website performance: maintaining the right standards every time

Website performance: maintaining the right standards every time

The human factor in website management is still a key consideration, carrying big risks that aren’t always predictable. But it’s easy to counteract them. All that’s needed is an automatic layer that sits on top of the site without impacting on its actual structure, thereby guaranteeing the same standards every time.

If you’ve got a conversion problem with your advertising campaign or your marketing strategy is coming up short, you’re not alone. There’s a famous quote from American tycoon John Wanamaker, who died in the 1920s, which advertising and marketing experts still like to use now. In summary he said: “I know that half of what we spend on advertising is wasted; the problem is knowing which half.”

And why is that pearl of wisdom is still relevant more than half a century later? Because publicists, marketing experts, and people who place ads continue to chase that elusive missing half. While that’s still very true when it comes to billboard and TV advertising, because there are no fixed reference points, online advertising is in a much stronger position to discover what works and what doesn’t, as there are so many more tools and results analytics available. Not only that, but there’s also the possibility to fix things that don’t work as you go along. There can be many reasons why a particular campaign isn’t delivering: people often point the finger at the creative content used – which of course plays a part – but more often the problem lies in the efficiency of content distribution on websites. And the devil there is in the detail.

STANDARDISATION: A COMMON ENEMY

One example? It’s enough to have a photo that’s wrong – too heavy, for example – and a site that normally runs well can slow down enough to mean that the hoped-for conversion rates aren’t achieved. All it can take is simply one photo that’s outside the usual parameters to significantly bog down even an optimised site that usually loads quickly. The effect is not only a reduced user experience – as what should be an enjoyable journey gets turned into a tedious waiting game for pages to load – but also a negative effect in terms of Google ranking, which is well-known for rewarding speed as well as the quality of the actual content.

A number of studies, including those carried out by Google themselves, underline how just a single second of delay can lead to losses of up to 15% when it comes to conversions. That’s a real disaster for e-commerce sites or multi-country websites belonging to big multinationals: which are exactly the type of sites most commonly affected by these sorts of problems. It’s easy to see why. The more complex the functionality and content of each website is, the more intervention and maintenance is needed, which in turn increases the scope for human error. That’s especially the case for multinational websites, which are often managed and standardised centrally, making them susceptible to human factors and all the potential oversights that come as a consequence. That’s why just a single photograph, which has perhaps been optimised in the UK for faster loading, for example, might be left in the wrong size in France – either crashing the site (in the worst case) or slowing it down dramatically. That’s only one example of a small detail that can be easy overlooked, rendering all the usual procedures of checks and controls insufficient.

THE RIGHT LAYER CAN SOLVE THE PROBLEM

Worse still, the human factor can undo the good work of all the systems and platforms that in recent years have really helped to optimise website performance. With the Cloud or servers in Amazon Web Services, for example, the connection between the pages of a website and their end users has becomes a lot closer, leading to concrete effects in terms of final results, as well as better speed and quality of loading. But each page, even though it’s a lot closer to the user, still has to reformat itself in every computer’s web browser, having been sent there by the server. There has been a big impact from Content Delivery Networks in that respect, such as those from Amazon and Akamai, for example, which always rely on the server to configure the pages.

All these systems, for one reason or another, cannot compensate for an error in the actual loading of content further up the line. In these cases, only a layer sitting between the server and the user – such as iSmartFrame – can neutralise any mistakes and make them unnoticeable when it comes to how the site functions and performs

The job of  iSmartFrame – a simple layer that can be installed without intervening on the structure of the actual website, and without impacting on the use of Content Delivery Networks or Amazon Web Services – is to deliver to each web browser a page that is already perfectly reformatted and optimised. If we’re talking about oversized photos,  this layer will take care of their loading and optimisation, meaning that the browser brings up pages faster, every time and anywhere. And this obviously has positive effects in terms of user experience and Google ranking, as a result of eliminating human error.

Complex sites in particular are always looking for standardisation and automation to maximise important marketing and advertising investments. Which, as we all know, are all half wrong anyway. But which half?