When you work for a rapidly growing web organization, you will usually have to maintain “cash cow” legacy systems while working on next generation systems in parallel.
Considering that while, in the process of developing the next generation system, the majority of revenue will still come from the cash cow legacy systems, and there will still be a demand to generate more revenue from those systems while the bulk of R&D expenditures are allotted to next-gen development.
This article will attempt to provide some tips on how to succeed with increasing revenues in legacy systems while maintaining performance.
- What’s in performance for Revenues?
Since concepts like bounce rate, average time on site, CTR and visits amount are all influencing revenues, improving any of them can increase it.
Slow loading pages are clearly reflected on bounce rate graph, so does average time on site, improving your page speed can decrease the bounce rate and make users stay more time on your site.
Improving your page speed affects google page speed score and the better you score, the more organic traffic you’ll earn.
All this leads to the simple fact that every extra second count – in revenues!
- The fight for better performance
Performance issues can arise from backend, system architecture and more… today I will focus on the front end side of the system.
So, how can you find out what is wrong?
Monitoring first – Use GTMatrix, Google pagespeed Insights, Yslow, NewRelic or whatever monitoring system suits you best.
Compare various monitoring tools, the gap between their statistics will often teach you something new.
Page speed test tools will give you tips on what needs to be done and why although not always in the most cost effective manner.
Doing extensive changes on a legacy system will sometimes be painful to execute generally due to the coupled nature of such systems.
- Untie the spaghetti incident
It is not always possible and in those cases it might be preferable to find workarounds instead. First, try to separate the piece (page, website) you are currently working on, if the dependencies are giving you a hard time, you can avoid fixing them by just duplicate it.
Now that you can work on your code without being afraid of damaging existing logic, start to figure out what will be the most cost effective changes you can do.
Consider the following cost effective changes:
- Remove all external resources from your page. refresh. start to add only what is still relevant and get rid of all those resources that are there from “historical reasons”
- Re-evaluate your JS code for inline loading vs. onload events and change respectively.
- Search for synchronous JS code – sometimes you will find it on 3rd party services that might use async = false.
- Avoid CSS inlined imports
- Look for inline JS and CSS pieces and merge them into an exist file.
- Merge multiple calls into single calls (google fonts, small JS and CSS files etc…)
- Check for the amount of unused CSS, After many years with many developers working on the system don’t be surprised to find ~90% unused code there.
Removing unused CSS code will not only give you faster download times but will also help you to render your page much faster.
- Using tools like un-css or even using google developer tool to find out what CSS are unused will help you to get rid of them.
- Verify your images are as small as possible, Optimize your images when necessary.
- Limit base64 images in your CSS
- Enable GZIP compression
- Leverage browser caching – by adding cache headers while loading your images.
All these changes are easy to adopt on a single site or page and very cost effective.
Monitor the changes you have made after releasing them back into production. After proving your point, it will be way easier to request for resources or approvals of doing cross system changes for performance.
- What is the next step?
How to easily add those improvements into each and every site?
Here I can recommend of two different approaches:
- extensive changes in the system – like building a resource manager that will manage the resources by page/site and will centralize all performance improvements processes. (prioritizing resources, merging multiple files to reduce extra requests etc)
This will not decrease the number of requests, but will reduce your requests weight to the minimum and will happily influence on your site page speed.
We have implemented many of these strategies and are seeing excellent short term performance gains. In the next post, I will summarize and quantify how the various optimizations have affected our business requirements and goals