At the office we usually take some time to optimise our client’s websites for Google Pagespeed. We got quite proficient in this and no what the common pitfalls and best practices are. However, I noticed we usually only do this for the homepage 🤔.
This makes sense, as that’s usually the main page, but what about all the other pages. For instance, the order process pages or your contact form. Those are just as, if not more, important to load up quickly.
This, combined with my desire to launch a webservice at some point, lead me to the idea to create a webapp called Wulfdeck. It automatically crawls the domain you enter and finds all the (internal) pages it links to, up to 2 levels deep. Next, you select which pages you would like to monitor, and you’re good to go. Wulfdeck will scan all the pages for their Pagespeed and show you your average, as well as per-page specifics.
The nice thing, I find, is that it calculates with the impact from the various Pagespeed Rules, so that it can tell you how many more Pagespeed points you will gain by fixing that specific Rule.
When you have changed your page you can re-check it and see if you’re at the desired result.
Since I launched it last week I got some subscribers, and luckily the system works as expected without unforeseen server issues or bugs, so that’s cool. I’ll see where it goes from here.
If you want to give it a spin and give me some feedback that’d be great. Check it out here.
Happy coding (and optimising)!
Questions or comments?
As always, if you have any questions or comments, you can find me on Twitter.