{"id":49,"date":"2010-04-29T12:10:56","date_gmt":"2010-04-29T11:10:56","guid":{"rendered":"http:\/\/www.forestsoftware.co.uk\/blog\/?p=49"},"modified":"2024-05-03T20:39:50","modified_gmt":"2024-05-03T19:39:50","slug":"speeding-up-your-webpages-part-2","status":"publish","type":"post","link":"https:\/\/www.forestsoftware.co.uk\/blog\/2010\/04\/speeding-up-your-webpages-part-2\/","title":{"rendered":"Speeding up your webpages \u2013 Part 2"},"content":{"rendered":"<span class=\"span-reading-time rt-reading-time\" style=\"display: block;\"><span class=\"rt-label rt-prefix\">Reading Time: <\/span> <span class=\"rt-time\"> 3<\/span> <span class=\"rt-label rt-postfix\">minutes : <\/span><\/span><p>In our 1st article on this subject ( <a href=\"http:\/\/www.forestsoftware.co.uk\/blog\/2010\/04\/webpage-speed-how-to-squeeze-some-extra-speed-from-your-page\/\">how to squeeze some extra speed from your page<\/a> ) we looked at some of the things that anyone can do to reduce the size of your web pages and speed up the visitors experience.\u00a0 This included removing unwanted spaces in the html, css files and\u00a0optimising images and were things that almost anyone can do.<!--more--><\/p>\n<p>This article shares how I got a 68% saving in file sizes as a minimum on my website pages, reduced the time taken to send css files and javascript.\u00a0 This improves the speed that the pages are displayed in the visitors browsers and reduces the bandwidth used by the server all for very minimal extra work.<\/p>\n<p>First, you should know that this works on Linux servers and not Microsoft IIS servers.<\/p>\n<p>The first thing that I did was to compress css and javascript files.\u00a0 To do this I created a php file called csszip.php that had the following code in it :-<\/p>\n<p><code>&lt;?php<br \/>\nob_start (\"ob_gzhandler\");<br \/>\nif( isset($_REQUEST['file']) ){<br \/>\n$file = $_REQUEST['file'];<br \/>\nif( goodfile($file) ){<br \/>\n$ext = end(explode(\".\", $file));<br \/>\nswitch($ext){<br \/>\ncase 'css':$contenttype = 'css';break;<br \/>\ncase 'js':$contenttype = 'javascript';break;<br \/>\ndefault:die();break;<br \/>\n}<br \/>\nheader('Content-type: text\/'.$contenttype.'; charset: UTF-8');<br \/>\nheader (\"cache-control: must-revalidate\");<br \/>\n$offset = 60 * 60;<br \/>\n$expire = \"expires: \" . gmdate (\"D, d M Y H:i:s\", time() + $offset) . \" GMT\";<br \/>\nheader ($expire);<br \/>\n$data = file_get_contents($file);<br \/>\n$data = compress($data);<br \/>\necho $data;<br \/>\n}<br \/>\n}<br \/>\nexit;<br \/>\nfunction goodfile($file){<br \/>\n$invalidChars=array(\"<a href=\"file:\/\/%22,%22\/%22%22,%22;%22,%22%3E%22,%22%3C%22,%22.php\">\\\\\",\"\\\"\",\";\",\"&gt;\",\"&lt;\",\".php<\/a>\");<br \/>\n$file=str_replace($invalidChars,\"\",$file);<br \/>\nif( file_exists($file) ) return true;<br \/>\nreturn false;<br \/>\n}<br \/>\nfunction compress($buffer) {<br \/>\n$buffer = preg_replace('!\/\\*[^*]*\\*+([^\/][^*]*\\*+)*\/!', '', $buffer);<br \/>\n$buffer = str_replace(array(\"\\r\\n\", \"\\r\", \"\\n\", \"\\t\", '\u00a0 ', '\u00a0\u00a0\u00a0 ', '\u00a0\u00a0\u00a0 '), '', $buffer);<br \/>\nreturn $buffer;<br \/>\n}<br \/>\n?&gt;<\/code><\/p>\n<p>Then, in the .htaccess file the following lines were added :-<\/p>\n<p><code>\u00a0# compress css and js files<br \/>\nRewriteRule ^(.*).css$ \/csszip.php?file=$1.css [L]<br \/>\nRewriteRule ^(.*).js$ \/csszip.php?file=$1.js [L]<\/code><\/p>\n<p>The next thing that I did was to put some php code into the top of every page (my server is already set up to allow php to be embedded in html pages) that compresses (gzips) the page contents if the visitors browser can handle compression (all modern ones can).\u00a0 While I was at it I realised that the server did not return the date and time of the last changes to the page do this was also added so that when bots come and visit the page they know that nothing has changed so that they do not need to download the page again &#8211; a further saving on time and bandwidth.<\/p>\n<p><code>&lt;?php<br \/>\n# if the brower accepts compression than gzip if, if not proceed as normal<br \/>\nif (substr_count($_SERVER['HTTP_ACCEPT_ENCODING'], 'gzip')) ob_start(\"ob_gzhandler\"); else ob_start();<br \/>\n# The Last-Modified time is the newer of this script's modification time<br \/>\n# and the File contents file modification time<br \/>\n$mtime = filemtime($_SERVER['SCRIPT_FILENAME']);<br \/>\n$gmt_mtime = gmdate('D, d M Y H:i:s', $mtime) . ' GMT';<br \/>\nheader(\"Last-Modified: \" . $gmt_mtime);<br \/>\n?&gt;<\/code><\/p>\n<p>Finally the pages were checked in broswers and also using the gzip compression testing tool at <a href=\"http:\/\/www.whatsmyip.org\/http_compression\/\" target=\"_blank\" rel=\"noopener\">http:\/\/www.whatsmyip.org\/http_compression\/<\/a>.\u00a0 This tool shows the savings in size, for example :-<\/p>\n<ul>\n<li>The home page dropped by 64% to 5.72kb<\/li>\n<li>The html colour code page dropped by 83.96% in size<\/li>\n<li>The office types page dropped by 65.2%<\/li>\n<li>The starting page of our business directory dropped by 75.56%<\/li>\n<\/ul>\n<p>As I mentioned above, this is not something that you should attempt if you are not familiar with php code and changes to htaccess files &#8211; and I would not recommend taking code from just any website and adding it to your own site unless you understand what it is doing, but Google webmaster tools is already reporting an improvement in page downloads and since page load times are a small factor in the results now on Google I figured that any improvement is worth having if it helps keep the site in the rankings for our target phrases.<\/p>\n","protected":false},"excerpt":{"rendered":"<p><span class=\"span-reading-time rt-reading-time\" style=\"display: block;\"><span class=\"rt-label rt-prefix\">Reading Time: <\/span> <span class=\"rt-time\"> 3<\/span> <span class=\"rt-label rt-postfix\">minutes : <\/span><\/span>In our 1st article on this subject ( how to squeeze some extra speed from your page ) we looked at some of the things that anyone can do to reduce the size of your web pages and speed up the visitors experience.\u00a0 This included removing unwanted spaces in the html, css files and\u00a0optimising images [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[5,3],"tags":[22],"class_list":["post-49","post","type-post","status-publish","format-standard","hentry","category-computers","category-seo","tag-website-development"],"_links":{"self":[{"href":"https:\/\/www.forestsoftware.co.uk\/blog\/wp-json\/wp\/v2\/posts\/49","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.forestsoftware.co.uk\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.forestsoftware.co.uk\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.forestsoftware.co.uk\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.forestsoftware.co.uk\/blog\/wp-json\/wp\/v2\/comments?post=49"}],"version-history":[{"count":0,"href":"https:\/\/www.forestsoftware.co.uk\/blog\/wp-json\/wp\/v2\/posts\/49\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.forestsoftware.co.uk\/blog\/wp-json\/wp\/v2\/media?parent=49"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.forestsoftware.co.uk\/blog\/wp-json\/wp\/v2\/categories?post=49"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.forestsoftware.co.uk\/blog\/wp-json\/wp\/v2\/tags?post=49"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}