how to squeeze some extra speed from your page ) we looked at some of the things that anyone can do to reduce the size of your web pages and s"/>
Last updated on May 3rd, 2024 at 08:39 pm
In our 1st article on this subject ( how to squeeze some extra speed from your page ) we looked at some of the things that anyone can do to reduce the size of your web pages and speed up the visitors experience. This included removing unwanted spaces in the html, css files and optimising images and were things that almost anyone can do.
This article shares how I got a 68% saving in file sizes as a minimum on my website pages, reduced the time taken to send css files and javascript. This improves the speed that the pages are displayed in the visitors browsers and reduces the bandwidth used by the server all for very minimal extra work.
First, you should know that this works on Linux servers and not Microsoft IIS servers.
The first thing that I did was to compress css and javascript files. To do this I created a php file called csszip.php that had the following code in it :-
<?php
ob_start ("ob_gzhandler");
if( isset($_REQUEST['file']) ){
$file = $_REQUEST['file'];
if( goodfile($file) ){
$ext = end(explode(".", $file));
switch($ext){
case 'css':$contenttype = 'css';break;
case 'js':$contenttype = 'javascript';break;
default:die();break;
}
header('Content-type: text/'.$contenttype.'; charset: UTF-8');
header ("cache-control: must-revalidate");
$offset = 60 * 60;
$expire = "expires: " . gmdate ("D, d M Y H:i:s", time() + $offset) . " GMT";
header ($expire);
$data = file_get_contents($file);
$data = compress($data);
echo $data;
}
}
exit;
function goodfile($file){
$invalidChars=array("\\","\"",";",">","<",".php");
$file=str_replace($invalidChars,"",$file);
if( file_exists($file) ) return true;
return false;
}
function compress($buffer) {
$buffer = preg_replace('!/\*[^*]*\*+([^/][^*]*\*+)*/!', '', $buffer);
$buffer = str_replace(array("\r\n", "\r", "\n", "\t", ' ', ' ', ' '), '', $buffer);
return $buffer;
}
?>
Then, in the .htaccess file the following lines were added :-
# compress css and js files
RewriteRule ^(.*).css$ /csszip.php?file=$1.css [L]
RewriteRule ^(.*).js$ /csszip.php?file=$1.js [L]
The next thing that I did was to put some php code into the top of every page (my server is already set up to allow php to be embedded in html pages) that compresses (gzips) the page contents if the visitors browser can handle compression (all modern ones can). While I was at it I realised that the server did not return the date and time of the last changes to the page do this was also added so that when bots come and visit the page they know that nothing has changed so that they do not need to download the page again – a further saving on time and bandwidth.
<?php
# if the brower accepts compression than gzip if, if not proceed as normal
if (substr_count($_SERVER['HTTP_ACCEPT_ENCODING'], 'gzip')) ob_start("ob_gzhandler"); else ob_start();
# The Last-Modified time is the newer of this script's modification time
# and the File contents file modification time
$mtime = filemtime($_SERVER['SCRIPT_FILENAME']);
$gmt_mtime = gmdate('D, d M Y H:i:s', $mtime) . ' GMT';
header("Last-Modified: " . $gmt_mtime);
?>
Finally the pages were checked in broswers and also using the gzip compression testing tool at http://www.whatsmyip.org/http_compression/. This tool shows the savings in size, for example :-
As I mentioned above, this is not something that you should attempt if you are not familiar with php code and changes to htaccess files – and I would not recommend taking code from just any website and adding it to your own site unless you understand what it is doing, but Google webmaster tools is already reporting an improvement in page downloads and since page load times are a small factor in the results now on Google I figured that any improvement is worth having if it helps keep the site in the rankings for our target phrases.