Handling Static Content
As of a few weeks ago my website used PHP to route static assets. That is, every time a request hit my server for css, js, or the like, a PHP process would kick up to process and stream the response back. This was done to handle versioning and compiling of different assets, which was very helpful for handling cache busting of styles and scripts. This also allowed me to fine tune headers and customize caching parameters for things that rarely changed.
There were two big problems that I was tying to solve when I initially built this system. The first was compiling and versioning: I wanted to break my assets into manageable pieces but only have a single HTTP request to download it. Every time a change was made to one of the pieces my code would automatically compile into a single entity, cache it, increment the version in the filename, and then future requests would use that one. Sure, that first request where things compiled would be slow, and then the rest would be fine. The second problem was with my photos. I cross-reference photos across multiple domains and didn't want to save the files across multiple filesystems, nor did I want to throw them all on a separate domain.
While my system did solve these problems it raised new ones. Every single static asset request booted PHP. This meant a minimum memory usage threshold and delay while my code figured things out. Also, there were things that the stack didn't do, like minification and obfuscation. I could have eventually figured out how to do that, or found PHP packages to handle it, but that would have added more overhead on that first request. Some of the requests were much heftier than others, like sitemaps and RSS feeds (they were also handled dynamically). Finally, it was just more code to maintain, and code is horrible.
So, a few weeks ago, I came up with a different solution. I would use modern technologies during the build process to compile css and js. This doesn't quite handle the versioning problem, though over the last few years I've changed styles and scripts so infrequently that this isn't really a problem worth dealing with. Symlinks would deal with the images issue, hooking a common directory up reroute those requests to a central place. This works as long as all of my sites are on the same server. And the rest of the static content, like robots.txt and sitemaps and RSS, will be generated by cron jobs because anything else is silly.
The symlinks and cron jobs were painless to set up. I was concerned about updating some of the larger sitemaps during mid-request, that my script would have an open connection to a file and be streaming content when a new HTTP request came in, so I opted to create a temp file and then do a quick rename of it. Also, I wasn't able to find a good RSS/sitemap package that handled things in a streaming fashion, so I may look into building my own at some point. Don't like the idea of holding that much string in memory just to dump it into a local resource when the script ends.
I still have css and js broken up into modular pieces, though instead of PHP compiling them during runtime I'm using grunt, something that runs on node. Grunt is specifically designed to prepare apps for deployment, and combined with bower can pull external dependencies down. This is still technology that I'm not entirely sure if my build scripts are up to my snuff, but here is what currently prepares the frontend of my website.
module.exports = function(grunt) {
grunt.config('env', grunt.option('env') || 'dev');
grunt.loadNpmTasks('grunt-bowercopy');
grunt.loadNpmTasks('grunt-contrib-clean');
grunt.loadNpmTasks('grunt-contrib-cssmin');
grunt.loadNpmTasks('grunt-contrib-uglify');
grunt.initConfig({
bowercopy: {
scripts: {
options: {
destPrefix: 'build/temp'
},
files: {
'jquery.js': 'jquery/dist/jquery.js',
'js.cookie.js': 'js-cookie/src/js.cookie.js',
'normalize.css': 'normalize-css/normalize.css',
'reset.css': 'HTML5-Reset/assets/css/reset.css'
}
}
},
clean: {
build: [
'bower_components',
'build'
],
refresh: [
'public/css/build/*',
'public/js/build/*'
]
},
cssmin: {
app: {
files: {
'public/css/build/404.css': 'public/css/404.css',
'public/css/build/503.css': 'public/css/503.css',
'public/css/build/blog.css': [
'public/css/blog.css',
'public/css/markup.css'
],
'public/css/build/home.css': 'public/css/home.css',
'public/css/build/lifestream.css': 'public/css/lifestream.css',
'public/css/build/portfolio.css': 'public/css/portfolio.css',
'public/css/build/site.css': 'public/css/site.css',
'public/css/build/waterfalls.css': 'public/css/waterfalls.css'
},
options: {
sourceMap: (grunt.config('env') == 'dev') ? true : false
}
},
vendor: {
files: {
'public/css/build/normalize.css': 'build/temp/normalize.css',
'public/css/build/reset.css': 'build/temp/reset.css'
}
}
},
uglify: {
app: {
files: {
'public/js/build/imagelightbox.min.js': 'public/js/imagelightbox.js',
'public/js/build/portfolio.min.js': [
'public/js/portfolio.js',
'public/js/imageloader.js'
],
'public/js/build/waterfalls.min.js': [
'public/js/waterfall-overlay.js',
'public/js/waterfall-map.js'
]
},
options: {
sourceMap: (grunt.config('env') == 'dev') ? true : false
}
},
vendor: {
files: {
'public/js/build/jquery.min.js': 'build/temp/jquery.js',
'public/js/build/js.cookie.min.js': 'build/temp/js.cookie.js',
}
}
}
});
grunt.registerTask(
'default',
[
'clean:refresh',
'bowercopy',
'uglify',
'cssmin',
'clean:build'
]
);
};
So this sort of works, and it gives me a chance to play with new shiny technology and the such, and yet there's a big hurdle. I don't have a deployment system yet. Right now I need to physically run 'grunt' on a local and move the final files up to my remote webserver. Which is little better than how I'm handling composer installs, I guess, which I need to run on my remote after each time I update anything on the site.
The next step is getting a reliable deploy system, something that will listen to the master branch of my repo and then run grunt and composer and push everything to a remote server, or something that most people call continuous integration. Once this is set up I can worry about the other features I lost during this transition, like versioning and fine-tuned headers. It was still worth it - I got to delete a large chunk of old code, speed up my static response times considerable, decreased load on my servers, and played with new technologies. So yeah, I'm pretty happy with this change.
Comments (0)