Office of Secret Intelligence

Who taught you to be a spy, fucking Gallagher!?!

Handling Increased Load

On top of being submitted to Postgres Weekly, my rails 4 + postgres trees tutorial was submitted to Ruby Weekly, so there has been a pretty drastic increase in load as of late, which is great!

I'm using Pingdom and Google Analytics/Adsense to gauge where people are coming from and what kind of load this site is dealing with, and while not super substantial, I've tried to make some improvements to keep the site load times down around a few seconds (there have in the past been a few minute long spikes, probably due to an influx of readers).

What surprised me was a) how much caching even small database calls helps out in the long run, and b) the bottleneck STILL isn't at the database level, but mostly around serving assets. I've compressed the hell out of my CSS and Javascript, but not using progressive jpegs and setting appropriate caching levels (browser caching, etc) for them.

I mention images because it's been a little more difficult to flat out cache them like I normally would other assets as they are being served from S3 and have an expire time associated with them, and if the image url is cached beyond the expire time, the image has a really difficult time being displayed properly.

One of the things I've been looking at is progressive jpeg compression, where I convert everything that's not a gif to a jpg and strip out a lot of the profiles and compress things down a bit (quality can be reduced significantly, especially in thumbnails, without a significant loss in image quality) to reduce image size and thus allow for better respones times and lower bandwidth. Some of this is detailed in the CarrierWave documentation, the image processor I use.

Another option, after optimization, is to start using Cloudfront. I think the cost would be negligible, and it would pretty much handle all of the static asset caching and such, rendering anything on my side not necessary beyond Rails fragment and action caching. It's fascinating to begin to see the "macro" level that performance optimizations are needed on with web development.

I'm used to doing a lot of under-the-hood behind-the-scenes API work that involves optimization of processing data in one form or another, but not necessarily "let's make this entire page smaller so the request returns faster and only needs to be performed once every few minutes."

More on this soon!


Caching, Caching, Caching and More Caching

So I updated the site code to do some more caching.  Things are going quite a bit faster between the Varnish ESI caching and the redis caching for fragments and such.

Still need to figure out how to get the gallery pages' images caching properly, but as long as the main page and reply trees are cached, the rails app itself will bear less of a burden.


Treeify 0.03 Release

Treeify 0.03 has been released.  I'll put something together that generates a reasonable changelog in the future, but for now, the biggest changes are:

 

  1. An actual README, so you can sort of figure out how to use it.
  2. A new method called "descendent_tree" which returns an array of hashes in a nested format, resembling a tree structure which is mighty handy for passing to a Rails view or serializing to JSON and traversing with Javascript.

All tests pass, which is good enough for me now.  I'll test this new version with the Kodiak build running this site soon enough and hopefully it won't break too much.

 

Thanks to github user espen for opening the issue to create a README and motivate me to clean things up and get a reasonable release out.


Application Configuration Using Rails 4 + Postgres 9.4 and JSON

fart

Look, I Got Featured!

A kind soul submitted Quick and Dirty How To - Trees in SQL + Postgres + Rails 4 to Postgres Weekly.  Whoever you are, thanks!

 

In that same vein, I have an issue opened on Treeify to add a readme and some examples, so I'll be writing up a post about how to use that within the next day or so.