Office of Secret Intelligence

Who taught you to be a spy, fucking Gallagher!?!

Golang Application Namespacing

I'm no guru when it comes to golang intricacies, and this one actually put me off go a few times, but I've finally figured out how to have your library and instance application play together in the same directory for go apps.


Here's a little clarification:


This is the basic go app structure that you're told to use from day one (so sayeth  It is correct.

However, this doesn't really help address say, "what if I want my library that I wrote and my application using it to live in the same git repository?"

The problem I run into is some sort of namespace clashing, where go complains about "package main" and "package myapp" existing in the same place.

Long story short, my solution was to create an "app" directory underneath my library's root directory, like this:

This seems to do the trick for my purposes, allowing me to maintain my library and instance code in the same git repository while allowing everything to compile properly.

Linux Snippet: Bring Networking Back Up

It's been a while since I've posted, I've just been buried with work and life.


Yesterday I was trying to recover some things from a personal VM that went kaput for reasons completely unrelated to this post, and I found that networking was disabled, even though I had it enabled through Virtualbox.


After a little bit of digging, I found the following command did the trick:


sudo dhclient eth0

nixCraft (one of my favorite *nix knowledgebase sites) explains it best:


The dhclient command, provides a means for configuring one or more network interfaces using the Dynamic Host Configuration Protocol, BOOTP protocol, or if these protocols fail, by statically assigning an address.

Thus starts my journal (and what this blog was originally intended to be) of miscellaneous commands and algorithms that I need to keep handy.

Treeify 0.04 Released

Amidst all this being busy stuff, I have released version 0.04 of Treeify (link to gem:


Here's a list of changes (noted in the file):


(My changelog generation needs some work but hey! It's there.)


I haven't yet updated this blog to use the newest Treeify code, but I'll do that soon, I hope.  The biggest changes involved adding the ability to specify what columns to use in a query and actually have them retrieved.  I created some benchmarks here:, though I don't know how accurate they are beyond saying one is faster than the other.  Although the previous method was faster, it did a lot less, and the new find_by_sql call can be fine tuned in the future.

On a side note, this article about generating Changelogs from git is awesome:, and so is this snippet:

That's all for now, hopefully I'll get some more time tomorrow to write some more interesting articles.

And We're Back

I've obtained my database dump, and it looks like all is well now.  I need to get varnish set up again, but in the mean time, everything appears to look okay.


This week I'd like to write a little something on Go, since I just wrote a small tool for work using it.  I think I'm also going to finally package up Kodiak and do a proper release, since I had a bit of an epiphany installing it on a new server and not having much pain.


Anyway, more later!

Handling Increased Load

On top of being submitted to Postgres Weekly, my rails 4 + postgres trees tutorial was submitted to Ruby Weekly, so there has been a pretty drastic increase in load as of late, which is great!

I'm using Pingdom and Google Analytics/Adsense to gauge where people are coming from and what kind of load this site is dealing with, and while not super substantial, I've tried to make some improvements to keep the site load times down around a few seconds (there have in the past been a few minute long spikes, probably due to an influx of readers).

What surprised me was a) how much caching even small database calls helps out in the long run, and b) the bottleneck STILL isn't at the database level, but mostly around serving assets. I've compressed the hell out of my CSS and Javascript, but not using progressive jpegs and setting appropriate caching levels (browser caching, etc) for them.

I mention images because it's been a little more difficult to flat out cache them like I normally would other assets as they are being served from S3 and have an expire time associated with them, and if the image url is cached beyond the expire time, the image has a really difficult time being displayed properly.

One of the things I've been looking at is progressive jpeg compression, where I convert everything that's not a gif to a jpg and strip out a lot of the profiles and compress things down a bit (quality can be reduced significantly, especially in thumbnails, without a significant loss in image quality) to reduce image size and thus allow for better respones times and lower bandwidth. Some of this is detailed in the CarrierWave documentation, the image processor I use.

Another option, after optimization, is to start using Cloudfront. I think the cost would be negligible, and it would pretty much handle all of the static asset caching and such, rendering anything on my side not necessary beyond Rails fragment and action caching. It's fascinating to begin to see the "macro" level that performance optimizations are needed on with web development.

I'm used to doing a lot of under-the-hood behind-the-scenes API work that involves optimization of processing data in one form or another, but not necessarily "let's make this entire page smaller so the request returns faster and only needs to be performed once every few minutes."

More on this soon!