Thursday, November 26

Book Review: Managing Humans

Book Cover
Like many of the books I've read, Managing Humans is essentially a printed-out blog, and you can tell that pretty quickly by the terse, disjointed, and stream-of-consciousness hops from one chapter to the next. The first few chapters were rough enough to give me the impression (a) it was going to be painful to finish the book, and (b) I couldn't wait to rip it a new one when I reviewed it. However, it got better, so I'll have to save my ripping for another victim.

First, the bad. In the early chapters the author uses some hyperbolically fabricated scenarios involving over-the-top caricatures of personalities to assert that everybody you work with is an idiot and they can be categorized into these nice little buckets and since he's been doing this for so many years he knows all the angles and this is how it is and this is how you should deal with it. This preaching tone and one-sided view of things really left a bad taste in my mouth.

However, in later chapters the author seems to come back down to earth and realize there are other sides to every story and he speaks more in the tone of "this is how it happened in my particular situation and I think this was the best way to handle it." That sat a lot better with me. If I were a betting man, I would speculate that the early chapters, as blog posts, received some critical feedback and the author took it to heart in his future writings.

Having been around the block a few times, from the lowly sysadmin to the even lowlier CTO, I could relate to a lot of the author's parables and anecdotes. In some cases he's spot on and offers great advice, but in others he's rather short-sighted and preachy.

In summary I would recommend the book because it's always good to see your industry through somebody else's eyes, with the caveat that it's a bumpy road in the beginning, and expect that you'll be reading a blog on paper as opposed to a smooth-flowing book.

Wednesday, November 11

Book Review: Crush It!

Book Cover
I've never read a motivational self-help book, but I imagine this is what they are like. Gary compresses a lot of good content into a very small book, but there's a plethora of cheerleading. His core message can be boiled down to work really hard (crush it) and be honest with yourself and others (your DNA).

The chapters are peppered with humorous and inspiring anecdotes from Gary's childhood, which at times seem contradictory. For example, an early chapter tells of Gary's nearly poor and starving family splurging on two Star Wars action figures for his Christmas present, followed a couple chapters later by his father giving him $1,000 to set up a booth at a baseball card convention.

Gary tells a good story, and he has some great advice on leveraging Internet trends, especially the social networking aspect, to "build your brand." He's also adamant about breaking the mold, chasing after advertisers directly rather than hiding behind facades like Google's AdSense; go straight to the money source and cut out the middle man. I hear Gary speak at half of the conventions I attend (even had lunch with him at the last one) and I like him and his message. If you haven't seen him, head over to Google and find a couple of his videos. His Wine Library TV shows are great, but his keynotes are awe inspiring, especially FOWA Miami from a couple years back.

Unfortunately this book, like so many success stories I've read recently, subscribes to the theory of "this worked for me, so it's gotta work for you, right?" Gary says you should be working all the time, "until your eyes bleed," yet the jacket sports a quote from his bud Tim Ferris who only works four hours a week. Likewise Gary makes an off-the-cuff comment about real entrepreneurs not spending their time playing poker with their buds, an apparent jab at bazillionaire Jason Calacanis who seems addicted to the game.

Despite the sometimes seemingly-mixed messages, I do recommend this book because it's short and to the point and full of great advice. My only caveat is we can all learn from his experiences, but we can't all be Gary V.; there are many ways to succeed and Gary's isn't the only one.

Sunday, September 20

Book Review: Free

Book Cover
I downloaded Free on my Kindle because, well, it was free. I hadn't heard of the author before, nor any of his prior books, but I saw a promotional message somewhere - I don't even recall where - noting that Amazon was letting you download the book for free, so I did. And, I read it. And, it's good.

What's it all about? Well as the title might have hinted, it's about giving things away for free, and how that's a viable business. Sure, it's a little more complicated in that, but the book does a good job of explaining how Google gives away all it's services for free but still makes a metric butt-load of money, how companies like PosgreSQL give their database away for free and charge for premium reliable on-call support, how free is killing some industries (recording industry, newspapers) but uplifting others (iTunes, Craig's List).

I'm jotting down this review from a slightly stale memory, so I'm only scratching the surface of the topics covered. Do yourself a favor and pick up a copy - a free one if you can find it, of course. I'll be considering the author's other books for my reading list now, which is yet another way free can make you (or him in this case) money.

Tuesday, August 25

Book Review: CA$HVERTISING

If you've read the book Predictably Irrational about the irrational decisions we humans make when it comes to buying stuff, CA$HVERTISING will show you how to exploit those quirks to maximize your profit. I hate to make a cliche analogy (which I think I've used in a prior review) but it's really like The Art of War for advertising; it's a hundred or more little bite-sized chunks of tips and techniques for getting prospective customers to notice your ad, read it, and act on it. The author supports each nugget of wisdom with a plethora of studies and statistics. And, at the back of the book, it's all summed up in a nice concise checklist for your next propaganda endeavor. Even if you're not an advertiser or promoter (like me) it's worth a read just to open your eyes to the tricks and techniques being used to lure you into parting with your hard-earned money.

Tuesday, July 28

We value your money, not your satisfaction.

Warning, this is not a tech article. Today I'm going to rant about [stupid] business owners.

The backstory: A couple years ago, there was a quaint little mom-and-pop coffee shop near my office where I liked to spend my lunch breaks. I'd bring my bagged lunch, order a large coffee, enjoy the half-hour escape from my work day, and be on my way. It was such a pleasant experience that one day I decided to invite a few coworkers; four of them in fact. As we walked in the front door, the owner comes over and informs me that I'm no longer welcome to eat my bagged lunches there; in the future I must purchase and consume their food. She says this to me as I'm bringing four new customers into her business! Do you think I ever went back after that day? Nope.

Flash forward to today. I work from home now, and decided to break up the routine and try working the morning at another coffee shop, not quite a mom-and-pop, but a small local chain. I like this particular shop because it's right next to a bagel joint. I can pick up a piping-hot double-toasted eight-grain bagel with hummus, then step next door and order a big ole coffee to wash it down. That was the plan, but after I purchased my bagel and walked over to the coffee shop, I noticed a new sign on the front door. In big bold letters it read, "DEAR VALUED CUSTOMERS, NO OUTSIDE FOOD!" I walked right on by.

On the drive home I got to thinking exactly what is "valued" in that statement? Obviously it's not the satisfaction of the customers that enjoy a bagel and coffee in the morning. The only thing being valued there is taking money from customers that don't eat bagels, because all the rest were just alienated and driven away.

It pains me to say this, but I've taken bagged lunches from neighboring restaurants into several different Starbucks on hundreds of occasions and never once been harassed, and that's why the next time I have a craving for a tomato, mozzarella, and pesto sandwich with an iced latte, they'll be getting my business.

Sunday, July 26

Book Review: Who Moved My Cheese?

Book Cover
Who Moved My Cheese? is a very short book. The publisher went through great lengths to make it feel like a book, with nice thick hard covers and large type, but I think most readers would consider it merely a lengthy essay. It took me about an hour to read from cover to cover.

Who Moved My Cheese? is a parable about dealing with change told through four caricatures that live in a maze, subsist off of cheese, and how they deal with the sudden loss of said cheese. And for those readers that may be too dense to grok the not-even-remotely-subtle lessons taught by the parable, it's bookended with a fictional discussion by other more-life-like characters who liken their own jobs and personal lives to the events in the story.

Of course the point of the book is to open the eyes of the reader to the inevitability of change in their lives and workplaces and encourage them to accept it, embrace it, don't fear it, and deal with it. It's a little cartoony and preachy in places, and sometimes draws out the point much further than necessary, but for just a one-hour read I recommend checking it out so at the very least the next time it comes up in conversation you can participate in the discussion.

Thursday, May 14

Book Review: Envisioning Information

Book Cover
I had this book on my reading list because I'd heard DHH praise it a couple years back. Although I'm just a lowly software developer, I've always had a fascination with user interaction and interface design. I devoured the books by Norman, Tognazzini, Nielsen, Raskin, etc. I was hoping to glean similar insight from this tome.

The Bad

Unfortunately, it's not an easy book to read. There's something about the author's style of writing that I found to be obtuse. The way he weaves his words I quite honestly couldn't tell most of the time whether he was praising or condemning a particular visualization technique. The book is peppered with a plethora of illustrations, maps, charts, diagrams, etc. with several on each page, but half the time I couldn't figure out which image the author was referring to in the text.

The Good

Regardless of the struggle to consume the verbiage, there are some gems of wisdom to be mined. The author drills home the concept of "1 + 1 = 3" in the sense of combining two simple visualization techniques can add a third dimension of information. This is demonstrated with both good and bad consequences. The author also stresses the avoidance of decoration that distracts from the data and information "prisons" such as thick dark grid lines or table borders that could be removed completely and simply implied by the layout of the information via white space or "negative shapes". The most impressive examples in the book are the train time tables - it's mind-boggling how much data can be cleanly and clearly expressed in a two-dimensional chart with the right finesse.

Sunday, May 10

Oh yeah, I produce a podcast now!

For some odd reason, it didn't dawn on my until now to pimp my podcast on my blog. It's called The Anachromystic Podcast and we've already got four episodes out. It's just me and my old friend Craig Walker talking about software development philosophies, techniques, technologies, and news. It's sorta half-way between Drunk & Retired (but not as salty) and StackOverflow (but not as much self fellating). It's on iTunes and plain old raw RSS. Please give it a listen and let us know what you think over at podcast@anachromystic.com. Thanks!

Wednesday, May 6

A better progress meter for your (Rails) scripts

As a follow-up to my post a couple weeks back on putting a progress meter in your long-running migrations, I've whipped up a more helpful and re-usable tool.

I've called it simply "Progress" and here it is in its entirety:

class Progress
require 'action_view/helpers/date_helper'
include ActionView::Helpers::DateHelper

def initialize(total, interval = 10)
@total = total
@interval = interval
@count = 0
@start = Time.now
end

def tick
@count += 1
if 0 == @count % @interval
sofar = Time.now
elapsed = sofar - @start
puts "been running for #{distance_of_time_in_words(@start, sofar)}"
rate = elapsed / @count
puts "at a rate of #{distance_of_time_in_words(sofar, (sofar - rate), true)} per item"
finish = sofar + ((@total * rate) - elapsed)
puts "should finish around #{distance_of_time_in_words(sofar, finish)}"
end
end
end

Usage is pretty simple. Here's an (slightly-abridged) example from a Rake task I wrote to clean out bad references to removed YouTube videos:

namespace :videos do
desc "Remove videos from SpumCo if YouTube also removed them"
task :purge => :environment do
videos = Video.find(:all, :order => 'created_at asc')
progress = Progress.new(videos.size)
videos.each do |video|
progress.tick
video.destroy unless video.still_on_youtube?
end
end
end

And what you see in the console as the script runs is periodic updates like so:

been running for less than a minute
at a rate of less than 5 seconds per item
should finish around 14 minutes

Friday, May 1

A slightly-better podcast recording setup

A few weeks back I posted about recording Skype calls on OS X using only freely available tools, and it worked out pretty well, but it had one annoying flaw: I could hear my own voice coming through the headphones, which often tripped-up my speech pattern. So I did some more Googling and I managed to cobble together enough scraps of information to workout a slightly improved version of my original configuration.

The two main differences from the original setup is the replacement of Audacity with GarageBand and the use of the 16ch in Soundflower instead of the 2ch.

1. LineIn: Choose the "Advanced" button, set the Output to the "Soundflower (16ch)", and set the Left and Right Channels both to "2". Why both on channel 2? Because channel 1 is what's going to be coming out of the headphones, and we don't want to listen to ourselves talk.

LineIn
Uploaded with plasq's Skitch!


LineIn Advanced Device Options
Uploaded with plasq's Skitch!


2. Skype: Set the input to your mic and the output to "Soundflower (16ch)" - it's going to use channels 1 and 2 automatically.

Skype Audio
Uploaded with plasq's Skitch!


3. Soundflower: Under (16ch), set Channel 1 to "Built-in Output[1]". That's going to be your left ear in the headphones. Set Channel 2 to "None" because, as we established above, your voice is going through there andyou don't want to hear yourself talk.

Soundflower Channel 1
Uploaded with plasq's Skitch!


Soundflower Channel 2
Uploaded with plasq's Skitch!


4. GarageBand: First make sure you've set the Audio Input to "Soundflower (16ch)", then select the track you want to record to - if you're using the old version like me, you can only record to one track at a time - and set the Input to "Channel 2 (Mono)". Remember that's the aggregation of Skype and LineIn, so GarageBand is going to record both you and the caller.

GarageBand Audio MIDI
Uploaded with plasq's Skitch!


GarageBand Input Channel
Uploaded with plasq's Skitch!

Thursday, April 23

Contractor Math for Dumbheads (and Hiring Managers)

dumb-dog-crossing.jpg 368×336 pixels
Uploaded with plasq's Skitch!
I've done my fair share of hiring and firing as a Chief This and Vice President of That over the last fifteen years. Since I've taken the dive into independent consulting recently - putting me on the other side of the interview desk - I've discovered that there's a lot of hiring managers out there that just don't seem to understand the simple math of contracting.

In short, the ever-repeating story goes something like this, "Thank you for your time, but we've decided that your rates are higher than what we're looking for, so we've decided to go with another candidate," followed two or three weeks later with, "Hello, do you have time to speak again? The other candidate didn't quite work out."

So what's the math? It's quite simple. Take our fictional contractors, Dudley, the hobbyist contractor, dabbling in many technologies, master of none, and Clark, the seasoned veteran of the dot-com bubble who spent the last decade specializing in the particular technology you're using. Dudley, for some inexplicable reason that doesn't seem to concern the hiring manager, has had several jobs over the last nine months, but he only charges $50 an hour. Clark, whose resume lists all of three companies over the last decade, including publications in trade journals, charges a whopping $90 an hour, nearly twice Dudley's rate.

Of course the logical choice is cheap Dudley, right? Bzzt. Project X is pitched to both candidates and they both come back with a one-week estimate. For the sake of our hypothetical scenario, both candidates tackle the project independently. Dudley was a little off on his estimate, and it actually took him two weeks. Cut him some slack; he just started reading the O'Reilly book before the first interview, and he had to start over once because he lost all his work to a hard-drive crash. What's version control? Clark, of course, delivers on time.

This brings us to the first part of the math. Dudley took 80 hours at $50 an hour for a cost of $4000. Clark took 40 hours at $90 an hour for a cost of $3600. Ooh boy look at those savings, right? Yeah, Clark's value is evident in the invoice cost, but the real value is even deeper.

Dudley's inexperience has lead him to write some brittle code. How brittle? We don't really know because he was pressed for time and didn't bother writing tests. Ah, and don't lose any sleep over the fact that the his version works fine on his laptop but not on your servers. Something must be wrong with your hosting provider. If you extend his contract for another couple weeks, he'll figure it all out for you.

Clark, on the other hand, has provided a system with a comprehensive suite of tests, and an automated deployment script. How quaint. As an added bonus, he's also developed a clean object model which will make future enhancements and integration relatively painless; it's the gift that keeps on giving.

In the bigger picture, by going with Dudley you've paid more money for a far worse system. You'll be paying the technical debt for a long time to come. Don't make your hiring decision based solely on rates. Look at the candidate's job history. Can they hold a job? Do they have repeat/long-term clients? Look at their technology portfolio. Are they "experts" in what you need, or do they dabble in whatever happens to be shiny right now? And don't be afraid to pick up the phone and call references. Former employers will rarely say anything bad about a hire, for legal cover-your-ass reasons, but if they liked the candidate, they'll usually praise them and their work.

Monday, April 20

Put a progress meter in your long-running migrations

SLOW
Uploaded with plasq's Skitch!
I'm working on a Rails project now that requires lots of database massaging and repair. This repair work needs to be tried and tested on a development workstation, reviewed on a staging server, then applied to the production system. Since the work needs to be repeatedly applied to several environments, I'm logically using migrations.

One nuisance I got sick of real quick is staring at a terminal running a long migration and wondering "is it doing anything?" and "how much longer is it going to take?" So I decided to add some progress indicators.

The simple yet effective method I've settled on is an on-screen count-down. Most of the migrations consist of the same pattern: 1) get a list of the records that need repaired, then 2) iterate over each record and repair it. I set the counter to the size of the record set I'll be iterating over, decrement it on each iteration, and print it to the screen. Seeing the numbers scroll across the screen lets me know the migration is working, and since the migrations count down to zero, I can gauge how long it's going to take to complete based on how fast the numbers are shrinking.

Here's an example:

class FixTheThingsWithTheStuff < ActiveRecord::Migration
def self.up
query = <<-SQL
select name
from things
where stuff = 1972
and deleted_at is null
group by name
having count(id) > 1
order by name
SQL
broken_rows = select_all(query)
count = broken_rows.size
broken_rows.each do |row|
printf "[#{count-=1}]"
# ... fix it! ...
end
end

def self.down
# ... re-break it! ...
end
end

Saturday, April 11

Recording Skype Calls on OS X with Audacity (for free)


UPDATE: I've posted a follow-up with a slightly better configuration.


Over the last decade, I've had many entertaining and enlightening "debates" with an old colleague of mine, so I asked if he'd be willing to record a few over Skype and see if they might be worthy of releasing as podcasts. Since it was my bright idea, he left it to me to figure out how to pull it off. I figured it couldn't be too hard since so many other people are doing it - boy was I wrong.

I wasted a few good hours today trying to figure out how to record Skype calls on my Macbook Pro using free tools, piecing together fragments of vague and out-dated blog posts and pleas for help on various support forums. I finally got it all working and thought it would be a good idea to put it all down on [virtual] paper for posterity in case I ever need to do it again, and by putting it here others might save themselves the trouble I went through.

All you need is to download and configure four pieces of free software:

1. Soundflower doesn't require any configuration - just a reboot after install.

Soundflower
Uploaded with plasq's Skitch!


2. LineIn (scroll down the page and look for the microphone icon). Set the input to the microphone (I'm using the built-in) and the output to Soundflower (2ch), then press the "pass thru" button to activate it.

LineIn
Uploaded with plasq's Skitch!


3. Skype. Set the input and output, both, to Soundflower (2ch). I worried this would cause feedback, or an echo of my voice, but apparently I don't know the first thing about audio.

Skype
Uploaded with plasq's Skitch!


4. Audacity. Leave the playback device set to built-in output and change the recording device to Soundflower (2ch). Also don't forget to check the "playthrough" box or you wont be able to hear what you're recording, which includes the Skype call.

Audacity
Uploaded with plasq's Skitch!


5. Leave your system settings alone. They should look like this:

System Output
Uploaded with plasq's Skitch!


System Input
Uploaded with plasq's Skitch!


6. Back in Audacity, hit the record button, then switch over to Skype and start the call. Now you're in business.

Friday, April 10

Book Review: Never Eat Alone

Book Cover
I think I can pretty much sum up the entire book in one sentence: Be a networking whore, but be sincere, and pay it forward.

Those are the three main points author Keith Ferrazzi drills into your skull, but he uses a few more words, and couple hundred more pages. Everybody in your life is an important networking contact, from your garbage man to your employer's president. Never be a phony; always be yourself. Don't schmooze only the people you think are important or powerful and likely to advance your career. Don't piss on your underlings or step on your peers to climb higher up the ladder. And last but not least, always come to the party offering to give something first, don't start out by asking for something, and don't expect reciprocation.

Ferrazzi is a fine writer, and he peppers his words of wisdom with entertaining self-deprecating stories of learning from his mistakes. The book makes it sound like the only thing he does all day is make phone calls, send e-mails, and throw dinner parties, non-stop. That sounds like a career in itself to me, and in his case (books, speaking, consulting, etc.) I believe it is. That doesn't leave much free time to focus on your main career, for those of us who have jobs other than full-time networking. But, it's a good eye-opener to the means and method of those who are well-connected. I'm sure I'll cherry pick a few of his techniques (that I'm not already doing) and I'll be a better person for it. My only real gripe with the book is the same with most books: too many words. He could have got his message across in a fraction of the verbiage.

Friday, April 3

Disabling third-party services when they stop performing (in Rails)

Chain
Uploaded with plasq's Skitch!
One of my clients uses the hosted version of CompanyX (not their real name) to serve ads on their site. A couple weeks back, CompanyX applied some "upgrades" and things didn't go as planned, so for nearly a week their service was up and down like a yo-yo. That resulted in me getting calls along the lines of, "Hey our site is loading slow because of the CompanyX ads, please take them all off," followed a few hours later with another call, "Hey CompanyX seems to be OK now please turn their ads back on," and a little while later the cycle repeats itself. That got real old, real quick.

So, I decided to whip up a little automated solution. I needed two core components:

1. A way to programatically turn the ads on and off.

2. A way to periodically test the third-party service, and enable or disable the ads based on its response time.

For disabling and enabling the serving of ads, I created a new model called CompanyXStatus which is essentially a toggle switch, it's either on or off, and for auditing purposes I have it store the date and time whenever it's flipped. The database table looks like this:
create_table "companyx_statuses" do |t|
t.column "enabled", :boolean
t.column "created_at", :datetime
end

And the app-facing API of the model looks like this:
class CompanyxStatus < ActiveRecord::Base

class << self

def enabled?
latest.enabled
end

def disabled?
!enabled?
end

def disable!
CompanyxStatus.create!(:enabled => false) if enabled?
end

def enable!
CompanyxStatus.create!(:enabled => true) if disabled?
end

private

# I'm not using a named scope because this client is on an OLD version of Rails...
def latest
CompanyxStatus.find(:first, :order => 'created_at desc') || CompanyxStatus.new(:enabled => true)
end

end

end

So in the views when I'm building a page I just have to check if CompanyxStatus.enabled? before rendering an ad tags.

Now, for the actual toggling logic, I'm using the Benchmark module to call out to the service and measure the response time. If if exceeds the threshold (2.5 seconds in this case) the service is disabled, otherwise enabled. Here's the rest of the model:
  def test
if 2.5 > latency
CompanyxStatus.enable
else
CompanyxStatus.disable
end
end

private

def latency
Benchmark::measure{ connect }.real
end

def connect
socket = Socket.new( AF_INET, SOCK_STREAM, 0 )
sockaddr = Socket.pack_sockaddr_in( 80, 'blah.companyx.org' )
socket.connect( sockaddr )
socket.write( "GET /blah.php HTTP/1.0\r\n\r\n" )
results = socket.read
end

Finally I need a way for the system to periodically make these checks and toggles so my client and I don't have to worry about babysitting the site. For this I wrote a simple rake task:
namespace :companyx do
desc "Ping CompanyX and disable it if too slow"
task :ping => :environment do
CompanyxStatus.test
end
end

And scheduled it as a cron job to run every minute:
* * * * * cd /home/client/apps/production/site/current && RAILS_ENV=production rake companyx:ping

That's it! Now I can get a good night's sleep knowing that the next time CompanyX has a burp in their service, my client's site is going to automatically shut them off until they get their act back together again.

Thursday, March 26

Scratching my own itch again leads to a new iPhone app

JustTweet
Several weeks back I was down at Barcamp Miami and I wanted to send a quick tweet from my iPhone. I have several Twitter clients on my phone - there's a bazillion of them out there - but most of them are very slow to start as they have to load all their fancy views and suck down all your friends tweets and blah blah blah. I wanted a small simple fast app that just let me post a quick tweet and be done with it. I couldn't find one, so I decided to write one.

Later that afternoon I discovered there was already one out there, for free no less, but I'd decided that this was a great entry-level application for me learn how the whole iTunes store process works, so I pressed on, determined to reinvent that wheel. I'd been dabbling with the iPhone SDK for months but things really kicked into high gear when I picked up Beginning iPhone Development; it's the book to get if you're new to iPhone and Cocoa development.

I wrote the app one Saturday evening, spent a few hours here and there afterwards rounding off the rough edges or applying things I'd picked up from the book (I'm still working my way through it). I asked an old designer friend and colleague if she'd create an icon for me and she agreed. I knew it would take her a bit to find the time to get the creative juices flowing, so I was in no hurry. It was time to start the red tape with Apple.

It's not a fun process. It's not a pretty process. But by gawd I have to give Apple credit for providing documentation that quite literally holds your hand and walks you through every last step of the grueling process. There were articles of incorporation to fax. Bank account routing numbers to supply. Certificates to be generated and signed. First born children to by sacrificed. It was hairy. But one glorious morning I opened up my inbox and found the "you may start uploading apps and selling them in the iTunes store" e-mail, and it was good.

I'd heard the horror stories of developers waiting months for their applications to be approved, and the stores of applications being rejected and resubmitted ad infinitum, so I had prepared myself for a long period of trepidation capped off with disappointment. When the shiny new icon arrived from my designer friend, I compiled my "release" build and uploaded my baby, praying everything would be fine. To my shock and awe, it was exactly one week later I received the "OK your app is on sale" e-mail. I was giddy.

I announced it to my friends and followers on Twitter and bless their hearts a lot of them bought the darn thing. Unfortunately I've not yet deciphered the financial section of the app store back-end so I have no clue how many copies were sold or how little I've made off it. The morning it went on sale, I received a support e-mail from a gentleman in Denmark who wasn't able to get it working. I haven't yet resolved his issue, and thus far he's the only person to report the problem, but I'm not one to rest on my laurels - last night I hacked up a few improvements and uploaded version 1.1 for review.

This isn't the iPhone application that's going to make me a zillionaire, but now that I've gone through the process and I've cut through all the red tape, I'm ready to start working on the real killer app.

Oh yeah, if you're feeling curious or just generous, please grab a copy for yourself. They're cheap while supplies last!

Saturday, February 14

Laptop recovery with Twitter? Scratching another itch...

One morning last week a curious idea popped into my head:

...

... and that afternoon I had accomplished this:

...

The premise is simple:

1. I set up a private Twitter for just my laptop status, and subscribe to it from my personal Twitter account.

2. I set up a cron job on my laptop to run ever hour, tweeting the geographical location of my laptop to the private account.

The theory is that if my laptop were ever stolen and the crook for some reason didn't format it before connecting it to the 'net again, I might have a fart's chance in a whirlwind of recovering it. Yeah, probably not, but it was fun to build, and on the coolness factor I'll have an automated journal of my travels.

As an added bonus, Twitter automatically rejects duplicate tweets, so the account doesn't get spammed every hour if the laptop hasn't moved (since the updates would be identical).

So how did I do it? Well the first thing I did was realize that I couldn't get my public IP address from the laptop when it was behind a NAT router, so I had to reach out to find a service that could provide me that information. My initial Google attempts came up dry and I was about to resort to scraping the IP Chicken page (blech) when a helpful Twitter follower came to my rescue:

...

That service is exactly what the doctor ordered (thanks @nu2rails). I used HTTParty to parse the response and Jeweler to turn my little library into a gem and tossed it up on my GitHub account for the world to enjoy.

So please check it out, try it, fork it, improve it, and let me know how you like it. Oh, and extra brownie points if you get the name of the gem!

Monday, February 9

Twitter2RSS: Scratching my own itch at Acts As Conference 2009

The beauty of RSS is that I can aggregate all of my information sources, read them at my leisure without worry of expiration, and organize them however I desire. It's always bugged me that I couldn't have that luxury with Twitter. Sure Twitter has RSS feeds, but they don't include the avatars, and they don't include direct messages, and they commonly truncate the tweets (blech). There are some pretty slick clients out there, sure, but why would I want to run yet another application for just another information feed? Why can't I have my cake and eat it too?

Last week, while I was up in Orlando for Acts As Conference, I did the same thing I do at most conferences, I started another damn pet project. Except this time I also completed it during the conference. Thanks to my bud Bryce Kerley from the Miami Ruby Meet-up for giving John Nunemaker's Twitter gem a little massaging, I had all the tools I needed to alleviate myself of the aforementioned headaches and get my Twitter goodness piped right into my RSS reader.

It's a proxy server, running on Rails (perhaps a bit overkill of a framework for such a simple application, but it's what I know best) that sucks up your Twitter business using their API and spits it back out in a consumable RSS feed, with avatars, and without truncation.

If you want to run it yourself, I've opened up the source over on GitHub at github.com/trak3r/twitter2rss

Or, if you're too lazy or don't have a cheap hosting provider, you can use mine over at twitter2rss.anachromystic.com

Enjoy!

Sunday, January 25

Book Review: Show Stopper!

My first book of 2009 was a crusty old out-of-print classic I'd given up on finding a copy of years ago, but I found it under the X-mas tree last month courtesy of my awesome girlfriend. The book was, of course, Show Stopper!: The Breakneck Race to Create Windows NT and the Next Generation at Microsoft.

I'm no fan of Microsoft, but I'm a sucker for software development stories, no matter how embellished. From the definitive The Soul of a New Machine to the almost farcical Microserfs and everything in between like The Autodesk File and Dreaming in Code and I could list another dozen from my reading list of the last decade but I'll save that for another time and another post. Essentially, I love reading about how other people suffered through pushing a product to market, because I've lived it myself a few times now, and I suspect I'll do a few more tours of duty before my retirement.

Show Stopper! makes a for a good leisurely read, but unfortunately it's far more focused on the people and their relationships rather than the technological and political hurdles they had to overcome. In other words, it's more fluff, less stuff. The book slowly and methodically introduces core people on the Windows NT team, giving their back story, explaining their motivations, touching on their personal lives outside of the office, etc. Technology and politics are covered, but not in great detail, and in some cases so abstractly as to insult a reader that hungers for the nitty gritty specifics.

Without spoiling the plot too much, a complex new operating system is conceived, an all-star team is assembled to produce it, deadlines are missed by years, tempers flare, nerves are shot, power plays are made, and in the end if ships. Of course, you knew how it ends. It saddens me think some of you readers are young enough to not have even used Windows NT. Sometimes working with the old clunky stuff gives you a much better appreciation for what you have today. Perhaps I'd not be such a fanatical OS X pundit today if I'd not had to launch a product or two on top of Windows NT.

So in summary, Show Stopper! is worth a read. It's not in my top five of historical dramatizations of big software development launches, but maybe it's in the top ten. Find yourself a copy while you still can.

Wednesday, January 14

Tracking AJAX calls with Google Analytics (and Rails)

With my recent pet project I used AJAX to provide a majority of the site functionality on the main page with it never having to reload itself or load another page. This resulted in a drastic drop of "page" views in my Google Analytics reports, because the only "page" being loaded on each visit was the sole main page; every click after that was an AJAX call to update only a portion of the already loaded page, hence not triggering any calls back to the Google Analytics tracking server. So I set out to see if I could remedy that, and I did.

I started out with a Google search for assistance. Surprisingly, the results were grim. The highest ranked result required a "donation" to read the answer, and I opted not to. Other results were vague or dated or focused on the same issue with Flash. I had to crack open the code and get my hands dirty.

When you set up a profile to track with Google Analytics, they give you a little snipped of JavaScript code to paste into your website pages. It looks something like this:

<script type="text/javascript">
var gaJsHost = (("https:" == document.location.protocol) ? "https://ssl." : "http://www.");
document.write(unescape("%3Cscript src='" + gaJsHost + "google-analytics.com/ga.js' type='text/javascript'%3E%3C/script%3E"));
</script>
<script type="text/javascript">
try {
var pageTracker = _gat._getTracker("UA-1377203-6");
pageTracker._trackPageview();
} catch(err) {}</script>

Seems cryptic enough, right? If you look closely you'll see it's actually two scripts, not one. The first script imports the Google Analytics tracking library:

<script type="text/javascript">
var gaJsHost = (("https:" == document.location.protocol) ? "https://ssl." : "http://www.");
document.write(unescape("%3Cscript src='" + gaJsHost + "google-analytics.com/ga.js' type='text/javascript'%3E%3C/script%3E"));
</script>

And the second script calls the library to track the page.

<script type="text/javascript">
try {
var pageTracker = _gat._getTracker("UA-1377203-6");
pageTracker._trackPageview();
} catch(err) {}</script>

In order to track a page load, including an AJAX call, you merely need to call the tracking library, but you need to import the library only once. That's where things get a little tricky.

For starters, I separated the two scripts into their own respective partials. I render both partials on the main page, so the library is loaded and then called to track the page load. However, from the AJAX calls, I only render the second script, the one that calls the library, since it will have already been loaded by the main page.

Here's the first curve ball. The first time I render and serve the main page, the AJAX-y portions of the page are rendered in-line. Since each of them in turn renders the call to the tracking library, the loading of the main page could erroneously track several hits. In order to prevent this, I wrapped the script with a global flag check and set, so no matter how many times it's rendered for a single page, it only injects the script once:

<% unless @already_tracked %>
<% @already_tracked = true %>
<script type="text/javascript">
try {
var pageTracker = _gat._getTracker("UA-1377203-6");
pageTracker._trackPageview();
} catch(err) {}</script>
<% end %>

The second curve ball is that you need to render the first partial - the one that loads the Google Analytics library - at the top of your page, rather than the bottom as Google recommends in its documentation. I put mine immediately inside the body tag. Why? Because the first partial, with an embedded tracker call, to render on the page is going to try to call the tracking library when the browser processes it, and if the tracking library isn't yet loaded, the call will silently fail (thanks to the try/catch block).

Now that the main page loads the tracking library, and the HTML snippets returned from the AJAX calls in turn call the already-loaded tracking library, each AJAX call is tracked as a page view on your Google Analytics report. Note that this solution is specific to AJAX calls that inject pre-rendered HTML into the page. If you need to track AJAX calls that deal with behind-the-scenes processing you should be able to simply make the same JavaScript calls you see in the second script; wrap them up in a helper function for convenience.

Saturday, January 10

How to Launch 2 Sites in 20 Hours

Since I "became independent" exactly two months ago to the day, I've launched two pet project web sites: Pocket Rails and Rate Marina's Outfits. I tracked my time on each, just like I would for a billable client, and oddly enough they each took about twenty hours from inception to launch. Turns out it's pretty damn easy, and cheap too. Here's how I did it.

Get organized

I'm a "to do" list guy. Everything I do on a daily basis centers around "to do" lists. Whenever I think to myself, "I need to..." it goes straight on to the list. For these two projects I used Ta-da Lists. I created a new list for each project, and started adding "to do" items as I thought of them, and checked them off as I completed them. It's a great way to ensure you don't forget anything, nothing falls through the cracks, and it gives you a decent visual representation of your progress and how much you have left to do.

Get a domain

The first thing you need is a domain name for your site so people have a way to surf to it. You don't necessarily need a separate domain for every site; you can can host several sites on a single domain via sub-domains. For example, I registered anachromystic.com for my company then hosted one of my projects at marina.anachromystic.com.

I get all my domain names through GoDaddy. It's usually the cheapest, and it's convenient to manage them all through a single central service. If you plan on sending/receiving e-mail through the domain, I strongly recommend Google Apps for Business. It's dirt simple to set up, their tutorials cover every major registrar, and requires no maintenance.

Get a host

As the name "pet" project implies, these sites are hobbies. They are not generating any money, and it's not critical that they be up all the time and fast to respond. So I went with the cheapest host I could find, DreamHost. Pull up Google and search for DreamHost promo codes and try all the ones you find. I ended up getting an entire year of hosting for about $20 (that's for the entire year, not a monthly rate).

Choose a platform (Hint: Use Rails, dummy)

Not only is Ruby on Rails the best platform for getting a site up and running quickly, you can get a head start with a "base" application like Bort which comes with a plethora of pre-shaved yaks including registration, e-mail activation, log-in, password reset, pre-configured routes, deployment scripts, etc. It's not perfect - I had to tweak it a bit - but it saved me hours of laying the groundwork and let me get to the meat of the project quicker.

Use hosted source control

Why hosted? First of all, it's essentially a cheap back-up of your work. Secondly, it makes it a lot easier to collaborate if you're working with other developers. There's a billion to choose from, and if you're willing to let other people see your code, they're free. I decided to make the source for Pocket Rails open but keep the source for Marina private (for now). I'm a huge git fanboy so GitHub was the natural choice for me. My open-sourced projects are hosted for free and I pay a measly $7 per month for the privilege of keeping some of them private.

Test all the fscking time

If you aren't test infected yet, it's time to wake up. Testing demonstrates that what you've written works, and testing ensures that when you modify or enhance it you don't break any of the old stuff. Don't let yourself fall into the quagmire of, "I'll add testing later after I get everything working." You'll waste endless hours of debugging issues that could have been prevented with preemptive testing. But don't take my word for it, take the word of Bryan Liles.

Also, on a somewhat related note to testing, use one of the plug-ins and accompanying services like Hoptoad or Exceptional to track and alert you when something breaks on your site.

If you want to get hard core, and why not, install the New Relic plug-in which will track and report on the performance of your application, so you can find out where the bottlenecks are.

Automate deployment

Get automated deployment working from the get go. Don't save it for the end. You should be able to deploy your site from your hosted source control to your hosting provider with a single command. Capistrano is the tool for the job if you're using Rails. Open up a terminal window and type "cap deploy" and watch it all unfold. If that one command doesn't do everything you need it to, make it! For example, I deploy a lot, and Capistrano doesn't clean up after itself automatically, so I hacked it to run the "clean" command after every "deploy". It also doesn't run database migrations by default, so I added the "migrate" task to "deploy" as well. When you can update your live site with one command, you'll sleep better at night.

If you build it, they will come

Well, they wont actually come until you promote it, but that's a later section. Once I had all the aforementioned steps in place, I buckled down and coded. My Ruby and Rails skills were a little rusty, so I'd occasionally have to visit a documentation site like APIdock or search for an issue on Google or as a last resort post my problem on gist then tweet the link on Twitter (thankfully I have quite a few smart and helpful Rails guys following me).

Package your dependencies

Don't expect your hosting provider, or the next developer to work on your project, to have all the necessary third-party dependencies for your project. Package them up with your application if possible. In Rails this is pretty simple, just declare the gem dependencies in your environment.rb file then rake gems:unpack to extract them locally. Do it for Rails itself too, and don't forget to add them all to source control.

Track it

If you want to know how many people are visiting your site, how many pages your site is serving, which page is the most popular, etc., you need to track it all. The quickest, easiest, and cheapest way I know of accomplishing that is Google Analytics. Create a profile for your site and they will generate a little snippet of JavaScript code to paste into your pages. Assuming you have a template that's common to ever page on the site (like a header or footer), that's the logical place to put it. Google will track everything for you and give you some super slick reports.

Tell people about it

Once you've got your new site coded, tested, and deployed, it's time to draw people to it. People aren't likely to find it on their own, so you need to announce it. With Pocket Rails I first started tweeting about it on Twitter. This attracted a few visitors and provided some initial feedback. Once I'd ironed out a few kinks I shot an e-mail over to the guys as the Rails Envy Podcast explaining what I'd built. They seemed to like it and mentioned it on their show. That drove a sizable burst of traffic which quickly died off. A week later I posted the link to reddit for ruby hackers and holy macaroni the site was deluged with visitors, and I began to see other people talking about it and linking to it thanks to tools like Google Alerts and Twitter Search. I added their RSS feeds to my news reader so I can keep on top of the chatter.

Finally, blog about it

Hey, my blog post about launching your site is recommending you blog about launching your site. How meta is that? It makes my brain hurt a little. But seriously, share your story so others may learn from it, as I hope you've learned something from my story. I'd love to hear your comments and criticisms, perhaps you'd have done something differently.

Friday, January 9

Rate Marina's Outfits

Yesterday I soft-launched my latest pet project, which I wrote on New Year's Eve of all nights, with graphic design graciously contributed by Allan Branch of Less Everything.

What is it? Well, for those of you too lazy to click the link, it's essentially a mash-up of YouTube and the Ajaxful Rating plug-in for Ruby on Rails that lets you rate the outfits worn by Marina Orlova on her popular webisode sensation HotForWords.

Why did I write it? To get rich and famous, of course, right? Ha, no. I wrote it because I've been interviewing for new gigs the last couple months and I keep getting asked for sample code that I've written. All the projects I've worked on for the last decade have been private and proprietary code bases, so I can't share them. Now I have something to share. For now I'm only sharing the code with potential employers, but I do plan to release it to the public in the very near future.

If you like what you see and you're looking for a solid developer (remote only, sorry, no relocation) please give my resume a peek. Thanks.