SW engineering, engineering management and the business of software

subscribe for more
stuff like this:

Cognitive Offloading and the Productivity of Go

Minimizing The Time from Idea to Production

There are many steps in the making of software. Conceptually, we can organize them in stages of a metaphoric pipeline as idea, architecture, prototype, and production-ready product.

I’ve been writing software for over three decades and Go is the best tool I’ve ever had for getting from idea to production.

There are many small reasons and two big reasons for this kind of efficiency and productivity.

Go Is Inherently Productive and Efficient

The small reasons are fairly well documented, but some highlights:

Go is a compiled, statically-typed language that feels more like a dynamic language than its peers. The syntax has some convenience sugar sprinkled in but the bulk of the credit is due to the compiler. Primarily via type elision, the compiler is smart enough to enforce static typing with minimal developer hand-holding. Also the compiler is faster than an ADHD squirrel marinated in Red Bull.

The built-in library covers a large surface area for such a young language, and the overall ecosystem is flourishing.

Error handling seems overwrought and full of boilerplate, but my experience is that idiomatic style of inline error handling makes programs faster and easier to debug. The end result is being able to zero in on problematic lines of code quickly which reduces the overall time to solution. (Russ Cox talks about the philosophy of Go and errors here.)

There are too many other small reasons to continue, but assume for now the language was crafted with productivity in mind.

Cognitive Offloading

Go’s primary advantage in facilitating fast time-to-product is a high level of positive cognitive offloading.

Making software involves quite a bit of mental juggling. You have to keep many disparate thoughts, concepts, requirements and goals in working memory simultaneously. The reason that Paul Graham coined the term Maker’s Schedule and the concept of half day chunks is that typically, in order to write software you need to load your working memory with the context of the problem your are solving and the existing state of the solution. This “ramp up” takes time and an interruption can wipe out a good chunk of that working memory.

Positive Cognitive Offloading can be thought of as a juggling partner you can hand off items to. If you trust them not to drop things, it frees your working memory for other items or allows you to juggle fewer items faster. Since there’s less to load, you move into the productive state faster.

Language features such as static typing, interfaces, closures, composition over inheritance, lack of implicit integer conversion, defer, fallthrough, etc. all result in a compiler that tells you when your code is likely to be buggy. Lack of warnings enforce discipline on the weak, squishy, analog life-forms who would otherwise allow ambiguity to deploy to production.

defer is an absolutely brilliant pattern that clearly illustrates this:

file, err := os.Open("some.file")
if err != nil {
    // don't forget to handle this!
defer file.Close()

if X {
} else if Y {
// otherwise whatever

The explicit offloading that occurs is that you don’t have to worry about if else chains or intermediate returns. In practice you can write code that needs cleanup without having constantly be on alert for exit points or wrapping code around an closure. As a matter of practice, the odds of leaving a dangling file are much lower when Close is nearby Open.

The time and energy requried to keep code bug-free is lower with defer allowing you to progress faster.

This kind of mindset is even more apparent in the toolchain. One of the nicest features of Go is the fmt tool. By outsourcing all coding formating standards to a command line tool, a surprising amount of weight is lifted from the task of writing code. Time wasting aside, the reduction of social friction or worse, check-in ping pong, over coding standards makes the whole world feel a bit more civilized.

Other parts of the toolchain which reduce cognitive weight include vet (a lint-like tool), test and even the GOPATH mechanism which forces a high level folder structure across all go projects.

It’s also important to contrast negative cognitive offloading. If your juggling partner drops things occasionally, this is arguably worse than having no juggling partner at all. If your ORM occasionally produces poor SQL that takes down your database, suddenly your cognitive overhead every time you use ORM methods skyrockets, because you have to ensure your ORM code doesn’t negatively impact the system.

In Go, the current state of the garbage collector can potentially cause NCO, but the existing profiler and improvements to the GC itself as well as some library additions in the upcoming Go 1.2 offer some relief. Needless to say, the other benefits far outweigh the cost.

Pipeline reversal penalty

One of the most common tasks a developer does is rewriting code that already exists. Thinking again about our metaphoric pipeline (idea, architecture, prototype, and production-ready product), rewriting code is essentially backing up through the pipeline.

There are many good and bad reasons to go in reverse, but there is always a short-term efficiency penalty to doing so. Good developers tend to offset that penalty by achieving longer-term benefits in maintainability, correctness and/or business goals.

Go, via intentional design and compiler implementation, has the shortest pipeline reversal penalty of any development ecosystem I’ve ever used. In practice, this means you are able to refactor more often with less regressions.

If you change an interface, the compiler tells you every single place that needs to be modified. Change a type and you are notified by line number everywhere your round peg no longer fits in the old, square hole.

If you take advantage of unit testing and benchmarking infrastructure, then you are residing near the magical Stuff Just Works Zone™. Even if you are not a developer, it should also be obvious that Go codebases are more easily adapted to changing business requirements.

Not Perfect

There is some tarnish on the generally shiny Go. Most crashes in Go are due to nil pointer references. John Carmack very concisely explains why:

The dual use of a single value as both a flag and an address causes an incredible number of fatal issues.

Something like Haskell’s Maybe type would be nice, or possibly some kind of Guaranteed-Good-Pointer type. In the meantime, there is a the cognitive overhead of nil checking.

The concurrency model is great, but has a bit of a learning curve to it. If you identify a performance bottleneck, you end up implementing & profiling both the the traditional way with mutexs/locks and the idiomatic way with channels and goroutines.

Some things that seem like negatives aren’t. The lack of generics is unfortunate, but the language designers are not willing to give up any of the other good stuff in Go in order to shoehorn them into the language. So far I’m convinced it’s the right decision. If they manage to pull it off in the future, their track record suggests that they will have found the right tradeoffs.

Net Win

Nearly two years ago, I said the following and I still believe it to be true:

Go is a tremendous productivity multiplier. I wish my competitors to use other lesser means of craft.

Go may not be for everyone, but there is more and more evidence that others are coming to similar conclusions.

One last point of interest, many of the above posts discuss how much fun Go is and my own experience upholds this. Not just programming, but in any domain, better tools that reduce friction nearly always make the process more entertaining. It turns out when you help people avoid the some of the irritations in their craft, they have a more enjoyable time with it.

Many thanks to @jkubicek, @yokimbo and Daniel Walton for reading drafts and providing feedback.

in lieu of comments, you should follow me on twitter at twitter/amattn and on twitch.tv at twitch.tv/amattn. I'm happy to chat about content here anytime.

the fine print:
aboutarchivemastodontwittertwitchconsulting or speaking inquiries
© matt nunogawa 2010 - 2023 / all rights reserved