Have you ever reused code?
The term ‘Code Reuse’ feels like a software developer’s cliche that had since fallen into the list with other unfashionable tech lexicons. Nevertheless the terminology still lingers on like a bad smell, never fully ready to die off. These days, code reuse feels more like the definition of a myth - a story everybody has heard of, but nobody has witnessed.
If you are ever geeky enough to have raised code reuse as a conversation piece, you’ll probably notice that almost everybody have something good to say about it, from a vague feel-good feeling about how good a thing it is, to how it may have profoundly changed a person’s life (ok, I exaggerated about this one). If you had to ask anybody for 5 good examples, I’m sure you’ll be hard pressed to find anybody with a sensible answer. How about we start with yourself: when was the last time you’ve reused your own code in a meaningful, substantive way?
These days, the only visible code reuse I know of, is only when I rely on code from a software library - often an external library written by somebody else. Be it a data structure, a fancy graphical widget, or complex mathematical computations, there is probably a library out there which will cater to your need. Writing from scratch is something you never seem to do anymore.
But relying on software libraries is just not my romanticised version of code reuse, the one where the object-oriented programming paradigm had so promised so long ago. Remember the textbook claims on writing your own well-abstracted objects, and how you’ll be rewarded in reusing them for all perpetuity? Personally, that lofty promise has certainly fallen short of my expectations from when I was a starry-eyed kid coding in OOP for the first time, to the more experienced software developer today.
So what went wrong? Nothing actually.
Code that has well-defined purposes, inputs and outputs, which are so often used, are easily defined and hence usually gets ‘factorised’ into code libraries. These libraries get battle-tested by many other developers over time, ironing out any residual kinks, as well as any lingering bugs. Over time, a well-used library makes more compelling sense to use than to roll your own, since it minimises the risk and uncertainty from newly introduced code.
So whatever’s that’s left for you to work on, are likely new and unique issues you are solving, making it naturally unfactorisable. And if certain portions of code do become apparent enough for you to find a commonality, that’s perhaps when you’ll refactor your own code to reuse these commonalities, although I suspect the possibility of such situations are getting less likely. Maybe like me, you’re feeling a little cheated as well.
Code reuse today is just an euphemism of relying on other people’s code
- well, it is still reuse, just not your own code, not unless you happen to be a software library writer. But chances are, you are usually not.
I might as well go one step further and declare that we never reuse our own code anymore - as a corollary to the famous Bikeshed Problem. Not all of us will gain sufficient experience in building our own nuclear reactor (or more efficient data structures and algorithms), so what’s left remaining is only to focus on the colour of the bikeshed (or button placements within a HTML form) because that’s the only thing that’s left to do when other people have done all the heavy lifting for you. And that’s how it should be, after all, didn’t they tell us not to reinvent the wheel?
It’s why any boy and his dog today can write an application with some knowledge of HTML, CSS and Javascript - nobody needs to know how to code a rasteriser for transforming vectors into pixels, write their own graphics routines so that they can display a button, input, or to write their own binary search tree in order to use a hashmap, since they don’t have to - the first principles of software systems are all conveniently abstracted into libraries, frameworks, and easy APIs that they can use.
It is not a bad thing, but it is also to no wonder why any arts major can simply write a web application and proclaim themselves to be a software developer these days. While I wouldn’t mind them doing a webpage for me, I won’t go as far to trust a lay-coder on anything that’s of any algorithmic complexity.
On the flip side, it’s never been better to be a software developer; we are more productive from the assortment of libraries that are at our disposal, from the myriad software frameworks to numerous tools that we utilise today - all of which has allowed us to write software systems that would be difficult in the past, a relative breeze today.
As software development goes these days, we are indeed standing on the shoulders of giants.