posted: May 11, 2019
tl;dr: The full stack concept has been around for decades, but the definition has changed considerably...
Although I don’t recall the term being used at the time, the concept of being a full stack developer has been around since I was in college in the 1980s. Back then it meant being able to design the hardware of a computer or computer-based system and also being able to write all the software that ran on that hardware. A desire to learn as much as possible about the entire stack is one of the key reasons I chose the major and courses that I did.
Early in my career I worked on several full stack projects, by that definition. One of them was the Inmar echoBOX, where I got to design the hardware and then write all the embedded software in assembly language. I enjoyed being able to work on both the hardware and software, but even then it was an exaggeration to say I was working at all layers of the stack. I hadn’t designed the integrated circuits, in particular the 8-bit microprocessor. Some other team of developers at Motorola was responsible for that key layer of the overall stack.
I got an inkling of what was happening to the hardware design profession back then: software was creeping into hardware design, to handle the increased complexity and component density driven by Moore’s Law. Hardware circuit design was moving from using pencils, paper, and templates to draw schematics of interconnected individual components, to using computer-aided design tools. As an intern at Teradyne, I was one of the first guinea pigs to test out the first schematic capture software system the company had purchased. My project was non-critical, so they didn’t care if it failed due to tool issues, but it succeeded, which opened the eyes of the more experienced designers. It was time to put down their pencils.
In my second stint at Teradyne I wrote code for a pure software-based logic simulation tool called Teradyne LASAR, which simulated the timing of large custom digital circuits to see if they would work before being built. The number of transistors on integrated circuits was growing exponentially thanks to Moore’s Law, so chip design shifted to designing hardware circuits in software-based through the use of hardware description languages such as Verilog and VHDL. Even printed circuit board design shifted into the software realm with automated layout tools and auto-routers. Over time, in the hardware realm, it became impossible to design all the layers for anything but the simplest designs and products.
On the software side complexity also grew dramatically over the years. When I started in my career it used to be possible to write your own operating system for an embedded system entirely from scratch. Even off-the-shelf real-time operating systems like VxWorks were pretty simple to use, initially. There wasn’t much to MS-DOS, and early Windows application programming was straightforward, using the Microsoft C/C++ compiler and some operating system calls for managing the user interface and storing data on disk.
Then came networking and the Internet, which grew into a huge new layer of its own if you really wanted to understand the routing protocols. Operating systems bulked up, to add networking functionality. Different application frameworks appeared. The Web came along, and then web applications, with Hotmail as one of the first. This created two new specialties: developing the client side of web applications in the browser, and developing the server-side functionality back in the cloud.
Being a full stack developer has always been more of an ideal than a reality. That’s to be celebrated, actually: it is a sign of success of the industry. The body of knowledge has grown far beyond the capabilities of any one person to master it all.