Software developer education over the years

posted: July 26, 2020

tl;dr: The range of choices and paths grows as traditional higher ed retreats...

In at least one respect, life was simpler when I was in high school: if you wanted to get a professional job in the computer industry, you went to a traditional (usually residential) four year college or university and majored in a field such as computer science, computer engineering, electrical engineering, mathematics, or perhaps a few other STEM subjects. Many of the larger technology companies offered student internships. During senior year they would recruit, interview, and make offers for entry-level professional positions to the graduating seniors. It was possible, inside these companies, for people to start out on the manufacturing floor or as a technician or as a customer support person, acquire skills, and become a professional software developer, but it was rare. The tech companies wanted to hire degreed engineers and computer scientists, and those from the premier universities were in the most demand.

That was one reason why it was a no-brainer decision for me to pursue an engineering degree at Cornell, but there were others. The price of Ivy League tuition was still affordable to middle-class families: it wasn’t until my senior year that annual tuition broke through the five-figure ($10,000) barrier. I had other interests besides computers, and wanted to learn more about the world in general. But the key reason for studying computers in college, as opposed to someplace else, was that the colleges actually had real, powerful computers.

I started college in 1982, a year after the IBM PC was first released. Personal computers at that time could accomplish some small-scale tasks, and I programmed one in high school for a business, but the computer industry was still very much dominated by minicomputers, mainframes, and even supercomputers. If you were a student interested in computers, you wanted to program on the most powerful systems in existence, which meant supercomputers. This was another reason I chose Cornell: it was a pioneer in supercomputer research, and was one of the National Science Foundation’s supercomputer centers. My high school had a grand total of four Apple IIe computers my senior year, a doubling of the number from the year before. So it was a thrill to go to Cornell and get the opportunity to study and use a DEC PDP-11, an IBM 370, and even more powerful specialized computers for graphics and numerical analysis. Even the Teraks were interesting.

Rhodes Hall on Cornell’s campus was built to house the supercomputer program, today known as the Center for Advanced Computing

Networking typically meant computer terminals attached to a minicomputer or mainframe, or dial-up access over a 300 or 1200 baud modem. The Internet was in its very early phases, and there were several regional academic networks. One of them was NSFNET, which grew into a major Internet backbone. It was originally built to connect Cornell’s supercomputer center with the NSF’s other supercomputer centers. So if you wanted to study, learn, and do computer networking, it helped to physically be at one of those centers. Marc Andreessen developed the NCSA Mosaic browser at one of those supercomputer centers, at the University of Illinois at Urbana-Champaign.

Online documentation of computer systems was minimal. If you wanted to study the guts of a computer, operating system, or programming language, your options were to take college classes, read a few textbooks and academic journals, and study the manuals racked in the computer centers. The only alternative way to acquire technical knowledge about computers was to get hired into a computer company at a lower-level position, or to befriend someone with access to those systems at either a company or a university.

Today there are many alternative ways to study computers and software. The success of the computer industry itself has created many of these alternatives, thanks especially to the rise of the Internet and broadband wide-area networking, as well as the proliferation of powerful, inexpensive personal computing devices such as the MacBook I am using to write this blog post. Those alternatives include:

Unlike fields of study such as biology, chemistry, and medicine, where much of the learning takes place in hands-on lab or clinical settings, computer education transfers quite nicely into a pure online learning environment. The alternatives above, combined with the far faster-than-inflation rise in college tuition, have completely changed the cost/benefit analysis that students perform today. I don’t know what decision I would have made if I were graduating high school today; it was definitely a goal of mine to graduate college without having to borrow any money.

The campus of Green Mountain College, founded 1834 in Poultney, Vermont, is up for auction this summer

Times have changed. I’ve worked with some very productive people who acquired their technical skills through the alternative routes listed above. As college tuition becomes ever higher, more people will turn to the alternatives, either out of preference or necessity. Traditional residential colleges and universities were already feeling the squeeze before the COVID-19 pandemic; the pandemic will accelerate the rise of alternatives, especially since colleges themselves are having to turn to the alternatives. There will probably always be a top tier of expensive universities performing leading-edge research and educating the next generation of academics, plus large state universities that deliver a college education in a much more cost-effective way. I think that many of the colleges and universities that don’t fall into either of those two categories will suffer the fate of Green Mountain College and Marlboro College, to name just two colleges close to where I grew up whose campuses are for sale now.

Related post: Tech company hiring practices over the years