In Which The Author Conducts A Technical Interview On The Shoulders Of Giants

I was in the gig economy before the gig economy was cool, you know. In the past two decades I’ve been a consultant, a third-party service provider, a contributor, a self-employed technocrat, and just a plain old temp. Sometimes it’s been remarkably lucrative and often it’s been remarkably depressing. Sometimes both at once.

Last month I said goodbye to the last of my technical support customers and took a new full-time contract. I’m surrounded by very nice people there. The commute is short and the parking is free. It’s a little hard not to get depressed when I’m sitting in my cube and seeing all my various journo-friends and journo-foes traveling the world and enjoying all the perks that the business can provide. The best I can tell myself is that Hawthorne started his adult career in the Custom-House and Melville spent his final years there. At the end of every day I do not worry about whether I sold my integrity cheaply or whether I failed to fight for the truth on any given subject. Such things have no meaning or relevance in the eternal late-fall twilight of that seventy-four-degree flourescent office building.

I make no decision on any subject beyond the technical. There are a thousand dreams and ambitions in that building and there are people who are compensated to a truly stunning degree and there are people who spend their lunch hours in worried dialogue about bills and childcare but it matters not to me. I show up in the morning and I do my work and I leave and by the time the oil is warm in my CB1100 on the road home all thoughts of the job have slipped from my mind with that same light viscosity.

This past Wednesday I assisted one of the fellows in the office in conducting a short technical interview. It was my intention to ask a couple of bland questions and shut up, but things got a bit out of hand.


My first thought as I read the applicant’s resume: I really am older than I realize. He had more than a decade’s worth of experience in the business and two degrees from our decidedly non-prestigious local quasi-universities but he was still fresh-faced and he was sufficiently excited about “neat tech stuff” to include a link to his personal tech blog in the header, right below his address but above the link to his programming repository on GitHub.

For those of you with no interest in modern computing beyond that of the consumer — and I know that’s most of you here — I should give a brief description of how it works. Once upon a time, every company had a “mainframe”. This was a massive computer that was “time-shared” into thousands of tasks. You accessed the mainframe through a terminal, which was like a typewriter that could also type back. Eventually video terminals became common. The last place to get rid of video terminals was probably the ticketing counter at your local airport, by the way, because speed and direct connectivity is important when you have thousands of reps making changes in the system at once.

Mainframes were specialized beasts that required round-the-clock care and feeding by a group of specialized operators. They could be remarkably difficult people — turn down the volume on your computer and check out The Bastard Operator From Hell to get a fictional, but reality-based, idea of how it used to be.

From mainframes we went to VAX and UNIX systems. From there we went to so-called personal computers which became bigger and more powerful as the years went on. Eventually we got “servers”, which are computers that share a design with your laptop or your desktop computer but which are vastly more capable. Then we got “enclosures”, which have sixteen or so servers plugged into a single network and storage backbone. When I worked for Honda I was tasked with keeping the enclosures alive, which was remarkably difficult because they were hugely flawed and massively fussy machines.

The current trend in computing is virtualization. Each server runs simulations of multiple servers inside itself. When you hear about “cloud computing” and stuff like that, we are talking about virtual computers that are actually programs running on a real computer. There are pros and cons to this method.

For a while, people were satisfied with virtual computing. But then somebody had the bright idea of adding another layer of abstraction, called “containers”. Each virtual computer can run many “containers” and each container is a partial computer with a single task. Maybe it’s a Web server. Maybe it’s a game host. Maybe it’s a database query engine. But the idea is that each container is separate from the others and cannot “contaminate” the other containers.

So. Each enclosure hosts many servers. Each server hosts many virtual servers. Each virtual server hosts many containers. You’ve seem this all before:

The question is: why? Why do all of this? What’s changed over the past thirty years to require the nesting-doll configuration? If you ask the average marketing-bot jerkoff from LinkedInLand he will tell you a bunch of crap about scalability and convenience and whatnot but the real reason is this:

There are no more skilled operators, and even if there were skilled operators, nobody would hire them.

The operators, and their descendants known as “professional sysadmins”, were unpleasant people with strong signs of autism spectrum disorder who rarely had any respect for any other aspect of the company, particularly anything having to do with sales, HR, and marketing types. I know because I’ve been one of them. We viewed our job as preserving the environment. Period. Point blank.

A good team of operators could accomplish miracles. They could run airlines and banks with computers that couldn’t hold a candle to the iPhone in your hand in terms of power or speed. But woe betide the company executive who made plans and expected that the operators would bend to his whim the way the people in the marketing department or the sales department or the cafeteria did. We did not work that way. We served the machine, and we knew what the machine needed. Simple as that.

This philosophy was efficient and it was cost-effective but it did not account for the American corporate ego. So it was replaced by virtualization. With virtualization, and with containers, nothing has to be done right. You just throw hundreds or thousands of identical containers at the problem. Is your website slow because the code is lousy and the database is junk? Add containers. Do your agile scrum programmers write spaghetti that destroys systems and sucks resources? Just press a button and a thousand extra containers will solve the problem. The containers are standardized images, knocked out with no thought as to the efficiency or elegance of their operation. They solve problems the way Grant won the Civil War — through attrition.

The young man sitting in front of me for this interview was a demonstrated master of containers. He had recent and significant experience with all the trends and fads in containers, all the tools to manipulate them, all the environments in which they could be deployed. He’d worked with every kind of container-related tool or enhancement you could imagine. When he spoke, it was with an effortless flow of concepts and buzzwords related to containers. I was pretty sure that I could learn a lot about containers just by hearing him respond to difficult interview questions, but I seemed only fair to start with the easy ones. He said that he “loved Linux”, so that’s the direction I took.

“What’s the difference between Open Source and Free Software?” This central distinction is the most critical aspect of any discussion regarding Linux. But he didn’t know. He didn’t know the canonical definition of Open Source, which is that the programming code is available. He thought it meant that there was a community for a given program. Regarding Free Software, he’d never even heard the term.

“Okay… What are the seven OSI layers?” Every sysadmin knows what they are. You need to understand them to perform even the simplest network troubleshooting. But he didn’t know.

“Okay… Can you give me a sense of what’s in a packet?” He smiled and said, “Gosh, someday I’d like to learn that.” Ten years ago, if you couldn’t at least describe a packet in generic terms you couldn’t get a job as a $10/hr PC tech — but this kid was about to earn seven or eight times that.

“Can you explain the difference between SIGHUP and SIGKILL?” There was a time when you weren’t allowed to even use a video terminal without knowing those signals. But he gave me a genial response about how he’d always been confused by that.

I decided to switch up a bit and ask him a question about “pods” in the container program known as “Kubernetes”. He was articulate, knowledgeable, and correct, offering an explanation that surpassed my own understanding of the products. A few more questions showed that when it came to modern computing, he was spot-on. What bothered me was that he had no, and I mean zero, understanding about how computers actually work.

No, scratch that.

What bothered me was that he had no understanding about how computers work — and it doesn’t matter. I recommended him for the job without hesitation. I think he’ll be better at this particular gig than I am.

When I started programming a TRS-80 in 1980, it was expected that any potential or aspiring computer user would acquaint himself with the innards of the machine, from the assembler language used for programming it to the individual microprocessors under the hood. As one of the early Linux sysadmins and advocates in the nineties, I had a working knowledge of the C programming language that was used to build UNIX. Without it, I would have been unable to understand and troubleshoot many of the problems I encountered. Even as recently as 2012, my mildly intimate knowledge of networking was absolutely crucial to solving some very thorny problems that were costing my client millions of dollars an hour.

In 2017, that no longer matters. Turns out that while the so-called geeks and nerds weren’t looking, there was a war being fought. The purpose of that war was to take control of the environment away from technical people and put it into the hands of non-technical people. Nowadays, you order computing resources the same way you order any other bulk product for your corporation. It costs more than it used to, and it doesn’t work quite as well, but what matters is that you’ve freed yourself from the tyranny of the operators in the basement.

Instead of having mysterious computers controlled by people who often literally referred to themselves as “wizards”, you have IaaS, Infrastructure As A Service. You turn a crank and the computing stuff comes out, like meat. It’s flavorless and characterless and it’s only just as good as it needs to be, but you can adjust the volume of the delivery and it’s all done through control panels on a webpage. Yes, there are still wizards out there, but they work at the infrastructure companies and they are sourced from overseas if possible so the balance of power is on the side of the contracting company that holds their visa.

Sitting at that interview table, I had a flash of self-understanding. I realized that the reason I’d always cared so much about hand-crafting in suits and shoes and bicycles and everything else was because I saw myself as an artisan of sorts, crafting delicate and masterful solutions in code or language. Most of it would go unnoticed by my users or my readers, but every once in a while a fellow sysadmin or classically-educated reader would recognize what I’d done and I would have that thrill of knowing that my efforts to provide the best work possible had been recognized.

Well, those days are mostly over. Nobody gives a shit about optimizations or elegance in computing anymore. Not when ten thousand dollars a month can get you more processing power via IaaS than the entire world had in 1980. We stand on the shoulders of giants, at the apex of the work done by a million brilliant men and women that, in the end, allows us to deploy an insurance-shopping app more quickly by pressing an infrastructure button a few more times.

The end is likely coming for excellence in writing in general, and autowriting in particular, as well. AutoWeek recently did a “30 under 30” issue that showcased some utterly brilliant photography and some mind-numbingly poor articles that wouldn’t have passed muster in a high-school composition class circa 1985. If this is the best of what the next generation has to offer than we might as well all just call up YouTube and let the various idiots there just defecate directly into our brains. I remain utterly gobsmacked by just how bad these Millennial writers are. The vast majority of them are functional illiterates whose internal library of learning, wit, and allusion contains nothing but an endless vista of dust-caked empty shelves.

There isn’t much I can do about the first of these two situations. Whatever chance I had to alter the direction of computing fell apart when Spindletop fell apart after September 11, 2001. I was affiliated with those fellows and I thought we were going to make a difference but in the years that followed I decided I’d rather make a buck, a decision by which I continue to stand.

With regards to my latter complaint, however, I believe I shall stop cursing the darkness long enough to light a candle. It’s my intention to conduct an online workshop for young would-be autowriters over the upcoming winter. There’s plenty of talent out there; it only requires a bit of direction, a touch of guidance. I do not yet know how thorough or extensive such a workshop should be, nor the precise form which it should take, but something must be done and I’m just the man to do it. Confined though I may be to the daily grind, there is no reason why I cannot grind exceeding fine, so to speak. Stay tuned.

78 Replies to “In Which The Author Conducts A Technical Interview On The Shoulders Of Giants”

  1. Ronnie Schreiber

    Corporate America has decided that B work is good enough. How much that has to do with political correctness and HR driven diversity stuff is open to debate, well where they still allow debate on such things so you can’t say that while working for Google.

    Of course, the schools are putting out such ignoramuses that lately, a couple of times when I’ve handed a cashier a big bill and some coins so I can get just bills back in change, they get confused.

    It doesn’t really bug me that autowriters can’t write. What Zappa said about rock journalism probably applies there too. The Setrights and Baruths are the exceptions, let’s remember that. Still, it annoys the hell out of me that idiots like Jim Acosta and Brian Stelter can make six figure salaries blathering on CNN, and don’t get me started on the $600K that NBC paid Chelsea Clinton during her stint as a journalist.

    My father, alav hashalom taught me and my siblings that no matter what we did, we had to do the best we could. Does anyone still think like that?

    I was in Mike Kleeve’s shop, Automotive Metal Shaping, near Port Huron, watching him work on the body Stewart Reed and his students designs for Peter Mullin’s Bugatti 64 chassis. Can someone who thinks they can manage by “containers” appreciate Ettore and Jean Bugatti’s passionate engineering? I was there for the better part of a day and some day I’ll publish something with both the 90% completed body and the finished project at the Mullin Museum, but I’m still not sure that I can do Mike and his team justice. For most of the time that I was there, one of Mike’s techs was using an Exacto knife to shave the recess for a piece of chrome trim on the headlight surround of a Zagato bodied Aston Martin DB4. He spent hours making sure that the trim fit perfectly flat.

    It’s not just handcrafted and bespoke stuff that’s disappearing. Well made manufactured goods that last longer than a couple of years and can also be repaired are disappearing. Just try to get your new whizbang clothes washer fixed in 5 years. Thank you but I’ll keep using a 30 year old machine that can be repaired.

    Reply
    • Disinterested-Observer

      I pilfered an original “Swing-a-way” can opener from a hotel 20 years ago. The rivet has only just started to loosen. Contrast that with the shitty, made in China, Oxo crap ubiquitous at China, China, and more China I mean Bed, Bath, and Beyond. Of course Swing-a-way isn’t even made in the US anymore.

      Reply
    • hank chinaski

      They prefer B work because they don’t want the baggage that comes with A work, whether it’s the basement coding wizards, the old school chief surgeons or the Sully-class airline captains. You are to be compliant, replaceable and powerless, else branded ‘disruptive’.

      Reply
      • Ronnie Schreiber

        I’ve told bosses and potential bosses in job interviews, “Don’t hire me because you think I’m smart and then expect me to be stupid when you’re wrong.”

        Do I have to say that I’ve been fired from just about every job I’ve ever had?

        Reply
    • McG

      Corporate America has decided that B work is good enough.

      In non-STEM fields at least, they kind of had no choice, given what’s been coming out of the “legitimate” diploma mills the last two or three decades.

      (Full disclosure: my own non-STEM degree dates from 27 years ago. I still think I learned very little in college, but graduated with a B average based on what I picked up outside of class.)

      Reply
  2. Kevin Jaeger

    Your reminiscing about the good old days of computing and its evolution is interesting but just a sign that you’re getting old. It’s really just the constant automation of routine tasks and every generation rails against the advancements as inefficient and a failure to respect their skills.

    When high level programming languages were developed the assembly language programmers railed against them as inefficient and swore they could write better code than any compiler could put out. But very few people need to know anything about assembly language any more. The same evolution has happened to filesystems, databases, networking, operating systems and clustering. It’s true that you could optimize the protocols and computing platform of a modern application like Ebay or Amazon and run it all on a fraction of the hardware and bandwidth but literally no one is interested in doing anything like that.

    Once upon a time we valued the skills of a mechanic who could adjust a carburetor and tune the timing of a distributor but sensors and an ECU have automated that process and made such skills obsolete. The same process is happening to your sysadmin, DBA and low level programming jobs and no one will miss them when they’re gone.

    Reply
    • Sseigmund

      There is still a place for a “skilled Mechanic” even if carburetors are a thing of the past. It’s called ECU calibration and it’s something of a black art and virtually no car owner would dare to venture into that dark world of a hooking up a laptop the OBD port. Why fear it? I find it very frustrating that the the traditional performance magazines are still living in the pre fuel injection and ECU era and simply do not discuss calibration. They may figure the the readers are not interested or not technically capable enough to understand the subject matter, but I suspect the writers are the ones with a fear of the new and different. I used to love Hot Rod and Popular Hot Rodding, and noting seemed to be off limits in terms of tuning and modification. Yes cars were simpler back then, but in context of the technology of the day they were no different than cars of today relatively speaking. The once interesting Engine Masters Challenge has virtually eliminated the LS engine family and Gen III HEMI following the assimilation by TEN: The Enthusiast Network in 2014. The line up in the last EMC was mostly a bunch of dinosaur engines many of which were out of production when I was in high school. Even NASCAR is having a hard time living in this automotive past, and has had to allow radical new technologies like, OH MY GOD, fuel injection !!! Even an old bastard like me can be pretty handy with a laptop and some information. Just show me how! I’m getting older but I’m not stupid. There is a big opportunity in this arena, and somebody needs to write about it.

      Reply
      • Kevin Jaeger

        That’s true, calibrating an OEM ECU or replacing it with an aftermarket ECU like Megasquirt is certainly a valuable skill, but it’s limited to some pretty specific applications. It’s pretty rare a typical mechanic would be involved in stuff like that unless he’s dealing with enthusiast or classic car restomods.

        Sometimes the old skills become even more valuable in a much smaller niche.

        Reply
    • Jack BaruthJack Baruth Post author

      Believe it or not, I was going to discuss precisely that but I didn’t want to bump up against the 3,000 word TLDR restriction that affects even the most committed reader on the Web nowadays.

      The problem with what you’re saying is that much of this inverted pyramid relies on really old stuff, like the GNU version of ‘ls’, and without anybody who knows how to work on it we are headed for an odd version of Idiocracy where people will keep sending API calls and saying “IT DON’T WORK.”

      Reply
  3. digitaleopard

    There are still a few places where you get to be a craftsman in the industry, but it’s mostly in highly specialized areas, and they get commoditized as fast as possible. The writing workshop sounds interesting, keep us posted.

    Reply
  4. phr3dly

    This post strikes a nerve for me. I am acutely aware that, as a mid-40s developer who still writes code (often in perl), used internally, that runs on Linux for years without a restart, I am living on borrowed time. Heck, the Synology NAS I use at home has container support via docker.

    And yet I find that, at my age, I really don’t care to learn a significant new technology. I wish I did, but I’d rather do a track day or ride my mountain/road bikes.

    So instead I hope that I can continue to milk my legacy code for another 5 years or so before I can comfortably retire. Some days I imagine retiring now and trying my hand at mobile development or what not. But I don’t have the energy for that stuff that I did was I was 25.

    Reply
    • Rod Jones

      Mid 40’s and you dont have the energy? Hell Im about to turn 65 and I can run circles around most 30 somethings. One of my best friends is 67 and he still does top secret programming for satellite systems and other govt things as a contractor. That is something I could never do but he could never build tall commercial buildings like I do.

      Reply
  5. Norman Yarvin

    Efficiency has indeed largely gone away as a reason for not building your project on top of a leaning tower of other people’s software, but security still applies. When someone is trying to break in, those details under the hood suddenly start to matter. Of course who needs security, these days? Certainly not the DNC — not when they can just run crying to the press and put all the blame on the big bad hackers.

    Reply
  6. phr3dly

    Forgot to mention; in the last 6 months a really bright younger coder has been approaching us about implementing our code as microservices in containers….

    Reply
  7. Sseigmund

    Jack, are we still going to see your name in R&T, TTAC Hagerty and other publications in the future?

    FYI, I have no ambition to be an autowriter, but I would be onboard for your workshop to sharpen up my writing skills. The effort will be greatly appreciated.

    Reply
    • Jack BaruthJack Baruth Post author

      I certainly hope so.

      I’ve always worked at least three jobs and 70-plus hours a week, that’s not going to significantly change until I run out of employ-ability or my son no longer needs money. ๐Ÿ™‚

      Reply
  8. Nick D

    That reminds me I need to update the story I sent in as it’s a bit stale.

    While I’m not a programmer, my business school required everyone to take a 3 credit hour Excel programming course – the most practical and valuable 3 hours I’ve ever spent. One project involved setting up a fairly complex set of nested logical operators. I spent about twice as long, but wrote about 1/3rd as much as everyone else. I didn’t get any extra credit for it and my instructor said I was the first one to lean out the solution, but my instructor sent me a Gmail invitation when those were a hot commodity, which was nice.

    The brute-force-ization of everything and boiling away of common sense creates real downstream effects in my line of work. We’re heavily regulated, and our crack team of compliance (broadly speaking) experts always believe a new, poorly written, ambiguous, internally conflicted policy solves everything. ‘If the rule says X, then we can say X^10 to be 10x better!’ (see what I did there?) is the prevailing philosophy.

    Except when they have to deal with the fruits of their policy-spewing labors and realize they need 1,500 temps to do meaningless paper shuffling that neither enhances any sort of compliance nor adds any value to anyone and is solely necessary because of the lack of a few hours of critical thought on the front end.

    One of things I really like about racing is learning how to understand, diagnose, and fix (not just replace parts, although that is sometimes the most expedient way to address a problem in the middle of an enduro) a semi-modern vehicle. OTJ learning and thousands of hours seems to be the only way to gain proficiency. I will never approach that, but with a few hundred hours I hope to be able to teach my boys how to twist a wrench, and most importantly, just ask “why.”

    Reply
  9. Anthony S

    Good article. I gave up on computers after a semester of C++ and COBOL

    Where does Moore’s law fit into all this?

    Reply
    • Jack BaruthJack Baruth Post author

      Strictly speaking, Moore’s Law broke half a decade ago and the increase in processor speed has slowed to a crawl as attempts to further miniaturize processor architecture starts to bump up against the laws of physics. That’s why I’m typing this on a quad-core i7 laptop — because the illusion of Moore’s law is being held up by using parallelism.

      Reply
      • Anthony S

        I worded that badly, what I intended to posit concerned whether finesse and skill in programming (think of the thought and effort going into what they crammed onto the Apollo GC or something like Elite (yes, the juxtaposition is stupid)) has been usurped by “cheap” computing and relatively brute force/simplistic methods like containers.
        Forgive any crappy misuse of nomenclature. computer-wise, I am wilfully ignorant.

        Reply
          • Ronnie Schreiber

            Meanwhile, a couple of times a day I have to use the task manager to tell Chrome to stop allocating all of my computer’s memory to Java or Flash or whatever memory sieve is slowing things down to less than a crawl.

          • silentsod

            I would disagree – people worry about resource consumption when it becomes an issue, it just isn’t always an issue. There are obvious performance pitfalls that should be avoided (quad nested loops, etc) or just techniques that are missed (C/C++ you can change how 2D arrays are accessed to speed up loops by ensuring a higher cache hit rate) with this mindset. The tradeoff is “more productivity” in terms of SLOC even as overall code quality may suffer. Now that raw computing speed isn’t increasing as much we may see a return to fast code being emphasized partly because writing properly parallelized operations is a more difficult problem than writing fast code in the first place.

          • Jack BaruthJack Baruth Post author

            I hope you’re right. Quality, in the Pirsig sense and in the other senses, should always be welcomed.

          • Peter Voyd

            Perhaps true in “infinitely scalable” domains like IaaS, but there are still plenty of applications where Wirth’s law still applies – software is getting slower more rapidly than hardware becomes faster.

          • Daniel J

            Resources are cheap right now and many compilers can optimize for speed better than most can write code for speed. The exception is video, especially high bandwidth video whether at the SMPTE SDI side or compressed h.264/h.265 side. FFMPEG is written entirely in ISO C99, and works. Why hasn’t there been a Java or Csharp equivalent? My guess is that many Interpreted languages and JIT compilers can’t compute that fast, regardless of resources.

  10. -Nate-Nate

    Sharing your writing skills is a very good thing Jack .

    Still waiting for you to write about painting a fence .

    I’m one of those Computer Illiterati (SP) .

    -Nate

    Reply
  11. Aoletsgo

    I started out coding on a Trash 80 also! For me after a few years I realized that I could not see myself doing this the rest of my life and went off in another direction. The old guys I worked with way back then, were truly master craftsman. They would rewrite clean, elegant code that I could only dream of writing because it took up too much memory.
    Now I am just fine with being a user who is a little bit above average on the technical skills.

    Reply
  12. jz78817

    I couldn’t code my way out of a soaked paper bag (if I try to write even a “Hello World” program I’ll blue screen every Windows box in the vicinity) but even I know the difference between “Open Source” and “Free Software.”

    Reply
  13. VTNoah

    Sign me up for the writing workshop. I’ll be a willing participant. Not sure I’ll be an autowriter but I having solid writing skills is a necessity for me.

    Reply
  14. silentsod

    There are two schools of thought around approaching programming, sysadmin, or other technical areas. One is that which you expound and endorse which is a metal up approach – the other is to know what you need to know to get the job done and nothing more. I believe the former leads to a richer understanding (I tell everyone who will listen that I think strong fundamentals are key to success in any field) and it is something I pursue provided time. However for everything else I am willing to accept the abstraction that my mental models are good enough and I don’t need to know exactly how, for instance, to make ketchup at a mass scale and put it in tiny packets of which a single one will never, ever be enough for even a small side of fries.

    Reply
    • silentsod

      Are good enough* in the sense that they don’t interfere with function or operation. Unlike people who think household HVAC is the same as automotive HVAC and crank things way up or way down thinking it’s “more” or “less.” It’s either all the way on or all the way off for 99%+ of houses.

      Reply
    • Ronnie Schreiber

      Those tiny packets are perhaps the most wasteful method of food packaging there is. I took some weight measurements before and after squeezing them out, then I washed out the packets and weighed again. Between 10% and 15% of the ketchup in those packets is wasted. You have to squeegie out the container with a flat edge to get anything more than 90% out.

      Reply
  15. galactagog

    Goddammit why doesn’t anyone use FORTRAN anymore??

    Get off my lawn….

    But seriously…I see a parallel with this, and how modern life is so far removed from our planet and the effects civilization has on it.

    Maybe it will all come back, to bite us in the ass, in the end. From the backend.

    Reply
  16. Orenwolf

    Jack,

    Someone is still managing the infrastructure under that IaaS. As someone who was once responsible for managing the team behind Wikipedia and the Wikimedia foundation and now deals with, among other things, extremely time-sensitive packet delivery, I can state with some authority that “sysadmins” are still a dearly needed resource.

    Like journalism or driving I’d imagine, sooner or later you have to “get serious” and make a push out of contracts to a *career*, or accept you’ll ever get past where you are. I’ve always had an amount of regret that some combination of your location, or chosen priorities, or timing, or whatever took that choice from you. Of course only you are able to speak to that. But you strike me as someone who would have been a real asset on a dedicated infrastructure team somewhere (whether or not that is at all relevant, of course, is entirely for you to determine!).

    Reply
    • Jack BaruthJack Baruth Post author

      There’s a simple but embarrassing answer.

      I never wanted it badly enough. I never really wanted to be a “computer person”. I sent a few friends to Google from the Midwest, helped them with their prep and so on. I worked with teams at MIT. But at the end of the day I never wanted to wake up as somebody whose primary identity was technical. I have nothing against those people and I don’t feel superior to them. I just don’t want to count myself among their numbers. Don’t want to ride the Google Bus, don’t want to put in 100-hour weeks, don’t want to ruin my tendons the way jwz did to put Netscape out the door, don’t want to retire to Portland with fifty million bucks at the age of fifty and realize that every single thing I ever did with my life would, if described accurately, cause a twenty-five-year-old woman to roll her eyes and smirk.

      Isn’t that stupid? You don’t have to answer. It’s stupid.

      Reply
      • Kevin Jaeger

        Well, I certainly don’t think it’s stupid.

        We make our choices in life based on our priorities and interests at the time. I passed on opportunities in Silicon Valley, too.

        I sometimes have second thoughts but those moments pass. It’s not like I’m poor, after all.

        Reply
  17. WheeTwelve

    I’m amazed how often you write about things that are on my mind, and from a very similar perspective.

    Please keep us posted on the workshop. Hopefully you can make it generic enough for those of us not specifically interested in auto writing.

    Reply
  18. Bill

    I always learn something reading your writing; even more so when I know nothing about the subject. Approaching retirement I must be the last generation that got through a “career” without any serious computer knowledge. In fact, I actively avoided learning. To gain that knowledge would condemn me to a cubicle. In the end the cubicle won.

    In general protest I am now going to set the timing on my Triumph with a matchbook cover.

    Reply
  19. Shortest Circuit

    Ah, this nice new “goodenough” computing. Nothing needs to be optimized anymore. Just be agile and get it out of the door.

    You know recently I read an article about viruses that can infect the BIOS, and be resident forever (or until someone knowledgeable enough gets an eprom burner) – a concept which to be honest, equally impressed and infuriated me. A good virus by (admittedly, my) definition isn’t too big, so it does not take up much space, but the article described a fairly competent piece of program, that modifies the MBR to bootstrap a piece of code downloaded from an unregistered IP address. I’m thinking about network stacks, sophisticated I/O, and I can still recall my first PC XT (yuck) having a 256k BIOS (ROM of course). So I got a bit interested and checked the BIOS chip in my current motherboard… it is a 8MBit flash ROM. I imagine a lot will fit into that without arousing too much suspicion. Maybe I should catch up… too bad it makes me happy to see how much crap I can get into a 8-16-32k microcontroller ๐Ÿ™‚ So I just dug up and old C64 from my stash, got the O-scope out, and a bad hex inverter and a color RAM later, I got a fully working 1MHz beast.

    Now imagine I have to do the same thing as you, interview newbies for our company. Sometimes my head turns how much they don’t know, and how confident they are. No worries though… as long as we have enough CPU cycles and “IT specialists” to throw at the problem, we’ll be all right!

    …right?

    Reply
    • hank chinaski

      Re. BIOS level virii:
      Not too surprising given how fat the current GUI BIOS seem. I was futzing with a system last year and found an option to make the damn thing phone home and flash itself, and it worked. IIRC some may even do this in Windows.

      aside: I’m sure the play by play on the brodozer v. MX5 would be appreciated here, if too long for the R&T piece, Jack.

      Reply
    • jz78817

      BIOS is old & busted, you know. it’s all about EFI* now.

      and in many cases it’s hard to care about resource consumption when it costs relative pennies to load my desktop PC up with 32 GB of RAM.

      * Extensible Firmware Interface, not Electronic Fuel Injection

      Reply
  20. Daniel J

    Yeah..I’m going to have to comment on this. As an embedded/Electrical/Computer/Software engineer (I’ve done a bit of it all), I even know that trends and the direction of software is changing.

    โ€œWhatโ€™s the difference between Open Source and Free Software?โ€ This central distinction is the most critical aspect of any discussion regarding Linux. But he didnโ€™t know. He didnโ€™t know the canonical definition of Open Source, which is that the programming code is available. He thought it meant that there was a community for a given program. Regarding Free Software, heโ€™d never even heard the term.”

    I don’t really believe that know this is all that important. Open Source and Free Software has been debated for years, and their definitions have changed and evolved over the years. In many cases, Free Software and Open Source software overlap. GNU has driven the Free Software “ideology” and the Open Source Initiative has driven the open source “ideology”. I’ve used and modified software that was defined as free software an it was really open source, and the same goes for what was stated as open source was closer to free software. I can tell you that most engineers don’t even really know about the open source initiative, and just “assume” open source means the source code is available.

    In regards to OSI layers, its barely touched on in Computer Science classes. Windows .NET pretty much abstracts all of the lower levels, from 4 down to 1. Most are only interested in the application layer and Presentation layer. Part of the problem these days is that a) Most software engineering is done on Linux OS or Windows and B) there is very little need for lower level optimization for most engineers. I’ve only done this because I’ve been doing embedded engineering for years, and many of the products I’ve worked on had minimal OSes or the BSP software was absolute junk. ATMEL/Freescale/Qualcomm come out with processors every day with new chipsets and new registers requiring new or re-writes of the lower layers of the OSI.

    Packets. Now thats funny. I doubt most new software engineers know what an ethernet packet is. Unless they are doing low level work, as long as the data is getting to the application layer, its not that important to know all the details. I do think its important for most engineers to know to view the “data” in the packet as it comes across the wire using something like Wireshark.

    I doubt anyone with much experience with Linux is going to know SIGHUP. SIGKILL and SIGALARM are still used heavily. But these sorts of questions start begging the question about whether software engineers even know what POSIX is. Probably not as most are using/learning Python, Ruby, Java, and Csharp. While many of these languages support POSIX, most of the features are abstracted out.

    What I’ve learned in the last year is that Software Engineering is moving farther and farther towards abstraction. The idea or philosophy is to solve the problems at the highest of levels without knowing about low level computer instruction so that code is “easier to read”, “Scalable”, “more maintainable”, and “portable”. The irony is that the more the language supports these abstractions, the more dependent (especially in the case of .NET) on the framework and OS which means less portability.

    So the bottom line is that you are sadly right. Software engineers really don’t have to know how a computer works anymore. As long as the the patterns and containers and models run, who cares? Need it to be faster? Hardware is cheap. Race conditions? They rarely exist at the high level anymore.

    I fear as I get older and if I don’t get on this bandwagon, I’ll be out of a job.

    Reply
    • Jack BaruthJack Baruth Post author

      You and me both.

      The only reason I think I even have a chance of working a decade from now is that software and systems have now become a complete ghetto for people not bright or motivated enough to work on AI or data science. So there might still be room for a fast troubleshooter in the decades to come.

      Reply
      • Daniel J

        C and C++ programmers are going to be a commodity in 10 years. Very few new products are using these languages compared to other languages as a whole. Some of it is overseas ( Texas instruments and Qualcomm) and is pure crap.

        Hell, I was floored when I saw that the comma.ai was written in Python.

        Reply
    • WheeTwelve

      I’d like to nitpick, if I may.

      I’m all in favor of code that is “easier to read,” “more maintainable,” “portable,” and “scalable.” Where I disagree with your comment is that most of the code today is *NOT* written that way. Because to meet any of those targets, someone has to sit down, and think things through first. Yet no one is given enough time to do exactly that.

      This is why we have things like agile, sprints, epics, etc. So we can tack on last-minute features and capabilities that no one has thought about previously, because no one spent any time thinking about it in the first place.

      All the abstract features languages are being given these days are meant to make it easier to meet the agile deadlines. I would argue that about 80% of the C++ features are not necessary. A properly trained and educated engineer can implement those features as-necessary, and usually more efficiently than what the language currently provides. Unfortunately, properly trained and educated engineers are too expensive, and therefore no longer valued. Plus, they take too long, mumbling something about ENGINEERING things, instead of just patching stuff together.

      No, those language features exist so that keyboard monkeys (much cheaper) can slap on a few template classes at the last minute with insane performance penalties. But you can “ship it” when the “epic” said you would. The end result is less readable, maintainable, portable, and scalable than the worst-written PERL script. But hey, it was cheap, and it was fast.

      At a previous job of mine, they replaced the entire web/back-end team THREE TIMES, and ended up completely re-writing all of the code, because the new team couldn’t figure out what the previous team had done. There were no specifications written down, no design documents, nothing. It was amusing to watch, given that I was working on something else.

      So, I would argue that readability, portability, scalability, and ease of understanding are all good goals. I would also argue that almost no software today is written with those goals in mind.

      Reply
      • Daniel J

        I’m not saying that all is written in such a way. I’m pointing out that managers are wanting software engineers who use patterns, abstractions, dependency injection, etc…

        Whether these are getting implemented? That’s up to the managers. You are correct in that most higher ups believe anyone out of college can write code and a deliverable product, regardless of how scalable or maintainable it is.

        Procedural/Imperative programing will be a thing of the past because too many people think it can’t scale and is hard to read.

        Reply
  21. Yamahog

    Insightful. I almost want to share this with my DBA team, they say “yamahog, write more efficient queries” and I say “give me more resources you jive turkeys”.

    They’re right – I can write more efficient queries and exert more effort into being a good user. But my compensation is more neatly aligned with my powerpoints than the queries that support them. And that’s the rub of economic activity (or at least value additive / value creative activity): there’s a transformation. I don’t think humanity will transmutate lead into gold any time soon, but if we could, I’d expect a corporation to be the first to do it and the first to try to cut corners (hey, we can transmutate rusted iron into 18k gold, that’s good enough!). My B- grade queries become A insights (or so I like to think) and then we execute against it with B+ work.

    Don’t hate the player hate the game. Our status quo reflects our societal choices and values. For some reason, folks would rather play with A.I than figure out how to implement ever more efficient sorting algorithms. Probably because knowing how to deploy neural nets has more interesting application than sorting a 1 million element list on a Ti-84’s hardware. Is that so wrong? Right now, I’m trying to use my computer to generate more black sabbath type music (using the first 4 albums as source material) – it seems to be the easiest path to creating value for myself and like-minded others.

    So what shakes us from the malaise? Creative destruction. How do we get that? Well, it’d be easier if we didn’t throw so much welfare at the status quo.

    Reply
    • Jack BaruthJack Baruth Post author

      “But my compensation is more neatly aligned with my powerpoints than the queries that support them”

      Most truthful thing I’ve heard in tech since just about forever ago.

      Reply
  22. Disinterested-Observer

    Jack-I hope you see this even though this particular horse/article has already taken a whipping. Since you mentioned the failure of Spindletop I would love to know what is your perspective on the SCO/Novell lawsuits. I think at the time it played a big part in keeping large desktop installations away from Linux, but in the end, like the Kaiser paying for Lenin’s train ticket to Petrograd it may have backfired on Microsoft. Microsoft may have held the desktop market but where I sit they are getting killed in the server/cloud/virtual space.

    Reply
    • Jack BaruthJack Baruth Post author

      You’re not wrong. Microsoft was tirelessly fighting an irrelevant battle. Linux on the desktop was never going to get anywhere because most desktop users didn’t want any hassle whatsoever. They wanted Windows or Mac OS. And then… they wanted phones.

      Microsoft now has almost exclusive control over the desktop both business and personal, but nobody cares. The future is phones and tablets connecting to containers that are hosted on Linux variants because it’s fractionally cheaper than Azure. ๐Ÿ™‚

      In a world where smartphones didn’t happen I think we’d still be talking about the impact of the legacy legislator corporations but as you pointed out it’s irrelevant now.

      Reply
      • Disinterested-Observer

        Well, in keeping with your wizards versus good enough analogy, the end user was never responsible for his or her desktop anyway. If you get in the wayback machine there was no reason to think that 95, 98, or ME was any more intuitive or reliable than whatever Redhat had at the time. Without giving up too much PII I will say that whatever our respective ages are, you have been in the business a little longer than I have, yet when I started we still had large, shitty, vterm100 environments to support and the user didn’t know any better. I never did understand why a decent wizard (not that I am one, but I can’t ride a skateboard either, that doesn’t mean it’s not possible) could not have had his users running staroffice on some gui flavor.

        Reply
        • jz78817

          “there was no reason to think that 95, 98, or ME was any more intuitive or reliable than whatever Redhat had at the time.”

          Um, yes there was. Windows 95 would give you its GUI even if you didn’t have a graphics driver installed. On the other hand if you didn’t have one of the small handful of graphics cards supported “out of the box,” getting X up and running on Red Hat could be an arduous task. I had a 3dfx card and that was borderline pointless. and if you did manage to get X up and running its default window manager was fvwm95 which was a horrid clone of Windows 95. Installing software with rpm was an endless maze of chasing down dependencies.

          hell, simply running anything from CD was way easier in Windows. pop in the disc, it shows up in Explorer. On Red Hat 5, pop in the disc, figure out which /dev/ device it was, then su (you had to be root) and mount -t iso9660 /dev/(whatever/ /mnt/cdrom.

          real intuitive.

          Reply
          • Disinterested-Observer

            Should have clarified, I was referring to corporate installations, not consumers. I think the SCO lawsuits really scared off a lot of potential business users who, at least back in 2000, would likely have some on-staff wizards who could have made Linux run well in that kind of environment. Corporate users should not be allowed to use the CD drive or fuck with the graphics driver anyway.

  23. malcolm from south carolina

    Jack, Matt Shapiro argues that the glorification of the workaholic everyman programmer has only made business owners more wealthy. I wonder what your thoughts are as someone with more years in the tech sphere.

    Thanks

    Reply
  24. the passenger

    You’ll forgive me (I hope) for not already being familiar all the details of your personal biography, but as a fellow lit major I am curious about how you ended up in computing.

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *