As I See It: The Good, the Bad, And The Mistaken
January 30, 2023 Victor Rozek
Writing a comprehensive history of IT would prove daunting if not impossible. The milestones are well documented, but not the steps between. From the legions of unsung programmers who developed the systems we take for granted, to the effects of social media that are both private and collective, IT is a complex swirl of the good, the bad, and the mistaken.
Since hindsight lends itself to clarity (albeit flavored with arrogance), let’s start with the mistaken. The grandest of all IT prediction fails belongs to none other than IBM’s legendary Thomas Watson. Back in 1943 he opined that, “There is a world market for maybe five computers.” In his defense it should be noted that the computers he knew were Winnebago-sized, operated by engineers in white lab coats, and cost somewhere on the order of $6.8 million in today’s cash. Then again, in 1927 HM Warner of Warner Brothers wondered, “Who the hell wants to hear actors talk?” I ask myself that same question every time I see Ice-T on an episode of Law & Order.
Of course, Watson wasn’t alone. Ken Olsen, founder of Digital Equipment Corporation, believed, “There is no reason anyone would want a computer in their home.” That was in 1977. More recently, in this century in fact, Bill Gates assured us,“Two years from now, spam will be solved.” If he wasn’t referring to the canned meat, he should have known better.
Reality has an annoying way of undermining predicted outcomes – just ask crypto investors. It’s probably safe to say that IT exceeded just about everybody’s wildest expectations, packed as it is with countless benefits and boatloads of unforeseen consequences.
What started modestly, according to the 1981 version of Bill Gates – “640K ought to be enough for anybody” – became aggressive and invasive, like silicon-based kudzu. It spread until it covered the planet. From smartphones to satellites, an atmospheric river of microchips flooded the Earth, and omnipresent Clouds now blanket the globe.
It’s curious that something so visible and universal as IT has a paradoxical relationship with anonymity. On one hand, Information Technology benefits from the anonymous creativity of generations of engineers and developers. We’ll never know the names of most of the people who fueled the Information Revolution; the teams of coders, the lab technicians, the chip designers, the inventors, even as we rely on and build upon their labors.
So I’d like to give a personal shout out to two scientists who have made my life infinitely easier. They created a small but miraculous word processing feature that I have taken for granted for decades, but makes the process of writing infinitely more manageable – the ability to copy/cut and paste. Their names are Larry Tesler and Tim Mott, and they worked for Xerox back in the 1970s where they first implemented the feature. Thank you, gentlemen. In the interest of full disclosure, I’m also exceedingly grateful to whoever pioneered spellcheck.
On the other hand, technology enables and amplifies the worst aspects of anonymity: from hackers and cyber criminals, to state surveillance and endless cyber war. Then there’s the pervasive rudeness, the vitriol, the ad hominem attacks, and the spread of false narratives poisoning social media – much of it hidden behind a curtain of anonymity.
Equally paradoxical is the way technology promotes both connection and isolation. It is miraculous that we can communicate so readily with friends and family all over the globe; to share sorrows and celebrate successes, knowing we are but a few keystrokes away from kindness and empathy.
And yet, for many, the connective tissue of social media leads only to profound isolation. The dutiful response of empty emojis, or the lack of any meaningful response to cries for connection become painful reminders that there’s no “there” there. And while the contact is virtual, the isolation is real.
A further contradiction is the way computer technology promotes resourcefulness while simultaneously creating dependency. The smartphone has become the new evolutionary appendage. Hardly anyone leaves home without it. Having much of the world’s knowledge at your fingertips is heady stuff. I may not know it all, but I can find most of it. Likewise, the ability to order one’s life with apps from Amazon to Zoom provides us with a level of resourcefulness unimagined just decades ago.
But use Google Maps for a time, and you’re likely to recycle your old paper maps. (There are, in fact, entire generations who have probably never used paper maps.) Get out of satellite range, and you’re lost. Lose cell signal, and you’re helpless. Lose power at home, and life as we know it stops. Note the outcry in response to the latest pastime of the maladjusted: Shooting up power substations. People do not like being disconnected from their technology. Simply stated, our dependence on technology makes us vulnerable to its loss.
Our trove of online personal data, accumulated out of necessity and convenience, makes us vulnerable to exploitation, from Nigerian princes to Social Security scams. Why anyone would buy crypto currency or NFTs is beyond my ability to explain, but technology has the power to make us believe in things that don’t exist – like stolen elections.
Finally, there is the blessing of shiny, new, faster, more capable technology juxtaposed to the mountains of e-waste generated annually – an estimated 50 million metric tons globally. I recall seeing news clips of children atop giant mounds of discarded devices, sorting for precious metals without any protective gear.
Sadly, e-waste can be toxic. It is not biodegradable and accumulates in the environment – in soil, air, water and living things. Practices such as open-air burning and acid baths used to recover valuable materials from electronic components release toxins that leach into the environment.
It’s another of technology’s many unintended consequences, but in spite of numerous predictions of its limitations and pending demise, IT endures and flourishes. It remains flawed, stubborn, and indispensible.
“I have traveled the length and breadth of this country and talked with the best people,” the editor in charge of business books for Prentice Hall, once decreed, “and I can assure you that data processing is a fad that won’t last out the year.” That was in 1957.
So as we enter another year of prognostication, creation, and consequence, it may be useful to embrace a degree of uncertainty. Whether our beliefs are predictive or dogmatic, it’s perhaps best to follow the advice of Joseph Nguyễn who famously said: “Don’t believe everything you think.”
Paper maps. I was describing, to my daughter, what driving vacations were like prior to GPS. “We had to navigate using maps”, I said. She looked at me weirdly. “Paper maps? Like a pirate?”.
HAHAHAHAHAHAHA!
Your daughter wins joke of the day. Just perfect!
I got paper maps for Christmas, in this case the set for the complete Appalachian Trail, and I still keep a set of laminated truck driver maps for big trips because I tend to wander out of cell range and follow my nose on backroads. Being “lost” is a kind of joy, not knowing what is coming next is what makes you an explorer, open for the next unexpected thing….