Skip to main content

Operator Overload

Photo by Oliver Sjostrom
Life has changed dramatically since the start of the personal computer revolution in the late 1970s. We have seen computing go from the realm of military to industry, then to the home and now to the pocket of our pants. Connected computers have followed the same path, as the Internet has forever changed the landscape of computing, including how people interact with it. Along the way, we've seen computing go from being rather anti-social to being a completely mainstream component of popular culture.

Let's pause for a moment and examine how technology migrated into being chic. In the late 1970s there was a lot of optimism around what computing technology could someday do for us and while many people were eager to learn, that number was still small. Computers were essentially souped up calculators and most people weren't eager to learn an arcane programming language or spend their free time noodling around with calculations.

One pivotal use case for technology exerted a mighty force on a particular group. Children took to video games so magnetically that the future of technology seemed assured initially. After years of record sales and blockbuster growth, in 1983, the video game market was flooded and the market bottomed out.

Technology was once again adrift, but not for long. In 1984, the Apple Macintosh was introduced. While woefully under powered and exorbitantly priced, it was a pivotal moment for computing. The Macintosh made computers accessible to a far bigger audience, because it simplified their operation down to point-and-click operations within a graphical user interface (GUI). The GUI removed the need to memorize arcane commands and allowed computers to be used in a manner that was more similar to how people naturally worked and thought. As the GUI increased adoption rate and public favor of computing technology, so too did video games rebound. Today, video games are one of the largest forms of entertainment by revenue.

Once the GUI reached a point of relative maturity, computing technology had secured a permanent place in our day-to-day lives. Learning a programming language was not necessary for most people, the word processing and spreadsheet functions were useful to adults while children (and often adults, too) enjoy the occasional video game. However, the role of computing technology as a social force was still very much in its infancy. It would take the revolution created by the internet for that to come about, but even that took many years.

Initially, the internet was not a commercial place. All content on the internet was derived by people directly, not corporations. While gaining in popularity, it wasn't until the mid 1990s that the internet started becoming a cultural force. This despite it having been created in the late 1960s as a result of the military efforts in the Vietnam war that created ARPANET, the very first internet. Commercial success did not come easily for the internet, and in the early days of the commerce-based internet much of the growth in e-commerce occurred by serving one of humanities most base needs, that being the need for pornography. Because people wanted to buy pornography without going to dingy adult 'bookstores' and because people creating pornography wanted to be compensated for their efforts, many computing security, graphical, and financial technologies were created. Certainly there were other reasons for e-commerce even in the early days, but make no mistake, pornography was the spark that started the fire.

Having satiated our need for neatly typed letters, balanced budgets, blasting space aliens, and viewing all the various sex acts man can possibly perform, one would think that computing technology had not only cemented itself but had also fully matured. Yet technology never stands still. There were still not one but two more pivotal moments as of this writing.

Social media definitely got off to a rocky start, because initially there were many sites that were near clones of each other and the situation looked like a repeat of the 1983 video game crash. Various sites had their successes, from Friendster to MySpace, to which was possibly the first social media site. However, in 2004, Facebook entered the fray and ultimately grew to be an internet powerhouse. Social media started innocently enough, as a way for people to connect and share their lives. As it matured, Facebook became a monster, and is currently charged with causing mental health problems, being far too overzealous in its use of its users data, discriminatory advertising, and sowing hate speech which ultimately disrupts our political processes around the globe. With the advent of Facebook, computing technology had started to turn sour on us.

I've covered a lot already in this post. Technology would have clearly changed our lives dramatically if I stopped now, but there's one more pivotal moment. The most powerful moment yet for computing technology happened when all the technology I've spoken about in this post was made compact and portable with the advent of the smartphone in 2007, with the advent of the iPhone from Apple. True, there were other phones that could do enhanced features like email, but the iPhone immediately made the Blackberry look like a primitive email terminal that had been grafted onto a cell phone. Smartphones took all the power of computing technology and made it instantly available. Regrettably, the effect of smartphones has been far more negative that we expected.

Writing for the New York Times, Kevin Roose recently wrote how limiting the use of his smartphone un-broke his brain. Kevin acknowledges that he spends far too much time on his phone, and he discovers that he was missing life in favor of apps, apps that encouraged unhealthy habits. Living in New York City it might be hard at times to avoid the desire to escape into a smartphone, but it's striking that even someone in his location noticed a problem. Roose mentions that the problem is gaining so much traction that simplified phones are emerging.

Simplified phones, which offer a reduced set of functionality with the goal of getting people to spend less time on them. Of course, simplified phones still afford the various necessary conveniences, just at the right dose. Roose also mentions both email and social media as the apps are where he spends the largest amount of time. Ironic that apps designed to connect people are actually the most isolating, isn't it?

Roose notes that being comfortable being still is something that technology has pretty much obliterated. He also lists all the ways he used his phone as a distraction, including using his phone to "...pretend to meditate." Roose ultimately decides to delete social media and news apps, along with games, to prune his phone down to things that he actually derived value from. In their space, he filled it with non-screen related activities that actually add value to his life. His pottery class has reduced his use of a phenomenon psychologists call "phubbing" or snubbing a person in favor of your phone. He reports that he has been more attentive at home and his life has undeniably improved since limiting his phone time.

Sadly, we are not yet at the end of the road for the ways in which computing technology has been destructive. Social media ostensibly allows users to post any content they see fit. It didn't take long for that policy to need some oversight and the process of content moderation for social media was born. Casey Newton recently wrote about the deplorable conditions that Facebook moderators endure to moderate content, posted by users, that is simply vile. Users post violent, hateful, and obscene content so often that an army of contract employees are needed to fend it off. These workers often leave their job traumatized because of the content they are paid to view. Their working conditions are obviously bad, as Newton observes, but the job itself is psychologically damaging.

Did we really create all of this amazing computing technology for it only to be used in such unhealthy ways? I utterly fail to see how creating or viewing content social media must curate would have created any value in our lives whatsoever. Clearly the creators of social media agree, hence the army of workers scouring all social media platforms, but the prevalence of this type of content is disturbing.

As someone who earns his living in technology, I have hope that the continual cycle of innovation will ride again, and we will cure ourselves of these ills. However, it may not be technology that cures us. If we made better efforts to limit our screen time by seeing the amazing things all around us, we would still be able to possess the awesome power of a supercomputer in our pocket without it dominating our entire existence. The solution is simple, become invested in things that aren't centered around a screen. Talk to people, go outside, take up a new hobby, explore a new area, pay attention. In other words, live the life that is right in front of you.

--Jay E. blogging for


Popular posts from this blog

Law enforcement and DNA sequencing

DNA sequencing has risen in popularity in recent years to to the widespread availability of affordable testing kits. Obviously people are opting into participation by uploading their DNA data, in great numbers, but do they fully know how that data will be used?

The Golden State Killer, who terrorized California from the mid-1970s to the mid-1980s, was recently apprehended by working with a lesser known testing company, Family Tree DNA. Their kits are smaller and so is their database, but their database has an big advantage. The company boasts that it has the largest database Y-DNA database in the world. Y-DNA is very useful in tracing patrilineal ancestry, which is essentially data on who you are related to. This data is how the Golden State Killer was caught. Because some of his relatives had willingly participated in DNA kit testing, law enforcement was able to triangulate his identity.

Use of these databases by law enforcement is a new but already rapidly growing phenomenon. Gene …

The Growing Disruption Of Artificial Intelligence

Photo by Frank Wang
Artificial intelligence may be as disruptive as the computers used to create it once were, and it could be even bigger. Given the disruption that social media has proven to be, one has to wonder if we are fully prepared for the life altering consequences we are building for ourselves.

IBM has been a key player in the artificial intelligence arena for over two decades. Deep Blue was their first tour de force in 1997, when its team of developers received $100,000 for defeating chess champion Gary Kasparov in a game of chess. That watershed moment has its roots all the way back in 1981 when researchers at Bell Labs developed a machine that achieved Master status in chess, for which they were awarded $5000. In 1988, researchers at Carnegie Melon University were awarded $10,000 for creating a machine that achieved international master status at chess. Deep Blue, however, was the first machine to beat the world chess champion.

Google has entered the fray as well, with th…