Skip to content

The human cost of technology

tech v manTechnological advances have bestowed huge benefits on our lives and our capacity to explore the unknown. But we have also paid a dear price for those gains, both as individuals and as a society. Indeed, the all-powerful influence of information and communications technologies in business and industry can fairly be considered a key cause of our current socioeconomic imbalance.

A broad look at the past 60 years suggests that technology has been at least as much foe as friend, especially in terms of the employer-employee relationship. For instance, prevailing management theory has long held that an organization prospers when its workers prosper too. In the past, that philosophy led companies to prize loyalty, longevity, teamwork, and commitment. In return for such contributions, employees received a decent living wage and a sense of security that stretched into the future. Over time, and with the noteworthy efforts of unions, employees also gained benefits such as health insurance and pension plans.


The influx of groundbreaking technologies…changed the organizational view of the employee from asset to commodity.


For the three decades preceding the ’80s, that mutually beneficial relationship yielded a relatively stable economy and society. Then came the influx of groundbreaking technologies, which launched the age of globalized everything and changed the organizational view of the employee from asset to commodity.

A cutthroat competitiveness settled in among employers, as they raced to reach and remain at the “leading edge.” Constantly seeking the best programmers and innovators, companies continually traded out “old” talent for “new.” Employees, in turn, followed companies’ lead and left employers for the highest bidders. Loyalty stopped being a key goal. Longevity became an albatross. Commitment to mutual success dwindled.

As tech stocks boomed, crashed, and boomed again, companies shifted their focus away from employees and toward shareholders. The bottom line became sacrosanct, and one way to inflate it was to have employees “do more with less.” Over time, that mantra translated into mass layoffs, salary cuts, and reduced benefits. Today, many of those who still have jobs are working grueling schedules without overtime pay, salary increases, bonuses, or compensation time. Indeed, many are expected to be on call for business 24/7.

Meanwhile, as communications technologies advanced, globalization intensified. Organizations increasingly branched into other countries, and, with access to a cheaper, worldwide labor pool, they increasingly outsourced American jobs. Thus, corporations have been able to pump up bottom lines even further, both by keeping wages and salaries low and by avoiding paying U.S. taxes.

Today, the average American worker, regardless of collar color, works harder and longer for less pay, benefits, and recognition. In-house opportunities for advancement and financial success have dwindled. Job security no longer exists. The likelihood of financial stability during retirement is all but gone. And, given the fall from grace of workers’ unions, the avenues for improving conditions are virtually nil.

Now, instead of being a society that values the individual—as democracies are meant to do—we are near to being a society where organizations and wealth rule. Is technology literally to blame for that? Well, to paraphrase an NRA slogan, “Technology doesn’t kill people. People kill people.” So the literal answer is no.


In a case of gross irony, we clamor for the very technologies that are dividing us economically and socially.


But the worship of technology is to blame. And of that, we are all guilty. Who hasn’t said, “I don’t know where I’d be without my [cell phone, tablet, laptop,…]”? Thus, in a case of gross irony, we clamor for the very technologies that are dividing us economically and socially. Author Ray Bradbury summed it up concisely when, in the story The Pedestrian, he described people in a technological world as being “alone, together.”

Other great minds of the past have warned us to be wary of technology—particularly of its ability to dehumanize us. Now, renowned thinkers, including Stephen Hawking and Elon Musk, are making dire predictions about artificial intelligence. They see AI as the gravest concern, posing no less than a threat to our very existence.

Given our institutions’ history of leveraging technologies to our detriment, it’s easy to believe they will wreak havoc when they get their hands on AI. And get it they will. But with so much on our plates just to keep afloat, and with our heads already locked in cyber worlds, who will be paying attention?

LindaK View All

A lifelong communicator, I'm pretty sure I came out of the womb talking. But with no siblings to chat and play with, I learned to express myself in writing. My subsequent birth as a politics junkie came while I watched my father, a career Marine, sob uncontrollably over Kennedy's assassination. Intuitively, I knew the world would never be the same, and I should pay attention. So I did.

Now, some 50 years later, I find myself dumbfounded by the trajectory of American politics and the prevalence of ignorance, bigotry, hate, and violence. I started Two Cents of Sense, hoping to help change that trajectory and to promote progressives' conversation, knowledge sharing, and actions.

3 thoughts on “The human cost of technology Leave a comment

  1. ahhh…. a sore spot for me.( not as high tech as AI etc….but still..) Here is my experience with computers.
    It all started around 1983. I was an art major at a community college. Just a 21 yr old California beach girl taking it slow. Then this remarkable bartender job opened at a beautiful restaurant right on the sand! I was making good money, ended up dropping out of school to tend bar full time. I lived right across the street in a tiny apartment. Life was great. No cell phones, no computers really. I would get excited to go home and see the blinking light on my answering machine.I didn’t even have cable, who needed it? movie theater down the block, grocery store on the corner. I don’t think I drove for 5 years. It was so great I kept it up for 16 years or so.

    Enter….Mom. ” you gonna waste your life away at that bar?” …”oh, I don’t know mom, I make 4 grand a month, I work 4 days, I live on the beach…hmmmm….why?”
    “well, I think you should get into computers, it’s the next big thing”
    ” computers? I don’t even like using the phone, I am an artist by God!”
    Well, guilt paved the way for my return to college. Graphic designers were making $80./hr. sounded good.
    Here’s where I have the sore spot. I go to an actual art (design) school. They have 4 Majors to choose from ( for a bachelors)
    1.Graphic design
    2.web design
    3 fashion design
    4 Interior design.
    well, I figure the best use of my “art’ skills would be in graphic design.
    Boom—-skip ahead 2 years and $ 70,000 later. I am a graphic designer. and boom——911 happens, like, a month after I graduate. not only are people not hiring, they are laying off, unsure of what this new event might bring to the world.
    That’s OK, I understand, we’re all a little shaken up.
    But guess what happens in those situations? yup- people still drink.Tending bar has great job security. People sad? they drink- People happy? they drink!
    So, I’m back at the bar,only now I’m 70 grand in debt. It’ OK, I still understand.

    Skip again 6 years or so. Things cool off and graphic design jobs open up again, I start applying. What I don’t realize is that “technology” moved so fast in the computer field, during my 6 year hiatus, that not only is my major almost obsolete, kids can literally train themselves over the web.
    I walk into a job interview with a graphic design degree, 4.0 GPA and the first thing they ask is “How are my web design skills?”
    Web design? what? that was an entirely different major. ( apparently now you get a web-slash-graphic design degree) I have no web training! none whatsoever AND the worst part? These kids with crazy web/design skills are working for $12./hr, gladly. $12???? c’mon.

    I have this aversion to technology, yet I have to use it. I do work as a graphic designer now but I had to move from my beach apartment to salt lake city. It’s not so bad, I have a lot of freedom and a beautiful view of a snow capped mountain, and no, I am not working for $12. My company treats me well. I keep running across websites like “learn Photoshop for free, Learn Illustrator, become a designer, for free” …and my $70,000 just floats before my eyes….in flames….mocking me.

    Liked by 1 person

  2. Fabulous read, Symea. You bring up another very valid point—the inability to keep up as technologies advance…and advance…and advance. Whether it’s having the latest skill or the newest cell phone, we just can’t stay afoot of the pace. And even if we can do so in the workplace, it rarely puts more money in our pockets. Rather, we are simply expected to evolve with the innovations and be grateful for having that opportunity. It’s another devaluation of the employee, another case of viewing workers as commodities instead of assets.

    One reason for the devaluation may be that as technologies evolve, the use of them simplifies. Consequently, “any child can do it.” Of course that’s when skill, talent, knowledge, and experience step in and take over. But while most employers want those qualities, they’re just not willing to pay for them. If you don’t literally add to an organization’s bottom line or its ultimate influence then you don’t make the bucks. You’re a valuable cog, but still just a cog.

    As for the education costs associated with becoming part of the institutional hierarchy, well, that leads to a whole other topic. And you’ve got my mind thinking.

    Liked by 1 person

  3. While I appreciate the concerns Linda expresses, I want to clarify the meaning of “artificial intelligence” and, based on that clarification, place AI in the proper perspective of modern times.

    There is no shortage of positions and counter-positions with respect to what artificial intelligence means. Most pop-culture views are fueled by sensational movies and books with very little or no substance to them. As meant by the 1960s AI pioneer John McCarthy, “artificial intelligence” is a kind of intellect based on things made by human beings. I’ve spent a few years studying the various meanings of AI while in graduate school, and I’ve come to believe that the science is filled with controversy and vast disagreement on almost everything across the board.

    I believe the roots of the controversy rest in two notions that, once believed, lead to dead ends. First is the notion that we think with our brains. Second is the notion that we have to understand everything based on the analogy of math and, by extension, mathematical logic.

    The first notion, that we think with our brains, is false. We think with our minds. The Greek philosophers of old helped us resolve this notion, and more than 2,000 years of philosophy testify to its falsehood. The second notion, that we should understand everything by the analogy of math and logic, is also false. Starting with the pre-Socratic thinker Protagoras, this notion has been spread widely through the ages in academia.

    These two false notions echo in the halls of academia to this day. As the mind was the solution to the false notion of thinking with a brain, so too, a cognitional theory based on the actual way we think is the solution to the non-sense that reality is to be reduced to mathematical logic. A mind clears up many things.

    The power of computers is in the speed with which they can calculate things. Beyond that, they have no spirit in terms of life principle, nor do robots have a holomorphic unity of body and spirit in terms of being a living soul.

    Yes, globalization and the macro economy seem to be chaotic and volatile. Yes, the information revolution is leveling the industrial aftermaths of the good old days and destroying old world jobs. But the automation created by software backed-up with various types of AI—from machine learning to supervised intelligence—also shows the path for new ones.

    There is no stopping the juggernaut that is the World Wide Web, but by questioning our presuppositions and assumptions, we will do a lot to clarify our understanding of things. This will give us a hope of a positive outlook on the future.

    Liked by 1 person

Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: