Internet Explorer 11 is not supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Beginning with Blueberries: What Are the Ethical Bounds of Tech?

As technology becomes more competent, ensuring that humanity isn’t replaced or lost is an important part of moving forward.

(TNS) — Of all the life experiences helping to shape research in an age of data privacy threats, robotic workers and driverless cars, here's one you might not have considered.

It involves blueberries -- specifically, the not-yet-ripe, green ones.

Years before he joined The Robotics Institute at Carnegie Mellon University, Randy Sargent worked on an early-career project in Washington state to develop a system for sorting blueberries without human hands, using cameras to spot green ones and air jets to separate them from the others on a conveyor.

As advancements go, it was a no-brainer, right? After all, who would want to do that monotonous work?

"I learned that it was a very naive view," he said.

Turns out, the employees making a living at it "didn't want to be replaced," he said. In fact, what he assumed was a joyless job actually enabled those workers to talk with each other and engage.

The project he worked on ultimately was not deployed, said Mr. Sargent, a senior systems scientist at Carnegie Mellon. But the example, he added, is less important than what it symbolizes.

Today, at Carnegie Mellon, he is among a swath of classroom professors, researchers and students probing the ethical implications of artificial intelligence, automation and other technology -- from the viewpoint of disciplines as disparate as computer science and philosophy. A sizable portion of his work involves uses of AI that can help level the playing field for those otherwise disadvantaged as data mining reshapes everyday life in areas from housing opportunities and creditworthiness to health care.

"When we're in a competitive marketplace, it's sometimes hard for us to pay as much attention as we should to broader impact of what we're doing," he said of those involved in the inventions.

So Mr. Sargent and his colleagues want to help a generation of scholars, inventors and others get outside their own "bubble" for the good of society.

The stakes in that would seem to be huge.

After all, intelligent machines able to "think," and data mining so rich it can reveal and even influence human behaviors, are driving an uneasy discussion globally about how far the technology should go.

Just look at the headlines.

"Ethics and automation: What to do when workers are displaced," reads a headline on the site of Sloan MIT, the management school at the Massachusetts Institute of Technology.

"After generations of increasing inequality, can we teach tech leaders to love their neighbors more than their algorithms and profits?" asks online tech industry publication Tech Crunch.

"Can we teach robots ethics?" asked BBC News in 2017, only to be followed by this:

"Why ethical robots may not be such a good idea after all," stated a piece posted to the web site of IEEE, the world's largest technical professional organization devoted to advancing technology for the benefit of humanity.

In the grand scheme of things, Carnegie Mellon is just one of many universities looking at these issues. It just happens to be a huge player, with a global reputation for computing, engineering and robotics that makes what it says on the subject resonate.

And it's not just those on the techy side of campus.

From his vantage in the philosophy department, Alex John London works with groups and individuals on and off campus to promote better understanding of various aspects of ethics, including the importance of recognizing the difference between useful innovations and hype.

He said he can't design an engine or build a brake or clutch. But he's a go-to person on human decision-making. As such, what he says is relevant to those trying to get an autonomous vehicle to safely do what humans behind the wheel have done for generations.

"Part of what I do as a humanist is educate people about what these system can and can't do," said Mr. London, the professor of ethics and philosophy.

He looks at such things as what's gained and lost in the quality of care and doctor-patient interaction as medical records are digitized. He helps others understand that it's not just the developers of the technology that have a responsibility for its ethical application.

"There's no one person who's responsible for that," he said.

Those who bankroll research, those who make policies and those who purchase and use the end product also have a say.

Robots able to do dangerous jobs offer a social benefit, he said. Another set of questions comes with use of robots in policing or the military. "I think people's reticence to use lethal force can change when they can use that force safely removed from the context," he said.

The fact that Carnegie Mellon for decades has championed research across disciplines makes it easier for him as a philosopher to bring ethics into the discussion on such things as computation science, AI and computing.

The earlier that conversation begins, the better.

In fact, one joint effort by the university and AlphaLab Gear -- a startup accelerator focused on entrepreneurs developing hardware, robots and other physical technologies -- has introduced a program dubbed Ethics MVP. Starting in April, it worked for six months with more than half a dozen startups.

The proposition, in essence, was: "What if we helped startup founders really think about ethics from day one and build ethical practices into their companies and products?" said Jessica Pachuta, a project co-director of Ethics MVP.

Participants reconvened last month to share their takeaways from the program, developed through the School of Computer Science's Create Lab.

It's hard to deny that ethics is good for business, said Ms. Pachuta, but it's a complicated topic with new questions arising as technology itself changes.

That can be hard on time-pressured entrepreneurs.

So regardless if the venture involves artificial intelligence, data collection or some other technology, founders are encouraged to explicitly weigh in on the subject, "so it's something normal to talk about."

©2020 Pittsburgh Post-Gazette. Distributed by Tribune Content Agency, LLC.

From Our Partners