Categories
Sixty Seconds of AI

Hands off the Wheel

As I read through a discussion of how search and local law intersect in Noble’s Algorithms of Oppression, I was reminded of a discussion a friend and I had over dinner the other day.

He’d suggested that artificial intelligence is responsible for the content moderation at places like Facebook. Citing the apparently instant start to streaming for services like Facebook Live, he posited there was no way his streaming could be moderated in any way, at least not before he’d had the chance to broadcast. Were he to share something that violated policy, everyone would see what he broadcast before it was discontinued.

Since I have done a lot of manual work in AI, and it’s well-established that every company whose primary product is user generated content relies on an army of manual content moderators working around the clock, I suggested it was actually possible there was no AI in the system at all. This seemed impossible to him: a numbers game, at scale.

Why it’s interesting

We were quickly able to agree that the numbers scaled down considerably for his example of streaming video: on any given day, regardless of the number of daily active users, nearly none would be streaming live that day, and even fewer would be streaming at any given moment.

This is the important fact I wanted to discuss: Even still, with billions of daily active users, any streaming media site would still need tens of thousands of moderators monitoring incoming streams. Those workers are rarely considered when we’re talking about something that “just works.”

(Caveat: I don’t have any idea how many users use streaming video or how many content moderators there are, just that the services can’t exist without them.)

Why it matters

My friend was happy to attribute beheading- and revenge-porn free experiences using social media apps to AI. The very existence of the service depends on high-quality human judgement to make decisions though.

The people actually making the product feel safe and fast and amazing, are real. Humans create the emotional power of what’s marketed as AI though manual labor, mostly in the global south. Especially when, in the recent cultural narrative of The Social Dilemma, the narrative has started to change to frame the product being you. This is an alarming and emotional message.

The people doing the emotional labor and the toughest part of the job being invisible isn’t new. We can’t forget it though.

Links

http://algorithmsofoppression.com/

The Social Dilemma on Netflix: https://www.netflix.com/title/81254224

Let’s have a conversation

Let us know what you think! Write us at:

sixtyseconds@deevui.com

Categories
Sixty Seconds of AI

Kazuo Ishiguro’s Artificial Friend

It’s a little embarrassing to admit that I’ve never read Ishiguro. We all have gaps, right?

I will admit that I have never picked up any of his books because when I was a kid I saw a bit of Remains of the Day on TV and thought it was a war movie, and I hate war movies.

(Dumb, I know, you don’t have to tell me that it’s the setting for the beginning of the movie but it’s not about WWII)

Anyway, he has a new book coming out called Klara and the Sun.

Why it’s interesting

The titular Klara is an “Artificial Friend” which is some kind of AI-powered affective robot that you take home from a store and learns about you.

Why we care

This is a bit of a leap since I’ve never read the guy’s books, but he has a reputation for emotional resonance. His best known and loved work (arguably) is Never Let Me Go, a book about clones raised to “donate” their organs. From what I’ve heard, the power of the story is the uncomfortable subject and marginal existence of the clones is powerful.

That makes me excited to read what he does with an embodied AI that’s a real life companion.

Links

https://www.penguinrandomhouse.com/books/653825/klara-and-the-sun-by-kazuo-ishiguro/

Let’s have a conversation

Let us know what you think! Write us at:

sixtyseconds@deevui.com

Categories
Sixty Seconds of AI

Can’t Have It Both Ways

Here’s a bit from a front page article from the NYT’s Sunday Business section.

I look forward to working with a true believer in Robotic Process Automation so I can better understand the perspective that leads people to talk about people like this (emphasis mine):

Jason Kingdon, the chief executive of the R.P.A. firm Blue Prism, speaks in the softened vernacular of displacement too. He refers to his company’s bots as “digital workers,” and he explained that the economic shock of the pandemic had “massively raised awareness” among executives about the variety of work that no longer requires human involvement.

“We think any business process can be automated,” he said.

Mr. Kingdon tells business leaders that between half and two-thirds of all the tasks currently being done at their companies can be done by machines. Ultimately, he sees a future in which humans will collaborate side-by-side with teams of digital employees, with plenty of work for everyone, although he conceded that the robots have certain natural advantages.

“A digital worker,” he said, “can be scaled in a vastly more flexible way.”

Why it’s interesting

The very use of the word “work” implies a human is involved, so but let’s leave the incorrect use of “worker” aside.

Bots do a single thing in a single way until they’re thrown away or refactored. They are purpose built and very very expensive to build, maintain, and update.

If you want to talk about research, development, customization, extras like encryption, data storage, security, bandwidth, and regular tuning as part of the process, all of which is done by specialist experts (i.e. Dee VUI) now it’s even more expensive.

It’s a cool marketing pitch (if you’re into, you know, our robot overlords) and I’ll thank the guy in advance for keeping us in business for when it doesn’t work and they don’t know how to fix it, but sheesh.

Why we care

I can get why you’d want to call a few hundred lines of code that call some proprietary software packages and send data to and from an API a “worker” instead of “software” if you were a primary investor in a company that sold that software and services that does this, as Kingdon does. That way you can imply that you sell something of equivalent value to abstract and emotional work.

It doesn’t though. The only way you can even call something “work” is because it requires physical, emotional, or mental labor, usually all of these in multiple ways. On the other hand, comparing people to line of code, and suggesting equivalence in value and kind, is creepy and dehumanizing.

Unpacking this a bit, it’s definitely a fact that many (most?) businesses rely on outdated machinery, processes, and practices because they invest as little as possible in maintenance, infrastructure, and upkeep. We live in a world that prioritizes short term gains, so that’s not surprising. It’s a business decision. That’s fine. If there are parts of their businesses that are inefficient and costly as a result of business decisions, it’s a ugly to characterize this as a labor problem.

Kingdon seems to want to have it both ways: He wants to call software “workers” by using a bot as a metaphor for a human in order to play up its value, but he want to to identify the value of the bot as inherently better than a human precisely because it’s not human.

Scapegoating human labor as a business problem to be solved isn’t the way forward. As AI practitioners, we should be looking to celebrate humans by maximizing the value of their contributions and diversity.

Links

https://www.nytimes.com/2021/03/06/business/the-robots-are-coming-for-phil-in-accounting.html

https://en.wikipedia.org/wiki/Robotic_process_automation

Let’s have a conversation

Let us know what you think! Write us at:

sixtyseconds@deevui.com