From Artificial Intelligence to Data Collection: The Perils and Promise of Tech
We are living in an age of dizzying technological change, especially in the domains of electronic communications, artificial intelligence, robotics, data collection and genetics. The admittedly remarkable new capacities provided by tech were initially heralded with utopian predictions of their transformative effects, but we have increasingly been seeing their much darker aspects—their critical role in the erosion of individual privacy and democracy, the heightening of social atomization and alienation, and the vulnerability of all our major systems to cyberattack have become ever more glaringly obvious. Few people are better equipped to help us make sense of these rapidly evolving crises than Elizabeth Dwoskin, the Washington Post‘s Silicon Valley correspondent, and one of the nation’s premiere reporters on the world of tech.
Bioneers sat down with Dwoskin at a recent Bioneers Conference to discuss some of the best and worst implications of our increasingly plugged-in lives. Watch Dwoskin’s Bioneers presentation on this topic here.
“These companies right now, they just print money. But you can see a scenario where if Europe continues the way it’s going, and if the U.S. continues the way it’s going, companies won’t be able to buy their way out.”Tweet
BIONEERS: There’s a lot of fear in the marketplace about job losses and inequity around AI. What are your views are on the real threats or advantages of AI and robotics?
ELIZABETH: The data show that there will be more jobs lost than gained. And of course that’s why you’re seeing the rise of movements toward basic income. How will people get money when there’s fewer jobs?
This reminds me somewhat of a story in my past. I used to cover immigration before I covered tech. I remember there was this debate about whether NAFTA was going to cost American jobs. At the time there were a lot of economists saying, “Yes, a small portion of people are going to lose jobs to globalization, but overall, the consumers are going to benefit. The boat will be lifted.” But it turned out that those predictions were wrong. The number of people affected was much larger than anticipated, and those people became justifiably angry. And it had huge political consequences.
Back to AI and robotics: People talk a lot about truckers. Truck driving is one of the biggest job categories for men with only a high school degree. With the self-driving movement, people are talking about long-haul trucks as being first to make the switch.
Another one is picking and packing. One of the holy grails of robotics is what we call the “dexterity challenge.” If I pick up this water glass, my hand has sensors that tell me how not to break it. But a robot doesn’t know not to break the glass. Teaching a robot dexterity, to pick up the glass and not break it, is considered a holy grail. There have been some big innovations in that area recently, including in Amazon’s warehouses. So if you think about the next layer of automation and jobs being lost, it’s people who pick things up in warehouses.
BIONEERS: You’ve done quite a bit of exploration into “tech bro” culture. As a female investigative journalist, what’s your take on that culture, especially in the #MeToo era?
ELIZABETH: I think a lot of women in Silicon Valley are really angry. One thing that’s been sad for me as a reporter is when I go into interviews with people, I’m almost always interviewing white men or Indian men. Then there’s usually a woman in the room, because a lot of women work in PR and marketing in Silicon Valley—It’s like the feminized job of Silicon Valley. The women don’t speak, because I’m supposed to be interviewing the executives. It’s really uncomfortable for me.
Venture capital firms can be uncomfortable too. You have a very attractive woman who’s an admin in the front, and all the partners are men. So it’s like walking into the 1950s. She’s great at her job. It’s no critique of her. It’s about these kind of like 1950s social structures.
I do think that if the culture is going to change, it will happen now because there’s so much anger. People are looking more and more at the workplace. I think papers like The Washington Post are still struggling with how to cover people’s personal conduct. There hasn’t always been a willingness to go forward with these kinds of stories. As journalists, we still have a long way to go.
I recently wrote a piece about this whole new crop of apps and businesses founded by women since #MeToo to improve reporting in the workplace. There are new channels for reporting and understanding your rights.
BIONEERS: I was recently reminded of a Tweet that you posted during the Brett Kavanaugh hearings. Facebook’s top lobbyist was actually seated behind Kavanaugh in the middle of his hearings. You posed the question: “Who thought that was a good idea?” What do you see as the ideology of Silicon Valley?
ELIZABETH: Pro money. Stay off my back, government.
About that Tweet: That was really interesting. I was watching the hearings and looking at who was in the row of Kavanaugh supporters right behind him. On the bench were his wife, his former law clerk, people very close to him. And then I saw this one face—actually, a source pointed out to me. I thought, could that be Joel Kaplan, Facebook’s head of policy? I confirmed it with a couple of sources. Those people were inside Facebook, and they were mortified.
Most of the people who work at Facebook are what you would consider liberal. Joel Kaplan is probably the top conservative at Facebook. And there aren’t that many. So you have this company that is largely run by liberals trying to make a non-partisan platform, and it’s being attacked particularly in the last few years by conservatives who think they’re being censored.
When I posted that Tweet, a lot of people started asking questions. And then The New York Times did a story about it. Kaplan went on his own accord, but apparently he got in trouble when there was so much attention. They had to do a company town hall for all the employees. But e didn’t apologize. People who work there flipped out. Some of them took mental health days. Literally, people took mental health days because this executive so publicly supported Kavanaugh.
These employees are so coddled. One of the biggest debates among management at these very rich tech companies is whether to serve dinner and how many people in their family employees should be allowed to bring home dinner to. They get breakfast and lunch already, fully catered and the fanciest buffets, but they also want dinner. And if you change the organic bananas, people complain. These are very, very entitled people. It’s completely divorced from reality.
The companies always tells people to work within their principles. But then you have this one really high-up guy whose principles you don’t agree with. It raises a lot of interesting question about how neutral this platform should be.
BIONEERS: We’ve also seen workers in Google and Facebook pushing back against some of the company’s contracts. Google is taking a contract with the Defense Department, for example. Do you see the maturation of these organizations?
ELIZABETH: The organizations are maturing, and they’re becoming much more like regular big companies, which in corporate America play both sides. They want to curry favor with both sides of the aisle. They’re playing the game in Washington. One of the ways to do that is by getting government contracts.
These companies also want to diversify their revenues. They want to have government contracts because that makes you politically safer if shit hits the fan.
BIONEERS: Many people who are attracted to the tech world, traditionally, feel that they are going into technology to help make the world better. Is the improve-the-world-through-tech philosophy still the reigning ethos?
ELIZABETH: No. I think that’s what a lot of people are reckoning with. That’s why you see the worker insurrections.
Forget the government contracts, let’s look at the other effects. I think there was always something really delusional about Silicon Valley’s dreams for the Internet and these profound tools that have changed the world in profoundly positive ways. Overall, I think there’s a net benefit to these tools. Like the fact that they’re becoming utilities, the fact that they’re as necessary as electricity,.
When I talk to some of the people doing healthcare startups who came from Google or Facebook, they’ll say it’s because they were looking for more meaning.
One of the quotes that made me come out to Silicon Valley was a guy who was a Facebook executive early on. He developed a cancer institute at Mt. Sinai. When he talked about this institute, he said, “The greatest minds of my generation were trying to make people click on ads.” He didn’t want to do that anymore. So he did the science thing.
BIONEERS: What do you think is the end game for the current U.S. administration in terms of the tech industry? A lot of what the Trump administration is doing seems to be making us less competitive players in terms of the global tech industry.
ELIZABETH: In this new Trump North American Free Trade Agreement, there’s actually an interesting clause. In the U.S., tech companies enjoy a very liability-free situation. If somebody puts something illegal on a tech platform, whether it’s selling a drug or anything illegal except for child pornography, they’re not legally responsible for it. It’s called Section 230, and it’s a 20-something-year-old law that frees internet companies from liability. They fight tooth-and-nail to keep it.
What you’re seeing in these new trade agreements, the whole global trade scheme that’s being renegotiated right now under Trump, is the tech companies are trying very hard to get those liability freedoms, those exemptions, baked into international law as well. So if the Mexican government comes in and says, “Wait, people are openly selling illegal opiates on Facebook,” which happens all the time, actually, the companies aren’t liable for it.
BIONEERS: Tell us about the European privacy law that passed in May 2018 and what it means for user data privacy.
ELIZABETH: The Europeans passed this sweeping new law that is changing the way tech companies do business. It requires tech companies to delete their records after a short period of time. But their business is mining and collecting data and making money off of it. Purging data is something that they’re not used to doing as part of their business model.
The Europeans are going to have to enforce this. They passed this sprawling new law that they themselves may not even understand the ramifications of. No one can fully understand what this European law means, but they’re going to have to audit everybody.
There’s also always a question about whether we should delete data or delete inferences made based on that data. Which is wrong—is it collecting data or is it profiling. That’s something that’s unresolved in European law and not even addressed in American law.
BIONEERS: And then there’s the argument in data forensics that data’s never fully deletable. Ultimately the way that the hardware is made, you can never fully remove data.
ELIZABETH: Yeah. There may be a lot of these cyber people sitting there smirking, “Ha, ha, deletion, yeah right.”
I do know that a lot of law enforcement types are very upset about the deletion requirements in Europe. If they go to tech companies and say, “We need you to help us find terrorists,” the companies are like, “You told us to delete the data.” We don’t have the Gmail records of the terrorist anymore.
I heard law enforcement in Europe was fighting for an exception.
BIONEERS: I’d like to hear your perspective on who is responsible for the consumer education side of this conversation. How can people start to understand what they’re getting themselves into in the world of tech?
ELIZABETH: What we need is not these long privacy policies. I’ve done stories where we’ve counted up the words in privacy policies, and it’s ridiculous. Nobody understands what they’re signing.
What happened in Europe was they passed this law. It went into effect in May, but the companies had two years to prepare for it, because it actually passed in 2015 or 2016. One of the central tenets of it was simplicity. They’d have to explain to people very clearly what they’re doing with the data. And in theory, they have to disclose to you every single time they come up with a new-fangled way to use your data.
That’s not how it’s actually played out. A lot of the new policies were as complicated as the previous ones, in my eyes. And did anyone police or care about that? No one has come down on a company for that.
Ninety percent of consumers keep the default setting, whatever it is. So if the default setting of a tech platform is data collection, that’s what you’re going to have. – Elizabeth Dwoskin on Tech PrivacyTweet
Something that is worth thinking about if the U.S. passes a privacy law is default settings in terms of consumer education. Ninety percent of consumers keep the default setting, whatever it is. So if the default setting of a tech platform is data collection, that’s what you’re going to have. Very few people actually go in and change their defaults. Something that I thought would be interesting for a privacy law in the United States is to pass one in which the default is actually no collection.
The way the tech and publishing companies responded to that in Europe was by creating all these new sign-in screens that made it so hard for you to get to what you wanted. If you wanted to get to the news, you had go through all these new screens. And same if you were going to use Facebook in Europe. They made it so frustrating – this is my cynical read – for you to actually get to the product, that you ended up just signing the thing away. Technically you gave affirmative consent.
But in the U.S., the default is just collection.
BIONEERS: Who’s job is it right now to police these types of things?
ELIZABETH: Well, the tech companies are doing a lot more policing than they ever did. They used to be very hands-off. But now, especially with the Russian interference, society’s saying that’s not good enough. And lawmakers are saying, “If you let this stuff continue to happen, then you might get regulated.” Facebook and Google claim to have hired over 20,000 moderators around the world in the last two years.
Facebook let me sit in on a meeting where they debated their moderation practices. If a post goes up anywhere in the world, the first thing that might happen is somebody flags it as problematic—it was nudity, it was hate speech, it was violent, it was bullying. Let’s say the moderator can’t decide the answer. Then it goes to a manager in Austin, and then from Austin, the hardest questions go to this really high-level meeting that takes place in Menlo Park every week. There, Facebook’s top executives look at the three hardest issues that came up that week, and they decide whether they should make a new rule. That’s been happening for a few years, so you can imagine how many rules they have now and how complicated it is.
BIONEERS: Are there cross-company conversations happening about data, policing and ethics?
ELIZABETH: They all know each other. It’s a small world. But I think what actually happens is that a lot of the smaller companies in Silicon Valley follow what the big companies do.
These companies don’t fully publish their guidelines. So I know that Facebook prohibits hate speech. That’s what they say. I’ll ask them for a definition of hate speech, and they’ll give me a very simple definition. That’s not what the moderator gets. What the moderator working in a call center in the Philippines gets is a very, very long page of likely scenarios. “If you see this scenario, it’s hate speech. If you see this, it’s a beheading. If you see this, it’s news. If you see this, it’s ISIS. If you see this, it’s just a group expressing themselves.” I mean, every little thing. The guidelines that they get are very, very detailed, and those aren’t public.
I think that everyone follows the broader policies, but it’s a complete black box for how those policies are actually enforced. And that’s why we see a ton of mistakes.
BIONEERS: Have you learned much about facial recognition software and how it tends to have an inherent ethnic bias? Is there anybody who’s actually in charge of making sure those tools work ethically?
ELIZABETH: No. There’s no rule. We did a story a couple of months ago on Amazon was selling facial recognition services to local law enforcement departments in Orlando and Washington state. There was no public auditing of whether that facial recognition was accurate. If it’s accurate, that’s a problem in and of itself. If it’s inaccurate, you’re sweeping all these innocent people into a facial database.
I don’t think there’s anything in our law that could prevent Facebook from building a facial recognition tool that doesn’t work as well on people with darker skin.
And then the question is: Well, do you want the tool to get good?
There are some civil-rights implications in other targeted data decision making. For example, there are lawsuits against Facebook over the issue of targeted, discriminatory ads, and the fact that you could target a job ad on Facebook to exclude women or to exclude older Americans. There’s a question over whether that should be allowed. Facebook has actually pulled back and said, “Okay, we’re going to remove that targeting category for certain types of ads — employment ads, lending ads, and housing ads.”
BIONEERS: Younger people are trending away from Facebook. What do you think the future of Facebook looks like?
ELIZABETH: I think Facebook is really scared of the generational shift. But they’ve amassed so much market power that even if all the kids of the future stopped using Facebook, which most have these days, they still have all their data. They have the parents’ contact lists. And they’re going to have the market power to buy whatever hot new thing comes along that all the kids are using.
They have an unbelievable visibility into what you use, because even if you’re not on Facebook anymore, they collect so much data about people in your social network that they can often find out about your behavior too.
I guess one question is: Can what’s going on in the European laws affect these huge tech companies’ revenues? These companies right now, they just print money. Right now they make so much money that they can almost do whatever they want. But you can see a scenario where if Europe continues the way it’s going, and if the U.S. continues the way it’s going, companies won’t be able to buy their way out of their problems.