Will Big Data Affect Government Policies? with MIT's Alex "Sandy" Pentland

Will Big Data Affect Government Policies? with MIT's Alex "Sandy" Pentland



In 2013, an app started harvesting data on
87 million Facebook users. In 2016, Cambridge Analytica used that big
data to target US voters in a presidential election. And two years later, a whistle blower revealed
the extent of the abuse of big data, and this man saw it all coming. In the age of big data, alternative facts,
and privacy abuse, Sandy Pentland wants access to everyone's cell phone. People have a right to know what's going on. Number of vested interests don't really want
you to see what's happening. Not just the monetary interest, but the political class doesn't really want the facts on the table. I'm Michael Hainsworth, and this is Futurithmic. My name's Alex Pentland, and I'm a professor
at MIT. He's more than just another professor. Forbes Magazine calls Sandy Pentland one of
the most powerful data scientists in the world today. In 1986, he co-created the MIT Media Lab.
1992, he became the wearable computing pioneer who taught the students who created Google
Glass, and in 2007 became an advisor to the World Economic Forum. Sandy Pentland is the scientist his colleagues
cite in their work. A successful scientist has a Hirsch Index
of 20. An outstanding scientist? 40. Pentland? 125. My job is to see the future, and then also
to invent ways to deal with the rocks, as well as tacking towards the things
that are good. How do you feel about walking down these halls
every day? Well, it's like air to me. I just live here. Right, right.
You're accustomed to this. Yeah, of course.
We invent the future here. What else, right? Pentland believes information is power, and
he wants to put that power back in the hands of the people. That power is literally in our hands. Biggest source of data for you today has got
to be this. Yeah, because almost every adult in the entire
world has one of those. There would have been an assumption, I can
imagine, that the real power of the data in these little glowing rectangles comes from
social media. Not the case.
No. The stuff that's really important is the stuff
of where you went, what is your daily habit, who do you interact with … If I gave you this phone, what could you do
with that information? You can actually see across the whole world. You can see whether policies are working,
whether the government is doing what it's supposed to do … If we aggregated a bunch
of people together, what could we tell about them? The answer is we could tell about how much
money they make, we could look at gender balance, we could look at how integrated you are into
society in terms of: Do you talk to other people that are very diverse people, or not? It turns out that that variable predicts,
for instance, infant mortality. That variable predicts crime. Unemployment? Unemployment is a certain one, yeah.
Absolutely. In 2016, Pentland did something remarkable:
His team convinced Saudi Arabia to hand over the cell phone metadata of 2.8 million cell
phone users, the latest 4 million-strong unemployment benefit application package, and the census data of where all the cell phone towers were in that capital city. It was anonymized from the household level
up to the neighborhood block resolution, and then they put it all together. It wasn't much of a surprise: Those who use
their smartphones in the evenings and at home were probably unemployed, and those who didn't
go straight home after work, they were generally more successful. But it also allowed Pentland and his team
to figure out where unemployment was going to be a problem, and that would allow governments
to act in real time. That real time power works both ways. Pentland believes we must take back control
from the data monopolies of corporations and governments. What we need to do is aggregate citizens to
be able to have our say in how the data is used. Interestingly, already co-ops, credit unions,
the things that 100 million Americans, for instance, are already members of,
are chartered to do that. What do we do with that data? Well, imagine that you have a small town,
and half the people or more are members of credit unions. I'll call them data unions, all right? And they went to the local hospital system and
said, "Well, say three quarters or half of the people in the town really want you to
do this thing. We see from our data that you're not doing
as good as the norm on these sorts of diseases, and you're much above market prices for those
sorts of services. You can do better." The fact that groups of people know what's
going on means that they can take monopolies like the medical system, some of the other
big companies, and demand better service. But how do we demand big data isn't weaponized
against us? It's gonna take a lot of kicking and screaming
and making it happen. Pentland's solution? At the World Economic Forum, he spearheaded
the creation of regulations that are now law in the European Union. It forces companies that aren't even European-based
to adopt their policies because they have operations in Europe, so you have the ability
to control the policies of other countries. Right. GDPR: It's a set of rules and regulations
for your privacy and data protection. It requires a company to follow all the rules,
and if the rules are broken, that's a 20 million Euro fine or 4% of revenue, whichever is greater. These are the six rules to GDPR: We have to
be told what our data will be legally used for, the data can only be used for that intended
purpose, only the requested data can be collected, the data kept on us must be kept up to date,
and it can't be kept for longer than it's needed, and our data must be kept secure. If GDPR existed in 2016 when Facebook's leak
gave Cambridge Analytica the ability to manipulate voters, Mark Zuckerberg would have cut a check
for one billion Euros. While Pentland believes big data will set
us free tomorrow, big data has a big problem today. Getting six billion people to hand over their
smartphone data, it's a tough sell. How do you do it? Well, first of all, we don't. That's rule one, right? That that data belongs to the people who own
the cell phone. But to make the cell phone work, it has to
talk to a cell tower, okay? That means that any time you're anywhere with
your cell phone, the cell tower knows that you're in the area. The general approach that people are thinking
about mostly is not to have data about individual cell phones, but to have data about cell towers. Weren't you able, though, to take a cash of
credit card information, reverse engineer it, and have a really high degree of accuracy
as to whose credit card information it was? Well, so that's actually the key thing in
the danger about data, is that it's possible to re-engineer it if you have
individual-level data. I would have thought you'd want a lot of granularity
to something like this. It turns out it doesn't really buy you that
much for most of the social functions, like how you design a city or electricity, making
sure there's no brownouts, making sure the buses are in the right place. It just doesn't do that much for you. How do we ensure that the data that we're
using to make decisions down the road isn't racist, isn't sexist, isn't classist? What we've been doing is building tools that,
first of all, keep track of what's actually happening, not what they say is happening. Second of all, to be able to audit it to see,
well, is this actually biased despite claims to the opposite? Is this actually what we wanna have happening? I mean, it doesn't have to be biased or unfair,
it could just simply be obnoxious, right? We'd like to be able to detect that pretty
early and be able to go back and say, "Well, who it is that's doing this?" and fix it. Well, then aren't we just taking the concerns
of bias away from the machine and applying it to the auditor? Shouldn't we be worried about the auditor
of the data being biased? One of the side effects of figuring out the
rules for how to audit data is we figure out the rules of how to audit humans, right? The regulators that act like machines, in
a certain sense. They have rules they're supposed to follow. Are those good rules? Well, currently we don't know. I keep talking about big data and machine
learning, because I'm really reluctant to talk to you about artificial intelligence,
because it strikes me that we're nowhere near the stage where we can call it artificial
intelligence. Yeah. I mean, there's so much buzz about this, but
it's really mostly hype. It's gonna be a long road before you see things
that are general, and I don't think I worry about what it's going to do to society the
way some people do, because again, there's little glowy screens that we call phones,
and we love them. The reason that we love them is because they
help us as humans live our social life. To live better. They're tools for helping us be human.

3 thoughts on “Will Big Data Affect Government Policies? with MIT's Alex "Sandy" Pentland

Leave a Reply

Your email address will not be published. Required fields are marked *