Image 1/4

Image 2/4

Image 3/4

Image 4/4

Surya Mattu


Tell us a little about your role in journalism and creating algorithms towards data narratives, for example in the about section of ProPulica they highlight that ‘many’ articles“... are augmented with data-rich “news applications” which, in turn, permit the localization of stories on the same subject by other news organizations. Almost all our stories are available for reprint under a Creative Commons license. “ Is this part of your mission? What other ways do you seek to enrich news and stories with your work? How is this continuing at Special Projects Desk?
In the context of journalism the focus is on finding ways to understand existing algorithms rather that create them. Often, the focus isn’t on understanding how it works (usually because its proprietary information) but rather inspecting the final results and seeing either where laws are being broken or trying to show how people are being affected. Usually, such stories do have a data-driven element to them and depending on the case you often have a widget that allows you to look at how your demographic was impacted. A big focus for me doing when doing this work is to try find ways to talk about the systems we are describing at the scale of an individual.

In the piece “Last Seen” you wrote the software/algorithms for that? What was your role in determining how the piece revealed data? It is a piece that builds on your thesis work “From the dark”, correct? If so, can you talk a little about that?
Yes, Last Seen evolved from my thesis. At the time I was interested in understanding how our smart devices leak our personal information. What I learnt was that a part of the Wi-Fi protocol was designed to make our devices broadcast the networks they had previously connected to. This was done to try and make it faster to connect to the network but a side effect was that our devices were broadcasting all the places they had previously connected to. In essence, your smart phone was shouting out the names of these networks in plain text. They include the conferences you have been to, coffee shops you visit, airports, friends networks and the like. For my thesis I made Wi-Fi Portraits of the people whose information I sniffed and for Last Seen we were running it live showing the information the audience members were broadcasting.

Your collaborative piece “Big Data Pawn Shop,” what again was your role in this work? How did you feel creating a ‘product’ from leaked information affects the message of the information? In creating the apparel / object from leaked documents, is this a fetishization of conspiracy and information?
The idea behind Big Data Pawn Shop was to think of a way to talk about the Snowden leaks at a time when people were tired of hearing about them. Instead of trying to come up with direct commentary on the importance of the leaks we decided to take advantage of the holiday season give people a way to expose their family and friends to the subject. The intention was not really to fetishize the documents but instead to take them out of their very serious context and place them in a context of capitalism that we are all more familiar with.

Julia Angwin, Jeff Larson, Lauren Kirchner, Terry Parris Jr. and yourself of ProPublica were finalists for a Pulitzer for your series “Machine Bias” a series of stories that investigated the effects of algorithms on our society. Can you tell us a little about how that began? What shaped the series? Racial bias was a common theme? Why race and not class or gender? Also, Amazon and Facebook? How were you hoping to open up algorithm usage in consumption and social networks in terms of creating awareness?
The focus of the series was to try and understand the different ways in which algorithms influence our lives and how they tend to reflect the systemic biases that exist in society. There is a lot of discourse around the potential harm of algorithms but very few actual cases of reported harm. This is not by a coincidence, one of the reasons algorithms can be so dangerous is precisely because it can be hard to measure how they are behaving for different groups of people. This is especially true for big companies who go to great lengths to prevent users from understanding how their systems work behind-the-scenes.

This was the challenge we tried to tackle, our objective was to go broad and show the influence in many different areas. These include the justice system, education, e-commerce, social media and biotech. One reason why race came up so much in the series was because it perfectly highlights a classic problem with algorithmic systems. The problem is that while algorithms are designed to ignore data related specifically to gender, race and class they usually include other data that can be a proxy for that information. For example, an algorithm might ignore a person's race when making its decision but still uses a person zip code. There is extensive research to show that a zip code is a very strong predictor of a person's race. So in this way a company can honestly say they do not use racial data in their algorithms but can still have an outcome that adversely impacts minority groups. The story we did about the Princeton Review is an example of this. We tried the same tests for gender but didn't find much in our case studies.

Your practice in general embraces hardware and software, do you have an inclination towards one more, how do you balance the two, and how do both areas work together for you in terms of being a creator?
I move between hardware and software depending on the project. The majority of my work as a journalist tends to be software driven whereas personal work and research has elements of wireless communication and networking which tend to involve more hardware

Do you consider yourself a hacker?
Sort of, I am not a fan of that term as the context in which it’s used has changed a lot over the years both in the cybersecurity and commercial tech space. I like to learn about how things work and spend time doing so, because it’s in tech I guess that makes me a hacker but in other fields I think i’d just be called curious.

Do you feel you are both an engineer and an artist? If so, did both areas come to you easily growing up? Or was either area more of a challenge?
I have always been in interested in understanding how technical systems work and even though I enjoyed working on creative projects I didn't have the self-confidence to think I could be an artist. So engineering definitely came easier. After some work experience I recognized that I was always drawn to projects that had a creative and/or critical aspect to them and began to make such projects myself. I guess that’s when I started identifying as an artist.

Of your own work, what was pivotal to you in terms of defining your practice? How are you changing and shifting now in your work?
To be honest I still don’t fully understand what my practice. I would say I am interested in doing research on issues concerning technology and its influence on society but that like saying I enjoy making pictures with colors in them. But still, I like to do research on tech related issues and depending on what the subject is they take shape either as an article, an art project or an educational tool/curriculum so things are in constant flux.

Any cool projects coming up you want to talk about?
I have been working called Herbivore with my collaborator Jen Kagan. Herbivore, is a packet sniffing and inspection tool that hopes to for networking and communications what processing is to code. Interviews
Surya Mattu Vol 1