Ethics in technology

Our CEO Jeff Glueck sat down with tech ethicist and founder of All Tech Is Human, David Ryan Polgar, to discuss privacy, the value of data and the future of ethics in technology.

Alex Kotliarskyi Unsplash

From transparency and developing legislation, to GDPR compliance and governmental regulations, to consumer awareness and digital citizenry, issues around technology and ethics are complex — and constantly evolving.

Foursquare is taking an active role in the development of ethical practices, both internally and externally. But how can we as a society — and Foursquare specifically as a company — map a terrain that is continually emerging?

Here’s what we learned:

Some say that technology is hijacking our minds. What is the role tech companies play in how people spend their time?

David Ryan Polgar: In my own background, I have an initiative called All Tech Is Human that is about better aligning the business interests of tech companies with the human interests of society. And what that organization and others like Tristan Harris’s Center for Humane Technology and the Time Well Spent movement are focused on is asking: is technology aligned with our connectivity, our happiness, with truthful information and the best interests of humanity?

There is this slot machine mentality — that our phones are designed like a slot machine with variable rewards versus something that has increasing value. Notifications are a top contributor to this effect. Companies have strong incentives to constantly hit users with notifications. But, that seems to be against user interests that are saying, “Hey, I want my smartphone to get me places faster and help me out.”

Jeff Glueck: I think about Time Well Spent and the role that tech can play in encouraging us to explore the real world more meaningfully. As a CEO at Foursquare and very much as a dad of three young kids, I care a lot about this issue.

At Foursquare, we believe passionately that tech can help encourage people away from endless social media scrolling and instead into noticing what’s around them. Notifications should be infrequent but extremely relevant to your context. That is part of the solution. The opposite of that is the constant, alienating stream of slot machine notifications. We need technology to understand human context in the real world, and to connect digital spaces with real-world places. Using context means engagement with apps that you don’t have to open; apps can become like a friend that taps you on the shoulder when there is something relevant to the moment that’s worthy of your attention.

For us, that means working with partners like Tinder to better connect people (who opt-in) to be matched with others who frequent their favorite places. Or, working with AccuWeather to make weather notifications contextually-aware, like a rain alert at a stadium is more pressing than one mid-afternoon at work. We power contextual computing to make those landscapes more relevant.

Going back to a no tech world is not an option. So how do we make it humane?

DRP: The idea of context is so important. Recently when I was going to New York, I didn’t realize that it was going to rain when I got there so I didn’t carry an umbrella. This was a prime opportunity for a weather alert. Even though that reveals location data, it gives value. However, consumers get nervous if they feel their data could get exploited — if we are attempting to understand patterns of human behavior that are used to exploit weakness as opposed to enhance one’s life.

Do people understand how their data is collected and used?

JG: We are always thoughtful about data use and we want to set a really high bar for the industry. We keep as our core D.N.A. the belief that data is a privilege. All the data that we collect in our apps or in the apps that we power is always based on transparent opt-in, with disclosure, notice, and choice. And, should consumers wish to download their data or be forgotten in GDPR parlance, we will fully comply. It’s simply responsible business practice.

Creating meaningful bridges between the digital and the physical world does require consumers to opt-in for us to be able to have awareness and make the experience more personalized. Our apps need location in order to work. In return we have to show value to consumers, and the same goes for partner apps — we require that there be a consumer-first reason that they’re collecting location data. As far as user data sharing, it’s always with consent.

DRP: That’s an exact point of what consumers want. We don’t want data to be stolen or used and abused. But, we do need to understand that data is a currency that you can exchange as value.

There was recently a New York magazine article about a coffee shop outside of Brown University where students pay for their coffee with their data instead of cash. That’s transparent. You’re giving people the upfront transparent decision-making power about their data and its value.

I think the concern is that data could be used to exploit a vulnerability. Cathy O’Neil writes about the Hippocratic oath for data scientists. I give my data knowingly and I want value from it. I don’t mind it being used and translated and thrown back at me as a recommendation, but companies have to have my best interest in mind.

So that hits right on the need to protect consumer privacy.

JG: I really like the idea of a Hippocratic oath. With the “do no harm” philosophy in mind, we started ethics training at Foursquare, as well as inviting people from across the company to be part of an ethics committee whose job it is to question if we are doing right by consumers. Because that is the ultimate test for us: are we adding value to the consumer experience to earn our keep?

But who is responsible, ultimately, for maintaining privacy and data use regulation? Industry, government, the people?

DRP: We need to have all of the different stakeholders take an active role in the development of tech ethics. I’d like to see industry and academia helping to inform the legislative branch, plus a responsive and active industry educating consumers. Then media to convey the stories and lend a necessary type of oversight. That’s where I see this issue headed.

We need to better educate adults on issues around digital citizenship and web literacy. We need to help everyone understand data policies, privacy policies, and how data is being used. This is not a black or white issue. It is about knowing the issues, knowing the value that you have in data, and then making the informed decision to have trade off because you are receiving the value.

My own philosophy — and why I have All Tech Is Human — is the idea that everybody can come to the table together and realize that technology when used correctly, when better aligned with our interests, can elevate our happiness and our ability to have real world experiences.

JG: I fully agree with that. For Foursquare, we have always believed that data is a privilege. That is why we’ve invested in all sorts of thoughtful policies and we hold ourselves to a higher bar.

And we continue to work at it. We are proud of the sense here that our work will not be done until more and more experiences are smartly relevant and based on a deep understanding of what people value in their lives. Ninety-percent of our consumer lives (as measured by spending) is in the real world, not on a screen. Making technology serve our real lives and not serve the endless scroll is very much part of our mission here at Foursquare.

A primer on Ethics in Technology, curated by David Ryan Polgar:

Investing to Train More Ethical Engineers (Techonomy)

Zuckerberg’s “move fast and break things” isn’t the way forward — a case for tech ethics training in education.

Materials for Ethics Training Workshops In Tech Companies (Santa Clara University)

A free, downloadable toolkit on tech ethics compiled by SCU’s Markkula Center for Applied Ethics.

Best Ethical Practices in Technology (Santa Clara University)

Ethics made easy: 16 guidelines to follow for putting tech ethics in practice.

Computer Programmers Get New Tech Ethics Code (Scientific American)

The Association for Computing Machinery’s updated ethics code explained — and why it’s important.

Stanford to Step-Up Teaching of Ethics in Technology (Financial Times)

What the school that educated many Silicon Valley tech founders is doing to instill tech ethics in the classroom.

How do we regulate advanced technologies along social or ethical lines? (Phys.org)

Who is responsible for regulating emergent technologies? Arizona State University Regents Professor of Law Gary Marchant explains.

Will the workplace of the future have an in-house data science ethicist? (Silicon Republic)

One author on what’s at stake for the future of society — and why data science needs more ethicists.

More on data

The benefits of using geospatial data in analytics

Learn More

How location intelligence leads to powerful data solutions

Learn More

Deck the Malls: What to Expect from Shoppers this Holiday

Learn More