Data Hostage: What do we leave behind at the border?

 

Is there a surveillance crisis brewing along our borders? Every time you cross a border technology has made it possible to violate your privacy. Your face is photographed, mobile phones tapped and social media analyzed, in an effort to “improve border security.” But is this ethical and what happens when your data falls into the wrong hands?

Receive this newsletter in your inbox

Photo credits: unsplash.com

Photo credits: unsplash.com

 
 

Technology and Ethics researcher Dr. Stephanie Hare recently wrote about the chilling implications of facial recognition software that has been deployed at airports, shopping centers and even conferences to tighten security. She explained how most people are unaware about the fact that their scanned faces are probably being matched against police watch lists or used to train machine-learning algorithms which monitor people.

"It makes us vulnerable to tyrants, for whom a technology that tracks us without our knowledge offers new possibilities to persecute according to our ethnicity, religion, gender, sexuality, immigration status or political beliefs,” she writes.

This week we reflect on the growing surveillance in Europe with articles from our latest magazine.

Editor’s Picks

As Europe tries to develop a balance between freedom, justice, fairness, and individual rights, Nicole Chi writes about how its borders will become more important to understand the continent’s fragilities in the face of new technologies. With refugees attempting to cross borders and enter Europe, facial recognition software has been extensively used for border security. But Nicole says the EU must implement clearer, more transparent processes for evaluating where and for what purposes these technologies can be used.

surveillance.jpg

At the Border of Europe's Surveillance State

Each time you cross a border, your privacy is violated.

Rupert Riddle analyzes the failures of facial recognition software with respect to racial and ethnic profiling. “Black and brown faces are more likely to be subjected to police surveillance, and hence more susceptible to be misidentified as potential offenders in the case of false positives,” he writes.

automated-assumptions.jpg

Automated Assumptions: The Failures of Facial Recognition

How technology has increased racial discrimination.

Ahead of the GovTech Summit in Paris, Hannah Johnson, COO of Public says: “As Europe builds its public service infrastructure, what’s required are innovators who will create technology with a European approach to citizen data and rights at the core.”

EU-tech.jpg

Can Europe Show the World a New Way for Tech?

Hanna Johnson explains the role Europe can play in harnessing the transformative power of technology for public services.

If Europe wants to innovate without sacrificing its values in a world governed by algorithms, it will have to find a way to combine innovation with freedom, justice, and individual rights. A tall order indeed—but a necessary one.

That’s it for now. You’re up to date!

With love,
Priyanka and the Are We Europe team