Ethical Data Science: An Oxymoron?
In the latest iOS update, Apple has added a feature where users can choose to opt out of being tracked by third-party apps. Many have hailed this as a long overdue step in the right direction since the mining of personal data specifically on social media has been such a widespread occurrence. A less sinister example would be targeted marketing, but Target has proven that even that could go wrong.
As data scientists, objectively our task is cleaning, organizing, and visualizing data. It just so happens that parsing personal information is a part of that task. How do we decide what is considered an invasion of privacy, and what do we do when privacy is breached wrongfully?
At every step of the workflow, there is potential for bad faith actors to prioritize profit. While Apple is letting users hold the reins on who or what has access to their data, what it has done is alleviate a surface level complaint concerning location services and ad tech. All the while, it still has mostly unbridled access to our data since apps are designed to better the user experience, and by extension learn about the user on an intimate level. Apple has no real control over how third party apps operate, so the user has to bear the responsibility of researching if an app is morally gray or worse.
FaceApp, a photo editing and sharing app, seemed harmless on the surface. However, only a few months into its trajectory to the top of the Appstore charts, critics raised concerns about how the app stores user photos on its servers since it requires an image to be uploaded in order to be edited. Others further speculated that the app was being used to better facial recognition algorithms. On top of that, the app integrates Google ad tracking, which is curious for a photo editing app.
One positive take I’ve seen by cyber-security researchers is that having local servers prevents the app from piracy, prevents IP (the photo processing code) from being stolen, and makes using the app a pleasant experience since otherwise there would possibly be longer image processing times and more battery drainage. While this could be true, the developers should have given users the option to delete their photos, which is a process that they have relegated to the app itself (photos get deleted in two days’ time.) While many users no longer use FaceApp in light of these questionable traits, the crux of the issue remains.
Very little legislation is in place to combat invasions of privacy, and the legislation that currently exists favors the app developers since most users consent to the privacy policy without reading the fine print.
Even then, who has the time and resources to contest the terms and conditions of an app worth millions?
Until legislative bodies are pushed to intervene, users will continue to have very little say in how their personal data is handled, so it is up to the collective body of concerned data ethicists and cyber security experts to raise the alarm.