Some stuff went down on the internet around an op-ed that Cathy O’Neil wrote in the New York Times about algorithms and accountability. I’ve no interest in being drawn into the kick-back, but it did force me to think about the extent to which the response to this issue has to be one of civic engagement as well as academic engagement (there is plenty of academic engagement – as the kick-back evidenced in spades, but I see fewer examples of that activity translating into civic or legal activity – but maybe I’m just not looking hard enough).
This is going round my head. It's a societal issue. Requires civic, legal *and* academic response. How do we do our piece in a bigger frame?
— Anne-Marie Scott (@ammienoot) November 15, 2017
This will continue to go round in my brain for a while yet, but a few interesting things that I’ve read recently are worth gathering together.
“Our data is ours, but it also is not ours. We trade it away for so much of our experience on the internet. Money from a data tax could begin to counter this trade imbalance.
The money should go toward improving privacy of our information on the internet, countering identity theft, improving connectivity and internet literacy, all causes that would help create a more equitable internet for all.”
It’s Time to Tax Companies for Using Our Personal Data (New York Times)
“The biggest problem also is that privacy and security don’t seem to be the responsibility of the manufacturer, but rather that of the consumer.”
“Digital understanding is the root of fairness — it means people can know how the technology of the internet works, it makes them aware of its power structures, and it enables them to question what these mean for their choices, rights, and lives.”
This is Digital Understanding (Doteveryone – Medium)
“Research in queer theory, race and privilege, and gender studies is exactly what is needed to advance fairness in algorithms. But this work, and the many scholars from underrepresented groups who have brought attention to these problems, have a long history of marginalization both within the academy and without.”
We’re Awake — But We’re Not At the Wheel (PERVADE – Medium)
“It should finally be noted that the nature of the data is also becoming less and less static; rather, data increasingly goes through a lifecycle in which its nature might change constantly. While the current legal system is focused on relatively static stages of data, and linked to them specific forms of protection (e.g. for personal data, sensitive data, private data, statistical data, anonymous data, non-identifying information, metadata, etc.), in reality, data go through a circular process: data is linked, aggregated and anonymized and then again de-anonymized, enriched with other data and profiles, so that it becomes personally identifying information again, and potentially even sensitive data, and is then once again pseudonymised, used for statistical analysis and group profiles, etc.”
Ten Questions for Future Regulation of Big Data: A Comparative and Empirical Legal Study (JIPITEC – Journal of Intellectual Property, Information Technology and E-Commerce Law)