Consent does not protect privacy in the era of big data because it is not meaningful in an era of giving permission through clicks on a screen, said Kate Crawford, a researcher at Microsoft Research and MIT, at the Social, Cultural & Ethical Dimensions of 'Big Data' held last night.
Big data analytics are being sliced and diced to create personalization and segmentation, Crawford said. But predictive analytics can create "predictive privacy harms" under the "rubric of personalization," Crawford said.
Instead of using consent to cure potential harms, there should be a data due process framework "placing accountability at the very end of the chain," Crawford argues. When data about a person is being used to make a decision that would affect their lives, disclosure should be mandated so that he or she can have the opportunity to respond, she further argues.
There should be more protection when the decisions involve important matters like health and employment, and there could be weaker protection when the decisions involve less weighty matters like advertising, Crawford said.
Even the most sophisticated systems can leak privacy information, Crawford said. The combination of private signals with public signals can be combined so that people's privacy is deeply violated, Crawford said.
"We need to be a little bit more skeptical when people tell us data is going to be secure," Crawford said.
Steven Hodas, a consultant who has worked on data projects for educational systems, said that the backlash against the InBloom, the company trying to collect, store and share student data with the support of the Gates Foundation, was because parents felt that their kids were being reduced to algorithms and they did not want teaching reimagined as educating a cohort.
Personalization does not mean more human interaction, but better data configuration, he said. We are "headed for dissonance with dissidence not too far behind," he said.
Parents want teachers to be "analog craftsmen, not maker bots," Hodas said.
The blowback against InBloom might have been averted if there had been portals for parents to access parent-oriented data, Hodas added.
Columbia University scholar Alondra Nelson said that data about genetics is a disproportionate issue for minorities because more minorities are arrested or convicted and have their DNA uploaded into criminal justice system databases. Blacks make up 13 percent of the American population, but they are 40 percent of felony convictions, she said. Even innocent people who are not ultimately convicted have their DNA included in the databases, she added.
In another example of how genetic data implicates privacy, sequencing the genome of the HeLa cell line and uploading it on-line meant that personal information about Henrietta Lacks, the woman from whose cervical cancer cells the cell line was developed, and her family could be identified, Nelson said. That included genetic markers for physical appearance and disposition for diseases.
The event was cohosted by the Data & Society Research Institute, the White House Office of Science and Technology Policy, and New York University's Information Law Institute.
Nicole Wong, a former legal director at Twitter and now a deputy U.S. chief technology offer working in the White House' big data workgroup, said we need to "lean into those hard questions" about the issues of technology, privacy and individual liberties.