Big Data: Parliament calls for better protection of fundamental rights and privacy
Posted by fidest press agency su giovedì, 16 marzo 2017
Strengthened transparency of algorithms, special attention to data used for law enforcement and more investment in digital literacy needed to safeguard fundamental rights in the digital era, MEPs say in a non-legislative resolution passed on Tuesday.
The non-legislative resolution drafted by Ana Gomes (S&D, PT) on the fundamental rights implications on Big data looks at how the increasing use of Big data impacts on fundamental rights, namely privacy and data protection.Big data is growing by 40% per year and has the potential to bring undeniable benefits and opportunities for citizens, businesses and governments, but also entails significant risks with regard to the protection of fundamental rights as guaranteed by the EU Charter and Union law. The resolution stresses the need to avoid discrimination based on the use of such data, including in law enforcement, as well as the need to ensure security of data.
MEPs want the Commission, the member states and the data protection authorities to take “any possible measures” to minimise algorithmic discrimination, including price discrimination, where consumers are given different prices of a product based on data collected from their previous internet behaviour, or unlawful discrimination and targeting of certain groups or persons defined by their race, colour, ethic or social origin, religion or political view or being refused from social benefits.”It is not just a question of data protection. These algorithms do have a real impact on peoples’ private lives because they can actually provoke what is happening and they can actually call into question and put at risk our fundamental rights through social media”, said Parliament’s rapporteur Ana Gomes in the debate ahead of the plenary vote.MEPs also emphasise the need for greater accountability and transparency of algorithms with regards to data processing and analytics by both private and public sector and warn that low quality of data or low quality procedures could result in biased algorithms.
The increase in data flows imply further vulnerabilities and new security challenges, MEPs say. They call for the use of privacy by design and by default, anonymisation techniques, encryption, and mandatory privacy impact assessments. They also stress that special attention should be given to the security of e-government systems.Special attention should also be paid to data used in for law enforcement purposes which should always be assessed by a human being, MEPs say. They call on the Commission, the European Data Protection Board and other independent supervisory authorities to issue guidelines and best practises for further specifying the criteria and conditions for decisions based on the use of big data for law enforcement purposes.MEPs urge the EU institutions and member states to invest in digital awareness raising of digital rights, privacy and data protection among citizens, including children. This education should foster an understanding of how algorithms and automated decision-making work and how data is collected for example from social networks, connected devices and internet searches.