Public Hearings of MP Nº869/18


Public Hearings of MP Nº869/18


Continuing the series of studies and follow-up of the Public Hearings held in the scope of analysis of Provisional Act nº 869/2018, the third meeting, held on 04/16/2019, addressed the subject "Data processing in the private sector, automated treatment and the Right to Explanation." Participating in the debate were the Chargé d’Affaires of the European Union in Brazil and representatives of the Brazilian Federation of Banks, the Brazilian Association of Radio and Television Broadcasters – Abert, the Brazilian Radio and Television Association – Abratel, the Brazilian Institute of Consumer Defense – IDEC, the Department of Regulation of the Financial System of the Central Bank of Brazil, the Brazilian Association of Information and Communication Technology Companies – Brasscom, academia represented by PUC-SP University, the Assespro Federation, the Ministry of Justice and Public Security, SindTele Brasil, and the Alana Institute.

One of the discussion points of the hearing was the question of the obligation of human review, when exercising the right to explanation, provided in the LGPD, in the management of data processed in an automated way. The Consumer Protection Institute pointed out that, from a consumer perspective, data processing cannot override consumer rights and that, while on the one hand automation can generate economic efficiency, on the other it is possible that decisions taken by algorithms are reviewed by the same algorithms that previously analyzed the question. In this way, if the revision were performed not by a human, with criticality, but by the same algorithm, the right of explanation would be greatly impaired. 

On the other hand, the Brazilian Association of Information and Communication Technology Companies stated that review by an individual does not add anything to the process except higher costs to the operation, not guaranteeing that the decision taken is better than that made by algorithms - which have even outperformed humans in recent technical research. Still in this sense, the Central Bank of Brazil stressed that the maintenance of the requirement of review of automated decisions by a human could inhibit the entry of new economic agents into the market, generating impacts for the financial system. In addition, he pointed out that the Brazilian Consumer Defense Code already establishes several obligations of transparency of data and the right of review, and it is not necessary to repeat the issue in the LGPD.

Representatives of academia have expressed that human review is not a guarantee per se, as well as automated review. The solution, therefore, would be to create algorithm governance, ensuring transparency and accountability of automated decisions, enabling a right to efficient explanation by issuing impact reports to eliminate mistaken or discriminatory biases, for example. In addition, they reinforced the importance of maintaining a dialogue about human confidence in relation to artificial intelligence.

Civil society, represented by the Alana Institute, brought an innovative bias in analyzing the subject, previously little debated in the Commission. Their contribution was based on the greater risk of discrimination of children and adolescents, due to the use of algorithms to treat their data (such as school information, for example). Thus, considering this "ultra-vulnerable" category of consumers and citizens, the Institute has emphasized the importance of strengthening the protection of data flow and guaranteeing the right to explain the treatment of information of children and adolescents, as this information will impact their future. Furthermore, it criticized the flexibility of the use of data by the Government, without accountability and its free sharing with the Private Authority, due to the risks generated by the information collected from children and adolescents; and corroborated the indispensability of the autonomy of the ANPD, crucial for the treatment of sensitive personal data of children and adolescents by the Public Authority.

Finally, other relevant topics were raised by the participants, such as the need to attribute competence on the processing of personal data to the Union, to resolve conflicts of competence, defense by refusal to extend the deadline for the entry into force of the LGPD by consumer organizations and the suggestion by the Ministry of Justice of a structure such as that of the Administrative Council for Economic Defense (CADE) for the ANPD, guaranteeing more resources, effectiveness and strength to the agency. The Commission rapporteur, at the end of the Public Hearing, pointed out that, in the final analysis, he would work for an autonomous ANPD and return to the original text of the LGPD, vetoed by former President Michel Temer. He also said that data privacy faces political problems but called for the current government to take an active stance on addressing the issue. 

While on 04/17/2019, the fourth and last public hearing of the Joint Commission was held on the analysis of MP 869/18, which discussed the theme "Data sharing and protection in health and scientific research". The debates brought together representatives of the National Supplementary Health Agency, the Oswaldo Cruz Foundation, the Center for Independent Research in Law and Technology – InternetLab, the Confederation of Santas Casas and Philanthropic Hospitals, the Brazilian Association of Diagnostic Medicine – ABMD, the Institute of Research in Law and Technology of Recife - IP.Rec, the National Confederation of General Insurance, Private Pension and Life, Supplementary Health and Capitalization Companies - CNSEG; and the Brazilian Society of Informatics in Health. There was also a brief participation of students and researchers from the University of Brasília - UNB, representing the Laboratory of Public Policies and the Internet.

The discussion emphasized the issue of the exception to the prohibition of sharing sensitive personal health data, in order to obtain an economic advantage in the event of a need for communication for the adequate provision of supplementary health services, foreseen in MP 869/18. In this sense, opinions were divided, and some participants considered that such a device would potentially violate fundamental principles of users, going against the protective logic of the LGPD and bringing maleficent consequences such as health insurance pricing and even denial of provision of necessary procedures; while others believe that the sharing of these health data is extremely beneficial and they understand that limiting the capacity of integration of the entities of the productive chain would strongly affect patient care and citizens rights. 

It is interesting to note that InternetLab took the position for the need to regulate the sharing of sensitive health data, which should be anonymized whenever possible, in order to avoid possible harms and benefits, which could generate some kind of discrimination among users of the health service. The representatives of the Confederation of Santas Casas and Philanthropic Hospitals and ABMD understood that the sharing of data for the adequate provision of supplementary health services should occur only in cases that represent some benefit to the users, protecting the rights of the citizens involved, while at the same time allowing technological innovation. In addition, CNSEG stressed the benefits of sharing this type of sensitive data with patients and health professionals - such as the possibility of discussing diagnoses with professionals from other specialties and implementing public policies, for example - arguing that the goal is not only the achievement of an economic advantage, but also protection of the beneficiary and viability of the activity carried out.  

Other important points raised were the inadequacy of the term "health entities" to describe those responsible for implementing health care procedures - which was probably the result of a translation error of the corresponding provision in the European General Data Protection Regulation (GDPR) -, considering that it should cover the entire Brazilian health sector; and the defense by IP.Rec that scientific research agencies be not for profit in order to avoid abuse by private law agencies (such as profiling of voters and directing electoral content, for example), while ABDM took the position of the importance that private agencies (such as diagnostic medicine and private universities) be considered research agencies in order to enable their activities. Also, the need to grant a longer period of adjustment to the LGPD was stressed. 

Finally, it is important to emphasize that, at the time, the Confederation of Santas Casas and Philanthropic Hospitals reinforced the relevance of creating the ANPD with the greatest possible urgency, suggesting a provisional model in which the agency would be part of the Public Administration, but would be allocated to the highest hierarchy to enable impartial treatment of data of private and public entities, besides having an unpaid multi-sectoral council, able to supervise the organizations performance and maintain a permanent dialogue. In the future, once legal uncertainty has been overcome and there is "budget slack", the agency would be removed from the Direct Administration, in order to operate under a special autarkic and independent regime. 

With the end of the public hearings, which provided balanced and diversified debates, the seventh meeting of the Joint Committee would be held on April 24, 2019 for analysis and discussion of the report and the final conclusions on the subject. However, due to the relevance of the topic and the need for adjustments in the final report, parliamentarians decided to suspend this session and resume it on April 25, 2019 bringing important conclusions about the issue. The session was suspended again and will be resumed on May 7, 2019 for new eventual topics.

The Telecommunication, Media and Technology team of Azevedo Sette will continue to follow the debates and developments on the subject. 


São Paulo, April 23, 2019

TMT Team of Azevedo Sette Advogados

Ricardo Barretto Ferreira

Juliana Sene Ikeda

Lorena Pretti Serraglio

Vitor Koketu

Isabella Aragão