Experts from the Personal Data Protection Office speak at the Data Protection Day at Vizja University
On 12 March, as part of the event “Data Protection Day with Vision”, a series of lectures and a panel discussion took place, devoted to current issues related to personal data protection in an era of rapid technological development, such as deepfakes, the dissemination of images of minors, and artificial intelligence. Presentations were delivered by the President of the Personal Data Protection Office, Mirosław Wróblewski, and Paulina Dawidczyk, Director of the Complaints Department at the Office. The debate was moderated by Jakub Groszkowski, Legal Adviser to the President of the Personal Data Protection Office.
Mirosław Wróblewski, in the opening lecture, discussed the problem of data protection in the context of new technologies, using the example of deepfakes. He pointed out that this is a form of improper use of personal data (such as image and voice), most often for fraudulent purposes – and the growing scale of the phenomenon increases pressure on supervisory authorities to react swiftly. The President recalled the high-profile complaint concerning the use of deepfake advertisements featuring the likeness of Polish citizens on Meta’s platforms. This case raised the question of whether digital platforms should be treated as joint controllers for all paid advertisements containing personal data.
Adults bear primary responsibility for protecting children’s images
Important guidance in this regard was provided by the Court of Justice of the European Union in the Russmedia judgment (C‑492/23), which concerned a false advertisement on an online platform. The Court held that a platform is obliged, even before publication, to apply appropriate technical and organisational measures enabling the identification of content containing special categories of data within the meaning of Article 9 GDPR, including the image of a natural person in a context that may violate their dignity. If the person posting the advertisement does not have the data subject’s consent, the platform must block the advertisement.
President Wróblewski also recalled a recent case in which a pupil’s image was altered into a nude photograph by her classmates. Unfortunately, law enforcement authorities found no grounds to initiate proceedings, and after intervention by the Prosecutor General, the case was discontinued. The President of the Personal Data Protection Office lodged a complaint in this matter in March of this year. This case illustrates that deepfakes involving minors pose a serious threat to their safety, dignity, and privacy. Current legislation does not provide effective and rapid legal protection against such acts. For this reason, the President called for the adoption of new, effective regulations.
Paulina Dawidczyk, Director of the Complaints Department at the Personal Data Protection Office, in her presentation “Consent to the processing of personal data and rules for sharing images – how to comply with the GDPR?”, addressed the issue of publishing images of children, including by schools and kindergartens. She emphasised that consent to process an image (whether of a child or an adult) must specify the conditions of its use (e.g. what caption will accompany the photograph), be granted to a specific entity, for a defined period, and be given in advance. It may also be withdrawn at any time. The form of consent is flexible, but for evidentiary purposes it should be recorded in some manner.
In the case of children, their age and level of maturity must also be taken into account. Younger children may be unable to give informed consent and often act in trust towards a parent, guardian, or other adult. For older children, their views should be heard before making important decisions concerning them – including consent to publish their image. It must be remembered that such publication constitutes a serious interference with a child’s privacy, and information about them posted online may be used for criminal purposes, including sexual exploitation.
Privacy in the age of ubiquitous artificial intelligence
The final presentation was delivered by Maciej Gawroński, legal counsel and partner at GP Partners, and concerned privacy in the age of ubiquitous artificial intelligence. He argued that we should rather speak of the death of privacy, as data placed online can now hardly be removed, even by invoking the so‑called right to be forgotten. He also listed various methods and forms of surveillance used by intelligence services, other state authorities, and private companies. In his view, the best way to protect privacy today is more frequent awarding of damages by civil courts under Article 82 GDPR.
Expert debate – people still do not associate AI with personal data protection
During the expert debate, Mr Gawroński noted that although EU legislation largely keeps pace with technological development, it is enforced only to a limited extent. While the GDPR has proven to be a good regulation, the AI Act appears more like a certification system containing a list of formal requirements. In his view, it neither prevents harm caused by AI nor ensures accountability, as providers can avoid responsibility by demonstrating compliance with formal criteria.
Paulina Dawidczyk also took part in the debate, pointing out that the average citizen still does not associate AI with personal data protection. There is no widespread awareness that not all information may be “fed” into such systems, and many companies lack rules governing the use of artificial intelligence.
Anna Maria Biała‑Malinowska from the Ministry of Digital Affairs presented the Ministry’s mOchrona application. It follows the principle of privacy by design and allows parents to control which applications their child downloads onto a smartphone. Moreover, once installed, the system recognises that the user is a minor and blocks many inappropriate types of content. Sławomir Jagieła, Director of the Postgraduate Education College at Vizja University, discussed current methods of using AI in education and the inclusion of data protection issues at various levels of the educational process.