"Grey areas" of legal regulation of AI and personal data processing

Authors

DOI:

https://doi.org/10.31617/3.2026(142)09

Keywords:

artificial intelligence, personal data protection, personal data processing, artificial intelligence and personal data, profiling, web scraping.

Abstract

In the modern period, artificial intelligence is often used to perform tasks that are in one way or another related to the processing of personal data. According to current legislation, any operations with personal data must be carried out openly, transparently, and in proportion to defined purposes. At the same time, AI systems are characterized by the "black box" phenomenon, which consists in the fact that the way they function is poorly understood and causes situations that may not have a clear legal solution – they occur in a "grey area". It is consi­dered why the issue of personal data protection is becoming particularly acute due to the development of artificial intelligence and related technologies. The study is based on the hypothesis that the lack of transparency regarding the impossibility of explaining to the data subject what specific information about him and in what way the AI system will use it is the only factor violating the personal data protection. To test this, common applications of AI were analyzed: profiling, biometric identification, content generation, as well as the procedure of training AI systems, which is associated with web scraping as a method of collecting information from websites. Since the mentioned methods are mostly designed by neologisms, their essence is described using the etymological method. The concept of personal data processing in the legislation of countries around the world has been studied, and it has been shown according to which criteria regarding the pro­cessing object one can assess processing methods for potential danger. The results of the study lead to the conclusion that the processing of personal data by AI systems may pose an unacceptable risk, violate the requirement for personal data accuracy, create threats to the enhanced protection of "sensitive data" and information about children, occur without the awareness of data subject, and also create a situation where personal data is stored significantly longer than necessary for the defined purpose and when it cannot be corrected (exercise the right to rectification). The factors that cause violations of personal data protection, aside from the lack of transparency, include the inability to ensure full control over the AI operational processes, active interaction with other systems and networks, as well as insufficient protection against data leaks, unauthorized access, and other threats.

Author Biography

Andrii HACHKEVYCH, Institute of Law, Psychology, and Innovation Education Lviv Polytechnic National University

PhD (Law), Associate Professor, Associate Professor at the Department of International and Criminal Law

References

Borak, M. (2025, June 11). Clearview AI faces more legal uncertainty in UK and US. Biometric Update. https://www.biometricupdate.com/202506/clearview-ai-faces-more-legal-uncertainty-in-uk-and-us

Brazilian Data Protection Law LGPD. (2018). ANPD. https://www.gov.br/anpd/pt-br/centrais-de-conteudo/outros-documentos-e-publicacoes-institucionais/lgpd-en-lei-no-13-709-capa.pdf

Bulgakova, D. (2022). Unique human identification under the GDPR Article 9(1) (2). Philosophy of Law and General Theory of Law, 1, 130-159. https://doi.org/10.21564/2707-7039.1.275645

California Legislative Information. (2018). California Consumer Privacy Act of 2018. https://leginfo.legislature.ca.gov/faces/codes_displayText.xhtml?division=3.&part=4.&lawCode=CIV&title=1.81.5

California Legislative Information. (2021). California Genetic Information Privacy Act of 2021. https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=202120220SB41

Cambridge Dictionary. (2025a). Biometric. https://dictionary.cambridge.org/dictionary/english/biometric

Cambridge Dictionary. (2025b). Web-scraping. https://dictionary.cambridge.org/dictionary/english/web-scraping

European Data Protection Board. (2024). Opinion 28/2024 on certain data protection aspects related to the processing of personal data in the context of AI models. https://www.edpb.europa.eu/system/files/2024-12/edpb_opinion_202428_ai-models_en.pdf

European Union. (2025). The EU Artificial Intelligence Act. The Artificial Intelligence Act. https://artificialintelligenceact.eu/

Fraser, G. (2024, December 13). BBC complains to Apple over misleading shooting headline. BBC. https://www.bbc.com/news/articles/cd0elzk24dno

GDPR. (2013). Art. 4 - Definitions. In General Data Protection Regulation (GDPR). https://gdpr-info.eu/art-4-gdpr/

General Data Protection Regulation. (2025, April 9). Best 3 GDPR risk assessment examples. https://gdprinfo.eu/best-3-gdpr-risk-assessment-examples

Hachkevych, A. (2025a). "Synthetic Creativity" of Generative Artificial Intelligence Poses Challenges for Legal Protection of Copyright and Related Rights. Bulletin of Lviv Polytechnic National University. Series: Legal Sciences, 12(3), 43-50. https://doi.org/10.23939/law2025.47.043

Hachkevych, A. (2025b). The impact of generative artificial intelligence on the legal systems of contemporary states. Law and Innovative Society, 1(24), 37-46. https://doi.org/10.37772/2309-9275-2025-1(24)-3

Humerick, M. (2018). Taking AI Personally: How the E.U. Must Learn to Balance the Interests of Personal Data Privacy & Artificial Intelligence. Santa Clara High Technology Law Journal, 34(4), 393-418.

Kelsey-Sugg, A., & Carrick, D. (2024, November 3). AI hallucinations caused artificial intelligence to falsely describe these people as criminals. ABC News. https://www.abc.net.au/news/2024-11-04/ai-artificial-intelligence-hallucinations-defamation-chatgpt/104518612

Kleinman, Z. (2018, March 21). Cambridge Analytica: The story so far. BBC News. https://www.bbc.com/news/technology-43465968

Law relating to the protection of natural persons in the processing of personal data. (2018). Official Journal of the Peopleʼs Democratic Republic of Algeria, Conventions and International Agreements. https://www.joradp.dz/FTP/JO-FRANCAIS/2018/F2018034.pdf

Mayers, L., Martin, S., & Rybicki, D. (2023, April 6). Victorian mayor may sue OpenAI after ChatGPT "accuses" him in bribery case. ABC News. https://www.abc.net.au/news/2023-04-06/hepburn-mayor-flags-legal-action-over-false-chatgpt-claims/102195610

No. 4 of 2013: Protection of Personal Information Act, 2013. (2013). Republic of South Africa Government Gazette. https://www.gov.za/sites/default/files/gcis_document/201409/3706726-11act4of2013protectionofpersonalinforcorrect.pdf

Higher Regional Court of Cologne, 15 UKl 2/25. (2025). Justiz.nrw.de. https://nrwe.justiz.nrw.de/olgs/koeln/j2025/15_UKl_2_25_Urteil_20250523.html

Onik, M. M. H., Kim, C.-S., & Yang, J. (2019). Personal Data Privacy Challenges of the Fourth Industrial Revolution. In 2019 21st International Conference on Advanced Communication Technology (ICACT) (pp. 635-638). IEEE. https://doi.org/10.23919/ICACT.2019.8701932

Sartor, G., & Lagioia, F. (2020). The impact of the General Data Protection Regulation (GDPR) on artificial intelligence. European Parliament. https://doi.org/10.2861/293

Singapore Statutes Online. (2012). Personal Data Protection Act 2012 - Singapore Statutes Online. https://sso.agc.gov.sg/Act/PDPA2012

Dmytriiev, O. (2019a). Content - interpretation, spelling, new online orthography. https://slovnyk.ua/index.php?swrd=контент

Dmytriiev, O. (2019b). Profile - interpretation, spelling, new online orthography. https://slovnyk.ua/index.php?swrd=профіль

The Law of Ukraine "On the Protection of Personal Data" No. 2297-VI. (2010). Official Website of the Parliament of Ukraine. https://zakon.rada.gov.ua/laws/show/2297-17#Text

Published

2026-03-12

How to Cite

[1]
HACHKEVYCH А. 2026. "Grey areas" of legal regulation of AI and personal data processing . Ius Modernum. 142, 1 (Mar. 2026), 121–135. DOI:https://doi.org/10.31617/3.2026(142)09.

Issue

Section

IUS NATIONALE