The issue of privacy, especially in terms of the protection of personal data linked to a person’s identity, has come to the fore this month due to the coming into force of Thailand’s Personal Data Protection Act (PDPA). It applies to both public and private entities that keep or process personal data concerning other people and it establishes safeguards to protect people’s privacy.
The PDPA is derived heavily from developments in the European Union (EU), especially its General Data Protection Regulation (GDPR), which came into force in 2018. The latter has a direct impact on Thais and Thai companies involved in activities in the EU in this regard. The EU has also agreed to enact a new law this year in the form of the Digital Services Act, which will counter illegal goods online and help expose algorithms which may impact the right to privacy.
In reality, all regions of the globe are faced with mutating digitalisation, which raises new challenges. The most obvious is Extended Reality (ER) which offers simulated, immersive experiences to users, such as through special goggles and headsets, offering a variety of services through avatars, but simultaneously collects delicate data, such as eye movements, nose twitches and facial expressions.
First, there is the need for a data controller or processor to obtain the consent of a data subject. This principle of consent is an essential prerequisite in the relationship between the parties concerned.
Yet, it should be based on “informed consent”, which means that the data controller, in particular, should offer some of the essential information for a person to decide whether to consent to have their personal data retained and/or revealed.
Today, even the notion of informed consent is not necessarily adequate as the rise of ER has opened a pandora’s box to the vast amount of data collected and the multiple uses, both positive and negative, which may arise.
One critical danger is “psychography”, which is the psychological mapping of a data subject that may lead to psychological or other types of profiling that results in the discrimination of a person. Consumer education is thus pivotal to enable data subjects to be cautious of the consequences.
A better approach is thus to advocate the “consent plus” principle, which calls for the consent factor to be coupled with other measures, such as consumer consciousness and easy readability of contractual terms that shape the consent factor.
Second, the right to privacy is not absolute and some data can be revealed for legitimate purposes, even without the consent of a data subject. The acceptable reasons to limit the right to privacy include national security and public health issues.
There are also possible exceptions in regard to the need to use data for research, historical and statistical reasons. Yet, here too safeguards are needed against the over-zealous exposure of personal data for so-called legitimate purposes.
International human rights principles instruct that these purposes must not be arbitrary and those invoking them must prove that the use or exposure of data is genuinely necessary and proportionate to the circumstances at hand.
A key area of concern is that these purposes are often linked with the surveillance of those seen as dissidents or opponents of those in power. The political implications are all too obvious in non-democratic states, especially when coupled with single Internet Gateway laws and central cyber security laws.
Third, there is the principle of data minimisation which means that those who collect data should collect the minimum and not the maximum, and this is related to the need to prove functionality in relation to data collection and use. Yet, what is advocated as the minimal in a world of mutating digitalisation is complex.
If the manufacturer of those goggles and related platform owners claim that “it is to enhance the entertainment” of those enjoying a game on screen, the public should not forget that the psychological implications of addiction and possibly neurological impact in terms of psycho-fixations may ensue.
The targeting of the vulnerabilities of specific groups, such as children, needs to be addressed. One innovation of the EU’s new act is the prevention of targeting and the implementation of more controls on data collection.
Fourth, there is the issue of cumulative data and its impact. This is much more related to the new digitalisation that collects minute data, which may appear innocuous if singled out for some purposes but which are dangerous when cumulated as aggregate data.
The latter might lead to clandestine conclusions that interrelate with matters of race, colour, gender, sexual orientation and the social and political origins of a data subject. This scenario is also changing today because ER can collect not only the data of a person using the goggles but also data on bystanders without the latter being in the know about the implied surveillance.
Fifth, on a more encouraging note, there is now not only the emphasis on various rights in relation to privacy with the new legal developments but also the call for due diligence and accountability in the business sector in the process.
The new Thai law, together with the advent of laws in other countries, embeds various rights to help the data subject.
These include the right to access data, the right to erase data (originally known in Europe as the “right to be forgotten”), the right to rectify data and the right to data portability to transfer data.
The door is open to the online platform industry and related industries to adopt due diligence measures to assess the potential impact of their operations and to prevent or mitigate harm.
The sanctions can be quite daunting for violators. In Thailand, there are both civil damages and criminal sentences. In Europe, especially for mega-companies that reach out to over 45 million customers, the fines for breaches could amount to some 4% of their massive annual global turnover. In future, 6% will be under the new EU Digital Services Act!