Banks, credit bureaus, insurance companies, large retailers, market research companies – all have access to this wealth of information to some extent.
Image: File.
The corporate world is flooded with data on consumers.
Much of this data is tied to you personally – how much you spend, what you spend your money on, your lines of credit, and how you are managing your debt – and the law requires your consent for such information to be gathered and used.
But there’s also a mountain of de-personalised data out there, which is used to assess market trends and financial risks among different demographic groups.
Banks, credit bureaus, insurance companies, large retailers, market research companies – all have access to this wealth of information to some extent.
So are they using this data responsibly and is your privacy being sufficiently recognised and protected?
A few weeks ago, I attended the Second Africa Credit and Risk Reporting Conference, which provided fascinating insights into the world of consumer-related data as used by the credit industry to assess risk, but also into how it is treated more generally in the corporate space.
The Protection of Personal Information (Popi) Act, enacted in 2013 and in full effect since July 2020, sets conditions for the lawful collection, processing, storage, and sharing of your personal information by public and private organisations.
It distinguishes between:
The Act places the obligation to protect personal information on the responsible party, including where it is processed by operators and passed on to other third parties..
In her presentation to the conference delegates, Marina Short, CEO of Consumer Profile Bureau, said data was the lifeblood of the credit economy, the “new oil”, driving credit access and inclusion, but she warned that “with great opportunity comes great responsibility”.
One must always remember that behind the data are people. “Consumers and communities place their trust in us to use their data and information responsibly.
If we mishandle that data, we risk eroding the trust on which crucial financial systems depend,” Short said
She said who owns the data is not clear cut. At first glance, it seems simple – surely individuals own their data?
Or is it the organisations, who collect, store and use it?
“The law tells us that no-one truly owns this data. Responsible parties, as defined in the Popi Act, are the custodians, or stewards of information that is entrusted to them. So the better question is not who owns the data, but who is accountable,” she said.
Short said consumers also need to play a role in protecting their information.
With the widening social media net and new technologies such as biometric data capture, artificial intelligence, and the internet of things, we are all increasingly vulnerable to endangering our privacy without being aware of it. “There’s a lot of data out there that we are unwittingly – and unwillingly – sharing,” she said.
You need to give consent for your personal data to be collected and used, but it should be for a specific purpose. Short said that, all too frequently, data is collected for one purpose and then used for others, such as marketing.
But questions remain. What is the life of that data – under what circumstances can it be stored or removed? And do organisations have the right to make money from your data without compensating you?
Short said the consequences of governance failure on the part of custodians can be devastating. They need to guard against cybercrime but also against third-party breaches.
An app may request access to your data, to which you consent, but the information may then be passed on to or accessed by third parties of whom you have no knowledge.
“In 2024, over 5.5 billion user accounts were compromised globally, of which 35% were through third-party vendors,” Short said.
However, technology can also work in the industry’s favour. Blockchain technology, privacy-enhancing technologies, and alternative methods to assess, for example, a consumer’s credit risk, are now available.
One alternative method is that the credit industry could rely more heavily on anonymised, or de-personalised, data from a broader spectrum of data sources to assess credit risk among specific demographics.
Anton Grutzmacher, co-founder and chief risk officer of fintech data provider Omnisient, told delegates that technology makes it possible to irreversibly anonymise or “tokenise” a consumer’s data at source – for example, from retailers, telecom providers and utilities providers – which can then be used to model risk behaviours without fear of transgressing privacy protections.
There are a number of advantages to this method: it upholds transparency, fairness and predictive accuracy, while expanding access to credit to underserved population groups, empowering them without sacrificing oversight or trust.
In a recent blog for law firm ENS, Era Gunning, executive of banking and finance, and Rakhee Dullabh, executive of technology, media and telecommunications at ENS, said a recent court case in the European Union had prompted a reconsideration of anonymised information where the subject can be re-identified by some means – in other words, although the data doesn’t have your name on it, something in it or some technique can be used to re-identify you.
They say the Popi Act is clear that the “responsible party” remains responsible for ensuring that this doesn’t happen.
The judgment reinforces a core lesson, Gunning and Dullabh say: “Identifiability must be assessed from the perspective of the party in possession of the data at any given moment.
A responsible party that retains [the key to re-identifying the data subject] will continue to process ‘personal information’ and must therefore comply with the Popi Act conditions.
Conversely, once the data is disclosed – without the key – to a recipient that has no foreseeable means of re-identification, the dataset will generally be non-personal information in that recipient’s hands.”
Martin Hesse. Martin Hesse.
Image: File Image: IOL
PERSONAL FINANCE