Digital Privacy
02 Nov 2016 Jim Killock Privacy
Facebook is right to sink Admiral’s app
Late yesterday, on the eve before Admiral tried to launch Firstcarquote, their application’s permission to use Facebook data was revoked by the social media site.
According to Admiral’s press release their app would use, “social data personality assessments, matched to real claims data, to better understand first time drivers and more accurately predict risk.” So young people could offer up their Facebook posts in the hope of getting a reduction in their car insurance.
However, their application has been found to be in breach of Facebook’s Platform Policy section 3.15, which states:
Don’t use data obtained from Facebook to make decisions about eligibility, including whether to approve or reject an application or how much interest to charge on a loan.
Firstcarquote’s site says:
“We were really hoping to have our sparkling new product ready for you, but there’s a hitch: we still have to sort a few final details.”
Like persuading Facebook to change their Platform Policy.
There are significant risks in allowing the financial or insurance industry to base assessments on our social media activity. We might be penalised for our posts or denied benefits and discounts because we don’t share enough or have interests that mark us out as different and somehow unreliable. Whether intentional or not, algorithms could perpetuate social biases that are based on race, gender, religion or sexuality. Without knowing the criteria for such decisions, how can we appeal against them? Will we start self-censoring our social media out of fear that we will be judged a high risk at some point in the future?
These practices could not only change how we use platforms like Facebook but also have the potential to undermine our trust in them. It is sensible for Facebook to continue to restrict these activities, despite patents indicating that they may themselves wish to monetise Facebook data in this kind of way.
Insurers and financial companies who are beginning to use social media data need engage in a public discussion about the ethics of these practices, which allow a very intense examination of factors that are entirely non-financial.
Companies like Admiral also need to think about how using such fluid personal information leaves their system vulnerable to being gamed. How hard would it be to work out what “likes” Admiral views as favourable, or unfavourable, and alter your profile accordingly? What we regard as a chilling effect could also turn out to be an incentive to cheat.
We must also recognise that these problems may confront us in the future, as the result of the forthcoming changes created by the General Data Protection Regulation. The government is clear this will enter UK law regardless of Brexit, which is sensible.
The GDPR creates many new rights for people, one of which is the famous right to delete your data, and another is the right to obtain all of your information at no cost, in electronic format, called “data portability”.
Data portability creates significant risks as well as benefits. It could be very hard to stop some industries attempting to abuse the trust of individuals, asking them to wholesale share their data to obtain discounts or favourable deals, but perhaps not being completely upfront about the downsides to the consumer.
There are extra protections in the GDPR around profiling, and particularly important is the right to have information deleted, if you find you have “over shared”.
Nevertheless, Admiral’s application shows a lack of understanding of the risks and responsibilities in parts of the financial industry. Indeed, Admiral appear to have not even done the basics and read Facebook’s terms and conditions, or understood the capacity for their product to be gamed. If this disregard is symptomatic, it may point to a need for sector specific privacy legislation for the financial industry, to further protect consumers from abuse through use of inappropriate or unreliable data.