Those current students who are doing the Marketing Consulting Project might remember guest lecturer Jo Nash Clulow’s talk on ethics. She brought up the key issue, reminding us that it’s not only our integrity as marketers that’s at stake. From a legal perspective, although offenses such as fraud are clearly unethical, questionable situations can arise throughout different stages of the data life cycle that could potentially expose their organizations to risks. This is why some companies follow ethical guidelines laid out by certain industry associations to ensure the ethical use of data.
From tracking lives to forecasting crime
When it comes to big data, the sky is the limit. Thanks to complex algorithms, our lives can now be tracked and turned into data which can be used for practically anything. From finding a cure to alzeihmers to fighting crime. With all the possible applications many wonder, will this make life easier or more complicated?
We all remember Minority Report, a not so distant future where a special crime fighting unit are able to arrest murderers before they commit crimes. It all sounds well and good in theory, but what if it the ‘predicted’ murderer were you? You might be thinking, yeah like that’s going to happen. Well guess what, it’s already happening! In 2011, police in Los Angeles and Manchester ran radical trials using a computer algorithm to try to predict where crime would occur before it was actually commited.
Nope, it’s not science fiction, the police unit found that by analysing large amounts of crime data, also known as 'big data', they could identify the behavioural patterns of criminals and make better use of their resources to target the areas the computer predicted crime would strike.The trials were performed across the UK, from Kent to Yorkshire, with results suggesting that predictive policing models actually work. In Trafford, Manchester, police noted a 26.6% fall in burglaries, compared to a 9.8% fall across Greater Manchester during 2011.
When data falls into the wrong hands.
'Big Data' analysts Cambridge Analytica were at the centre of debate last year when they played a key role in Donald Trump's successful presidential campaign - using psychographic profiling, social media surveys and highly targeted demographic data to build detailed profiles of almost half the voters in the United States. This raises the big question about the ethical issues of when a campaign becomes less about message, and more about demographics and targets?
Besides political biases and personal details, the data also contained citizens' religious beliefs, ethnicity, all stored in spreadsheets that were uploaded to a server owned by Deep Root Analytics. The cache of data, which was exposed last week, seems to have been collected from a wide range of sources from Reddit to committees that raised funds for the Republican Party.
Well that’s embarrassing. So whose fault was it? And who is responsible in the company for mishaps like these? Dan O'Sullivan from Upguard called into question the responsibility of those in possession of such data in a blog post on the company’s website. He stated, "The ability to collect such information and store it insecurely further calls into question the responsibilities owed by private corporations and political campaigns to those citizens targeted by increasingly high-powered data analytics operations."
The end of privacy.
While people are worried about their data being acquired by marketers, there are criminals who could use this information for more sinister reasons, like identity fraud, harassment of people under protection orders, and the intimidation of people with opposing political views. "The potential for this type of data being made available publicly and on the dark web is extremely high," Paul Fletcher, a cyber-security evangelist at security firm Alert Logic told the BBC.
What does Big Data mean for the future of marketing?
But what are these primal behaviours that the best campaigns evoke in us - and how do they harness them? Do marketers manipulate our subconscious instincts and emotions - or simply hold a mirror to them? While we can all agree that big data can be used for good or bad, the big question still remains.
Even if the line hasn’t yet been drawn in the sand, is it still ethical to cross it?