Published on May 13th, 2014 | by Ganna Pogrebna0
Technology, Privacy and Governmental Regulation
Earlier this week, I attended a conference on Household Financial Decision-Making and Behaviour in Nottingham. Held between May 6-8, it was organised by the Network for Integrated Behavioural Science, a research initiative between the University of Warwick, University of Nottingham, University of East Anglia and funded by the Economic and Social Research Council (ESRC). The conference concentrated on financial as well as consumption decisions observed on markets, with many talks devoted to the extent to which modern theories of decision-making can capture these decisions.One of the keynote addresses was delivered by George Loewenstein of Carnegie Mellon University, who is renowned for his work as a leader in behavioural economics.
Loewenstein spoke about Behavioural Economics and Privacy in the new digital economy. He referred to a series of decision making experiments conducted by himself and his co-authors, which demonstrated how individuals recklessly endanger their own privacy by revealing sensitive information together with identifying data without considering the consequences. For example, an individual may post compromising photos on Facebook, not thinking that her employer may be regularly monitoring employees’ accounts on social media portals. Loewenstein concluded that individuals generally could not be trusted with their own privacy and that the government should step in and regulate privacy issues relating to online behaviour. In particular, he suggested that a set of rules should be developed for social media service providers such as Facebook, Twitter, YouTube, Google+, etc. to ensure their customers’ security.
While the issue of privacy and cyber security of social media is very topical, I am not convinced that governmental regulation can provide a cure. We already see interesting trends which suggest that consumers are becoming more and more aware of such privacy issues, and they choose services with transparent protocols. For example, the user base of Facebook is aging, with younger people switching to services like Twitter and Instagram.
In my opinion, this not only reflects the fact that kids do not want to be present on networks used by their parents but also shows that consumers of the future prefer social media portals where they can show the world what they see (Instagram) rather than letting the world see them (Facebook). My sister, who is 7 years my junior, for example, explains that she would rather use Instagram where everyone can see everyone else than use Facebook where even after setting the highest access restrictions, your information can go viral should your closest Facebook friends decide to tag your private information.
There is no doubt that, considering the modern business models which exist in the markets for information, private companies may fail miserably in protecting customer privacy. The brightest example in the last few weeks is the case of Snapchat deceiving their users into believing that the information they exchanged via the Snapchat services was almost instantly deleted.
Yet, there is no guarantee that the government can do a better job! For example, the UK government has traded school data, NHS data and even taxpayer data with private companies, sparking debates about the effectiveness of any type of governmental regulation of information storage and transfers.
I believe that our understanding of privacy and cyber security will undergo serious transformation in the next few years, with new business models being developed around markets for information. This is a particular issue that we are also looking at in the Hub-of-All-Things research project.
However – and I might be alone on this — if given a choice between deciding for myself what information to share in a free market or engaging in information exchange in a market where government acts as some sort of Big Brother, I would choose self-regulated markets every time!