Monthly Archives: August 2016

The power struggle behind big data: forfeiting emotional autonomy to the machine

I encounter a lot of debates between myself and those who defend the authority of big data. It made me begin to wonder what kind of perception we are generating and what kind of attitudes we are embedding into our practice due to this desire for a quantifiable truth. Logical positivism states that only deduction or direct observation is meaningful. Big data provides us with direct observation; a tangible source for which we can point and deduce as reality. But what are we really doing in this process, and what kinds of attitudes are being shaped and are shaping our approach to work?

Much of my own design work involves direct engagement and documentation of peoples’ thoughts, lives, reactions and emotions. I am a messenger and a representative of someone else’s story. Ethnography is the methodology I often take in my work and with it comes an attitude that both aims to accurately describe the reality and perceptions of individuals I have studied, or sometimes, to emancipate those who are marginalised by a source of power. Yet, more and more I am asked, reading, and witnessing the attempt to introduce quantitative information into the accounts I produce on behalf of my participants.


The push for data-driven design

Surveys, sample sizes, statistics and predictive analytics are a few of the big data-inspired demands that are infiltrating human-centered design. Qualitative information is not enough. The growing perception is that qualitative, “small” data- such as the stories and insights that emerge from ethnographic practice- require the “validity” and rigour of cold frequency and hard numbers. At its extreme, qualitative research is not even needed. Numbers tell a story, and this story is edited by computer processing. Statisticians read this story received and report back. A kind of double hermeneutic is at play, except the co-creative conversation we are having about our reality is with a computer.

Reducing what is human- into a code or a number to be quantified- is in essence an attempt to stabilise and transform emotionally driven (irrational) human behaviour into predictable (a la economic) beings. This of course is only a superficial representation of what we think is true; the kind of boiled down, reductionistic approach to make sense of that which cannot be controlled and predicted (much like the foundations of ecology). In an attempt to find a handle for control, big data and quantification is often used to assert authority and preserve hierarchy; data is used to leverage and exercise authority over a project, client, community and even society. Furthermore, this increasing preference for data not only reaffirms the immediate power and authority,  but to a greater extent, requires us to forfeit our emotional autonomy to a machine.

The craving for big data to serve as an ‘authority’ over our reality means we value and trust an algorithm over that which should understand and represent us the best- a human. For any one who is worried about AI replacing our jobs, this fervent and blind faith is doing nothing but accelerate this destiny.


Because the data said so

The persistence for data portrays an underlying desire for an objective authority- a highly rational and unemotional super parent. We see the computer as greater than us, our own motherboard Messiah that we have crafted to liberate ourselves. I find this liberation often results in laziness as people point to figures rather than figuring out problems, and in the process, lose sight of the perspectives of their peers.

What kind of governance have we created and poured our trust into? We choose a machine to calculate us, and view it as an extension (and association) of the human brain (this makes me think of Herbert Simon). This mechanical organism is a better version of us; it is stronger, possesses a more powerful processing ability and infinite memory. However, this figurehead for objective authority is rational, emotionless and lacks empathy-much like the basic traits of a psychopath.

Emotions are what makes us irrational beings. Our insatiable desire for data means we also wish to control ourselves; to contain this irrationality in order to become stable and predictable. This predictability is founded on a desire for control and control is founded through a fear of risk and loss. Loss aversion is one of the strongest human traits that drive our irrational being and we have turned to computers to parent us away from bad decisions. But as any good parent knows, sometimes you need to let your child make a few bad decisions in order to learn and grow.


Not everything that can be counted, counts 

(and no, this is not another Einstein quote)

This quote actually is owned by a sociologist named William Bruce Cameron. Cameron describes beautifully the inherent danger in big data. In a paper dating back to 1963, Cameron states:

“It would be nice if all of the data which sociologists require could be enumerated because then we could run them through IBM machines and draw charts as the economists do. However, not everything that can be counted counts, and not everything that counts can be counted.”


Am i saying we should abandon quantitative data altogether? No, not necessarily. Quantitative information has its purpose and place. What i am saying is, when we are dealing with what is in essence to be human (to design) we need to think twice before reaching for a number to justify our information. In human-centered driven work, such as design practice, emotions are just as much a valid source of ‘data’ as are numbers. What we really need to be strictly conscious of, is that we are not allowing ourselves to treat hard data as a superior source of information to qualitative reasoning. Both qualitative and quantitative information should remain equal. This is the reason why I had chose to use critical realism as the theoretical paradigm for analysing design practice. It provides an intellectual middle ground between our desire for control (quant) and emotional autonomy (qual), by emphasising neither and integrating both where appropriate. We must preserve our right to remain human-centered, and hold each other accountable when the temptation for big data-driven design takes control.