The fallacy of automatic confidentiality statements in company email

It is a standard measure taken by all organisations big and small around the world, but having confidentiality statements attached to outgoing emails by default may in fact be “dangerous”, according to Mark Cenite, Acting Division Head of Communication Research at the Wee Kim Wee School of Communication and Information (SCI) in Singapore.

Cenite, who holds a Juris Doctor degree in Law from Stanford University, said that “such blanket statements may be rendered redundant” on the grounds that it had also previously been attached to emails that contain non-confidential and trivial information.

“Legally, since the statement appears on every single email, it loses its function to indicate and classify which messages are truly confidential and is of little help,” he said.

According to Cenite, who teaches the fundamentals of confidentiality law to SCI undergraduates, maximum legal protection for sensitive information in emails can only be obtained when senders manually and explicitly indicate the confidential nature of the email.

This can be as simple as typing the word ‘CONFIDENTIAL’ at the beginning of the confidential message.

The lack of utility of the blanket statement, Cenite said, could also “give a false sense of security to people who are communicating confidential information”.

Despite the limited legal value of the measure, the Nanyang Technological University (NTU) in Singapore had decided to automatically attach confidentiality statements for all outbound staff and faculty emails from October 6 onwards.

Chew Kheng Chuan, Chief University Advancement Officer of NTU, said the “incorporation of confidentiality statements into emails will remind faculty, staff and recipients the basic privacy expectations that are part of accepted best practices.”

To this, Cenite said: “(The school) is following a norm but that might not be very helpful.”

He proposed a way to better protect the confidentiality of emails.

Cenite said: “Rather than pursuing the blanket approach, I would suggest that they (the school) educate faculty and staff on the basics of confidentiality or give them tips on how to distinguish confidential information.

“It is not a very difficult thing for any organisation to do,” he said.

Advertisements

The trouble with 2012

On a 20-feet wide video screen plastered on the side of a grey-walled building in the heart of Singapore’s shopping district, images of a crumbling Christ The Redeemer and other world landmarks in the process of being mangled beyond recognition are played, over and over again.

The cities that were being swallowed up whole become numerous reflections on the many irises of those caught in the assault of doomsday imagery.

Around the world, this scene repeats itself regularly. From neon-doused Tokyo to faded and nondescript suburban malls across America, people stop to take in the bombastic trailer of 2012 that often comes with full sound – hard to ignore even if you do not like it.

Millions of cinema-goers worldwide are exposed to that same trailer every day.

Online, there is a burgeoning discussion about the significance of the ancient Mayan calendar end-date occurring on December 24, 2012, which a growing community quite solidly believes denotes the end of the world.

This is something that according to Roland Emmerich, director and co-writer of 2012, used as “fact” in his film.

The 53-year old German, who had helmed Independence Day and The Day After Tomorrow, said the year 2012 phenomenon on the Internet inspired his latest movie.

Referring to him and his production crew, Emmerich said: “We thought ‘wow’, this is a really great thing because if so many people believe that the Earth ends (sic), it kind of somewhat correlates and connects (the audience to the film).”

He added: “Recently, there is a pessimism in the world like there is no glorious future, so (audiences) are kind of drawn to end of time scenarios.”

Herein lies my worry: with filmmakers like Emmerich pandering to pessimism and negativity and with audiences oblivious to their intentions, films like 2012, even without explicit sexual acts or acts of violence, contribute negatively to society.

While I understand that it is not the role of creative arts to maintain social good, it is important to note that such films cannot be considered art, nor is figuring out how to aesthetically make skyscrapers fall particularly creative.

Nor is entertainment about feeding peoples’ fears and making many feel uncomfortable.

In making 2012, Emmerich and his team displayed ambivalence toward audience well being – based upon firm theories of social psychology – and plump for big box-office gains.

The most elementary of the social psychology theories is perhaps the self-fulfilling prophecy. On a personal level, the notion of future events being biased towards how we view them is at once both compelling and convincing. Collectively, individuals may risk being influenced to commit to negative decisions to the detriment of their communities.

Owing to this, pessimism and negativity do not deserve confirmation, most definitely not on a mass medium. Those with strong beliefs, principles and faith systems will not be affected by 2012’s doomsday scenarios, but those who are down and out and wandering in need of help could certainly use some positive reinforcements rather than being fed constantly with negative, insidious ones.

More complex is how the portrayal of human life as inconsequential may impact audience psychological well-being. This is an interesting direction of study for communications scholars. I remember the disgust that is Transformers: two full-length movies that degrade human beings to the role of props.

The Day After Tomorrow was a fine cautionary tale about global warming, albeit blighted by an array of scientific inaccuracies. Independence Day was a well-executed science fiction film whose plot served as a stirring metaphor for overcoming adversity. By the looks of it, I cannot see any redeeming message about 2012; I can only see it earning a hell lot of money at the box-office.

Why not settling for what you have can be a good thing.

Imagine: after spending some time thinking of ideas for a creative assignment (advertising/graphic design/business planning etc.), you are mentally exhausted.

You have a few of them on the table: scribbles, sketches, doodles. Inside you softly concede that among those ideas, none are outstanding. (This requires critical evaluation of your own and/or your groupmates’ ideas.)

You proceed to do one of the following:

  1. Pick the one that you feel best with or could agree with the most (i.e. you feel least guilty over choosing)
  2. Pick the one the you feel would get the best grade/feedback from the client (i.e. the least chance of getting stick)
  3. Pick the one that most group members can agree upon, like good ol’ democracy

Whichever one of the above you or your group may pick, it probably would be a bad choice. Not that the idea chosen might necessarily suck, but this inhibits us from excelling in the work that we take pride in. This is called satisficing.

According to Matthew E. May, the author of In Pursuit of Elegance: Why The Best Ideas Have Something Missing, the word “satisfice” combines “satisfy” and “suffice” that Nobel Laureate Herbert Simon coined over fifty years ago in his book Models of Man to describe the default decision-making process by which we generally accept the first option that offers an acceptable payoff and stop looking for the best way to solve the problem. While satisficing helps us make it through the day, it’s deadly when you’re trying to design a compelling solution.

So when a situation like the above happens, don’t just go back to the drawing board. Break out of the thinking process and patterns you have been used to. Mark the existing ideas in your head and make a commitment, for the time being, to leave it as is. It helps to stash it away and making it clear verbally that you will move on; depart your port of call and sail out to the ocean and explore and fish again. Don’t look back.

How do you do this? May gives an interesting example on his article (the one that I have linked you to), which I will paraphrase a bit here.

Suppose your client is a kitchen appliance company, and the brief is to market a new convection oven in the Tropics of south-east Asia.

  1. Go to the dictionary, open it to any page, and pick the first noun on the page—for example: cow.
  2. Now brainstorm as many characteristics, concepts, and ideas that relate to “cow.”—for example: grass, milk, beef, black and white, brown, moo, field, herd, gentle, and docile.
  3. Pick one or two of those associations and relate them back to your problem. Use them to spark creativity and new ways of thinking about ovens. This will help you get off the normal path of ideas associated with appliances—for example, the word “black and white” might spur the marketing ‘big idea’ of ‘food is either good or bad, there’s no in-between’ or ‘what makes food great?’. Or ‘brown’ together with ‘gentle’ may conjure up some great graphic and semiotic ideas about food roasted to perfection, given most ovens’ tendency to be too strong and over-cook your favourite slab of meat!
  4. Now use this technique with the real problem that you’re trying solve.

Dr KC Yeoh, my mentor/lecturer at Graphic Communications class back in 2008, recognised the importance of not satisficing and helped lift the standard of many a student’s work. Somewhat regrettably though, I notice that not all students understood what he was doing, agreed with him, or adopted this very useful philosophy for themselves. I hope this article will help you take another look at how you approach creative work!

Deconstructing Knowledge

May I begin by stating that I do not have much knowledge about the Deconstructionism movement, unlike what the title of this post suggests.

Or even whether I could call deconstructionism a movement at all.

The title presents quite succinctly the challenge that I’m faced with – when you expose yourself to the wonderful and wondrous world of knowledge in its many fields, you tend to subsequently construct your thoughts in thick, layered, complex, academic fluff.

Fluff is not jargon. Fluff is an elaboration of sorts, sometimes peppered with jargon.

I regard it a human instinct to appear more sophisticated, unless some other trait takes over, like smugness. The former doesn’t always augur well either. It being a human instinct, you don’t always realise it, until someone shoves your work back to you with a reply somewhere between bewilderment and bullshit.

This is where deconstructing simplifying knowledge comes in.

Just now, in the shower (too much info?), I had reflected upon what I have read, learnt and gained these past few months. From information gained from tweets to books and more books, I could ramble on and on about social media, Web 2.0. and the civil society of Singapore, linking thoughts like a monkey swinging from tree to tree. Sure, covering ground is all good, but am I really getting anywhere? Did I just want to talk and learn as much as I can about the subject matter? Did I just want to play ‘sponge’?

These questions shake me. Even then, sooner is better than later is better than never. More true than anything else, I have forsaken the true purpose of seeking information. Perhaps a gross thing to admit, I was only accumulating knowledge like a landfill, without sorting and compacting the knowledge that I have gained.

Soon, the unprocessed knowledge will decompose and recede in my memory as vague impressions of something or the other.

This is why as we read and read, we need to stop, once in a while. Just stop.

And make sense of what we’ve just read.

And from the masses of information, facts, nuggets, anecdotes – condense them and fashion them into something that is at once inspired yet original.

Something you can pass on to others without them shaking their head at you.

And this is why I’m blogging again.