Know Your Medium, Part 2

Previously, I introduced the topic of linguistic relativity—how the choice of “language” affects what concepts are easy to think about.

Another wrinkle of linguistic relativity is that a language affects what you are obliged to think about. For example, when talking about an event in English, we need to consider when it happened (past/present/future tense). Other languages include what’s called evidentiality1: you need to consider how you know about the event; did you see it yourself, or did someone else tell you about it (first/second/etc. hand).

These considerations (what am I forced to convey? what is going to be difficult to convey?) are important when you are trying to tell a story, as the answers are different for a novel than they are for a screenplay.

For example, a common “bad writing” complaint is a book starting with the main character examining themselves in the mirror2 (thus providing a description). The reason this keeps cropping up is that—with only text—it’s not easy to show what a character looks like. Typically, one or two salient features will be mentioned about a character, and the rest will be left to the reader’s imagination3.

In contrast, a scriptwriter would have to tie themselves in some very uncomfortable knots in order to not present a character’s appearance to the audience. It happens the instant the actor emerges. What is difficult is revealing a character’s name. If they are being introduced, that’s easy enough, but there are bound to be characters the protagonist already knows (but the audience doesn’t). Naming is comparatively trivial in a novel.

There’s a deeper significance to this movie-appearance/book-name difference, though, which becomes apparent when it comes to a certain type of twist ending: a character (especially the protagonist) is not be who they have seemed to be. Sometimes, this is accomplished by having the character masked, in the shadows4, or otherwise hidden until a dramatic reveal at the end. This can be very effective if done well, like in the original Metroid game where the main character, bounty hunter Samus Aran, is unexpectedly revealed to be female, smashing players’ preconceptions.

This sort of twist crops up more often in sci-fi/fantasy settings, where hand-waves like “life-like androids”, “clones”, “plastic surgery”, or good old-fashioned “magic” allow for a character to not be who they appear to be. But it’s success is not dependant on the justification (if the rest is done well, the audience are more forgiving). There’s a couple of ways in which the writer can trip themselves up with this trope, which requires some detailed examples.

(Cue spoiler warnings for the films “The Tourist” and “Self/Less”.)

In The Tourist, Johnny Depp’s main character seems to be an everyman caught up in the hunt for a vanished criminal mastermind. Interpol want to catch the baddie. The other crooks want his loot stash. The femme fatale has made everyone think Johnny is the crook (post plastic surgery). After many hijinks, the cops shoot/arrest the other crooks, and Johnny is free to go. But wait! He knows how to get at the crook’s secret safe (he is the criminal mastermind after all), so he gets the money and the leading lady, and lives happily ever after.

Based on the presentation (i.e. the cinematic language), this is a happy ending. Emotionally, we go along with it, because the face we’ve been following/rooting for throughout has won. But when you pause a moment, you feel discomfited: the character you’re attached to is a cunning criminal, who changed his entire appearance to escape the police. This made for an awkward ending.

In Self/Less, old, rich, and ailing Ben Kingsley undergoes a black-market medical procedure to transfer his consciousness to a younger, healthier body (Ryan Reynolds). We follow Ryan as his initial carefree hedonism turns to concern over the weird dreams/flashbacks he starts having (especially when he forgets to take his meds). Eventually, he discovers his new body is not “vat grown”, but originally belonged to someone else, who sold it to pay his family’s debts. Ultimately, Ryan brings down the immoral scientist doing the mind transfers, stops taking the medication (so “Ben Kingsley” fades away), and reunites with his family in traditional hollywood-happy-ending fashion. But wait! Though we’re attached to his face, we know basically nothing about this Ryan Reynolds. Again, there’s something slightly awkward about the ending.

Both movies kind of got away with it (though neither were especially critically successful), but it wouldn’t have worked at all in a novel. There we’re attached to a name, not a face, and it would be more obvious that we’re actually dealing with a different person, but in a movie we’re not obliged to think about that.

The point is to know the medium you’re working in. What is easy? What is hard? What do you need to think about? And most importantly, what do you not need to think about but might trip you up later?

1 Several languages are mentioned in the wikipedia page; the impression I got (which may be inaccurate, I’m no expert on languages) was that a lot of them were from eastern-europe, the middle east, or america.

2 As with all writing “rules”, there are exceptions: Divergent gets away with it, as the scene also reveals details about the world, i.e. that these people restrict the use of mirrors.

3 An interesting example of this cropped up with the casting of Harry Potter and the Cursed Child. Noma Dumezweni was cast as Hermione, sparking much debate. JK Rowling pointed out that the character’s race was never specified. (Did the author envision Hermione like that to begin with? Your guess is as good as mine.)

4 Easy in a novel, requires lots of tricky lighting to make it work in a movie.


Know Your Medium, Part 1

The concept behind Linguistic Relativity1 has been around for quite a long time (predictably, Greek philosophers had musings on the topic). Summarised, it is the idea that the language we speak shapes the way we think.

Now that sounds fairly reasonable. But it has caused controversy when it has been presented as linguistic determinism; that your language restricts what you are able to think. In this form, it is argued that if a language has no word for something, then people who speak that language cannot conceive of that thing. English itself is a fantastic counter to this—for example, we had no word for Schadenfreude, so we nabbed3 it from German.

The evidence does support, however, that particular concepts become easier/harder to consider/discuss in different languages. And again, this is fairly intuitive—it’s harder to express yourself to others if you lack the vocabulary4. Where I find it particularly interesting, though, is the ways the concept applies to other forms of communication. For example, the same tune could be expressed differently for different instruments (guitar chord diagrams for example).

One of my jobs has been (essentially) training problem-solving, and an important tool in solving any problem is notation. If you’re faced with a problem like:

My grandson is about as many days as my son in weeks, and my grandson is as many months as I am in years. My grandson, my son and I together are 120 years. Can you tell me my age in years?

You may find it much easier to work with (and ultimately solve) once you translate it (where g, s, and i are the grandson, son, and “I”‘s ages respectively)5:

g x 365 = s x 52
g x 12 = i
g + s + i = 120

Where am I going with this? The point is that any form of communication involves a vocabulary (in the more general sense), which will be more accommodating to some ideas than others. I plan to delve into some more specific examples (comparing books and movies, as I am wont to do), but this has gotten long enough (and I’m getting muddled with my footnote numbering), so that will have to wait for next time. Ciao6.

1 You may have heard of it as the “Sapir-Whorf Hypothesis”, an honourific title at best as the two (Edward Sapir and Benjamin Lee Whorf2) never directly collaborated.

2 Whorf also helped popularise the famous not-really-a-fact about Inuit having many different words for snow.

3 I’ve always liked James Nicoll’s quote: “We don’t just borrow words; on occasion, English has pursued other languages down alleyways to beat them unconscious and rifle their pockets for new vocabulary.”

4 Case in point, 2-year-olds. Eternally frustrated that Mum and Dad just don’t seem to get them. Some would argue this phase lasts about 18 years. Others would say it never ends.

5 If you’re interested, their ages (in years, rounding to the nearest year) are 6 (grandson), 42 (son), and 72 (“I”).

6 In English it means “goodbye”, but it was purloined from (Venetian) Italian where it could be used as either greeting or farewell. A more literal translation might be “at your service”. Just thought you might like to know that.

Twin Medics

I’m generally a fan of thought experiments (see the blog title, for example), whether about the nature of reality, ethics1, language and meaning, technology, or anything else. They may be called by various names: experiments, paradoxes2, dilemmas, problems…

The advantage of a thought experiment is that it allows one (or many) to consider the nuances and implications of a situation before getting into it. This is especially handy if the situation is one that requires quick decisions, or has a high cost. Plus, it’s just interesting to consider what might be, and what are the implications and ramifications of a decision.

I find scenarios a little frustrating, however. There may be a good point behind them, but the way they are presented means the immediate solution is a matter of grammar or semantics. For example, the “Omnipotence Paradox”, usually expressed as (something like) “if God can do anything, can he create a stone too heavy for him to lift?”. Whether the answer is yes or no, it establishes something that God cannot do, thus God cannot be omnipotent. It’s really about the logical inconsistencies of our concept of omnipotence, and the limitations of our language in expressing certain concepts. Which is fine, those are worthy topics of discussion, but we shouldn’t claim it tells us anything useful about the nature/existence of God.

Another famous one that doesn’t really hold up is the “Epimenides’ Paradox”, named after a Cretan philosopher who claimed that all Cretans were liars. But he was a Cretan, so he must have been lying. So Cretans are not liars, so he was telling the truth, so … 😕

But that's a false dichotomy. The statement “All Cretans are liars” is not the same as the (more specific) statement “All Cretans always lie”. In the real world no-one lies all the time (despite recent evidence). Of more relevance is the (somewhat blander and more formal) “Liar Paradox”, encapsulated in “This sentence is false”. This has been the basis of much discussion of the problems of self-referential language.

Speaking of lying, though, I saw an article purporting to list the 5 lies you are allowed to tell in a relationship. The morality of lying has been a hobbyhorse of mine, so I was intrigued. But ultimately disappointed. Their list of acceptable topics to lie about was:

  1. Whether you liked the meal they cooked
  2. Whether the hotel room is okay
  3. Whether it’s fine for their family to visit
  4. Whether those clothes look good on them
  5. Whether they’re right (in an argument)

In general, this seems to be mistaking lying for diplomacy. In all these situations, lying about your feelings to spare theirs is a bad idea. Again, it’s presenting a false dichotomy: you have more options than lying through your teeth, or giving it to them with both barrels. Telling the truth can (and should) be done gently, and with respect for the person you’re talking to. It’s a lack of that respect that makes the truth blunt and rude.

A specific note on outfits: they advise praising an outfit that works and saying nothing about an outfit that doesn’t (i.e. lying by omission). Again, the truth would be better, but this is a scenario where you have to show you deserve the right to tell the truth. The stereotypical girlfriend’s “test” (“Does this make my bum look big?”) isn’t about the clothes. It’s not a yes/no question. You pass by showing that you want her to look good, and can say something’s wrong without hurting her feelings.

Ultimately, don’t you want those close to you to respect and value your feelings and opinions? How can they do that if you’re not being honest?

1 A topical example is the Trolley Problem—first popularised in the late 1960’s—which directly relates to the decision-making of automated vehicles in potential crash situations (do you drive into the other car, or the pedestrian?).

2 Yes, the heading is a dreadful pun. No, I’m not sorry. 😛

Potential retitle

One of the things I’ve been considering is renaming this blog to make it a bit clearer what it’s about. I’m not intending to change the overall purpose, that is, sharing thoughts on various topics.

I’ve noticed that I tend to (over)think about things differently to most people, and I make connections between seemingly disparate topics, so I’d like to inject some of that flavour (i.e. an unusual perspective/unexpected links) into the title and tagline. But of course, it should be short, memorable, and easy to understand, which I’m not so good at. 😉

I’ve thought of a few ideas, so I’m going to try out this “poll” thingy (hopefully it works). Any and all feedback is most welcome, especially if you have alternative suggestions.

EDIT: Manual poll – please leave your response in the comments

  • The Odd-ball and Chain
  • Ponderlust (as in “wanderlust”)
  • Mulled Lines
  • Exteriordinary Thoughts (portmanteau of exterior and ordinary)
  • Musing Alfresco
  • The Nut’s Case (as in “making a case”)

You Keep Using That Word…

“…I do not think it means what you think it means.”

The internet gives any pleb* the ability to communicate with a much wider audience than ever before, but this comes with a significant downside. And I mean besides the “Pro: anyone can publish. Con: anyone can publish” issue.

While there are photos, videos, and podcasts, they take time to prepare, so the vast majority of interaction is still text-based (forums, blogs, articles, etc). Anyone familiar with the old chestnut about “only 7% of communication is words, the rest is tone, facial expression, and body language”** will see the problem. When you add in the variety of cultures, and that many are not communicating in their native language, there is a huge potential for misunderstandings.

Communication Theory points out that in order to convey a concept from source to destination (e.g. my brain to yours) the information must be translated into some external message which is then received and interpreted at the other end (in this case, words in a blog post). This presents two obvious places where things can go wrong: I may struggle to put my thoughts into words (yes, frequently); you may misunderstand what I have written (e.g. due to ambiguity). And that is assuming the encoding is agreed upon (e.g. that we agree on the meaning of the words).

A good example of this phenomenon is that sarcasm is easy to miss in plain text, so many people have tried to design special punctuation to indicate it. None has succeeded (which is kind of a shame), because they have not become universally recognised, so their use instead presents another avenue for misunderstanding and confusion. How ironic.

There must similarly be universally-agreed meanings in order for any words to be useful. Which is fine for words in common use, but if I was to hazard some atypical language, bystanders might find themselves at sixes and sevens.

Combine this with the dehumanising effect of conversing via a computer screen (with an ethereal person—or persons—of unknown nature and location, rather than concrete individuals right in front of you***), and it’s not surprising that arguments erupt online. I’m not sure how to solve this problem, but at some point I’ll probably post about how I try to avoid getting caught up in it myself (which may or may not work for anyone else).

* Well, any sufficiently fortunate to have access the necessary infrastructure, sufficient income to pay for it, and enough time to present their works for perusal.

** Ironically, the original study behind this concept has been misunderstood. What it actually shows (if anything) is that when words and tone/body language are inconsistent, the audience is more likely to go with the non-verbal cues.

*** Hmmm, there’s a thought. I wonder if anyone’s looked into whether online discussions are more polite when people’s posts are accompanied by actual photos of them (as opposed to animals, celebrities, symbols, and so forth).

Are Lies Ever White?

For a long time I’ve been puzzled by the idea of lying. Various moral philosophies have fairly clear edicts on the matter (i.e. deceit = bad) and this seems to be the prevailing opinion. Sure, there are those who see absolutely no problem with lying, but they’re generally either fictional, psychopaths, or both.

But at the same time, this suggests that most people are hypocrites, in that they decry lying with one hand, and indulge in it with the other, justifying things as “social niceties” or “little white lies” (textbook denial-by-diminishing). But I tend to agree: there are plenty of times that to tell the truth would be impolite, unwise, in some way less good than not doing so.

So what’s the deal? Are we all filthy liars, or is lying not as bad as it’s portrayed?

Actually, I think there’s something else involved (from my perspective at least). Part of my confusion has been due to my perception of what a “lie” is, which has been influenced more than the average by “Knights and Knaves” logic puzzles (e.g. the two doors, two guards scene in the film Labyrinth).

The effect being that I had a very black-and-white view of lying: any false statement is a lie. More recently, harkening back to my thoughts on morality, I’ve realised that this neglects the “liar”‘s intentions. Do they realise what they’re saying is false? Or are they mistaken? Misinformed? Deluded? Why are they making a false statement?

Another distinction that should be made is between lying by commission (stating something false) and lying by omission (leaving out something true). It took me a while to come to grips with the idea that someone saying “How are you?” is just making conversation, and you don’t need to feel uncomfortable about saying that you’re fine even though you’re actually traumatised over yesterday’s episode of Days Of Our Lives. There’s nothing wrong with keeping private things private.

Another situation that seems, at face value, to be lying is the whole area of fiction: literature, theatre, comedy, etc. Knowingly asserting things that are not true, but without malice. Indeed, there is an implicit assumption that both parties know the story is false. Similarly in various types of games, deliberate deceit is a component (e.g. bluffing at poker, dummy passes in football), but it’s under specific conditions.

I don’t want to suggest by this that lying is a good thing; merely clarify the definition. Much of the function of society depends on there being a level of trust between people. What got me (back) onto this topic was an episode of the tv show Perception (another in the “abrasive, mentally-ill, but brilliant layman* helps solve crimes” genre – this time with a schizophrenic neuroscience professor) which mentioned that we react to lies with the same part of our brains that process pain: discovering you are being deceived literally feels uncomfortable. I also vaguely recollect reading (somewhere) that when we hear a statement, our brain processes it as though it were true (which I suspect is why rumours can hurt so much), and we have to actively refute it (literally have second thoughts about it).

To conclude with a metaphor (because I like metaphors**): tigers are dangerous, but we shouldn’t treat zebras the same way just because they’re stripey.

* Because they’re almost always men (which is a whole other kettle of worms).

** I’m rather fond of footnotes, too.

Holding out for a hero

What does it mean to be a hero?

The word apparently originated with the ancient Greeks, referring to demigods and other more-than-human characters in their mythology. The more recent (and marvelously circular) definition is someone who is admired for their heroic qualities (particularly courage or nobility) and/or great achievements.*

While this captures the strict definition of the word, I think it’s missing some nuance. I don’t fault dictionaries for this, they tend to be a little behind the dynamic frontier of language-as-she-is-spake, and it may vary slightly in different cultures. As far as I can see, people could intend one of two concepts (or possibly both) when they describe someone as a hero:

  • A protector, rescuer, or provider. Someone who came through when you needed them.
  • An idol, role model, or good example. Someone that you want to be

Both fit into the standard definition (someone admired for their qualities or achievements), but engender different feelings: gratitude or aspiration. This distinction may help explain why some are affected more by the halo effect than others; with a hero you are grateful to but don’t want to be, you already have a sense of their less-desirable traits (or at least that they are different from you).

The other issue may be the modern tendency to polarity of opinions. Thus a public figure (take your favourite sportsperson for example) cannot be merely “okay”; they are either considered the greatest thing since dimpled golf balls or the most overrated player in history.

I guess it’s another example of the phenomena of people not having sensible opinions because that requires thought (read: effort). Hmmm. Maybe I should write a post on that sometime…

* The definition of “hero” also refers to the protagonist of a story, or a particular type of (especially swashbuckling) sandwich. I think we can accept that these are not relevant to the current discussion.

I do not think it means what you think it means

When we talk about people having the same cultural background, we mean they have a shared history. Not necessarily in terms of actual experiences (though it can be—witness the cohesion between people who have lived through the same event, for example), but that over their lifetimes they have accumulated various adages, ideas, and norms.

Mostly, these operate on a subconscious, background level; affecting your behaviour, but not something you explicitly think about. This is why culture-shock can be so significant—everyone involved is taking certain things for granted, so any differences aren’t recognised until it is too late.

Something I find interesting is that a lot of these ideas are based on metaphors—explaining how one aspect of the world works by equating it with another. Metaphors are incredibly valuable, but they are never perfect. It can be amusing (or, occasionally, worrying) to delve into these metaphors and explore their limits. How far can you stretch this comparison before it breaks?*

One concept that I’ve been pondering lately is that of “love”. (Huge topic, I know. 🙂 ) Specifically, a couple of the common metaphors that attempt to express some aspects of it: a battlefield, and a lottery.

You may know of the song Love is a Battlefield, but it’s not the only place the concept crops up. For example, couples going through difficult times may be encouraged to fight for the relationship. Both metaphors capture the unpredictability/lack of control experienced by those in the throes of passion, as well as the sense of there being something one hopes for. People talk about “winning” or being “lucky” in love.

As usual, however, these metaphors can be troublesome when over-applied. When love is viewed as a prize/goal, people start getting frustrated; they feel they’ve “bought enough tickets” or “fought hard enough”, as it were, that they “deserve” to win something. An unreasonable focus is placed on the end result, not on the process of actually getting to know another human being (which, you know, is the point).

Similarly, the danger of the battlefield metaphor is that it’s ambiguous who the enemy is. It should be the vagaries and difficulties of life that make it hard to find time to devote to each other, but it can all too easily be assumed to be the other person (with predictably disastrous consequences—you should be cooperating, not competing).

So, I guess my point is that it’s worth stopping occasionally to think about how you’re viewing something, and considering whether the metaphor you’re using actually applies.

Do you have your own examples of misapplied concepts?

* Some, not very far at all. For example, the quote “Love means never having to say you’re sorry.” I think I understand what is intended, but there’s more wrong than right with this concept.

What’s in a name?

Most of the time, especially when dealing with the spoken rather then the written word, someone can use the wrong variant and no-one will notice and/or care. But not when it comes to names (or your own at least).

(Apologies to the people whose hackles were raised by my use of “then” instead of “than” in the previous paragraph. 🙂 )

There are a lot of names that either have different spellings (e.g. Ann/Anne), or are similar to other names (e.g. Davis/Davies), which increases the likelihood of mistakes. This concerns me, because I’m terrible at remembering names at the best of times. I did well at one place I worked, but there I had lists I could refer back to.

People often use variations of my name, which is frustrating. Spelling or grammatical errors I can brush off, but this is not just a word, it’s a representation of who I am. In the same way that clothes and make-up can enable people to present themselves the way they want to be seen, the way someone is referred to is very personal, and it can feel disrespectful to be mis-referenced (you could always respond with “Segmentation Fault!”, but most people wouldn’t understand).

What got me thinking about the topic is seeing a movie trailer for the film Rio 2. The main characters (named Blue and Jewel) and their kids meet Jewel’s long-lost father, and the usual sort of tired tension-with-her-father jokes ensue. In particular, when the kids are introduced, politely expressing “Nice to meet you, sir”, their newly-discovered grandfather says to call him “Pop-Pop”*. Naturally, when Blue later (speaking to his kids) refers to him as “Pop-Pop”, he is given a stern eyeballing and told “You call me Sir.”

Regardless of the illogic of this particular example**, this is doing the opposite of what I described last time: continuing to use stereotypes that really should be retired. In this instance, the grandfather has no reason to be any less welcoming towards Blue except “He took away my little girl!”. Even though he hasn’t seen said “little girl” (now an adult woman… er… parrot) for a long time. It hearkens uncomfortably back to the archaic concept of a marriage as essentially a business transaction between two men. That said, I haven’t seen the whole film, and it could be the basis of an interesting character arc where the grandfather realises that the real focus of his resentment is whoever took Jewel into captivity, and that he’s just glad she’s alright. But it’ll probably be Blue-does-something-vaguely-heroic, thus earning (and I use the word “earning” deliberately) a “Well, I guess you’re alright. But you still call me Sir.”

* It’s good that there are a wide variety of nicknames for grandparents — you should be able to settle on something distinct for each. Otherwise, when you mention “Grandpa”, does that mean Mum’s dad, or Dad’s dad? One of my grandmothers (possibly due to not feeling old enough to be a “Nana” or “Granny”) preferred we call her by her given name. Not knowing anyone else by that name, I spent part of my early childhood thinking it was another word for grandmother. Families are wierd. 😉

** Supposing you were visiting your friend Bob, and his daughter Carla answers the door. It makes more sense to say something like “Hi Carla, is your Dad at home?” rather than “…is Bob at home?”. You certainly don’t call Bob “Dad”, but Carla does, so there’s less potential for confusion.