What makes me?

Immigration is always a hot-button topic for politicians, and some seem unable to resist pushing that big, red, shiny button. Almost like they were being (or had been) given rewards on a variable schedule.

Exhibit A: Peters, W.; unable to let an issue pass without adding his two cents. He does raise an interesting theory1, however: is Australia strict on immigrants from New Zealand because NZ is lax on immigrants to NZ? (If you didn’t read the link, he claims immigrants have used NZ as a stepping stone: staying the minimum time to gain citizenship then moving to Australia.)

Even if what he says is true, it doesn’t mean the problem is only on one side. NZ could (and maybe already has) tighten up its immigration policies2. Australia could limit their sanctions (on NZ’ers living in Australia) to only the freshly-minted New Zealanders.


There has been other citizenship-related news recently, too. Several members of the Australian parliament have run into problems due to potentially having dual-citizenship. I can understand the reasoning: a dual-citizen is not fully committed to the country, and so should not be an MP. However, it becomes ludicrous when someone can hold dual citizenship without their knowledge or initiative3.

And, of course, there’s the latest kerfuffle from the US4. *sigh* Is there anyone he hasn’t managed to alienate yet?

All these issues are different facets of a point5 that I’ve been wondering about: What does it mean to be a citizen?

When I was growing up, not being well-informed on political history, I had assumed countries existed because they were originally the homeland of a certain ethnic group, but that owing to people migrating around the world, a lot of countries now had a mixture of ethnicities. In a way, this was further reinforced as countries like Yugoslavia, Czechoslovakia, and the USSR separated into distinct states, seemingly along ethnic lines.

I did however, find it confusing learning about various long-running conflicts that seemed to involve particular ethnic groups whose “homelands” straddled two or more countries (e.g. the Kurdish people). It was only later I realised the countries were often defined as “the bit claimed by [insert applicable expansionist European empire]”, and so you got locals being told to fight other locals from across the river because they’re being oppressed by the French rather than the British. Yeah. No wonder some parts of the world are messed up.

There is a definite distinction between what the powers-that-be have decided upon, and they way individual people feel. As an extended example, despite the ancient nation of Israel/Judah being ended by the Roman Empire (often symbolised by the destruction of the temple in Jerusalem in about 70 AD), the strength of Jewish identity passed down through the generations meant that even after nearly 1900 years there was enough impetus to re-establish Israel. Unfortunately, this was plenty of time for Palestine to become established and viewed as an ancestral homeland, hence neither side wanting to cede territory.

People can identify very strongly with “their” country, even though—much like their racial identity—it’s usually obtained through an accident of birth. Unlike race, however, people cannot wake up one morning to find that they are now part of a different country (again, outside of their initiative). But the impression I get is that—at the scale of a country—people are generally more attached to a physical place than a political state (it’s just that the two are often synonymous). We have a sense of “home”; a place that provides security. Some are willing to venture further from home than others, but most would feel a sense of loss if they could never go back.

This attachment can vary depending on the level of patriotism in a country. The United States, being formed of a vast array of different peoples, a lot of them immigrants, has focused on forming a strong bond with the flag and the nation. Children recite the Pledge of Allegiance in school. There is a strong implication of “if you do this, you are one of us, and we stand together”. If someone has participated in this, grown up in this environment, maybe never known any other, it is downright cruel to have the authorities essentially say “you are not one of us”. To send people away from the only home they’ve ever known to a place where they may not even speak the language. The world’s seen too much of that crap already.


1 Whether he is correct in his assertions, I don’t know, but the scenario he presented prompted Thoughts™.

2 Maybe not to the level of Australia’s, though? Just saying. Not an expert.

3 When this issue first appeared in the news, a friend of mine joked that we could therefore remove any Australian MP we disliked by granting them NZ citizenship. You might well argue that we would then have to let said former-Aussie-MP into NZ, but our immigration laws do bar criminals, and obtaining a governmental position when unqualified certainly sounds like a crime. Practically treason.
(Please note this entire footnote should not be taken seriously. I’m sure this won’t be too arduous for any of you.)

4 That is, the latest at the time I started writing this. By the time you read it, the Human Vuvuzela will no doubt have moved on to fresh outrages.

5 It can be helpful to step back, to generalise, to look at the common threads in several related situations. The trouble is, working at this more abstract, theoretical level, it’s not obvious how to apply your conclusions to the real world. Assuming you even manage to come to any conclusions.

Advertisements

Doctor Why

So, this morning I—like many others—woke to the announcement of who’s playing the next Doctor. Somewhat predictably, there has been online ranting from the extremes of the pro/anti spectrum1.

Am I the only one thinking “That’s interesting. I’ll have to see how they turn out.”?

But then, I tend to have that reaction to any new Doctor announcement2. Generally I haven’t heard of/encountered the new actor before, so I don’t have any particular preconceptions before I see them in action. It’s hard to judge how good a Doctor someone will be beforehand3, as it’s not just a matter of who the actor is, but what their “style” is going to be (both costume and manner), and where the writing team take them.

Doctor Who is at rather an advantage in this regard, having not just a built-in mechanism for cast changes, but an expectation of them. Companions come and go as their own stories are completed, and the main character regenerates to be the-same-but-different, allowing a new style, attitude, and perspective. Regardless of whether you like a particular Doctor and/or companion, the show will eventually move on, and even if you stop watching there’s always the incentive to come back to see what you think of the new one.

My theory is that with every new Doctor, the writers etc (whether they’ve worked with the previous Doctor(s) or not) will have a sense of what aspect of the character they want to explore, and will cast the part with that in mind. The general theme so far with the new series has been the Doctor coming to terms with past trauma (the Time War and all that), so…

  • The 9th Doctor started out a bit hardened and cynical, and gradually learned to open up to people again, to have fun, to “dance”. Christopher Eccleston was a good fit for being able to be a bit scary but also a bit goofy.
  • The 10th was somewhat mellower, but now that he’d opened up, had to deal with the previously-bottled emotions. Hence, manic energy, but also the angst. And David Tennant does good sad-puppy eyes.
  • 11 had moved on, wanting a fresh start. New, happy memories to replace the bad, old ones (10 and 11 are described at one point as “the man who regrets, and the man who forgets”). Hence the young Matt Smith, who never-the-less can convey an alienness and sense of age.
  • 12 (bearing in mind I’ve only seen his first season) had finally resolved a lot of his issues around the Time War, but had been uncertain about who he was without that focal point, making it difficult for him to relate to other people. Peter Capaldi made this unintentional abrasiveness work, and—in an odd reversal of the previous version—was an old-looking face who behaved like a teenager; still finding themselves, desperate for approval, but prone to being prickly.

So where does that leave 13, to be played by Jodie Whittaker? For one thing, she wouldn’t have been cast if they didn’t think she was capable of playing the part. My hope is that the Doctor being a woman has a purpose behind it, in terms of exploring a different side of the character.

But, as with all her predecessors, we’ll have to wait, and watch, to find out. As usual, I’m quietly hopeful.


1 “Yes!!! Another kick in the plums for the patriarchy!!”, “Ow, my entitlement!”, “About time! James (Jane) Bond next!”, “The show is ruined forever!”, etc. etc. All the stuff you encounter if you ever dare to read the comment section.

2 Though I do admit a little extra emphasis on the ‘that’s interesting’ this time.

3 Full disclosure: Of the “modern” Doctors, I’d heard of Christopher Eccleston and thought him a weird choice, but he was fine. I knew nothing about David Tennant, enjoyed his early stuff, but got a bit over him by the end. Matt Smith was the first Doctor younger than me, which was a strange feeling, but I thought he did well4. Peter Capaldi I didn’t know what to expect, and personally haven’t enjoyed his Doctor, but I wouldn’t say he’s done a bad job of it.

4 Plus, bow ties are cool. I’m undecided about fezzes though.

Know Your Medium, Part 2

Previously, I introduced the topic of linguistic relativity—how the choice of “language” affects what concepts are easy to think about.

Another wrinkle of linguistic relativity is that a language affects what you are obliged to think about. For example, when talking about an event in English, we need to consider when it happened (past/present/future tense). Other languages include what’s called evidentiality1: you need to consider how you know about the event; did you see it yourself, or did someone else tell you about it (first/second/etc. hand).

These considerations (what am I forced to convey? what is going to be difficult to convey?) are important when you are trying to tell a story, as the answers are different for a novel than they are for a screenplay.

For example, a common “bad writing” complaint is a book starting with the main character examining themselves in the mirror2 (thus providing a description). The reason this keeps cropping up is that—with only text—it’s not easy to show what a character looks like. Typically, one or two salient features will be mentioned about a character, and the rest will be left to the reader’s imagination3.

In contrast, a scriptwriter would have to tie themselves in some very uncomfortable knots in order to not present a character’s appearance to the audience. It happens the instant the actor emerges. What is difficult is revealing a character’s name. If they are being introduced, that’s easy enough, but there are bound to be characters the protagonist already knows (but the audience doesn’t). Naming is comparatively trivial in a novel.


There’s a deeper significance to this movie-appearance/book-name difference, though, which becomes apparent when it comes to a certain type of twist ending: a character (especially the protagonist) is not be who they have seemed to be. Sometimes, this is accomplished by having the character masked, in the shadows4, or otherwise hidden until a dramatic reveal at the end. This can be very effective if done well, like in the original Metroid game where the main character, bounty hunter Samus Aran, is unexpectedly revealed to be female, smashing players’ preconceptions.

This sort of twist crops up more often in sci-fi/fantasy settings, where hand-waves like “life-like androids”, “clones”, “plastic surgery”, or good old-fashioned “magic” allow for a character to not be who they appear to be. But it’s success is not dependant on the justification (if the rest is done well, the audience are more forgiving). There’s a couple of ways in which the writer can trip themselves up with this trope, which requires some detailed examples.

(Cue spoiler warnings for the films “The Tourist” and “Self/Less”.)

In The Tourist, Johnny Depp’s main character seems to be an everyman caught up in the hunt for a vanished criminal mastermind. Interpol want to catch the baddie. The other crooks want his loot stash. The femme fatale has made everyone think Johnny is the crook (post plastic surgery). After many hijinks, the cops shoot/arrest the other crooks, and Johnny is free to go. But wait! He knows how to get at the crook’s secret safe (he is the criminal mastermind after all), so he gets the money and the leading lady, and lives happily ever after.

Based on the presentation (i.e. the cinematic language), this is a happy ending. Emotionally, we go along with it, because the face we’ve been following/rooting for throughout has won. But when you pause a moment, you feel discomfited: the character you’re attached to is a cunning criminal, who changed his entire appearance to escape the police. This made for an awkward ending.

In Self/Less, old, rich, and ailing Ben Kingsley undergoes a black-market medical procedure to transfer his consciousness to a younger, healthier body (Ryan Reynolds). We follow Ryan as his initial carefree hedonism turns to concern over the weird dreams/flashbacks he starts having (especially when he forgets to take his meds). Eventually, he discovers his new body is not “vat grown”, but originally belonged to someone else, who sold it to pay his family’s debts. Ultimately, Ryan brings down the immoral scientist doing the mind transfers, stops taking the medication (so “Ben Kingsley” fades away), and reunites with his family in traditional hollywood-happy-ending fashion. But wait! Though we’re attached to his face, we know basically nothing about this Ryan Reynolds. Again, there’s something slightly awkward about the ending.

Both movies kind of got away with it (though neither were especially critically successful), but it wouldn’t have worked at all in a novel. There we’re attached to a name, not a face, and it would be more obvious that we’re actually dealing with a different person, but in a movie we’re not obliged to think about that.

The point is to know the medium you’re working in. What is easy? What is hard? What do you need to think about? And most importantly, what do you not need to think about but might trip you up later?


1 Several languages are mentioned in the wikipedia page; the impression I got (which may be inaccurate, I’m no expert on languages) was that a lot of them were from eastern-europe, the middle east, or america.

2 As with all writing “rules”, there are exceptions: Divergent gets away with it, as the scene also reveals details about the world, i.e. that these people restrict the use of mirrors.

3 An interesting example of this cropped up with the casting of Harry Potter and the Cursed Child. Noma Dumezweni was cast as Hermione, sparking much debate. JK Rowling pointed out that the character’s race was never specified. (Did the author envision Hermione like that to begin with? Your guess is as good as mine.)

4 Easy in a novel, requires lots of tricky lighting to make it work in a movie.

Know Your Medium, Part 1

The concept behind Linguistic Relativity1 has been around for quite a long time (predictably, Greek philosophers had musings on the topic). Summarised, it is the idea that the language we speak shapes the way we think.

Now that sounds fairly reasonable. But it has caused controversy when it has been presented as linguistic determinism; that your language restricts what you are able to think. In this form, it is argued that if a language has no word for something, then people who speak that language cannot conceive of that thing. English itself is a fantastic counter to this—for example, we had no word for Schadenfreude, so we nabbed3 it from German.

The evidence does support, however, that particular concepts become easier/harder to consider/discuss in different languages. And again, this is fairly intuitive—it’s harder to express yourself to others if you lack the vocabulary4. Where I find it particularly interesting, though, is the ways the concept applies to other forms of communication. For example, the same tune could be expressed differently for different instruments (guitar chord diagrams for example).

One of my jobs has been (essentially) training problem-solving, and an important tool in solving any problem is notation. If you’re faced with a problem like:

My grandson is about as many days as my son in weeks, and my grandson is as many months as I am in years. My grandson, my son and I together are 120 years. Can you tell me my age in years?

You may find it much easier to work with (and ultimately solve) once you translate it (where g, s, and i are the grandson, son, and “I”‘s ages respectively)5:

g x 365 = s x 52
g x 12 = i
g + s + i = 120

Where am I going with this? The point is that any form of communication involves a vocabulary (in the more general sense), which will be more accommodating to some ideas than others. I plan to delve into some more specific examples (comparing books and movies, as I am wont to do), but this has gotten long enough (and I’m getting muddled with my footnote numbering), so that will have to wait for next time. Ciao6.


1 You may have heard of it as the “Sapir-Whorf Hypothesis”, an honourific title at best as the two (Edward Sapir and Benjamin Lee Whorf2) never directly collaborated.

2 Whorf also helped popularise the famous not-really-a-fact about Inuit having many different words for snow.

3 I’ve always liked James Nicoll’s quote: “We don’t just borrow words; on occasion, English has pursued other languages down alleyways to beat them unconscious and rifle their pockets for new vocabulary.”

4 Case in point, 2-year-olds. Eternally frustrated that Mum and Dad just don’t seem to get them. Some would argue this phase lasts about 18 years. Others would say it never ends.

5 If you’re interested, their ages (in years, rounding to the nearest year) are 6 (grandson), 42 (son), and 72 (“I”).

6 In English it means “goodbye”, but it was purloined from (Venetian) Italian where it could be used as either greeting or farewell. A more literal translation might be “at your service”. Just thought you might like to know that.

Twin Medics

I’m generally a fan of thought experiments (see the blog title, for example), whether about the nature of reality, ethics1, language and meaning, technology, or anything else. They may be called by various names: experiments, paradoxes2, dilemmas, problems…

The advantage of a thought experiment is that it allows one (or many) to consider the nuances and implications of a situation before getting into it. This is especially handy if the situation is one that requires quick decisions, or has a high cost. Plus, it’s just interesting to consider what might be, and what are the implications and ramifications of a decision.

I find scenarios a little frustrating, however. There may be a good point behind them, but the way they are presented means the immediate solution is a matter of grammar or semantics. For example, the “Omnipotence Paradox”, usually expressed as (something like) “if God can do anything, can he create a stone too heavy for him to lift?”. Whether the answer is yes or no, it establishes something that God cannot do, thus God cannot be omnipotent. It’s really about the logical inconsistencies of our concept of omnipotence, and the limitations of our language in expressing certain concepts. Which is fine, those are worthy topics of discussion, but we shouldn’t claim it tells us anything useful about the nature/existence of God.

Another famous one that doesn’t really hold up is the “Epimenides’ Paradox”, named after a Cretan philosopher who claimed that all Cretans were liars. But he was a Cretan, so he must have been lying. So Cretans are not liars, so he was telling the truth, so … 😕

But that's a false dichotomy. The statement “All Cretans are liars” is not the same as the (more specific) statement “All Cretans always lie”. In the real world no-one lies all the time (despite recent evidence). Of more relevance is the (somewhat blander and more formal) “Liar Paradox”, encapsulated in “This sentence is false”. This has been the basis of much discussion of the problems of self-referential language.


Speaking of lying, though, I saw an article purporting to list the 5 lies you are allowed to tell in a relationship. The morality of lying has been a hobbyhorse of mine, so I was intrigued. But ultimately disappointed. Their list of acceptable topics to lie about was:

  1. Whether you liked the meal they cooked
  2. Whether the hotel room is okay
  3. Whether it’s fine for their family to visit
  4. Whether those clothes look good on them
  5. Whether they’re right (in an argument)

In general, this seems to be mistaking lying for diplomacy. In all these situations, lying about your feelings to spare theirs is a bad idea. Again, it’s presenting a false dichotomy: you have more options than lying through your teeth, or giving it to them with both barrels. Telling the truth can (and should) be done gently, and with respect for the person you’re talking to. It’s a lack of that respect that makes the truth blunt and rude.

A specific note on outfits: they advise praising an outfit that works and saying nothing about an outfit that doesn’t (i.e. lying by omission). Again, the truth would be better, but this is a scenario where you have to show you deserve the right to tell the truth. The stereotypical girlfriend’s “test” (“Does this make my bum look big?”) isn’t about the clothes. It’s not a yes/no question. You pass by showing that you want her to look good, and can say something’s wrong without hurting her feelings.

Ultimately, don’t you want those close to you to respect and value your feelings and opinions? How can they do that if you’re not being honest?


1 A topical example is the Trolley Problem—first popularised in the late 1960’s—which directly relates to the decision-making of automated vehicles in potential crash situations (do you drive into the other car, or the pedestrian?).

2 Yes, the heading is a dreadful pun. No, I’m not sorry. 😛

Lost in the Crowd

I have a definite dose of the cynicisms at the moment.

The whole “post-truth”, “alternative facts” nonsense could very well have something to do with it. In some ways, I feel like this is a predictable result of swinging too far in the post-modernist “truth is relative”, “my interpretation is [just as/more] important than your intentions” school of thought.

The trouble is that reality is nuanced. I’d like to be able to make a blanket statement that this perspective is wrong, but that falls into the same trap of over-simplifying reality to save having to engage in critical thought.

When it comes to the “meaning” behind a piece of art—be it a novel, film, painting, sculpture, or whatever else—or a statement that someone has made (say, a political speech for example), it is important to recognise that different people will have different reactions to it. Everyone brings their own opinions, history, understanding, and perspectives to bear when they take in something. That’s why things like innuendoes or inside jokes work; some people will interpret them differently than others. And (assuming people of generally sound mind), multiple interpretations are valid.

As an example, a few years ago there was some debate about censoring the “n-word” in The Adventures of Tom Sawyer—what was considered acceptable parlance has changed, and a modern reader likely brings additional baggage. I don’t have a strong inclination one way or the other (as a relative outsider), but I do think it’s an important discussion to have. Values dissonance can have quite an impact on how one views a story/character. A Shakespearean character who spouts “zounds!” and the like comes across to modern readers as quaint, when for the time that may have been considered offensive language.

Not everything is this open to interpretation, however. You may be able to say anything with statistics, but you cannot change the underlying data. Some things are true, some things are false, and some things are ineffable.

I don’t know any politicians personally, so it would be extremely arrogant of me to make statements about their beliefs or attitudes. Recent experience has shown me that in any large group of people, there can be vast differences in attitude towards an issue, even among those who take the same “side”.

That said, I find the recent actions of certain newly-minted state leaders to be very worrying. They may be done with good intentions towards improving the lives of their citizens, but they seem to be giving entirely the wrong impression in terms of being confrontational, alienating, and divisive; emboldening to bigots both domestic and foreign.

Scott Adams (of Dilbert fame) has been posting interesting explanations of the negotiation and persuasion tactics behind certain decisions, and recently pointed out that—by pushing lots of controversial things through in a short space of time—one can undermine strong protests about any of them (as the opponents have too many things to complain about). Like John Key and flags, however, I’m left wondering what else has gone on that’s been overlooked in the rush.

We are in for interesting times. Kia kaha.

Real, True, or Plausible?

People sometimes make the distinction about whether aspects of fiction are “realistic” or not. Generally, I feel fiction doesn’t have to be (it is fiction after all), and that it’s more important that it be “true”.

What I mean is that some aspect of the scene has to be presented truthfully. The reader/audience’s reaction should be “that’s how that character would behave”, “people are like that”, or “that’s what would happen”. It’s about resonance, often on an emotional level. If you (the writer) have achieved that, then the audience will be following you, even if events are not realistic.

It does help, though, if events are also plausible. They don’t need to match how things behave in the “real world”, but they should fit with the way things work in the fictional world that is presented. If a fantasy novel establishes how magic works, then it’s cheating to have it suddenly do something different, and smacks of a writer who has painted themselves into a corner.

Alternatively, you can focus too much on making a scene work on an emotional level, so the audience/reader goes along with it, but later on thinks that something seemed not quite right1.

For an (extended) example, I recently was watching a scene described as a “spies goodbye”. A couple of agents had been captured, their covers were blown2, and their only option was to “retire” from the game and never make contact with their former allies/co-workers. The two are morosely drowning their sorrows in a dimly-lit bar. The waitress brings over a drink, saying it’s a gift from another table. They glance around, and spot one of their (former) colleagues at another table in the corner. This happens several more times, as they realise their whole team are lurking in various parts of the room3. One by one, they make (tearful) eye contact, raise their glasses, then quietly leave.

Emotionally, it hits the mark. It’s fiction, so it doesn’t matter that in the real world this would be a blatant violation of the “cannot make any contact” restriction, but on reflection, it still feels a bit implausible. My main issue is that it breaks one of the cardinal rules of subterfuge: have a reason for being there4.

Is this fixable? A similar effect could be achieved by having the team members nabbing a nearby table and loudly sharing a toast to absent friends—they’re all wearing black (or at least dark colours), anyway, so they’ll give other patrons the impression that they’ve come from a funeral. Far less likely to draw unwanted attention than several people buying drinks for and saluting an otherwise inconspicuous couple. You can still use largely the same camera angles, but without the sense that people are staring at those they supposedly don’t know.

The advantage of being a writer is that any painted corner is escapable. And, if you do it right, you can reinforce other aspects of character/world, without the audience ever realising you were in a pickle to start with.


1 TvTropes refers to this sort of thing as “fridge” moments. As well as the “wait, how does that work?” they also note things that seem brilliant, or horrifying, when thought about later.

2 You know the drill—a mission goes wrong, agents have to improvise, “if you are captured, the Agency will disavow any knowledge of your actions (or even existence)”, etc.

3 And clearly want to get them really drunk.

4 It’s been frequently shown that people (only) remember things they focus on. Details that “fit” an expected pattern, that don’t stand out, will likely be forgotten.

Crime and Punishment

I was shown an interesting blog article talking about the Game of Thrones tv series, and the conflicting drives (in the audience) of empathy and vengeance. You can read it here, but in summary it was addressing the way characters do horrible things, so we want them to be punished, but then get penalised brutally, so we feel bad for them.

At this point, I should clarify that I’m not a fan of GoT—it’s too … intense … for my tastes1. Given the popularity of revenge-based stories throughout history (as in, they usually inspire catharsis, not ambivalence), I suspect the makers of the show are trying to portray the acts of retribution in such a way as to emphasise their brutality and engender empathy in the audience. It would certainly fit with the theme of “everyone is equally nasty (and those that aren’t tend to get killed off quickly)”.

It does raise an interesting thought, though. When we see another human suffering, we feel sympathy. If we see someone wronged, we feel anger: we want justice. But what do we mean by “justice”? Sure, revenge is viscerally satisfying, but only if we dissociate from the other party (usually either through seeing them as somehow inhuman—monstrously evil and unredeemable—or by otherwise distancing them—they are from a rival clan/group).

Many stories of vengeance also convey the idea of “‘an eye for an eye’ leaves the world blind”. Our desire for punishment can be defused by seeing the humanity of the perpetrator. Some political parties like to focus on “tougher sentences for crime” as though it would help, but evidence suggests it does not: likelihood of punishment (“Will I get caught?”) matters more than severity of punishment in deterring lawbreaking.

This is all focusing on the penalties of wrongdoing, however (whether via an individual avenger, or state sanctions). And while the presence of these can mitigate our sense of injustice, I do wonder if they are ambulance-at-the-bottom-of-the-cliff measures.

Perhaps the way to make the world a more just place would be to try and ensure there were no benefits to breaking the rules.

But we could go further. It’s also known that people are more likely to take risks to avoid a loss than to gain a bonus. So maybe the real problem (and the real injustice) is that following the rules doesn’t mean you’ll be successful.


1 I do know enough bits and pieces of history to recognise the reality of the political machinations; it’s been said2 that democracy doesn’t guarantee you the best ruler, but allows you to change them without bloodshed. It’s worth remembering. We don’t know how propitious are the circumstances, Frederick. In the Ottoman Empire, for example, a new sultan would have his extended family killed off to prevent the possibility of civil war over heirship.

2 I seem to recall a specific quote along these lines, but I cannot remember the wording, or who said it. If anyone does know, please enlighten me!

Expectations Colour Reality

I tend to be a bit cynical about the self-help industry; it often seems geared around getting your clients to open their wallets and say “Help yourself”. Yet I cannot deny the positive impacts of motivational media. When you feel like your day has been nothing but wading through chest-high blancmange1, a cheery reminder that “You only fail when you stop trying!” can be just the tonic to help you reach dinner-time with your sanity, if not intact, at least not missing any pieces.


There’s a lot of it about.

And yet, at other times, the same statement can seem like the most tedious inanity that ever cloyed its way out of the primordial syrup. So what gives?

There’s a learning metaphor I like that suggests concepts are like Lego blocks, and we better assimilate new ones if there are sufficient others to connect it to2; a block on its lonesome is easily misplaced, but a firmly connected one is likely to stay where you put it. If we don’t have the appropriate framework, we won’t be able to connect with a new concept, so it will seem either impenetrable or silly3.

A similar metaphor can be applied to moods. If we’re in a particular mood (e.g. grouchy), our available connectors may be incompatible with the thing we’ve just encountered (e.g. a cutesy “it gets better!” quote), and so it will be easily brushed aside.

This pattern shows up all over the place. In our biases (any new information about someone or something has to connect to—and thus reinforce—our existing framework). In priming/anchoring (once we start thinking in a particular direction, it can be hard to change). Placebos work because we’re told they will heal us. Over-hyped experiences inevitably disappoint.

Changing our perspective will change the way we react to something, separate from the actual value of what we’re reacting to. Imagine you go to a restaurant and see a particular dish on the menu—the one you fondly remember your mother making when you were a child.

You eagerly order, only to find that they do it … differently. Not badly, just not like mother used to make. You leave the restaurant feeling unsatisfied with your meal (and maybe with the evening out in general). Whereas if you’d acknowledged beforehand that the dish was likely to be different, you would probably have been quite happy with it.

And this, I think, is what’s really behind the common motivational concept (which I’ve seen many variations of, attributed to all kinds of people): “If you can’t change your circumstances, change your reaction”. I found this idea irritating for a long time, because we can’t control (all of) our reactions; if we get a shock, for example, our body dumps adrenaline into our system before we’re even consciously aware of it. But we can control our expectations going into a situation, and that will impact how we react.

If we don’t expect a movie based on a favourite childhood book to be that great, we’ll still be disappointed when it’s turned into largely empty spectacle with an overdose of Legolas4, but we won’t be shocked and tempted to write angry letters to the director. Our expectations colour our reality. Which hopefully is more meaningful with the rest of the post to undergird it.


1 Please note, I’ve never actually tried this, it just seems like it would be difficult (it may actually be tremendous fun). And “blancmange” is a funny word. 😉

2 I might not connect my block in the same place as you—my pre-existing structures may be quite different. We may both be able to lock in the new idea, but because we connect it differently, we’ll have different associations with that idea. Hence one of the values of brainstorming, in that the same concept can send different people off in different directions.

3 When you’re trying to convey a concept to someone else (especially if it’s new to them), it’s easy to be so focused on the concept itself that you take for granted the framework around it. If you’re thoroughly familiar with a concept, a short statement can be deep and meaningful. If you’re not, the same statement can seem vague and airy-fairy.

4 I’m not angry, just disappointed given what might have been. And it makes for an amusing example.

Procrastination

I saw an amusing TED talk the other day explaining what goes on in the mind of a procrastinator. The only complaint I have with it is that it oversimplifies a little in assuming all procrastination is the “messing about unproductively leaving important task to the last minute followed by mad deadline panic” type.

I’m generally pretty good at not doing that, but I frequently suffer from the “finding other productive things to do to avoid dealing with particularly daunting/unpleasant task” type. And how does one overcome procrastination? Just read this handy-dandy self-help guide:

  1. Don’t waste energy trying to be someone else—be yourself!
  2. Only, be a more organised and productive yourself. Because winners get up at 5am to make to-do lists using quinoa and mason jars.

What brought this topic to mind? I’m procrastinating, natch1. I’ve been wanting to get some feedback on a project I’ve been tinkering with (especially as it could use a jump-start), but I’ve been reluctant to show it to anyone. It required a little introspection to realise that I was putting this off.

It’s kind of weird that despite being well aware that it’s at a first draft/prototype stage, knowing about several deficiencies, and wanting suggestions on what direction to proceed, the thought of revealing it has me curled in a corner, clutching it and wailing that “it’s not ready!”2, and making vague mutterings including frequent use of the word “precious”.

So, yeah. I’ll get over it. It just amused me once I realised what I was doing, and so I thought I’d share.


1 No, I have no idea how long it’s been since “natch” (short for “naturally”) was in the common vernacular, either. 😉

2 Or should that be “I’m not ready”?