Getting a head-start

I was thinking about saying something about privilege (e.g. racial), but the engendered discussion tends to be somewhat… acrimonious. So instead, I’m going to talk about iPhones.

Western capitalism is purported to have two fundamental principles: egalitarianism (equal opportunities) and Darwinism (survival of the “fittest”). What’s not so clearly stated, is the way these principles conflict when you add momentum to the mix.

(All data sourced from wikipedia – it doesn’t matter if it’s not precise, the general trends are what I’m interested in)

Taking smartphone operating systems as an example, one can see both the success and the failure of the “free market” approach. Observe this graph of smartphone market share over the last few years.

Smartphone market share %

Initially (circa 2007), Symbian, Windows Mobile, and Blackberry ruled the roost; then the iPhone was released, closely followed by Android phones. They proved “fitter” than the incumbents, and so took over. You can argue about whether iOS or Android is better, but—as happened with Macs and PCs—less hardware restrictions has led to a bigger piece of the pie.

Other systems have tried, and generally failed. Even with the might of Microsoft* or Samsung behind them, they haven’t taken off anywhere near as well as the big two. You may argue that this shows they were “less fit”. It’s probably a fair assessment that Bada 1.0 was not as good as Android 2.2 Froyo (the most recent version when the first Bada phone was released), but it hadn’t been through eight versions/patches. It may well have been better than Android 1.0, but that didn’t matter because the “ecosystem” had changed by then.

iOS and Android took over because they did some things significantly better (to lapse into business-speak, they were revolutionary rather than evolutionary). I don’t know exactly what those things were, but it shows that the market is not as egalitarian as some would have you think.

It’s the same for a lot of different businesses; anyone may have the same opportunities to enter a particular market, but the same amount of effort will not lead to the same amount of success. Suppose two companies start producing a new widget—something completely new, never seen before. Company A is a start-up, whereas company B is a well-known multinational who have made decent quality gizmos for years. Regardless of the relative quality of company A’s widgets and company B’s widgets, company B can use pre-existing infrastructure and branding to create, market, and distribute, whereas company A is starting from scratch. Company A’s per-widget costs will be greater, and they don’t have the safety net of a strong trade in gizmos to fall back on. Company A cannot compete, and soon folds (or, if they’re very fortunate, gets purchased by company B).

Given such prospects, isn’t it reasonable for a government to invest in start-ups and other small businesses? The existing companies have such a head-start that to not do so would be to abandon small businesses to the Darwinian ferocity of market forces. And the world would miss out on a lot of new perspectives and innovations.

I leave readers to draw the intended parallels. πŸ™‚


* Note that I’ve combined the figures for the Windows Mobile OS (which largely came from PDAs) and the more recent Windows Phone (as the latter superseded the former).

Advertisements

What is a chapter?

Something that’s been puzzling me of late is the above question.

I’ve been reading various blogs and suchlike about developing your writing skills, where there is plenty of good advice about how to construct a scene. The lower-level construction of sentences (grammar) and paragraphs (one idea per, start a new one for a new speaker, etc.) were pretty well drilled-in at school. But nowhere have I come across a definitive concept of where a chapter ends*.

The approach seems to vary from writer to writer. With some, a chapter is essentially a scene, even if that makes their lengths wildly inconsistent. Others go for a (roughly) set length, which puts constraints on the development of the story. Still others eschew chapters altogether.

I’ve heard it said that “you just know” when to end a chapter. That’s fine, and I can accept it as a piece of advice, but I’m surprised that critical literary analysis hasn’t provided a more tangible measure.


* I’m mainly talking about fiction. Like paragraphs, chapters in non-fiction are fairly straightforward; a non-fiction chapter is essentially an essay on a particular sub-topic of the book.

Malicious technology (Part 2)

Okay, so last time I suggested that even expert users have problems, because computers aren’t always predictable.

So, if we accept that computers will throw us for a loop occasionally, the issue becomes how we react to this. Blaming ourselves, falling prostrate before the almighty PC (or Mac, if you’re that way inclined πŸ™‚ ) and begging that we be permitted to complete our terribly-important email is not going to cut it.

What is the alternative? Well, some take completely the opposite tack and assume the computer is entirely at fault, as it’s not doing exactly what they want it to, but I suspect that’s more related to ego; such people probably aren’t reading this, or aren’t going to take any notice of it. No, I suggest you imagine you are interacting not with a machine, but with a person.

This may not be all that difficult. People anthropomorphise objects all the time. Naming their car. Holding conversations with their pets. But that’s not what I’m talking about—you’re already doing that if you’re assuming the computer is smarter than you. Instead, recognise that the computer is just a machine following instructions, so when you’re using Microsoft Word (just as an example), you are navigating a bureaucratic maze defined by the person that created the program*.

Now, it’s quite reasonable to assume that this generic programmer you’re effectively interacting with is smarter than you. They certainly know more about computers. But only because they’re the combination of several brains. So how does this help? Because while you may not know much about programming a computer, you do know a lot about what it is you’re trying to do.

You see, every program is essentially a tool (or rather, a collection of tools, but lets keep it simple). A tool is designed to facilitate a particular task. If the person performing that task is finding the tool odd or frustrating, it’s probably because the way they think of the task is different to the way the tool designer thought of it.

For example, I have a ratchet screwdriver. It has a switch that enables it to be set to work clockwise (for tightening screws), counter-clockwise (for loosening screws), or both (acting like a standard screwdriver). One could conceivably add a fourth setting allowing the head to freely twist in both directions, but this isn’t of any use in practice. This is a silly example, but it’s the sort of thing that can happen with programs; a feature or setting is added (usually with the best of intentions) because it is easy to do, not because it is relevant or useful to the task.

If you don’t believe me, think of Clippy. And, when something goes wrong with your computer, ask yourself “does the programmer really understand what I’m trying to do?”.


* In general, it’s very unlikely a computer program was made entirely by one person. However, numerous decisions that went into its construction were made by people, and as you don’t know anything about them, you may as well assume a unified** Mr/Mrs Microsoft Word Programmer.

** If the team have been working together and communicating effectively, the program ought to feel consistent. If not, your generic programmer may come across as a bit… cuckoo.

Malicious technology (Part 1)

No, this is not related to the Heartbleed bug. There is already plenty of valuable information out there about it and what you should do about it without me sticking my oars in.


I’ve noticed a certain unconscious assumption some people have (myself included) when dealing with computers*, that primarily affects how we react to something going wrong. I feel it would remove certain roadblocks if this were addressed and refuted.

It’s the belief that the computer is smarter than we are.

Now, it sounds silly to say it out loud—computers are just machines, of course they’re not smarter than people. And I don’t think it’s a position that people hold intellectually, but rather emotionally. What’s the difference? Just ask someone on a strict diet what they feel about butterscotch pudding (or whatever sugary/salty/fatty banned food it is they’re craving). Their brain says “no, it’s bad for me”, but their heart says “yummy!”.

So what effect does this have with regards to computers? When something goes wrong—a program crashes; your data mysteriously disappears; something changes and you have no idea how to get it back; you find yourself in a maze of twisty little dialog boxes, all alike; etc.—your reaction is “what have I done?!”. Tech support often reinforces this**, both overtly (“What did you do?”) and implicitly (think about the underlying tone of a lot of the available help, especially online).

There are two components to the assumption.

If I was more savvy, this wouldn’t have happened

It’s easy to see why this belief persists. It’s certainly true in a wide variety of other situations. However, even expert computer users still run into problems. Partly this is because almost no-one is an expert in all aspects of computers (electrical physics, electronics, cpu and memory architecture, bios, kernel, network, software, etc.). The difference tends to be that more experienced users are better able to recover from a problem (either through knowing where to look, or being able to understand the jargon). Plus, there’s the second component…

The computer doesn’t make mistakes

Again, an understandable belief. Almost every mechanism we encounter will perform the same way every time (provided it’s not faulty). Computers don’t necessarily. Or rather, they do, but Chaos Theory. In practice, this means a correctly-behaving computer can be unpredictable. This makes it hard to get over the assumption that it did something unexpected because of something you did (wrong).

There’s more to say on the topic, but this is getting long enough, so I’ll save it for next time.


* And indeed, many other types of technology. I think the key factor is whether or not the user understands it or not—it’s not generally a problem with a toaster for example. Unless you’ve got a whizz-bang, multi-function, golden-brown-sensing, iPhone-compatible toaster. And if you’re rich enough to afford one, you probably don’t make your own toast, anyway.

** To be fair, there are people who genuinely shouldn’t be put in front of a computer, and who cause no end of stress and frustration to the people who have to work with/clean up after them. The vast majority of computer users, however, are reasonably capable.

What does Christmas feel like?

The other day I was out shopping, and was surprised (bearing in mind this is April—a couple of weeks before Easter, no less) when a Christmas song came on as part of the store’s muzak. The shop assistant near me was evidently more disturbed, as they muttered something to the effect of “oh, that’s not right” and scurried off to rectify the situation.

Funnily enough, I wasn’t particularly bothered either way (mind you, I didn’t have to stay in the shop until my shift ended), whereas in six months time it would probably be cringe-inducing. I suspect Christmas (commercial) cheer is a straw-that-broke-the-camel’s-back, so one song on its own isn’t a problem.

What did make me think was the strong association between Christmas and snow (I can’t remember what the song was, but snow was involved somewhere). I’ve spent several Christmases in the southern hemisphere, where Christmas is in mid-summer, yet there is still the same association—decorations including artificially-frosted windows and things like that. It’s probably a cultural hangover of some sort, given the amount of influence the USA and UK/Europe have on the rest of the western world.

A common criticism of Easter and/or Christmas traditions is that they’re not really Christian at all; they’re repackaged pagan seasonal festivals. Critics also complain that the dates are inaccurate (mainly for Christmas—the date for Easter is calculated similarly to the date for the Jewish Passover). Both criticisms miss the point somewhat. New ideas are more easily accepted when they are presented in familiar terms—early Christians quite sensibly co-opted existing holidays but gave them a different title and meaning. Secondly, celebrating something on the “wrong” day doesn’t invalidate the “right” day, or render the celebration irrelevant. The Commonwealth don’t celebrate the Queen’s birthday on the actual day (or even the same day in different countries).

To me, both Easter and Christmas are times to either celebrate significant historical events (if you believe them), and/or to enjoy a break (which goodness knows we could all use at times) and the chance to be with family and friends.

I still find the snow thing kind of silly, though. πŸ™‚

What’s in a name?

Most of the time, especially when dealing with the spoken rather then the written word, someone can use the wrong variant and no-one will notice and/or care. But not when it comes to names (or your own at least).

(Apologies to the people whose hackles were raised by my use of “then” instead of “than” in the previous paragraph. πŸ™‚ )

There are a lot of names that either have different spellings (e.g. Ann/Anne), or are similar to other names (e.g. Davis/Davies), which increases the likelihood of mistakes. This concerns me, because I’m terrible at remembering names at the best of times. I did well at one place I worked, but there I had lists I could refer back to.

People often use variations of my name, which is frustrating. Spelling or grammatical errors I can brush off, but this is not just a word, it’s a representation of who I am. In the same way that clothes and make-up can enable people to present themselves the way they want to be seen, the way someone is referred to is very personal, and it can feel disrespectful to be mis-referenced (you could always respond with “Segmentation Fault!”, but most people wouldn’t understand).

What got me thinking about the topic is seeing a movie trailer for the film Rio 2. The main characters (named Blue and Jewel) and their kids meet Jewel’s long-lost father, and the usual sort of tired tension-with-her-father jokes ensue. In particular, when the kids are introduced, politely expressing “Nice to meet you, sir”, their newly-discovered grandfather says to call him “Pop-Pop”*. Naturally, when Blue later (speaking to his kids) refers to him as “Pop-Pop”, he is given a stern eyeballing and told “You call me Sir.”

Regardless of the illogic of this particular example**, this is doing the opposite of what I described last time: continuing to use stereotypes that really should be retired. In this instance, the grandfather has no reason to be any less welcoming towards Blue except “He took away my little girl!”. Even though he hasn’t seen said “little girl” (now an adult woman… er… parrot) for a long time. It hearkens uncomfortably back to the archaic concept of a marriage as essentially a business transaction between two men. That said, I haven’t seen the whole film, and it could be the basis of an interesting character arc where the grandfather realises that the real focus of his resentment is whoever took Jewel into captivity, and that he’s just glad she’s alright. But it’ll probably be Blue-does-something-vaguely-heroic, thus earning (and I use the word “earning” deliberately) a “Well, I guess you’re alright. But you still call me Sir.”


* It’s good that there are a wide variety of nicknames for grandparents — you should be able to settle on something distinct for each. Otherwise, when you mention “Grandpa”, does that mean Mum’s dad, or Dad’s dad? One of my grandmothers (possibly due to not feeling old enough to be a “Nana” or “Granny”) preferred we call her by her given name. Not knowing anyone else by that name, I spent part of my early childhood thinking it was another word for grandmother. Families are wierd. πŸ˜‰

** Supposing you were visiting your friend Bob, and his daughter Carla answers the door. It makes more sense to say something like “Hi Carla, is your Dad at home?” rather than “…is Bob at home?”. You certainly don’t call Bob “Dad”, but Carla does, so there’s less potential for confusion.

Hello, World!

Hello, World!

I should probably start with something relatively straight-forward. Like many others, I have seen and enjoyed the movie “Frozen“, particularly its subverting of the usual clichés (especially when it comes to Disney princess movies).

(To avoid spoilery details, I’ll use the example that none of the sisters’ outfits involve The Only Colour That Girls Likeβ„’. The nearest is that both wear magenta/purple cloaks, but they’re a detail of the outfits, not the base colour.)

What I particularly appreciate, though, is that it doesn’t make a big deal of these subversions. Neither in the marketing, or the film itself, is there any sense of “Look at us, being all edgy and going against established tropes!”. The story just gets on with it, treating these differences as perfectly normal.

And really, isn’t that the way to make an actual change to society-as-a-whole’s attitudes? As opposed to having a “token” difference (because we can all see how well that’s worked…).