Francis’s news feed

This combines together on one page various news websites and diaries which I like to read.

October 07, 2015

Mars Orbiter Mission update: A year at Mars by The Planetary Society

A couple of weeks ago, there was a flurry of rumor that ISRO was ready to announce some results from its Mars Orbiter Mission's methane sensor. The Indian space agency held a press event for the one-year-in-orbit anniversary of Mars Orbiter Mission and released a book containing mission photos, but did not unveil any new scientific results.

October 06, 2015

Can I just do Settlers and Town Planners as a Bimodal? by Simon Wardley

tl;dr : yes ... BUT.

Continuing on from the post on Bimodal, I was asked "Can I just do Settlers and Town Planners?"

Let us understand what that means. You're going to focus on product improvement and the industrialisation of a component but forgo the uncharted exploration of the novel and new. Well, there's no reason why you can't do this but it does means that you going to have to rely on the Settlers extending into the outside ecosystem to discover those novel and new components which at a later stage you will want to turn into products and finally industrialise (see figure 1). If you don't do this then you have no future pipeline.

Figure 1 - Focus on Settlers and Town Planners

For reference, I've given a fuller list of characteristics of the different roles in figure 2.

Figure 2 - Characteristics (you'll have to expand)

Ok, the problem with pushing all the novel and new to the outside market (i.e. taking an outside-in approach) is you'll be in a race with others to discover future patterns of success. This is not necessarily a bad thing unless :-

1) Everyone is trying to do this and there isn't some other system to exploit (see "How Much for a Fountain of Youth")

or more likely

2) Someone is playing an ILC ecosystem game against you (or a variant such as Tower & Moat).

I won't go through the ecosystem game, you'll have to read that post but the net effect is they can use their ecosystem as a future sensing engine and the bigger their ecosystem, the more powerful the effect will be. This means they'll not only have the edge on you but this edge will become larger over time compared to simply trying to scan the market.

So, yes you can just use a Settler & Town Planner structure but you'll need really high levels of situational awareness, ecosystem play, weak signal detection or otherwise you'll be an easy target with nothing in the bag (i.e. novel surprises) and a simple pipeline to choke.

My Favourite Film of 1978. by Feeling Listless

Film Right, let's talk about Star Wars too.

But first:  1978.  Much of this list has worked from the rule of slotting films into the year of release as per the IMDb.  But for various reasons, see next week, that simply doesn't work for Star Wars and so I'm invoking the "Stories We Tell" rule and borrowing the year when the film received its national UK release, which is the first occasion I could have seen the film.  So here we are in 1978.  Sorry Grease.

Except unlike most people of a certain age, I don't rightly remember the first time I saw Star Wars.

My best guestimate is at Woolton Picturehouse in 1980 during a matinee double bill with The Empire Strikes Back. I remember clearly walking to the back of the auditorium with my Dad to buy a carton of Kiora during the closing credits. This was at about six years old, so I can't really tell you how I felt about it, indeed I was probably more excited about the concentrated fruit drink which was too fruity for the crows. I loved that advert.

For years I thought I'd imagined or dreamt all this but then I discovered a quad poster at a film fair, this quad poster, which was reassuring. There are few things more disappointing than when the memories of life's milestones a revealed to be fraudulent.

Of all the films on this list, it's also the text which I've watched the most.  More than Adventures in Babysitting, more than Ferris, more than Star Trek VI, even before George Lucas began to fiddle about with its rusty innards.  More even than my actual favourite film, When Harry Met Sally.

Every rerelease at the cinema, ever new format and I've been there to tut at the changes whilst marvelling at the restorations.  About the only version I haven't owned is the nutty non-anamorphic laserdisc transfer which appeared along with the rerelease of the "special" editions on dvd because I'd already bought first boxed set version.

I learn as I write that this is a film which was even changed significantly during the original theatrical run.  Not surprised.

What draws me back to Star Wars?  There's the nostalgia bullshit, the stuff we all know about.  There's having something to talk about with friends because no matter which other franchise people are fans of, Star Wars exists as a kind of neutral territory we can all meet on.  Star Wars is Switzerland.  Plus habit.  Habit is important too.

Like plenty of films of that era and in general, Star Wars is just sort of always there.  Ubiquitous.

Even in childhood, even before I owned a video recorder of my own, I began the format journey with some second hand copies of the original panned and scanned CBS/FOX releases.

The occasions at primary school on strike days and near summer holidays when half the school would crowd around the institutional television to watch an off-air, or as has always been the case, off-ITV, recording of the film complete with adverts which would sometimes be fast forwarded through depending on whether the teacher was paying attention.

When visiting Germany were a relative was stationed in the RAF and hiring a video from the local newsagents. Other important hires during that holiday where many of the original 60s animated MARVEL series notably The Might Thor: "Across the rainbow bridge of Asgard to where the booming heavens roar!"

At various friends houses across the years when it was impossible not to watch all three together, usually friends who like me could quote the whole thing but would still shush each other just in case we missed some new nuance of what was being said.

I saw it twice during the epic 1997 release of the special editions.  Once on the giant screen one at the Odeon London Road with friends and then again on the tiny screen five upstairs at the front of an empty auditorium before returning to screen one for Empire which was released again just a few weeks later.

Imagine that now - a twenty year old film, albeit in a new version, in general release and projected on the most important screen in most cinemas and taking massive box office.

But was I ever a fan?  Not sure and when it comes to giant multimedia franchise properties, what is a fan anyway?

Essentially if you're of that certain age in the UK, science fiction fandom is dominated by this trinity, Star Wars, Star Trek and Doctor Who.  I'm aware than in the US, until recently, it's been more of duopoly between the first two, but in general it seems to be you were either a fan of Wars or Trek or Who.

As we've discussed I've done Trek. I'm currently Who (just about).  Before both of those it was Transformers though arguably that only really stretched to the comics and comics in general were a huge influence.  There's Shakespeare too but let's not confuse everything for now.

I worked my way through a fair amount of Star Trek's spin-off novel back catalogue and Doctor Who's story runs across pretty much every artistic medium with a narrative.

Yet despite Star Wars having its own expanded universe, apart from some book and records (specifically Planet of the Hoojibs) (you can read along with me in your book) (you will know it is time to turn the page when you hear R2-D2 beep like this...), bits of the radio adaptation and having a run of the MARVEL reprint comics bought for me from Speke Market for a few years, I wasn't attracted to anything.

Which isn't to say I didn't try.  I even bought a copy of Timothy Zahn's Heir to the Empire (Star Wars: The Thrawn Trilogy, Vol. 1).  Still haven't read it (which is probably just as well now).

The only exception is some of the computer games, notably the original PC version of X-Wing which I played with friends into the night at university, someone on the joystick piloting a craft, the other at the keyboard playing R2 with laser and shield strength, which was about the only way to be successful with some levels.

Having considered this for a good ten minutes before sitting down at my laptop to type this into the Blogger box the only reason I can come up with is that I didn't see the point and that I didn't see the point because unlike Star Trek and Doctor Who, Star Wars tells its story pretty well across the three films.  Once you reach the celebration at the close of whatever version of Jedi you happen to be watching, the story's done.

Now, George didn't think so, hopefully JJ's found a way out of that narrative quagmire, but ultimately, I must have subconsciously decided that however much I like these characters unlike those other television properties which were chemically designed from the off to generate stories, by hitching himself to Joseph Campbell and Valdimir Propp in telling us the story of farm boy makes good, George essentially limited his options.

Leland Chee, keeper of the holocron (such as it is now) would obviously disagree with me.

And as the superb The Clone Wars demonstrated, it is possible to engineer something within the same universe to fit a television narrative structure and I loved every minute of that.  Even began to like Jar Jar a little bit especially in those episodes in which they clearly dared themselves to produce entertaining television that just featured that damned Gungan and the droids.

The anthology films are also intriguing but they're generally going to be telling known stories.  Rogue One is the Star Wars equivalent of The Dambusters, telling a familiar story albeit in a fictional universe.

But the further adventures of Luke, Han and Leia and the team?  Not so much, especially since the films end on such a positive note.

Does that stop me from being able to call myself a Star Wars fan?  Or is this like so many other things now, a matter of self-identification not to mention cultural monogaminity as I've always assumed.  As a Doctor Who fan I'm not really allowed to call myself a fan of any of these other franchises or something like that?

This is probably getting a bit silly.

Perhaps it's enough for me simply to say that Star Wars is my favourite film of 1977.  Or 1978.  From a certain point of view...

Eels in the dock by Goatchurch

Not much bloggable recently. Some well-formed untested ideas on matters of servo motor control, and lots of bad psychology. Two discoveries of note that’ll take up a lot of time are jscut for CAM and chilipeppr for CNC downloads, both of which run in the browser. I have an interest in this approach, but now that I’ve found some people who are doing it effectively, it will save me a lot of time.

Meantime, here’s a couple of bad pictures from a cheeky kayak dive in the docks during Becka’s Thursday night canoe polo session (aka murderball).

Getting ready to go down:

One of the scary eels that made me shriek when it appeared in the dark.

Eels look and move a lot like snakes. We’ve got an instinctive fear when we encounter them in their own environment. It was interesting how they could swim backwards just as easily as forwards.

20 questions to ask children about ghosts by Dan Catt


Sometimes I find myself in a situation where I need to talk to a child. For some people talking to kids comes naturally, I am not one of those people. My usual gambit is to ask them what they're interested in and taking it from there. But sometimes that just doesn't work, they're not interested in anything, or they're not interested in sharing it, or they're too intimidated by my beard to say anything.

When that happens I fall back on my ghost questions. The handy thing about these questions is that even if they don't believe in ghosts, or answer in the negative to a bunch of these you can just keep going.

Best of all, whenever I've asked these questions the answers have always been charming.

  1. Do you believe in ghosts?
  2. If there's thunder and loads of rain, does a ghost get wet?
  3. Can it get hit by lightening?
  4. If the wind is strong, does it blow away?
  5. Can a ghost breath underwater?
  6. Are there human ghosts underwater?
  7. Do cows have ghosts?
  8. Do birds have ghosts?
  9. Do bird ghosts get wet in rain or blow away in strong wind?
  10. Do cows have ghosts?
  11. Could you have a cow ghost underwater?
  12. Do fish have ghosts?
  13. Why don't we see fish ghosts on dry land?
  14. Can ghosts go on holiday?
  15. If a fish died in a fish bowl, and then the fish bowl is moved, does the fish ghost haunt the area where the fish bowl was even if there isn't any water?
  16. Can you spray paint a ghost?
  17. If you spray a ghosts with stinky perfume will it smell?
  18. What happens if two ghosts try to run through each other?
  19. If you blow dry a ghost does it go fluffy?
  20. Do you believe in ghosts?

Dry Out Super-Earths via Giant Impacts by Astrobites

Title: Giant Impact: An Efficient Mechanism for Devolatilization of Super-Earths
Author: Shang-Fei Liu (刘尚飞), Yasunori Hori (堀安範), D. N. C. Lin, Erik Asphaug
First author’s institution: Earth and Planetary Sciences, UC Santa Cruz, USA.
Status: Accepted for publication in Astrophysical Journal.

The early phase of planetary systems – a very brutal era

Forming planets is a violent business. After the first bigger bodies of several hundred kilometers have accumulated in the protoplanetary disk (see here for short summary of the main growth’ steps), they further grow via giant collisions. Today’s paper used these giant impacts between two bodies of comparable size to explain the enigmatic observations of planets which do not seem to fit in the widely accepted theory of planetary assembly, namely the systems Kepler-36 and Kepler-11.


Figure 1: Stage in the simulation when the impactor hits the main body. Shown is  a 2d cut through the 3d structure of the model. The color coding gives the compositional structures of the bodies: red is the iron core, green the silicate mantle and blue the H/He atmosphere. When the impactor crashes into the planet the bodies merge and parts of the system will be expelled. Credit: Liu et al. (2015)

These two systems host planets on orbits very close to each other and the central star. Surprisingly, these planets are very different from each other, albeit their close neighbourhood. Both systems host a mini-neptune (a planet of < 10 Earth masses with a massive atmosphere and very small mean density) close to a super-Earth (a terrestrial planet with < 10 Earth masses). The problem here lies within their close-in configuration but their fundamental difference in structure. The formation paths of these two planet categories are thought to be dramatically different. Thus, we are left with the question how they can be so different, but so close to each other! Figure 1 shows the idea the authors propose to resolve this issue. Using computer simulations they smash a massive planetary embryo in an already existing planet with a massive hydrogen/helium atmosphere. Depending on the difference in velocities of the planet and its impactor the outcome of the collision will be different.

Crashing smoothly vs. disruptive

The two variations the authors explore are shown in Figure 2 below.


Figure 2: Impact in the accretionary (left) and disruptive regime (right). For a description see the text. Credit: Liu et al. (2015)

One option (left) for the collision is to have a small velocity difference. This means the injected energy is relatively small and the impactor crashes into the planet and merges its structure with it. The second option (right) is to increase the velocity dispersion. Then, the involved energy is greater and the collision is much more violent. Before we further dig into the outcome of the simulations, let’s have a closer look at the time sequence in the images. The color coding shows the density, going from purple (low) to bright yellow (white). The snapshots go from (a) before the impact, (b) immediately after the impact, (c) at 1.56 h after, (d) 18 h after, (e) is an enlarged version of  (d). The dashed circle around the merged body in panel (e) indicates the hill radius, which means the sphere in which the gravity of the merged planet dominates.

Pulverised atmospheres

The difference in the figures above might be only minor. However, if we have a more in-depth look at the composition of the material in the simulation the deviations are drastic.


Figure 3: Compositional, radial mass fraction in the accretionary (left) and disruptive (right) simulation at 18 h after the impact. Iron, silicate rock and H/He gas are represented via gray, light coral and blue, respectively. The dot-dashed lines from top to bottom show the initial configurations of the simulations. The dashed lines, starting bottom left, indicate the cumulative mass (yellow), enclosed silicate rock (red), enclosed iron (black) and enclosed H/He (blue) in Earth radii. Comparing the mass fractions of H/He left and right it becomes clear that the high-speed impact to the right drives out most of the atmosphere. Credit: Liu et al. (2015)

The most obvious feature of the figures above is the striking difference in H/He atmospheres. The collision outcome of the accretionary simulation is much more calm and the atmosphere is retained by the target body. Therefore, the merged planet is able to hold on to most of its original envelope. In contrast, the more violent collision to the right leads to most of the atmosphere molecules being ejected into space.

However, the authors’ do a quick post-processing check if the accretionary simulation system could hold on to its atmosphere after the impact. There is one essential problem: young stars tend to send out very aggressive extreme ultra-violet (XUV) radiation, which has the potential to destroy the atmosphere. In fact, this poses a problem to the surviving parts of the atmosphere, as it is inflated in comparison with the initial state. Hence, much more XUV photons tackle the atmosphere and tear it apart. Because of this both simulations lose their atmosphere after the impact, and even the relatively modest impact with low velocity dispersion can’t save the atmosphere.

In summary, the authors were able to demonstrate that such a violent collision of two planet-sized bodies is in principle enough to “dry out” volatile-rich planets, which offers an explanation mechanism for seemingly weird systems, such as Kepler-11 and Kepler-36. However, the principle formation pathways of planetary systems are still under much debate and maybe other reasoning can lead to similar architectures. Nevertheless, the study of Liu et al. demonstrates the fascinating advances in recent computational methods and emphasizes the potential of geophysically motivated modeling approaches in astrophysics.

Hayabusa2's target asteroid has a name! by The Planetary Society

JAXA announced today the results of the naming contest for Hayabusa2. The target of the sample-return mission, formerly known as 1999 JU3 and still numbered 162173, is now named 162173 Ryugu.

Finding new language for space missions that fly without humans by The Planetary Society

Historically, human spaceflight was described using the words "manned" and "unmanned," but NASA has shifted to using gender-neutral words to describe human space exploration, even though the Associated Press has not. A recent discussion on Twitter among science writers and scientists highlighted some alternatives.

October 05, 2015

Less Furious Road. by Feeling Listless

Film The Guardian's interview with George Miller has an interesting titbit about the preview process of the best action film of the year (sorry Joss):

"The studio released the film with an R rating in America, very rare for productions that cost as much as this one – somewhere in the vicinity of US$150m. From a financial point of view it is considered a given that releasing an R-rated movie, rather than editing it down to a more widely palatable PG, significantly impacts performance at the domestic box office. Potentially by tens of millions of dollars.

Says Miller: two versions of Fury Road were completed and screened to test audiences. “I’m happy to go on the record as saying we tested both versions and it was very clear that the bland version scored a lot less across all demographics than the version you see,” he says.

“To the great credit of the studio they realised if we decreased its intensity and took away a lot of its key imagery it would basically take the life out of the film. It was the studio that said if we compromise the film too much to get a PG, we won’t have a film at all. I thought that was very brave."
It would be an interesting academic exercise to see the PG version of the film, what exactly was lost (although I expect we can probably guess).  There have been plenty of examples of films having been compromised for ratings purposes, The Hunger Games: Catching Fire notably in the UK which had plenty of blood shots removed for theatrical release.  Sometimes you just have to let films be films.

October 04, 2015

Now, See Hear, Doctor. by Feeling Listless

TV When writing about Doctor Who's Under The Lake yesterday, I suggested it might be a good idea for the BBC to make a signed edition available as quickly as possible on the iPlayer. It's not on there yet, but yesterday there was a signed screening of the episode with Q&A as part of the See Hear festival and this making of piece from the programme itself about Sophie Stone's part in the episode has been posted online.

It's predictably fascinating, especially in relation to how the signing sections were prepared with Stone, Zaqi Ismail and a signing coach working through the next group of scenes to be filmed the evening before and negotiations about which pieces of signing would be used. Stone and Ismail essentially have their own dialects of signing due to communicating with relatives leading their own signs for various words.

Here is a less complex Extradental version:

Which includes how they had to make up signs for words which don't have official equivalents yet.

If you really want bimodal then you'll need to give something up. by Simon Wardley

Back in 1993, Robert Cringely wrote the excellent book Accidental Empires (I have the 1996 reprint) which talked about the three different type of companies using the metaphors of commando, infantry and police.

During 2004-2006, having discovered a process of technology evolution and a mechanism of understanding the competitive landscape (known as Wardley mapping), I implemented the equivalent three party structure (known as pioneers, settlers and town planners) to mimic evolution within a single company. A description of this three party system can be found here and the importance of the middle for sustainable and competitive advantage. Over the last decade, I've written numerous articles and gave lots of presentations around the world on the three party structure and the importance of the middle.

In 2014, I came across bimodal / dual operating system and twin speed IT. I can't tell you how much this caused me to howl with laughter. I'm no fan of these concepts. I view them as old ideas, poorly thought through and dressed up as new. I've seen two party systems in the past that have degenerated into 'them' vs 'us' and with no consideration of how things evolve they are not sustainable. There is no effective process for how the new (i.e. tomorrow's industrialised components) become industrialised. The idea that somehow the two groups will work together in a 'dance' is fanciful. Brawl would be more like it. Pioneers and Town Planners are polar opposites, they don't get on well.

I'm quite convinced that those undergoing re-organisation into bimodal will be facing a future re-organisation into a more three party or spectrum based structure in the near future. Certainly that means oodles of cash for consultants advising on these multiple re-organisations and be honest, I don't care if it's private companies that I'm not involved with. My concern is this spills over into Government Departments.

There is however a way of creating a workable bimodal structure but it means you need to give something up. To remind readers, the three parts for which you need brilliant people are :-

Pioneers. Pioneers are brilliant people. They are able to explore never before discovered concepts, the uncharted land. They show you wonder but they fail a lot. Half the time the thing doesn't work properly. You wouldn't trust what they build. They create 'crazy' ideas. Their type of innovation is what we call core research. They make future success possible. Most of the time we look at them and go "what?", "I don't understand?" and "is that magic?". In the past, we often burnt them at the stake. They built the first ever electric source (the Parthian Battery, 400AD) and the first ever digital computer (Z3, 1943).

Settlers. Settlers are brilliant people. They can turn the half baked thing into something useful for a larger audience. They build trust. They build understanding. They learn and refine the concept. They make the possible future actually happen. They turn the prototype into a product, make it manufacturable, listen to customers and turn it profitable. Their innovation is what we tend to think of as applied research and differentiation. They built the first ever computer products (e.g. IBM 650 and onwards), the first generators (Hippolyte Pixii, Siemens Generators).

Town Planners. Town Planners are brilliant people. They are able to take something and industrialise it taking advantage of economies of scale. They build the platforms of the future and this requires immense skill. You trust what they build. They find ways to make things faster, better, smaller, more efficient, more economic and good enough. They build the services that pioneers build upon. Their type of innovation is industrial research. They take something that exists and turn it into a commodity or a utility (e.g. with Electricity, then Edison, Tesla and Westinghouse). They are the industrial giants we depend upon.

The process of evolution (see figure 1) causes a change of characteristics which is why you need multiple attitudes and why one size fits all methods don't work (i.e. why agile, lean and six sigma should die and be replaced by agile plus lean plus six sigma). 

Figure 1 - Evolution

If you want to create a bimodal / dual operating system structure out of this then you really have to give up on one part. For example :-

You could focus on Settlers and Town Planners alone i.e. your company could all be about product development and industrialisation. Simply drop the pioneering research function. The best way to do this is with the press release process i.e. force the writing of press release before any project starts because no-one can write a press release for something unexplored. Don't attempt to do any pioneering stuff and instead use ecosystems models such as ILC or some equivalent to do future sensing

You could focus on Pioneers and Settlers alone i.e. your company could develop novel concepts and then create products out of this. Simply drop the town planning function and outsource all industrialised components (including dropping your own products) where appropriate.

You could decide to focus on just one aspect i.e. be Pioneers (develop novel concepts), be Settlers (create great products) or be Town Planners (create highly industrialised commodity and utility services).

However ... 

1) DON'T try and break into Pioneers and Town Planners. These two groups are far apart. You'll create a them vs us culture. None of the novel concepts will ever be industrialised because the Pioneers won't develop them enough and the Town Planners will refuse to accept them for being underdeveloped. Both groups feel they are the most important and both ridicules the other.

2) DON'T bury your Settlers into one of these groups. They won't feel welcome, they will be in conflict with the group becoming second class citizens. Put them in the Pioneer group and they'll be denigrated to documenting the "glorious" inventions of others and fighting a losing battle over user needs. Stick them with Town Planners and they'll be seen as 'lightweights', the people whose job it is to deal with those annoying Pioneers and document what they've done etc.

Either drop one aspect (i.e. Pioneers or Town Planners), or focus solely on one aspect (i.e. Pioneer, Settler or Town Planner) or focus on ALL three. This means the Settlers (the missing middle) need to be recognised.

Create one mode to focus on "core system maintenance, stability, efficiency, traditional & slow moving development cycles" and another mode to focus on "innovation & differentiation with high degree of business involvement, fast turnaround & frequent update" ... wellPioneers and Town Planners don't mix and creating two camps focused on this won't make for a happy environment.

Oh, but isn't Bimodal / Dual better than Unimodal / Single? No, I'm far from convinced that this is the case. In the past many ignored evolution and squashed all three attitudes, cultures and type of people into one group called IT, Finance, Marketing or whatever (yes, evolution impacts everything we do). Those groups would tend to yo-yo in methods between the extremes hence in IT we had Agile vs Six Sigma (ITIL etc) battles. But every now and then the 'middle' would get its chance. Today you're seeing this with Lean, with focus on user needs etc.

If you organise by the extremes e.g. pioneers and town planners or mode 1 and mode 2 or whatever you wish to call it then you're burying the middle. This is not good. This middle group are not mode 1 or mode 2 but they have a vital role to play. They deal with the transition between the extremes. However in a dual structure then you've given the other attitudes a flag to rally around, you've formalised a structure to support this and left the settlers singing in the wind. You've gone from ignoring them (which you did to all attitudes under one group) to actively discouraging them and sending a message that only pioneers and town planners matter.

Well, you've been warned, tread this path carefully. This will be my last post on this subject (I hope) given it's such old hat but I'll come back in a decade and we shall see what has happened to these dual structures.

P.S. The mapping technique, the characteristics, the use of multiple methods, the pioneer / settler / town planner structure and a host of other stuff you'll find in this blog is all ... creative commons share alike. You can do this yourself. You don't need external consultants, license fees etc.

“Girly-girl” by Feeling Listless

Film I think I'm just going to keep posting these quotes because something has to change. Whilst it's true that Raimi's Spider-Man was filmed and released over fifteen years ago, the second paragraph explains why this is still relevant.

  Kristen Dunst being interview by Elizabeth Day in The Observer:

"We start talking about whether the pressure to look a certain way is stronger for women than it is for men. Does she think the film industry is sexist? “God. These conversations are always so, like…” She pauses and I see her actively decide to say what she really feels. “I mean, yeah,” she concludes. She recalls that, when she was filming Spider-Man at the age of 18, the older men on set – including director Raimi – would call her “Girly-girl”.

“I didn’t like that at all. I mean, I think they meant it as endearing, but at my age I took it as dismissive.” At the time she was too intimidated to speak up for herself. But recently she found herself working with the same first assistant director on another film. “I told him how much that upset me,” she says. “And he treated me completely differently on this movie and we got along really well. He’s a great guy.”
And from The Guardian last Thur

Under The Lake. by Feeling Listless

TV  At the risk of sounding like the demented and tormented Brad Pitt detective at the end of the movie Seven, “What’s in the baaaaax????” Lesser stories and episodes than Toby Whithouse's Under The Lake, and we’ve all seen, read and heard them, would have made the idiotic choice of finding a mechanism for opening the suspended animation crate revealing what’s inside the cliff-hanger moment. All the talk of the Sword of Orion led me to Ted Rogers the contents to be a cyberman (partly because the writer also mentions a minuet and I assumed his script was a love-letter to McGann’s first audios season) and it might still be, but the point is that going into the next episode we still don’t know, the script’s self-esteem is high enough that it’s happy to keep some mysteries for its second instalment know full well that there’s enough other hullabaloo bounding around to keep us otherwise enthralled. It’s a BBC Two launch night of a script if you will (including the odd black out).

We’re at episode three, people, and everything is blazing. Within minutes the usual social media platform was buzzing with notices about this being the best episode ever and such (well amongst the dozen of us not watching the rugby) which is probably going a bit far, this is just a superior base under siege story so far not Human Nature. But having spent the day actually looking forward to watching Doctor Who, quite the change from last year when I almost stopped watching (true), the show didn’t disappoint and was actively enjoyable, hitting all the right notes, in the right order and actually made me laugh, a lot, my new second hand armchair shaking beneath me, and pretty scareded on more than one occasion. Which used to be the baseline expectation for Doctor Who before the new series threw it into a vat of contemporary television bells and whistles, so it’s actually comforting to find ones self watching something which isn’t afraid to just stand alone, land the TARDIS somewhere and have at it.

Given Whitehouse’s previous scripts for the series, there’s something pretty iconoclastic about Under The Lake’s traditionalism. Not all of them have worked (GUNS!) (SOMETIMES!) but they’ve all been pretty rangy in how they’ve interpreted the show. Yet here we are on a base, under a lake, with a crew of scientists being menaced by ghosts obscuring some other mystery designed to intrigue the Doctor. As is always the case with two parters it’s impossible to make a judgement about the whole thing yet, and the throw forward (spoiler alert until the end of this paragraph) certainly seems to bend expectations in that we’ll soon be visiting the area of the base before if became the base which I don’t think has happened before, time travel becoming a component for the second story running, but so far Whitehouse has pretty much followed the series five playbook to the Pemberton. Was Vector Pretroleum, a shout out to the Fury of the Deep writer? Let’s hope so.

With all the necessary caveats in place, how are we orientated? Let’s look at the elements, starting with the crew who with the exception of the rather obvious Weyland-Yutani spy played by the usually called upon to be more subtle Steven Robertson, they’re that rare example of a group of professionals assured of their capabilities. How marvellous (as Russell might say) that ours is the show which has a leader whose abilities are just the sort of thing which are required at a crucial moment, who’s capabilities are what drives the story forward and which no one else, even the Doctor it seems, are able to tap into. Sophie Stone was the first deaf person to be accepted at RADA and it’s incalculable the effect of seeing this heroism might have had on children with similar dreams of following in the foot steps of either the actor or her character and it’s to be hoped that a signed repeat of the story turns up on television and the iPlayer as quickly as possible (it's currently listed in the signed section but none of the streams are yet).

It’s an eclectically cast group. The leader's translator Zaqi Ismail’s only other screen credit is Indian Summer II, mostly working otherwise in regional theatre, his ability to read sign language the only skill listed on his CV. To that he should add "able to look petrified, terrified and generally shit scared." Arsher Ali, best known for the Chris Morris film, Four Lions, also in what's usually a fodder role but given enough scope for some genuinely funny emoting. Then right in the middle is the mighty Morvern Christie, finally receiving her Doctor Who credit after having even avoided the gaze of Big Finish's various casting Saurons and utterly charming as she always is. She was literally the only reason I sat through all eight episodes of Frank Spotnitz co-production misfire Hunted and is still on my top five lists for potential future Doctors, the one I keep in back up just in case Romola is too busy as president of the known universe.

Structurally the base is about what it needs to be design wise, with lots of bulkheads and corridors and cargo bays, fulfilling the usual narrative requirements.  Has anyone written a piece for the sorority bulletin about how the aesthetics of these environments have changed across the series both in broadcast order and in-verse chronologically?  Under The Lake offers a pretty classical example, mainly white but with lived in sections, rather like crashing the Liberator into Moonbase Alpha.  All these bases have a unique feature and so here's a Faraday Cage, for which the Radio Times handily asked a proper scientist an explanation the other day which includes the ability to block radio waves making the wifi in the Doctor's sonic sunglasses all the more magical.  If nothing else it reminds me that I haven't watched Planet of the Dead lately.  Let's add Lady Christina De Souza to the list of characters who we hope Big Finish will grant an audio boxed set, shall we?

Like Vampires of Venice, the apparent supernatural elements sadly turns out not to be, an implementation instead of some alien's plan.  Nonetheless these are perfectly spooky examples with their blank spaces and faces with the irony that outside Christie, the episode's most prominent guest cast members find themselves being called upon to speak wordlessly (though perhaps they'll return next week in the earlier time period given their offhand demises here so early in the episode).  Again, Whithouse's script is careful to keep them intriguing.  How are they able to become randomly corporeal enough to carry metal and how come they can't exist within the artificial daylight of the cabin embracing instead the darkness?  Image wise they're genuinely horrific and as every I wonder how this will play with children, how far over the edge this is.  There's plenty of jump-scare horror on Netflix which has design work less potent than this,

After the emotional horrors of last season, Clara and to an extent Jenna, seem happy to allow themselves to slip backwards into a more traditional companion role.  There's a supercut to be made for just how many times she says "Doctor" or asks a question in the episode to such and extent that even the Doctor not only notices but even uncomfortably warns her against "going native" with the implication that she's using this urge for adventure as a way of running from her grief.  Jenna's playing in this scene is extraordinary, the mask of trying to accept the kind of sympathy which is being offered by someone who's only really wanting to make themselves feel better, and wanting that conversation to end.  Let me direct you to this superb animation about the difference between sympathy and empathy (with a trigger warning for anyone touched by tragedy recently) (he says inelegantly), then go back and watch Jenna's face as she allows him to stop, especially her eyes.

But the Doctor's attitude and Capaldi's performance in that scene is the epitome of this new and improved Twelfth Doctor.  Arguably his attitude hasn't changed that much from last year, he's still rude and yes, lacking in empathy, but he's clearly aware that it's a problem with this incarnation and the introduction of the cards (screengrabs here) and the implications of them is warm and funny rather than horrific, I think.  This is no longer the man who simply dismissed all of the humans who went inside the Dalek with him out of hand and we're almost seeing a reverse of the Fourth and Leela during their Hinchcliffe season, of Clara attempting to civilise him.  There's also a quite shameless introduction of Tenth like pop culture referencing too and dime-switching from complement to counterintuitive: the moment when he thanks Christie for turning on all the lights before asking her to do the reverse.  Continuing on from last week (although we don't really know the production order), the actor simply feels more confident in himself and his ability to do his childhood hero justice.

One item which on the one hand is obviously simply a quick way of making the crew trust him but which otherwise has interesting implications is how this 22nd century crew are not just aware of UNIT but also the Doctor as an entity even to the point if LINDA like fandom.  Having spent the best part of a season not too long ago expunging himself from the knowledge of the universe, now some rando in an underwater mining facility has heard of him.  He seems unconcerned by this.  It's possible that this is simple foreshadowing for next week's episode when the Doctor potentially does something notorious enough with the help of future UNIT in the base's past to warrant this recognition.  But in the spin-offs, the "future" history of the organisation is sketchy (even if information about its employee's descendents isn't) so again, we're seeing a very confident script dropping potential hints for some future narrative, either next week in the coming months.

Which returns us to the initial question.  “What’s in the baaaaax????”  The imdb page for the episode has a potentially spoilery casting line, but I'm not convinced it's that simple.  It's not Davros this time.  My boiler plate theory is that it's the Doctor's real body, him having become a ghost for reasons and he'll spend the episode trying to explain this and how to put the two back together in her part of the upcoming episode, the Time Lord having discovered the identity of the person who thought they'd be going in the box, but that doesn't explain the fatalities.  Unless it is a cyberman.  Or it isn't actually revealed next week who is in the box, the Doctor having decided it's probably best not known.  But like The Satan Pit, a locked box simply can't stay closed, it has to be opened and my goodness I can;t wait for the explanation which is a wonderful, wonderful feeling.  Although it's become a disastrous night for English Rugby fans in the period its taken me to write this review, for Doctor Who fans it has been the exact opposite.

October 03, 2015

On Display. by Feeling Listless

Art It's well known that a vast percentage of a museum or gallery's collection isn't on display either because the archive is chock full of very average art which has been bequested by well meaning collectors with little taste, or because the work is difficult to adequately display or because tastes have changed. But the percentages are far starker than I'd realised. Here's a BBC Culture paragraph which tots up some statistics:

"The walls of the Tate, the Met, the Louvre or MoMA may look perfectly well-hung, but the vast majority of art belonging to the world’s top art institutions (and in many countries, their taxpayers) is at any time hidden from public view in temperature-controlled, darkened, and meticulously organised storage facilities. Overall percentages paint an even more dramatic picture: the Tate shows about 20% of its permanent collection. The Louvre shows 8%, the Guggenheim a lowly 3% and the Berlinische Galerie – a Berlin museum whose mandate is to show, preserve and collect art made in the city – 2% of its holdings. These include approximately 6,000 sculptures and paintings, 80,000 photographs, and 15,000 prints by artists including George Grosz and Hannah Höch."
The overall piece explains why some pieces aren't shown more often. But there will be plenty of very good museum pieces in all our national collections, which we pay for, that don't see the light simply because of a lack of wall space.

Although there'll be matters of insurance and security, but why not loan these to smaller galleries throughout the country or other municipal institutions, create small "national galleries" in the provinces augmenting existing art collections?

In some of these museums for all the gems, there's also a fair amount of pretty average stuff which is essentially wall filling because it feels like there should be some oils, works on paper are too perishable and which could be replaced with works of international interest.

New Horizons releases new color pictures of Charon, high-resolution lookback photo of Pluto by The Planetary Society

Now that New Horizons is regularly sending back data, the mission is settling into a routine of releasing a set of captioned images on Thursdays, followed by raw LORRI images on Friday. The Thursday releases give us the opportunity to see lovely color data from the spacecraft's Ralph MVIC instrument. This week, the newly available color data set covered Charon.

Thousands of Photos by Apollo Astronauts now on Flickr by The Planetary Society

A cache of more than 8,400 unedited, high-resolution photos taken by Apollo astronauts during trips to the moon is now available for viewing and download on Flickr.

Mars Exploration Rovers Update: Opportunity Rocks on Ancient Water During Walkabout by The Planetary Society

Opportunity continued her walkabout around Marathon Valley in September and sent home more evidence of significant water alteration and, perhaps, an ancient environment inviting enough for the emergence of life.

October 02, 2015

Arena's Night and Day. by Feeling Listless

TV As part of its fortieth anniversary celebrations, the BBC's Arena strand will be setting up an internet stream which utilises the show's back catalogue to create a visual experience which maps the hours of a day in a kind of loose version of Christian Marclay's The Clock:

"As Arena approaches its 40th anniversary, the strand sets sail into uncharted waters. 'Night and Day' is a 24 hour visual experience following the pattern of day and night, drawn exclusively from Arena’s rich and varied archive. This is a unique and ground-breaking project, unlike anything seen on your television screens. Night and Day is designed to be experienced on a range of platforms – as a continuous cinema or art piece, as a 24 hour television broadcast, and living permanently as a continuous transmission online, all broadcast in real time, following the light through morning, noon and night. An accompanying second screen app in sync with the main art work will provide further information about the scenes as they unfold."
There'll also be a 90 minute condensed version, although I have a feeling I'll be spending day or two with the longer version.

Going to Mars by Albert Wenger

I was thrilled to hear about NASA’s discovery of water on Mars. I am hoping it will contribute to interest in a human mission to Mars. Ideally, such a mission would not be under the banner of a single nation but rather include team members from around the world.

Why travel to space at all when there are so many problems to solve here? Many of the problems here have old historical gripes attached to them that will take a long time to overcome. Mars offers a unique opportunity to co-operate globally in a way that starts from scratch. The moon is much too close to Earth by comparison and still feels like a logical extension of our regionals squabbles. It doesn’t help that the initial space race was closely associated with the Cold War.

A counter argument would be that it will be impossible to assemble such a program internationally. I think that’s true if we put politicians in charge. But most scientists and engineers care more about the thing itself than giving their nation some extra advantage. So if the right initial group came together it becomes more a question of funding it. Here too is an opportunity for some innovation. Over time some (maybe all?) of a mission could be crowdfunded and that would be an interesting opportunity for the leading crowdfunding sites to collaborate to have the same project raise in all places.

Bringing humans to Mars will have other benefits in addition to giving us a common rallying point for bringing humanity closer together. For starters, it would be great to have a backup strategy for Earth. Having been involved with building several high performance resilient computer systems, I abhor single points of failure. And Earth is currently a single point of failure for humanity. It would also be nice to show that we don’t need to worry about overpopulation even if lifespans were to suddenly start growing a lot.

Most importantly though I think the actual difficulties of establishing human life on Mars will make us realize how important it is to take better care of Earth. When embattled in whatever conflict here on Earth, from the petty to the catastrophic, it is easy to lose sight of just how magical our planet is and how inhospitable the rest of the universe is to human life. Attempts to put humans on Mars and sustain them there followed by all of us will provide a constant reminder.

Until we embark on the mission in reality we will have to make do with fictional versions. And there I am excited about seeing The Martian this weekend after thoroughly enjoying the book.

Cosmic Rays Make for Windy Galaxies by Astrobites

Title: Launching Cosmic Ray-Driven Outflows from the Magnetized Interstellar Medium
Authors: Philipp Girichidis et. al.
First author’s institution: Max-Planck-Institut für Astrophysik, Garching, Germany
Status: Submitted to ApJ

Galaxy evolution is a game of balance. Galaxies grow as they accumulate gas from what is called the intergalactic medium and as they merge with smaller galaxies. Over time, galaxies convert their gas into stars. This inflow of gas and the subsequent star formation is balanced out by the outflow of gas from galactic winds, through a process known as feedback. As suggested by observations and seen in simulations, these winds are driven by supernovae explosions that occur as stars die. Eventually, gas driven out may eventually fall back into the galaxy, continuing a stellar circle of life.

Although simulations have done a good job reproducing galaxy properties worrying about feedback from supernovae alone, this is far from the complete picture. Cosmic rays, or high energy protons, electrons, and nuclei moving near the speed of light, can create a significant pressure in galaxies through collisions with the gas in galaxies. Supernovae explosions are important sources of cosmic rays in galaxies. Simulating cosmic rays is computationally challenging, yet they may be very important for understanding the life cycle and structure of galaxies. In addition, since cosmic rays are charged particles, worrying about how they interact with magnetic fields in galaxies may be very important. The authors of today’s astrobite use hydrodynamic simulations with magnetic fields (or magnetohydrodynamics, MHD), cosmic rays, and supernovae to try and better understand their roles in driving galactic winds.

Testing Feedback


FIgure 1: Density slices through the gas disk in each of three simulations. The vertical axis gives height above the disk. From left to right, the simulations include only thermal energy from supernovae explosions, only cosmic rays from supernovae, and both. (Source: Figure 1 of Girchidis et. al. 2015)


The authors aim to understand how supernovae, magnetic fields, and cosmic rays affect the evolution of gas contained within the disk of a galaxy. In particular, their test is to see what, if any, of the three best reproduces the density and temperature distribution of gas as a function of height above the gas disk. They perform three simulations of a galaxy disk, all three of which include magnetic fields. One simulation includes the thermal energy injected by supernovae explosions only, one includes cosmic rays generated by supernovae only, and the third includes both. The gas density of these three simulations are shown (left to right) in Figure 1, 250 million years after the start of the simulation.

Figure 2:

Figure 2: This is a more quantitative view of what is shown in Figure 1. Shown is the gas density as a function of height above the disk (z) for the run with thermal supernovae energy only (black), cosmic rays only (blue), and both (red). These are compared against observations of the Milky Way (yellow). (Source: Figure 2 of Girchidis et. al. 2015)

Putting numbers to Figure 1, Figure 2 shows the gas density as a function of height above the disk for all three simulations: thermal supernovae energy only (black), cosmic rays only (blue), and both (red). The vertical lines show the position within which each simulation contains 90% of the gas mass. As shown, including only thermal supernovae energy produces a dense disk, with little gas above the disk. Adding in cosmic rays changes this significantly, driving out quite a lot of gas mass to large heights above the disk. This is in part because cosmic rays are able to quickly diffuse to large distances above the disk. The gas from the disk then flows out from the disk to large heights, following the large pressure gradient established by the cosmic rays. The cosmic ray simulations do a much better job of matching the yellow line, which gives observational estimates of the gas density above the disk of our Milky Way.

In addition, the authors go on to show that, over time, including cosmic rays serves to slowly grow the thickness of the gas disk, and quickly dumps gas at large heights above the disk. They also show that the mass flow rate of galactic winds generated by cosmic rays is nearly an order of magnitude greater than those generated by thermal energy injection alone.

Developing an Accurate Model of Galaxy Formation

This research aims to better describe the evolution of galaxies by including the effects of supernovae feedback as well as the not-well-understood effects of cosmic rays and magnetic fields in their simulations. Their work shows that cosmic rays are able to dive out a significant amount of gas from the disks of galaxies, potentially tilting the balance between gas inflow and star formation, and gas outflow. Understanding this process better with further work will bring the properties of simulated galaxies in better agreement with observed galaxies in our Universe.

Cargo Craft Completes Six-Hour Schlep to ISS by The Planetary Society

A Russian cargo craft laden with more than three tons of food, fuel and supplies arrived at the International Space Station today.

Favorite Astro Plots #1: Asteroid orbital parameters by The Planetary Society

This is the first in a series of posts in which scientists share favorite planetary science plots. For my #FaveAstroPlot, I explain what you can see when you look at how asteroid orbit eccentricity and inclination vary with distance from the Sun.

October 01, 2015

Out Class and Online. by Feeling Listless

TV When the official Doctor Who twitter feed knocked out this at six o'clock this evening pretty much everyone was excited:

Cue five hours of random speculation on the Twitters in which most fans seemed to convince themselves that it would be a missing episode announcement or that Osgood was going to be announced as a companion or some such. Unless that was just me. Either way when this happened:

And eventually:

Everyone was:

Even though:

Which is entirely correct.

Anyway, so yes, a Doctor Who's getting a spin-off, it's called Class and it's set in Coal Hill School and it's written by Patrick Ness.

BUT and I've rather buried the headline here, the channel it's for, BBC Three, goes online only in January something which is unmentioned in the above press release (although the Radio Times has noticed).  So when this is "broadcast" next autumn, it's main distribution point will be the iPlayer or a surviving BBC channel late night.

SO Class is, like Spooks: Code 9 before it, is a YA spin-off of a very successful existing property designed to promote the reformating of BBC Three.

Notice too that this is for BBC Three and not CBBC so although it seems like a rerun of SJA to some degree the tone they'll more like be searching for is Buffy or Almost Human but not, we should suspect Torchwood.

Speculation: the return of Courtney and Jenna Coleman as a series regular.  Ian Chesterton cameo and Capaldi appearing in at least one episode.

Patrick Ness is supposed to be quite good.  I wasn't a huge fan of his Puffin eBook from 2013 featuring Fifth and Nyssa, his only other Who credit so far, but you do have to ask why him?  Are we seeing the BBC training someone to take over from Moffat?  He only has one other screenwriting credit albeit on a major motion picture based on his own novel.

We don't know much about it but in general I'm still, well, hum.  The Caretaker was the third worst episode of last year but that was mainly due to the appalling gender politics rather than anything to do with the school.  But it doesn't feel like anything spectacularly new based on the press release, essentially Grange Hill with aliens and haven't we seen that already?

Which isn't to say that a Paternosters spin-off would be the better choice but there was certainly scope to introduce a thing in the main series, as per Torchwood, which was a bit more unusual.  But the scope of this seems firmly based on what could be budgeted relatively cheaply as per most precinct dramas.  I'll try not to pre-judge though.  We await casting announcements with great interest.

Empathy, augmented - public services as digital assistants by Richard Pope

sketches of a digital assistant for an imaginary public service

Google Now is probably the best known example of the so called 'intelligent digital assistants'*. It suggests relevant information based on your location, your calendar and your emails. So, for example, it might automatically track a parcel based on a confirmation email from Amazon, or nudge you with the quickest route home based on your location.

Google Now is (for now) confined to day-to-day admin, and using it feels very obviously like having a machine help you out (and I'd guess a machine that runs off simple 'given these events have happened, then do this thing' rather than any artificial intelligence cleverness).

In addition to Google Now, there are examples of personal assistants that combine contextual notifications with a conversational, instant messenger style interface. So you get pushed some relevant information or asked to complete a task, but you can also ask a question or add a comment.

Native and Vida are apps that help you book complex travel arrangements and diagnose food allergies respectively. There is a good write-up here of how they work.

Compared to Google Now, these seem much less obviously like a machine talking**. Instead, you are having a conversation with a person (and it almost certainly is a real person most of the time), but there are these automatic nudges and nuggets of information that make the conversation richer.

What is really nice with these examples is that the differences of dealing with a real person and dealing with the purely digital parts of the service are abstracted away. There is a single interface onto a complex domain with computers doing the things computers are good at (joining together disparate data sets / reacting to changes in context in real-time), and humans doing the things humans are good at (empathy, and understanding complex edge cases).

So, what is the relevance for public services? Well, for most public services, probably very little. You don't need an intelligent assistant to buy a fishing licence or book a driving test.

Where it is potentially revolutionary is in the delivery of complex services that require interaction over a long period of time and with many edge-cases - services where everybody is an edge-case and everything is always changing. Things like benefits, caring, health, special educational needs or mediation. Things that are complex and demand empathy.

What could a public-service-as-digital-assistant look like?

1) Smart to-do lists that make it clear exactly what the next steps a user needs to do to navigate the system. Very much like cards in Google Now, items/cards get added to the list based on a user's context. Completing one task may trigger other tasks. For example, if asked to confirm how many children are in their household, and the number has changed since they were first asked, new cards might appear asking them to enter the details of the children. New cards can be added automatically by the system, at a face-to-face meeting with a government advisor, or when a user is on the phone to a call centre; there is one interface regardless of the channel.

2) A dynamic overview of a user's situation right now. What this looks like will depend on the service, but should also change based on a user's exact context. For example, the overview when an advisor is beginning to understand the caring needs of a family member may be very different once help has been put in place. Broadly though, these should communicate where a user is right now, how they are progressing through the system and what to expect next. The same view that is visible to a user should be visible to the government advisors who are helping them.

3) Augmented conversations that, rather than remove human interaction from a service, instead augment it. So if a nurse mentions details of a medicine a user is going to be asked to try, then the contraindications are automatically presented. Or if a special education advisor mentions a school, then the travel time and school performance are linked too. Or if a user notes down 5 jobs they have applied for, the pay ranges and locations are automatically summarised for the user and government advisor to comment on.

(The closest to this currently happening in the public sector is the work the Universal Credit Digital Service team are doing with to-do lists.)

Personally, I think these patterns provide an opportunity to design services that genuinely understand and react to a citizen's needs, that seamlessly blend the online and the offline, the human and the automated into a single empathetic service.

I guess we’ll only find out by building some.

* I'm not counting Siri here, which is really more of a voice interface with a personality

** I've not used either of these directly, I'm just going on descriptions and screen grabs

The first detection of an inverse Rossiter-McLaughlin effect by Astrobites

How do you observe an Earth transit, from Earth? You look at reflected sunlight from large highly reflective surfaces. Good candidates include planets, and their moons. They are the largest mirrors in the solar system.

On Jan 5 2014 Earth transited the Sun, as seen from Jupiter and its moons. Jupiter itself is not a good sunlight reflector, due to its high rotational velocity and its turbulent atmosphere. Its major solid satellites are better mirrors. Therefore, the authors observed Jupiter’s moons Europa and Ganymede during the transit (see Figure 2), and took spectra of the Sun from the reflected sunlight with HARPS, and HARPS-N, two very precise radial velocity spectrographs. The former spectrograph is located in La Silla in Chile, and the latter in La Palma in the Canary Islands. The authors’ goal was to measure the Rossiter-McLaughlin effect.

Fig 1: The Earth, and the Moon as they would appear to an observer on Jupiter on 5 January 2014, transiting the Sun. Figure 1 from the paper.

Fig 1: The Earth, and the Moon as they would appear to an observer on Jupiter on 5 January 2014, transiting the Sun. Figure 1 from the paper.

Figure 2 from the paper.

Fig 2: The geometric configuration of Jupiter and its moons, as seen from the Sun. Figure 2 from the paper.

Transits: sequential blocking of a star

The Rossiter-McLaughlin effect is a spectroscopic effect observed during transit events (see Figure 3). As a star rotates on its axis, half of the visible stellar photosphere moves towards the observer (blueshifted), while the other visible half the star moves away from the observer (redshifted). As a transiting objectin our case a planetmoves across the star, the planet will block out one quadrant of the star first, and then the other. This sequential blocking of blue-and redshifted regions on the star causes the observed stellar spectrum to vary. More specifically, the uneven contribution from the two stellar quadrants distorts the spectral line profiles, causing the apparent radial velocity of the star to change, when in fact it does not. The effect can give information on a) the planet radius, and b) the angle between the sky projections of the planet’s orbital axis, and the stellar rotational axis.

Fig 3: The Rossiter-McLaughlin effect: as a planet passes in front of a star it sequentially blocks blue-and redshifted regions of the star causing the star’s apparent radial velocity to change when in fact it does not. The viewer is at the bottom. Figure from Wikipedia.

Fig 3: The Rossiter-McLaughlin effect: as a planet passes in front of a star it sequentially blocks blue-and redshifted regions of the star causing the star’s apparent radial velocity to change, when in fact it does not. The viewer is at the bottom. Figure from Wikipedia.

Observations of the transit

Figure 4 shows the whole set of corrected radial velocities taken of the transit, including observations of Jupiter’s moons the nights before and after. The transit, as seen from Jupiter’s moons, took about 9 hours and 40 minutes. The best available coverage of the event was for 6 hours from HARPS-N at La Palma Observatory. HARPS at La Silla Observatory was able to observe the transit for about an hour.

Figure 7 from the paper.

Fig 4: Corrected radial velocities measured on Jan 4-6, 2015. Vertical dashed lines denote the start, middle, and end of the transit. Observations of Europa from La Palma cover about 6 hours of the transit (black circles). Color observations are of Ganymede (cyan) and Europa (red) from La Silla. Figure 4 from the paper.

An anomalous drift

The expected modulation in the solar radial velocities due to the transit was on the order of 20cm/s. The Moon, which also partook in the transit (see Figure 1 again), added a few cm/s to this number.

Instead of detecting the expected 20cm/s modulation, the authors detected a much larger signal, on the order of -38m/sa modulation about 400 times higher and opposite in sign than expected (see peak in Figure 4): an inverse Rossiter-McLaughlin effect.

The authors ruled out that the observed modulation could be caused by instrumental effects, as the two spectrographs showed consistent results. Additionally, the authors rule out the possible dependence of the anomalous signal with magnetic activity of the Sun, from observations conducted simultaneously with The Birmingham Solar Oscillation Network (BiSON). They had another idea.

The culprit: Europa’s opposition surge

The authors suggest that the anomaly is produced by Europa’s opposition surge.

The opposition surge is a brightening of a rocky celestial surface when it is observed at opposition. An example of an object at opposition is the full moon. The “surge” part has to do with the increase, or “surge”, in reflected solar radiation observed at opposition. This is due to a combination of two effects. First, at opposition the reflective surface has almost no shadows. Second, at opposition photons can constructively interfere with dust particles close to the reflective surface, increasing its reflectivity. The latter effect is called coherent backscatter.

The authors created a simple model for Europa’s opposition surge, and compared it to their observations (see Figure 5). It works. As the Earth moves across the face of the Sun, rather than blocking the light (like in the Rossiter-McLaughlin effect shown in Figure 3), the net effect is that the light grazing the Earth is amplified. The Earth thus acts as a lens, compensating not only for the lost light during the eclipse—but makes the Sun appear much brighter! This explains the opposite sign, and the amplitude of the effect. Additionally, the amplification of reflected light is not fixed only to the transit, but happens gradually as Earth gets closer to transiting, and as Europa gets closer to being at opposition. The effect is symmetric, and is analogously observed as Earth moves out of transit.

Fig 5: The dotted blue line represents the expected Rossiter-McLaughlin effect. It is amplified 50-fold, for visibility---still much smaller than the observed signal. Figure 7 from the paper.

Fig 5: The model of the opposition surge (thick black line) compared to observations from HARPS-N at La Palma (red), and HARPS at La Silla (blue). The dotted blue line shows the originally expected Rossiter-McLaughlin effect, amplified 50-fold for visibility. It is much smaller than the observed signal. Figure 7 from the paper.


This is the first time an inverse Rossiter-McLaughlin effect, caused by a moon’s opposition surge, has been detected. The authors predict the effect can be observed again during the next conjunction of Earth and Jupiter in 2016. Although, this will be a grazing transit with a smaller amplitude than the transit studied in this paper, the authors can now predict with confidence the extent of the newly discovered effect in the upcoming event.

NASA announces five Discovery proposals selected for further study by The Planetary Society

NASA announced the first-round selections for its next Discovery mission today. A total of five planetary mission concepts -- three targeted at asteroids, two at Venus -- will move to the next stage of the competition.

The solar system at 1 kilometer per pixel: Can you identify these worlds? The answers by The Planetary Society

Last Friday I posted an image containing 18 samples of terrain, all shown at the same scale. Were you able to figure out which square was which? Here are the answers.

September 30, 2015

Rufford Old Hall. by Feeling Listless

A fine Tudor building, the home for stories of romance, wealth and 500 years of Hesketh family history.

Be wowed by the Tudor Great Hall with its fantastic furniture, arms, armour, tapestries and the carved oak screen, a rare survivor from the 1500s. History springs to life in the Hesketh's dining room, its food-laden table, lit candles and 'fire in the hearth' waiting to welcome the family's dinner guests.

And did Shakespeare spend a short time here in his youth? There’s reasonable evidence to suggest that he could once have known Rufford’s Great Hall for a few months whilst still in his teens. Ask us about the evidence and decide for yourself!

Then relax as you stroll through Rufford's Victorian and Edwardian gardens - and remember you're only a few feet (or metres) above sea level - making Rufford one of the lowest lying National Trust gardens in England.
Heritage  Sitting on my desk right now, next to this laptop, is a small brown cardboard box with the words My National Trust embossed into the lid with a logo containing the silhouette of an oak leaf above.  Everything about it is making me smile.  Becoming a member of the National Trust wasn't something I set out to do today.  But after surprising myself with a visit to Rufford Old Hall having only discovered its existence yesterday, and realising at the entrance gate that the fiver per month membership fee by direct debit was wholly affordable and cheaper than the average entrance fee (having previously looked at the lump sum approach of old and shivered), I handed over my Visa Debit details and was given in return a copy of the 2015 handbook.  The membership card will be in the post by the end of next month.

After completing the North West Art Collections project, I've been a bit of a loose end wanting to try something else.  Various ideas have been researched and rejected.  After watching the funeral of Richard III, I considered seeing the tombs of all the British monarchs.  But that's messy geographically, plus it felt like more of a box ticking exercise and the only proper way to do it would be chronologically and frankly, yes but no.  Another was to work through the remaining art collections in Your Paintings but many of them are in official buildings ill equipped for visits by members of the public so there'd never be a case of simply just turning up.  I almost decided to try and see everything by a particular artist but then there was the process of choosing the artist and it was inevitably going to be someone with lots of work in private collections.

The National Trust, in the end, was inevitable.  For one thing I've already visited a couple of the properties for the purposes of the other project as you'll see if you click the new tag at the bottom of this post.  Plus there's a decent iProduct app with a structure that shows properties closest to you so there's a fairly logical approach to fanning out from home into the country.  But there's also the fact that although there's a relatively finite amount of destinations, there are enough that I'm unlikely to run out ever.  So it's a project which feels like it has the potential for completion but really doesn't and also has the added bonus of forcing me to visit properties even further afield in places I wouldn't otherwise have a reason to visit, just like the other project (although just like the other project everything is public transport and cost permitting).

Rufford Old Hall, then. The origins of the house are messy. A property has stood on the land since the 14th century, but a version of the current building was erected in 1530 (of which great hall (pictured) is the only surviving element), possibly by Thomas Hesketh after a series of inheritances of the kind which tended to happen then because women weren't allowed to keep hold of the money which we'd now deem quite rightly as being theirs. The house then stayed in the family for centuries who made a series of changes including the current extensions, although the main family subsequently moved to a Rufford New Hall which led to this building getting its name. As well as the family, it also attracted numerous tennants including a school who used the main hall during one of the periods of building work.

The entry on Your Paintings, the Wikipedia and this old local guide book transcript have versions of the story which somewhat contradict one another and even having also heard it described by one of the volunteer guides in the house, I'm still unclear as to the chronology and who these people were.  Perhaps I should have bought a guide book.  Perhaps the National Trust's own website should be more detailed.  Ultimately the house itself has resolved itself into two periods, the Tudor section of the great hall and the rest of the house which since being gifted to the Trust in 1936 has largely been regressed back to how it would have looked in the Victorian era when the majority of the fixtures and fittings were originally installed.  A lot of these had been moved to the family's eventual regular home at Easton Neston but then bought back when the contents of that property were sold in 2004.

It's at time like this I remember wistfully the more linear philanthropic development of regional art collections.  On entering the house my first question, just to make sure, was whether there was any particularly distinguished paintings and the first answer was no but after exploring and chatting to the various volunteers this turned out to be not quite right.  The majority of the collection is production line family portraits and in some of these you can actually see how the body and background had been prepared by one artist ready for another to paint in the face of the given subject.  But in the dressing room upstairs there's a massive collection of flower watercolours by Ellen Stevens which had been bought by the Trust who decided they'd be best presented at Rufford.  Minutely detailed and observed, they're almost worth the visit to the house by themselves.

The best oil painting in the house is the utterly thrilling An Extensive Landscape with Exotic Flowers, Fruit and Vegetables and a 'Noli me Tangere' in the Garden Beyond by the obscure Flemish painter Gommaert van der Gracht.  Glancing towards the surrealism four hundred odd years before Dali and Magritte, Gracht combines a landscape, still life, animal painting with a religious scene.  Superbly detailed fruit fill almost half the composition, with a goat representing Adam eating fruit from a tree on which vines represent a serpent.  Meanwhile in the background, a resurrected Jesus visits Mary Magdalene, the allegorical message being that we're seeing the original sin being forgiven.  Here's an image though the postage stamp you're looking at fails the capture the grandeur of this canvas which fills half a wall in the dining room.

As you can see from the above quote from the Trust website, Rufford's other claim to fame is a Shakespeare connection, the idea being that the playwright spent part of the missing years here both as a player in a tour company and assistant teacher in around 1585.  The only potential documentary evidence appears to be a will by Alexander Houghton of Lea Hall near Preston which states:
"Item. It is my mind and will that the said Thomas Hoghton of Brynescoules my brother shall have all my instruments belonging to music, and all manner of play clothes if he be minded to keep and do keep players.

"And if he will not keep and maintain players then it is my mind and will that Sir Thomas Hesketh knight shall have the same instruments and play clothes.

"And I most heartily require the said Sir Thomas to be friendly unto Fluke Gyllome and William Shakeshafte now dwelling with me and either to take them into his service or else to help them to some good master as my trust is he will."
Shakeshafte being Shakespeare in this instance. The Oxford Companion to Shakespeare is cynical, noting that Shakespeare has to be back in Stratford two years later to marry Anne Hathaway and that although Shakeshaft was a common name in the area at the time, there's no evidence of it having been used interchangeably in Shakespeare's family.  In other words, as is so often the case with the man, we simply don't know.  But there are some related books, including the RSC Complete Works in the gift shop just in case.

Although it's only a relatively small house, the visit filled about three and a half hours including a Lancashire cheese sandwich in the cafe which has been set up in the house's original kitchens.  The rest of the visitors were of retirement age and plenty of them seemed to buzz around within about half an hour (apart from one gentleman who was admirably thorough with his questions of the volunteers, right down to how the cutlery was arranged in the dining room).  But I tended to sit in each of the rooms flicking through the large visitor guides provided and enjoying being in the space mostly because I know what 2015 is like and something you just want to get away from it as much as possible.  Becoming a National Trust member genuinely feels like the next best thing to being handed the keys to a TARDIS.

Tanks! Why tank stories make great tech myths by Charlie Stross

Me again! M Harold Page, but you can call me "Martin" (I use my very fine middle name to differentiate myself from the folk singer and the French YA writer).

I've just published Swords Versus Tanks 1: "Armoured heroes clash across the centuries". It even has a cover quote from Charlie ("Holy ####!").  So now I'm here to shamelessly plug my new book (click through and take a look at the cover... Go on! You know you want to!).

However, you're a sophisticated lot, so call the above "A word from our sponsor" and let me tell you why I think tank stories make great tech myths.

First some examples...

We all know the one about the Panther and the T34. The Panther is the better tank, when the Russian mud hasn't knocked it out, when it doesn't need shipping to Czechoslovakia for repair, when it's not being spammed by cheap and cheerful T34s.

That's a story that ought to be taught to engineers and software developers. Sometimes perfect means "delivered now". It's reputedly the Duke Nukem Forever story, but that's just the most anecdotal example of chasing perfection at the expense of practicality. I'm sure you guys have others.

Then there's the farce of under-gunned early WWII tanks on all sides unable to actually harm each other except by ramming - though, in North Africa, a driver did dismount mid battle and single-handedly capture a squadron of Italian tanks. They had no radios and communicated by runner, so all he had to do was rap on the hatch of each, then point his revolver at the commander... (We'll come to tanks and radios).

The example of the under-gunned tanks is a rich one.

It's a good example of how, when like competes with like, it's all down to strategy and resources. Note how MS Word gradually displaced its very similar rivals. 

It also highlights the need to consider a good range of use cases - do you really expect your tanks not to face other similar tanks? I don't know about you, but I've worked on software that was fine only as long as you adopted the workflow envisaged by the developers. The old RoboHelp, for example, forced you to plan everything top down and didn't leave much room for second thoughts. 

One reason that the tanks were undergunned is that they were shoehorned into old paradigms - the eternal curse of the AFVs, e.g. modern being armour optimized for set piece battles and evolving painfully in contact with asymmetric war.

In 1939, the issue seems to have been twofold:

First, the one we all know. Allied tank doctrine was based around the infantry attack, with the heaviest tanks designed to be deployed in penny packets to support infantry but not fight each other. "We" knew this was a bad idea as far back as the dawn of tank warfare. However, the institutional momentum was on the side of the older arms.

Second, and less well known, tanks were specialised -- Light, Medium (Cruiser), Heavy -- according to a WWI paradigm. This quickly turned out to make no sense on the modern battlefield where the artillery had such a long reach, and where in the chaos, tanks would blunder into situations for which they were not designed, e.g. fighting other tanks.

The lesson is not only that old paradigms have their own momentum due to vested interests, but also that it's hard to see outside the box. It reminds me of what that happened when computers appeared in organisations. The PCs went first to the secretaries, because, after all, computers were just glorified electric type writers. Job conscious secretaries resisted handing over the resulting files, while middle ranking staff resisted aquiring computers, because typing was what secretaries did. In other parts of the same organisation, databases sat on special computers, as did Internet access when it arrived, all because, historically, different activities required different physical spaces - the card index, the mail room... it just made intuitive sense.

Finally, we come to the radios.

Even in 1939, it was common to have only one radio per armoured unit, then communicate by semaphore or runner. This is the military equivalent of the funny little chappy with the red flag scurrying along in front of your automobile. Why so few radios? Perhaps it was hierarchical thinking - radios are for the unit commanders. Perhaps it was a failure of vision - the older heads could not conceive of a fluid warfare where tanks, even if initially massed, could end up operating independently.

The price for the French might well have been their nation. Supposedly, there was a moment when a French tank unit stumbled on a weak flank and the Nazi blitzkrieg could - perhaps - have been brought to an abrupt halt.

Not equipping all your tanks with radios is like letting your teenager go to a party but have to be home at 2230! You might as well save yourself the parental taxi trip (or is that the point? Hmm...).

The obvious Aesop here is "For want of a nail the kingdom was lost..." The less obvious one is that it's rarely enough to tick the boxes.  Remember the Dotcom Boom when everybody was putting up flashy Flash-laden websites and nailing "e" to their business name? I'm sure you tech-savvy denizens of the comment thread have better examples...


I think tanks stories make great tech myths because they are *not* IT stories. You can't dive into the detail to justify people's actions. Nor do they hit hot buttons about operating systems and open source. Better yet, the outcomes are usually both photographable or at least easy to visualise and memorable because, in hindsight, ignoring the tragedies, they are darkly funny.

Of course, there's plenty of dark humour in Swords Versus Tanks 1: "Armoured heroes clash across the centuries". 1930s-style quasi-communists have invaded their own past, not realizing that the magic works. It follows that a knight with a rune-etched sword can take on a tank.  (There's more to it than that, of course. The moderns are not necessarily the bad guys.)

You can download my book from Amazon (UK, US, CA, AU). You can also get Swords Versus Tanks merchandise, including mugs and - I jest you not - shower curtains. (Somebody has already bought pillow slips. Really.)

Questions to ask your design work by Tom Darlow

Last month I spent some time updating my portfolio. As a product / user experience / web designer, it can be quite tricky to present your work to others.

The rock star visual designers of this world can get away with presenting screenshots of their stunning visual work — job done, money due.

But if your day-to-day consists of making user flows, interaction designs, napkin wireframes, trade-offs, A/B tests, and minimum viable products, then you — like me — need to do a bit more groundwork to do the show and tell properly.

It’s not so much the what, but the why and the how.

With this in mind, I decided to structure my portfolio around the questions I ask myself when looking at other products and services:
A user experience, product design portfolio layout

1/ What was the context?

Was this a piece of work for a fast moving startup or a 100 year old family business? Setting the context lays some good foundations for answering the questions around what you did and why.

2/ What problem were you trying to solve?

The cornerstone of successful design projects, is making sure you’re solving the right problems.

In this section of my portfolio pieces, I found it useful to pull apart the problem I was hired to solve.

Sometimes the problem definition is a bit blurry i.e: Do we really need to make the sign up process faster? Or - to convert more prospects - should we enable users to access features without committing to signing-up first?

3/ How did you design the solution?

Here we deep dive right into how you designed the product / interface / solution.

Unpack how you approached things. Show rather than tell. Let us see what you made.

Did new insights emerge as you moved through the project? If so, lay them out here — tell us how you responded.

4/ Review the above

The final section of the portfolio answers:

  • How did it go?
  • What impact did the solution make?
  • What happened when your work reached customers’ hands?
  • Can you surface some feedback or data that points to show your solution is working out?

Arguably, the biggest thing you bring to the table as a designer is your ability to render an optimal solution — and answers to these questions go a long way in revealing how you do that.

The APOGEE Treasure Trove by Astrobites

  • Title: The Apache Point Observatory Galactic Evolution Experiment (APOGEE)
  • Authors: Steven R. Majewski, Ricardo P. Schiavon, Peter M. Frinchaboy, et al. (there are more than 70 co-authors)
  • First Author’s Institution: Dept. of Astronomy, University of Virginia, Charlottesville, VA (there are 50 institutions represented among the authors)
  • Paper Status: Submitted to The Astronomical Journal

photo of apogee telescope

Apache Point Observatory in Sunspot, NM. The SDSS 2.5-m telescope is to the right, pointing toward the center of the Milky Way. The full moon and light pollution from nearby El Paso don’t stop APOGEE! Figure 15 in the paper.

What’s black, white, and re(a)d all over… and spent three years looking all around the sky? It’s APOGEE (the Apache Point Observatory Galactic Evolution Experiment), a three-year campaign that used a single 2.5-m telescope in New Mexico to collect half a million near-infrared spectra for 146,000 stars!

Black and white? As shown below, the raw spectra from APOGEE look black and white, but appearances can be deceiving. Each horizontal stripe is the spectrum of one star, spanning a range of colors redder than your eye can see. To get the spectra nicely stacked in an image like this, fiber-optic cables are plugged into metal plates which are specially drilled to let in slits of light from individual stars in different regions of the sky. Each star gets one fiber, which corresponds to one row on the detector. An image like this allows APOGEE to gather data for a multitude of stars quickly.

raw spectra

Part of a raw 2D APOGEE image from observations near the bulge of the Milky Way. Each horizontal stripe is a portion of one star’s near-infrared spectrum. The x-axis will correspond to wavelength once the spectra are processed. Bright vertical lines are from airglow, dark vertical lines common to all stars are from molecules in Earth’s atmosphere, and the dark vertical lines that vary from star to star are scientifically interesting stellar absorption lines that correspond to various elements. Fainter and brighter stars are intentionally interspersed to reduce contamination between stars. Figure 14 in the paper.

Re(a)d all over? Today’s paper accompanies the latest public data release, DR12, of the Sloan Digital Sky Survey (SDSS), a large collaboration which includes APOGEE. So in addition to focusing on red giant stars viewed in near-infrared light, all the APOGEE data are now freely available and may be read by anyone. Even so, the APOGEE team has been hard at work.

Probing to new galactic depths with the near-infrared

APOGEE is designed to primarily observe evolved red giant stars in the Milky Way using near-infrared light. What’s so special about this setup? First, red giants are some of the brightest stars, so it’s possible to see them farther away than Sun-like stars. Second, near-infrared light doesn’t get blocked by dust like visible light does, so it lets APOGEE observe stars toward the center of the Milky Way, which is otherwise obscured with thick dust lanes. This is really important if you want to understand how different stellar populations in the galaxy behave.

Mapping velocities and composition

Because APOGEE collects spectra of stars, not images, each observation contains lots of information. Spectra tell us how fast a star is moving towards or away from us (its radial velocity), how hot a star is, what its surface gravity is like, and what elements it is made of. Lots of work has gone into developing a pipeline to process the spectra and return this information reliably, because it’s not practical to look at hundreds of thousands of observations by hand.

APOGEE visits each star at least three times to check if it is varying for any reason. (For example, binary stars will have different radial velocities at different times, and the APOGEE team wants to exclude binaries when they use star velocities to measure the overall motion of the galaxy.) The figures below show how a subset of stars mapped by APOGEE vary in radial velocity (top) and chemical composition (i.e., metallicity, bottom). The stars in both figures lie within two kiloparsecs above or below the disk of the Milky Way, so we are essentially seeing a slice of the middle of the galaxy. Observations don’t exist for the lower right quadrant of either figure, because that region is only visible from Earth’s southern hemisphere.


A map of stars observed by APOGEE, color-coded by radial velocity. The Sun is located at the center of the “spoke” of observations, and is defined as having zero radial velocity (greenish). An artist impression of the Milky Way is superimposed for context. This figure illustrates how the galaxy as a whole rotates. The Sun moves with the galaxy, and other stars’ relative motions depend on how far in front or behind of us they are. This astrobite has more details. Figure 24 from the paper.

metallicities, fig 25

A map of stars observed by APOGEE, color-coded by metallicity. As above, the Sun is in the center of the observation “spokes” and an artist impression of the Milky Way is superimposed for context. The Sun is defined to have 0 metallicity (greenish). Stars that are more chemically enriched than the Sun are red, and stars that are have fewer metals than the Sun are blue. This figure illuminates an overall galactic metallicity gradient. Figure 25 from the paper.

Together, maps like these provide an unprecedented look into our galaxy’s past, present, and future by combining kinematics and the locations of stars with different chemistry. Thanks to APOGEE’s success, plans are now underway for APOGEE-2 in the southern hemisphere using a telescope in Chile. This treasure trove of data will undoubtedly be put to good use for years to come.

Mars Week Continues: We've Released Our 'Humans Orbiting Mars' Workshop Report by The Planetary Society

Learn all about a sustainable, affordable path to get humans to the Red Planet—a path that goes through Mars orbit and Phobos.

September 29, 2015

The sky's gone dark by Charlie Stross

Here's a technological question with philosophical side-effects that's been bugging me for the past few days ...

Today, the commercial exploitation of outer space appears to be a growth area. Barely a week goes by without a satellite launch somewhere on the planet. SpaceX has a gigantic order book and a contract to ferry astronauts to the ISS, probably starting in 2018; United Launch Alliance have a similar manned space taxi under development, and there are multiple competing projects under way to fill low earth orbit with constellations of hundreds of small data relay satellites to bring internet connectivity to the entire planet. For the first time since the 1960s it's beginning to look as if human activity beyond low earth orbit is a distinct possibility within the next decade.

But there's a fly in the ointment.

Kessler Syndrome, or collisional cascading, is a nightmare scenario for space activity. Proposed by NASA scientist Donald Kessler in 1978, it proposes that at a certain critical density, orbiting debris shed by satellites and launch vehicles will begin to impact on and shatter other satellites, producing a cascade of more debris, so that the probability of any given satellite being hit rises, leading to a chain reaction that effectively renders access to low earth orbit unacceptably hazardous.

This isn't just fantasy. There are an estimated 300,000 pieces of debris already in orbit; a satellite is destroyed every year by an impact event. Even a fleck of shed paint a tenth of a millimeter across carries as much kinetic energy as a rifle bullet when it's traveling at orbital velocity, and the majority of this crud is clustered in low orbit, with a secondary belt of bits in geosychronous orbit as well. The ISS carries patch kits in case of a micro-particle impact and periodically has to expend fuel to dodge dead satellites drifting into its orbit; on occasion the US space shuttles suffered windscreen impacts that necessitated ground repairs.

If a Kessler cascade erupts in low earth orbit, launching new satellites or manned spacecraft will become very hazardous, equivalent to running across a field under beaten fire from a machine gun with an infinite ammunition supply. Sooner or later you'll be hit. And the debris stays in orbit for a very long time, typically years to decades (centuries or millennia for the particles in higher orbits). Solar flares might mitigate the worst of the effect by causing the earth's ionosphere to bulge—it was added drag resulting from a solar event that took down Skylab prematurely in the 1970s—but it could still deny access to low orbit for long enough to kill the viability of any commercial launch business. And then there's the nightmare scenario: a Kessler cascade in geosynchronous orbit. The crud up there will take centuries to disperse, mostly due to radiation degradation and the solar wind gradually blowing it into higher orbits.

So here's my question.

Postulate a runaway Kessler syndrome kicks off around 2030, at a point when there are thousands of small comsats (and a couple of big space stations), ranging from very low orbits to a couple of thousand kilometers up. Human access to space is completely restricted; any launch at all becomes a game of Russian roulette. (You can't carry enough armor plating to protect a manned capsule against a Kesseler cascade—larger bits of debris, and by "large" I mean with masses in the 0.1-10 gram range—carry as much kinetic energy as an armor-piercing anti-tank projectile.) Unmanned satellites are possible, but risk adding to the cascade. So basically we completely lose access to orbit.

There are some proposals to mitigate the risk of Kessler Syndrome by using microsats to recover and deorbit larger bits of debris, and lasers to evaporate smaller particles, but let's ignore these for now: whether or not they work, they don't work unless we start using them before Kessler syndrome kicks in.

So, suppose that with the exception of already-on-orbit GPS clusters and high altitude comsats, we can't launch anything else for a century. What effect does it have on society and geopolitics when the sky goes dark?

My Favourite Film of 1979. by Feeling Listless

Film Glancing backwards through this list as we creep ever closer to the moment when I'll be choosing films released before the year of my birth, one pattern which is emerging is how often a film hasn't just been something I've enjoyed on a visceral or emotional level but also as a kind of gateway to something else, be it the art of Surat in the case of Ferris Bueller, the cosmology of Sunshine and Shakespeare's appearance in Star Trek VI (probably) (if I'm being honest).  There's a sense that I'm especially drawn to films which don't just exist as pure entertainment but also have the weight of being a kind of cultural event in which the narrative and character are extrapolated through and drawn around cultural artifacts leading to a much deeper experience.

Manhattan's a case in point.  When I saw the film on Channel 4, years after its original release I was overwhelmed, not just by Gordon Willis's mythologising photography of the city or the psychologically complex characterisation, but also the cultural references, most of which I didn't understand at the time but wanted to.  This is probably the first time I heard Gershwin, of Flaubert, of Cezanne, of Fellini and Zelda Fitzgerald.  There are also the locations: the Guggenheim, the American Museum of Natural History, Bloomingdale's, MOMA, the Whitney.  When watching Manhattan, you're not just watching a film, you're listening to a concert, you're visiting an art exhibition, reading literature and you're being given an architectural tour and the director makes sure that you notice.  He wants you to notice.

Recently, there seems to be less of these films as "cultural events" because film culture in particular is increasingly scared to alienating its audience by presenting it with culture (and even a culture) it might not necessarily be aware of in an otherwise familiar context.  How often now do we see adaptations, even of contemporary novels, and the first element to be excised or neutered are the cultural references?  Or in order to make the life of an artist or poet acceptable to a wider audience, the work they create is backgrounded in favour of a love story or some other historical moment.  Pollock's a rare example of a film in which you actually learn something about his method and inspiration on top of his psychological underpinning and biography (indeed it's the very film which finally led me to understanding why his earlier work, at least, is great).

Which is why the rare occasions when films do embrace such things can be a joy.  The underrated Liberal Arts has a sequence in which its protagonist is introduced to a ton of classical music in a montage sequence through which we watch him experience a cultural awakening and I expect most of us are right along with him and straight to Spotify to listen to these pieces in full.  Which isn't to say that sometimes filmmakers don't get such things horrendously wrong.  Welcome as it is to see Jane Austen feature in The Rewrite, she's generally ridiculed and there's no way on earth a scholar of the calibre Allison Janney's apparently playing wouldn't be aware of Clueless or not seen any of the adaptations.  I have a theory that the professors Janney plays in each of these films are somehow related and argue the toss over such matters at Thanksgiving.

I've often talked about a kind of cultural awakening which happened in the early 90s and Manhattan and plenty of Woody Allen's other films and films in general will have been a contributing factor (and since it's important to credit the gateway, it was probably Ed Chigliak in Northern Exposure and his various dream sequences) (Northern Exposure was also a huge factor but this list doesn't feature television) (ahem).  Which isn't to say that they're designed that way.  When Alvy pulls out Marshall Mcluhan in Annie Hall, it's in front of an audience he expects to know who Marshall Mcluhan is (even if it's not necessarily necessary in order to understand the joke).  It's worth noting I've yet to see The Sorrow and the Pity and it's The Last Action Hero which led to me watch The Seventh Seal.

Nevertheless this seems to me to be one of the embraceable cliches.  From the moment we're born, we're constantly absorbing the culture around us, accepting some, rejecting others, leading to a set of behaviours or something to live up to.  Much of the time it's a manufactured fantasy but the trick, if we're to remain sane, is to counter intuitively follow Cypher's request in The Matrix to be re-inserted otherwise we'll spend our lives like Neo, nostalgically glancing out of car windows at restaurants we used to eat, with really good noodles, that we now know don't exist.  When I saw Manhattan, I didn't see any justifiable reason why, even if I could never live in this 1979 city, I couldn't at least enjoy some of its benefits.  Hello Rhapsody in Blue.  Hello Fellini.  Hello as many other films set in New York as I could find which included Jon Jost's All The Vermeers in New York.  Hello Vermeer.

Which isn't to say I actively wanted to live my life like the characters in the film.  They seems so old and stressed out and I didn't really fancy Muriel Hemingway (because it would have felt like I was cheating on Debbie Gibson and the girl I fancied on the 80 bus) (teenage boys are curious beings).  But it was the idea of having that sort of cultural awareness which was attractive, of being the sort of person who would sit in the evening listening to records or reading the latest novels rather than playing Wizball on the C64 and watching the Justice episode of Star Trek: TNG for the umpteenth time (like I said, teenage boys...).  I still aspire to all that and manage it sometimes, through in present circumstance I have noticeably been watching more films which is always a fallback position but which it should be added I don't feel embarrassed about because why should I?

But when critics and audiences increasingly describe films as empty experiences, it's impossible not to attribute some of that to the loss of culture and of presenting characters connected to aspirationally higher art.  Film companies necessarily want to present audiences with characters they think they'll identify with which means they're more like to go to a rock concert than an opera house and if they do visit an opera house, it'll be part of a "fish out of water" routine or an action sequence rather than their natural repose.  Say what you like about Amazon's television selection, but between Mozart in the Jungle and hiring Whit Stillman and Woody himself to produce series it's unembarrassed by its high art tendencies.  If only this was still true of the big screen.  Still, we'll always have Manhattan.

Learn To Commit by Albert Wenger

This past Saturday Susan, our daughter Katie and I (with help from our son Michael) taught a one day computer bootcamp for women at Cornell. Susan, who is  a Cornell alum, had come up with the idea a few months earlier as she noticed how many more male than female students showed up at hackathons. Ami Stuart who organizes tech events for Cornell liked the idea and did an amazing job pulling together the actual event which wound up being sponsored by both Ziggeo (Susan’s company) and Accenture.

The goal for the day was to provide exposure to a wide set of topics, including how the web works, using the command line, using an editor to change html and css files, a bit of javascript and publishing everything using both github pages and a server. We had asked the students to do some pre-work including signing up for github, registering a domain, and installing Firefox, git and Atom on their laptops.

If all of this sounds a bit too ambitious to you that would be right, although not by much as it turns out. We had a full day from 9am to 5pm with only a 1 hour bio break and managed to get through most of our program. We cut the parts on Javascript a bit short and in the end only a few students could access their own servers. Nearly everyone though was able to publish the beginnings of their own site via github pages.

If you are wondering where the name “Learn To Commit” comes from then you probably should take the bootcamp. We put all the materials online.  As Susan wrote in her post about the event, we are thinking of repeating this at other schools, so if this is of interest to you, please contact us.

Missing: Several Large Planets by Astrobites

Title: Hunting for planets in the HL Tau disk
Authors: L. Testi, A. Skemer, Th. Henning et al.
First author’s institution: ESO, Karl Schwarzschild str. 2, D-85748 Garching bei Muenchen, Germany
Status: Accepted for publication in ApJ Letters


ALMA image of a disc of gas and dust around the young star HL Tau. The dark rings in the disc are thought to be gaps, carved out by giant planets. Image Credit: ALMA (ESO/NAOJ/NRAO)

Nearly a year ago, the ALMA collaboration released this stunning image of the young star HL Tau. The sub-millimeter wavelengths of light that ALMA detects revealed a vast disc of gas and dust, several times larger than Neptune’s orbit. Intriguingly, the disc was divided up into a series of well-defined, concentric rings.

The cause of the rings seemed clear: There must be planets around HL Tau, their gravity sculpting the gas and sweeping out the dark gaps in the disc.

But there was an issue with this hypothesis. HL Tau is a very young star, less than a million years old. Many planetary formation models assume that planets take much longer to grow to the kind of sizes needed to shape the disc like that. If the gaps are being made by planets, then those models will need a serious rethink.

The authors of today’s paper decided to take a closer look, and see if they could spot the hypothetical planets. This isn’t as easy as it may appear. In the part of the electromagnetic spectrum probed by ALMA, the star is relatively dim, allowing the light from the disc to be discerned. However, any planets present would shine in infrared light, with a much shorter wavelength. In infrared, the blinding light from HL Tau would easily outshine that from a planet.

Two techniques were used to overcome this problem. The first was simple: Use the biggest telescope that they could get their hands on, in this case the unique Large Binocular Telescope Interferometer (LBTI).

The second trick was to use adaptive optics. This technique uses a light source, such as a laser or, in the case of the LBTI, a well-known star, to correct for the distortions in light caused by the Earth’s atmosphere. As the telescope’s computers know what the guide star “should” look like, it can continuously  flex a small mirror to counteract the effects of the atmosphere. This makes the images much clearer, enough to directly image planets around some stars.

But even adaptive optics wasn’t enough to show up planets around HL Tau. The last obstacle was the disc itself. The very reason for looking for planets had become a hindrance to spotting them, scattering the light from the star out to much greater distances than usual.

To remove this scattered light, the authors made two infrared observations of HL Tau, one at a slightly redder wavelength than the other. In both, any signal from planets was drowned out by the scattered light.

But the exoplanets were predicted to be much redder than the scattered light. This meant that they wouldn’t show up at all in the less-red image, regardless of the scattered light. However, they should have been somewhere in the second image, with the scattered light roughly the same in both. Subtract the first image from the second, and the scattered light would disappear, leaving just the planets.


Left: K-band infrared image, with scattered light only. Right: Slightly redder L’ band image, showing both scattered light and (if they are there) planets. Subtract one from the other and…


…No planets. Oh well. The subtracted image, with blue lines showing the most prominent gaps in the disc, the red star the position of HL Tau, and the green circle the position of a candidate planet from an older observation. Planets should show up as white dots near the rings, of which there are none to be seen.

When the authors did this, they spotted…nothing. Based on the precision of their data, they conclude that there are no planets larger than 10-15 times the mass of Jupiter near the gaps in HL Tau’s disc.

At first glace that doesn’t seem to be a problem. Planets that large aren’t all that common, and there could easily be planets too small for the LBTI to detect hiding in the gaps.

But planets any smaller than 10 Jupiter masses wouldn’t have enough gravity to shape the disc in the way seen in the ALMA image. Planets or no planets, a new explanation for the complex structure of HL Tau’s disc may be needed.

The authors point out one possible way to solve this problem. ALMA is most sensitive to dust grains around a millimeter across, whilst the disc is probably made of a range of particle sizes. Smaller planets may have just enough gravity to move only the millimeter-scale particles into the observed rings, leaving the rest of the disc relatively untouched.

So are the gaps in the ring really caused by planets, or something else that we haven’t thought of yet? The paper ends by charting out the ways that astronomers can explore this system in the future. Longer observations by ALMA could broaden the range of dust sizes seen, allowing a more complete image of the disc structure to be made. And searches for smaller planets could be carried out, although such precise measurements will probably need to wait for the next generation of truly giant telescopes.


NASA's Mars Announcement: Present-day transient flows of briny water on steep slopes by The Planetary Society

NASA held a press briefing today to publicize a cool incremental result in the story of present-day liquid water on Mars. How big a deal is this story? Was all the pre-announcement hype justified? Is this just NASA discovering water on Mars for the zillionth time? What does this mean for things many space fans care about: life on Mars or future human exploration?

Dawn Journal: 8 Years in Space by The Planetary Society

On the 8th anniversary of the launch of the Dawn spacecraft, Chief Engineer and Mission Director Marc Rayman gives his annual summary of Dawn’s progress on its interplanetary travels.

September 28, 2015

On Doctor Who ratings. by Feeling Listless

TV The consolidated "ratings" for Doctor Who's The Magician's Apprentice are out:

As you'll see this doesn't include the iPlayer numbers so essentially they're a fiction and in no way reflect the actual number of people who watched episode in subsequent days and as will be the case now weeks, since it'll be hanging around the streaming service for a good time yet.  We won't really know the implications of the numbers until the regular column is published in the association gazette in a month or so.

Which is the point.  You can look at The Witch's Familiar's numbers and bang your head against a table until your forehead bleeds but 3.7m overnight isn't awful in the current television climate.  Few things are appointment television any more especially drama and there are loads of die-hard fans of the kind which work on the show and its ancillary spin-off material who don't even bother watching it on broadcast.  The only reason I do is so I can get the review out that night.

Let's not worry about cancellations and hiatuses just yet.  Moffat's said in the past that the show's emergence on a Saturday night is increasingly becoming its "publication" time and that they only really care about how many people consume it across its life, rather like a movie which barely registers at the cinema but does well on dvd and streaming.  If people lose interest in these later moments, that's when we begin to worry.

One of the reasons viewers must be timeshifting is because its often difficult to keep track of what time Doctor Who is on.  In its current broadcast position, Doctor Who is a slave to Strictly, TX dependent on the current duration of that lead in programme.  Ideally Who would be on first and has been previously, but because of some astonishing lack of nerve in the face of Simon Cowell, the DCMS and who knows what, the drama's been sacrificed against The X Factor and the Rugby.

Updated!  29/09/2015  The Guardian's posted a ratings update hidden in a wider story about comments from BBC1 controller Charlotte Moore saying she'd be ok about a female Doctor.  Mentions that the show's had an 1.5m viewings so far on the iPlayer which puts the show's "rating" at 8m - though of course 6.5m of that is extrapolated speculation rather than solid, countable streams.

Doctor Who News also mentions the overnights for the Sunday omnibus.  The share seems bizarrely low, but again it was opposite the rugby.Nevertheless, it's still wrong to say the ratings are dropping.  The ratings are just fine.

More on Women in Film. by Feeling Listless

Film The Observer this weekend ran a couple of pieces about women in film which underscore, just as this tiff Q&A does, the huge gap between men and women in the industry. Firstly an interview with Geena Davis, who is a key example of how Hollywood treats its actresses so poorly:

At 59, Davis is familiar with the crushing silence of a phone that never rings. Women in film are, she says “definitely” discriminated against because of their age.

“I was averaging about one movie a year my whole career and that was because I’m fussy. I probably could have done more. And then in my 40s I made one movie… And I was positive it wasn’t going to happen to me because I got a lot of great parts for women. I was very fortunate to have all that stuff happen and never get typecast, so I was just cruising along thinking: ‘Well yeah, it won’t happen to me.’ It did.”
Then nine women in film talk about the sexism they've either seen or experienced themselves. Agnes Godard, cinematographer:
"I have experienced sexism at work. Most of the time it’s a refusal to do what you’ve asked, or to doubt the legitimacy of the instruction. The most illustrative thing I went through was a long time ago, in 1983, when I was a focus puller on the Wim Wenders film Paris, Texas. The director of photography (DoP) was Robby Müller and I split my role with his usual focus puller. One week Robby wasn’t there and I set the camera and did a frame. When Robby arrived, he said, “Who did this beautiful frame? It’s really good”, and the grip, who was next to me, said it was the male assistant who made it. I was just speechless – I felt invisible. I think I said something, but it was like a whisper, because I was astonished. And I was shy and quite young at the start of my career, and I didn’t feel I could complain."
Trigger warning: the comments on both articles fulfil Lewis's Law.

September 27, 2015

Brief hiatus by Charlie Stross

I've fallen silent because I'm drinking my way around Amsterdam this weekend. Tomorrow (Monday) I'm flying out to Portland, Oregon, for the H. P. Lovecraft film festival and Cthulhucon, where I'm one of the guests of honor. And on Monday evening I aim to be eating and drinking in Deschutes Brewery in Portland from about 7pm in a desperate attempt to stave off jet lag. If you're in town, why not come along and help keep me awake until local evening?

The Witch's Familiar. by Feeling Listless

TV Ridiculous. Utterly ridiculous. There’s not a lot more you can say about The Witch's Familiar even though I expect I will as the night draws on. Much of Doctor Who is pretty ridiculous.  It’s why we love it so, and why all of it is amazing even when it is rubbish. Try to describe the plot of most stories to someone with only a passing interest in the series and they’ll generally look at you as though you spent the whole of the 60s dropping acid despite you having been born in the 70s. Try it some time, pick something at random, The Sunmakers, for example, and have a go. Watch carefully for the moment when they either (a) try to look for the quickest route out of the conversation or (b) have the phrase “And you watch this?” pop into their brains awaiting the most strategic moment of deployment.

This isn’t unique to Doctor Who. Most of science fiction and fantasy has to be pretty bonkers in order to justify its own existence and keep us entertained, but there’s just something expressively weird about Doctor Who because no matter how many times you think you have a story understood and you know what’s about to happen, some random element will introduce itself and everything will narratively head off in a direction you weren’t expecting. Indeed there’s an argument that Doctor Who fails when it isn’t doing that, when everything you think will happen in a story happens, when a particular story element is set up to occur in an episode and there is no twist and the outcome is as expected. But I think I’ve clobbered half of last year’s episodes more than enough.

That’s why The Witch's Familiar is so damn good. Throughout I had absolutely no idea what it was doing, where it was going and how it was going to end. Not one. As a friend pointed out to me last week, there were story elements in the first part, the Hand Mines, the planes with laser being shot at by bows and arrows which would have been key elements of some other shows and yet they’re introduced and forgotten almost immediately. Such as we are with the hybrid Daleks. In another series, that would have been status quo from now onwards, the old threat regenerated. But in Doctor Who, it’s blown up almost as soon as it’s introduced as a way of underscoring how its main character reacts to danger, his compulsive expectation that he’s going to win (wearing now what amounts a pair of Joo Janta 200s).

Even when it looks predictable, it really isn’t. Knowing full well that we didn’t really believe that the Daleks would have exterminated Missy and Clara (despite neither of the given actresses appearing in any of the published cast lists), the writer Steven Moffat, for it is he, boldly just sticks them in the teaser, explaining how they got out of that as a way of introducing the aforementioned compulsive expectation. Even then we know, because they tell us, that their part of the episode will be about them returning to the city, but because Missy’s an even more unhinged presence than ever before, “a nightmare dressed like a daydream” if you will, we’ve no idea what that will look like especially since the only reason she’s keeping Clara alive is because of some notion murder her further down the line.  Or sewer.

Sure enough what results is a version of the Hulk and Loki incident from The Avengers (Assemble) over and over and over again with Clara, the familiar in this case, on the receiving end. In this strand Moffat’s paying homage to The Mutants (or whatever Doctor Who Magazine’s deciding to call it now) to a large extent, with the companion in much the same position as the Thals in that earlier adventure, but whereas the unpredictable element then was dependent on which of the fair haired ciphers would be eaten by a tentacled something from the deep, here it's whatever horror Missy will subject the companion to. With the chemistry between Michelle and Jenna underpinning the comedy with dread and Hettie MacDonald’s nose for slapstick and editing, it’s all hilarious and scary. And sticky.

And perverse because here’s Missy also inflicting on Clara the shocking truth behind, Oswin, the character played by the actress’s first appearance in the series. Across the years Daleks have supposed to be scary but there are few fans who haven’t also wanted to become one, running around in circular columns of printed pvc or a cardboard box with some blu-tac stuck on the end of a pencil applied to the front (depending on the disposable income of the parent). Yet here’s poor Oswin or at least a version of her, having a similar experience turned into a nightmare for a second time. Like Chesterton she finds herself locked inside. Unlike Chesterton that captivity extends to her ability to communicate. Expect an ios or (ironically) Android translation doodat which does much the same thing at an app store in time for Christmas.

Except, all of this is just the B-story. Weaving throughout is the Doctor’s confrontation with Davros which barring the introductory scenes in which this Twelfth incarnation finally recreates the bitter John Birt end of the 1993 BBC VT Christmas tape (“So Jeannette, by increasing my assistant’s salary to above my own I can then point out to the governors the foolishness of the pay scale AND GUARANTEE FOR MYSELF A HANDSOME PAY RISE”) is a two hander between these old, old foes and friends. Again, ridiculously, we’re in episode two of a twelve episode run and it’s largely about emotional chicanery and referencing forty year old mythology at a time when in earlier series it was about properly introducing some new companion or showing a post-regenerative Doctor’s first adventure.  Thrilling, intellectually satisfying and also shifting that old mythology onwards still.

After his first couple of appearances, the original television run of the show tended to made a point of keeping these two separate for as long as possible which was nonsense because as even Davros knows, the reason Genesis of the Daleks is a classic is because of his lengthy conversation with the Doctor over the price of eggs or the universe (which is roughly the same in the Organic food section of Waitrose). In later years, Big Finish has thankfully noticed and Joseph Lidster’s Terror Firma in particular presents a spiritually similar conversation as appears here with Davros apparently close to death (which is why its absence was felt so much last week presumably due to licensing and BBC charter reasons). For all his whimsy, the Doctor’s always at his best when academically jousting with scientists even if he has the beating hearts of an artist.

As expected, Moffat’s Genesis wave from last week wasn’t just the introduction of some gratuitous continuity point and paid off this week (and how). There was always something slightly nonsensical about Fourth's attitude to those two wires in Genesis since he’d destroyed the Dalek race on numerous occasions already and all he’d be doing was saving his younger incarnations from doing much the same thing as Twelfth does at the conclusion of this story (somewhat referencing Power in the process). He mentions the effects on history (which various chronologies have since suggested happen anyway due to the line about setting back the development of the Daleks) and Russell T Davies has since suggested it was the original front of the Time War. But, yes, it’s a very odd scene in retrospect.

Even more than Journey’s End, the intensity of Julian Bleach’s depiction of Davros is breathtaking, continuing the legacy of Wisher, Gooderson and Molloy with the script providing him the opportunity for offering even greater emotional depths, at least for the screen version (Molloy has been astonishing in the audios too and I don’t want to draw away from his achievement). Davros’s eyes open and suddenly the least expressive element of the old mask is given force. We know now of course that both figures in the conversation are play acting, but anyone who's heard the biographical audio series about Davros (which Moffat studiously doesn’t contradict at all here) will find even greater poignancy in the action (even as we’re wondering if Moffat meant to paraphrase George Lucas or more probably Lawrence Kasdan here).

Up against him is the Doctor giving his A-game. Yes, he is the Doctor. He really, really is. There were glimmers in the final production block episode of last year and Last Christmas that Capaldi had realised how to play him and Moffat to write him, but finally we have a figure that fulfils the promise of those technicolour eyebrows from The Day of the Doctor, all of the fierce, powerful forces, underpinned by tenderness, diplomacy and yes, compassion even if in the latter case it’s being deployed as a bluff. Peter finally looks like he’s properly enjoying himself and also that his Time Lord skin fits snugly rather than as something he’s been told to wear and is making the most of it. Even without the visual reference at the start of the episode, we can see the Fourth Doctor as a major influence, but Scottish (with a few Tennanty ticks).

More than that, I also feel like my hero’s returned, which as anyone who held my hand through the dark times last year will know is huge. This man simply doesn’t feel like the cruel impostor who wandered through series eight, though I say that cautiously given that he only has a conversation with about half a dozen people across this entire story and none of them are strangers. We’ll see what happens next week. But when he smiles here, it’s with the gleeful, comfortable warmth of Tom or the other Peter or Matt rather than because his teeth want a divorce from his gums and can’t seem to find a decent solicitor. It’s been argued that what we saw last year was the mania of a post-regenerative cycle stretched across twelve episodes, but without that made plain in the script, it was really hard to take.

Then, just when it looks like everything’s about resolve itself and the Dalek base is about to tip into its own sewers, there’s the supremely odd scene of Missy trying to convince the Doctor to murder Clara. Designed mainly to give the Time Lady a point in the story which isn’t that she not River Song, it’s the Doctor Who equivalent of the repeated fake out which causes the surviving cast of the various Scream sequels to have weapons to hand when current wearer of Ghost Face looks like he’s already checked out. We know he won’t do it and on the surface it seems like the kind of slightly bland confrontation designed to ramp up some false tension that often ruins a good story.

But as has been the case in the rest of the story, it has a purpose: to return the Doctor to moment of cliff-hanger in the previous episode, the reasons for which are self-explanatory. Nevertheless there are staggering implications which are somewhat glossed over and are connected to the end scene of Listen. How is the TARDIS able to visit these points in time now, old Skaro and old Gallifrey, given the events of the Time Way and why is he not asking that question? When young Davros was revealed last week, I thought half of the surprise was that the Doctor could even be in that space let alone be speaking to a pint-sized version of his arch enemy. Moffat’s said that all these stories will be linked in some way. Perhaps he’ll return to this element later in the year.

The upshot of all this is that I’m really excited about the next ten episodes plus Christmas special which is really good news since the last thing you want is to dread watching the next episode of what purports to be a favourite television series (as anyone who sat through s6 of Buffy and s5 of The West Wing will tell you). Moffat seems to be enjoying himself again, evidenced by the teaser and at the other end of the episode Missy’s first meeting with Davros after all these years. We’ll see how that transmits through the other writers, if this is a genuine new direction for the series or an abiration.  Nevertheless, for now, Doctor Who’s back to be being the thing it should always be, ridiculous, utterly ridiculous. But in a good way.

September 26, 2015

The solar system at 1 kilometer per pixel: Can you identify these worlds? by The Planetary Society

A look at the surfaces of 18 worlds in our solar system, all at the same scale.

LightSail Gets Backup Burn Wire for 2016 Mission by The Planetary Society

LightSail's burn wire, the mission-critical component responsible for releasing the spacecraft's solar panels, will get a backup ahead of next year's solar sailing mission.

September 25, 2015

Doctor Who Speculation. by Feeling Listless

TV I've just tweeted the following but I'm putting a version of it here for posterity.

Jenna Coleman's not listed in the cast for episode two but Clara is mentioned in the official synopsis for episode three.

The speculation:

Clara really has been exterminated. She's gone.

But before the next adventure, either at the end of tomorrow night or the beginning of the third episode, the Doctor breaks into his own chronology and picks up a Clara from earlier in her timestream, before the version we saw at the beginning of The Magician's Apprentice and they travel the universe in the classical style, usual sorts of adventures, the Doctor knowing her fate.

Then at the end of the final episode, he drops her back on Earth, knowing it has to end some time, and knowing her fate.  Which she doesn't.

Or the final episode will about him trying to get around the laws of time in order so that she can live. Something like that.

Yes, it's very reminiscent of at least three things Moffat's done before but if he and Doctor Who are capable of anything it's re-using old ideas that work.

Updated 27/9/2015  Well, I got that wrong.

Lose yourself in this high-resolution portrait of Pluto by The Planetary Society

Enlarge this image to its full 8000-pixel-square glory and lose yourself in it.

Mega Hemorrhaging Total Lunar Eclipse Sept 27-28, 2015 by The Planetary Society

There will be a spectacular total lunar eclipse on the night of Sept. 27-28, 2015, a newly dubbed Mega Hemorrhaging Eclipse. Here is info on what lunar eclipses are and how to observe the eclipse.

Towards a Jupiter Weather Forecast by The Planetary Society

Trying to keep track of the ever-changing face of Jupiter is a pretty big challenge—its a dynamic world that can fascinate and surprise every time we turn our telescopes towards it.

Elizabeth Wurtzel on the BRCA mutation. by Feeling Listless

Health Something I had absolutely no idea about. From the NYT:

"The BRCA mutation entered the Jewish community in Poland some 500 years ago, and because the Jews of Eastern Europe lived in isolated communities, they incubated it among themselves. Entire families of women were wiped out by breast cancer, and no one knew why as they buried their dead.

"Even though the 14 million Jews of the world today have scattered and intermarried, the BRCA mutation still disproportionately affects Ashkenazi Jews."

September 24, 2015

When Stars Align by Astrobites

Title: Lens Masses and Distances from Microlens Parallax and Flux
Authors: J. C. Yee
First Author’s Institution: Harvard-Smithsonian Center for Astrophysics, Cambridge, MA
Status: Submitted to The Astrophysical Journal Letters



When stars were divine, and their journeys across the heavens foretold the events to unfold on our humble terrestrial sphere, the alignments of stars were studiously marked.  They signaled the rise of new dynasties, a lucky windfall, ill-fated love.  With the passage of a few millennia (and the realization that the wandering stars were planets), our modern sensibilities have been honed to instinctually interpret the apparent crossing of stellar paths as just a happy but natural coincidence with no deeper significance.  But perhaps mistakenly so.  For when the stars align, you just might catch a glimpse of things otherwise invisible:  binary stars in wide orbits, isolated black hole hermits, or the abandoned (or unruly) free-floating planet far from the star around which it was born.

These invisible wanderers are quite literally brought to light through a unique sequence of events that occurs when two celestial bodies align.  The force that orchestrates the event?  Gravity.  Black holes have gotten much fame for their gravitational brawn, which grants them the abilities to warp space and time and to bend light.  Such powers, however, are actually not limited to black holes alone—they’re bequested on anything with mass.  Your everyday celestial body—say, a star or planet—can do precisely the same.  The share of the limelight that black holes have been given in this regard is fairly earned, however, as the ease with which you can see the distortions caused by a massive object scales with its mass.  Such objects can act as a lens by virtue of their spheroidal shapes, focusing the stray light beams passing from behind into a distorted image.  This process is known as gravitational lensing.

For objects as small as stars, you can’t see these images.  The images would be tiny—about a million times smaller than the angular diameter of the Moon—hence the name of this class of gravitational lensing events, microlensing.  But these minuscule images can have a disproportionately large detectable effect.  When a star crosses paths with another, you’d observe the background star to brighten drastically—as much as a factor of a 1000!—then dim.  The two stars don’t have to pass directly in front of each other, but the closer they do, the more the light from the star behind (the “source” star) will be focused. Maximal brightening occurs when the source and the lens are at exactly the same position, a special point called the caustic. Over weeks to months, a distant observer would note a single brightening, then dimming of the star.  If the lensed star had any companions—a fairly likely scenario, as stars often come in pairs, and most (if not all!) are believed to have planets—its caustic can morph from a point into a series of closed curves (see Figure 1). If the source star approached or crossed these curves, we’d observe additional brief spikes in light. Depending on how the mass of the companion compares to the lensed star, these spikes can be as short as a day (for low mass companions such as planets)—a real challenge for planet hunters searching for microlensed systems.

Figure 1. The geometry of a a microlensing event.  The path of the background star (the

Figure 1. The geometry of a a microlensing event.  The path of the background star (the “source” of light) is shown by thin gray curve; the arrow shows the direction it moved along this line.  The foreground lensing object is a binary system, likely a brown dwarf (M1) with a planet (M2).  The background is shaded in different shades of gray to show how much the binary could cause the background star to brighten (see Figure 2 for what was observed).  The dark black curves denote the “caustics” of the binary lens: when the background star crosses a caustic, it momentarily becomes infinitely bright if the background star was a point (which is unrealistic—we know stars have finite sizes!).  Figure from Han et al. 2013.

Figure 2. A microlensing lightcurve.  The lightcurve (brightness over time, here in days) observed for the system shown in Figure 1.  The two peaks occur when the background star crosses the caustic (which, as you can see from Figure 1, occurs twice).  Figure from Han et al. 2013.

Figure 2. A microlensing lightcurve. The lightcurve (brightness over time, here in days) observed for the system shown in Figure 1. The two peaks occur when the background star crosses the caustic (which, as you can see from Figure 1, occurs twice). Figure from Han et al. 2013.





Microlensing events, however, lack one piece of information prized by astronomers—masses.  The lens mass affects how long a microlensing event lasts.  The duration of a microlensing event is easy to measure, but it also depends on three other things: how far away the lens is, how far the source is, and how fast they’re moving relative to each other.  Thus in order to derive the mass of the lens, we need to determine the other three.  The distance of the source is easy—typical sources are in the Galactic bulge, the concentration of stars at the center of our galaxy, which is a well known distance away (about 8 kpc).

The author of today’s paper suggest a two-step process to determine the remaining three unknowns.  The first is to make two microlensing observations of the same event, but at different places (and thus different angles).  Such a “microlens parallax” measurement—which has just recently become possible to obtain for many microlens events due to a new campaign to search for space-based microlensing events with the Spitzer Space Telescope—allows you to reduce the unknowns to the lens mass and distance.  This leaves you with the classic problem of having a single equation with two unknowns, for which there are an infinity of permitted combinations.

Figure 2.  Disentangling the mass of the lensing star.  Measuring the mass

Figure 3. Disentangling the mass of the lensing star.  If you can observe a microlensing event from two different angles, you can derive a mass/magnitude-distance relation (black; the dashed line denotes the uncertainty).  Measuring the flux from the lensing star also produces a magnitude-distance relation (magenta).  The place where the two lines cross gives the mass and distance of the lens.  In the special case in which you have a binary lens, the size of the source star affects how bright it gets, which allows you to derive another mass-distance relation.  Figure from today’s paper.

The final key to the puzzle?  The flux from the lens, if it’s bright enough.  If the lens is a star, its flux allows us to measure how far away it is, given that we know how luminous it is.  Since we don’t often know how luminous a given object is, this yields a magnitude-distance relationship.  Based on our understanding of stars, the mass-distance relationship obtained from a microlens parallax measurement can be converted into a second (and very different!) magnitude-distance relationship (see Figure 3).  Whatever magnitude-distance combination that’s permissible by both relationships gives you the distance to the lens—which finally allows you to solve for the mass of the lens.

And there you have it.  It may seem like a long and difficult process to obtain the mass of a fleetingly visible object, but these mass measurements will help us to understand planets and stars of our galaxy that are currently unreachable by another means.  With additional Spitzer microlensing campaigns—the first of which is already returning a treasure trove of results—as well as the revamped Kepler mission, K2, and the upcoming mission WFIRST, space-based microlening surveys may become routine.  It’s an exciting time for microlensing—many new discoveries await!




Cover image:  A map of the amount of brightening you’d see if a distant star passed behind one of the stars in a equal-mass wide binary.  The black curves denote the lens’s caustic—if the background star crossed this curve, would momentarily become infinitely bright if it was a point source.   The path of the distant background (source) star would appear as a line across the image.  You can predict the lightcurve of the microlensing event by plotting the brightening along the source star’s path.  Figure from Han & Gaudi 2008.


September 23, 2015

Peanutized. by Feeling Listless

Film The new Peanuts film has a website which offers you the chance to produce a version of yourself in the new threedification of Charles Schultz's style. Find me above, in all my man at Asda, t-shirt and jeans glory. As with the Paddington film, I'm expecting nothing but good things from this film. The trailer feels in keeping with the original comics and the reliance of that music in the publicity shows that the producers are clearly very clued in on what makes this franchise unique.

Direct Detection of Exomoons by Astrobites

Screen Shot 2015-09-20 at 10.19.42 PM

Figure 1. A synthesized spectroastrometric signal, with the left panel representing the bluer wavelength dominated by the planet and the right panel representing the redder wavelength dominated by the moon within an absorption band where the planet’s spectrum is dark. The offset of the right image indicates that the flux is being given off by a probable moon. Figure 1 in the paper.


  • Title: The Center of Light: Spectroastrometric Detection of Exomoons
  • Authors: Eric Agol, Tiffany Jansen, Brianna Lacy, Tyler D. Robinson, Victoria Meadows
  • First Author’s Institution: University of Washington Astrobiology Program & NASA Astrobiology Institute’s Virtual Planetary Laboratory
  • Paper Status: Submitted to ApJ


There are 182 moons (and counting) in our Solar System that come in all shapes, sizes, and compositions. In fact, there are more moons in our Solar System that are known to harbor liquid water than planets. Jupiter’s moon Europa and Saturn’s moon Enceladus (thanks to this recent study) are believed to harbor global liquid water oceans beneath their icy crusts, possibly offering us the best opportunities for finding life elsewhere in the Solar System. Though we are far from adequately exploring these moons, astronomers are already looking at ways to study moons around planets outside of our Solar System, known as exomoons. Many factors go in to whether an exomoon may be habitable (see this astrobite), but the abundance of gas giant planets discovered orbiting in the habitable zone of their stars may mean that there are many rocky moons with temperate climates, and planet-moon interactions may lead to heating of exomoons that do not occupy the traditional habitable zone.

Though none have yet been confirmed, attempts to indirectly detect these extrasolar satellites around confirmed exoplanets are underway; most techniques for doing this involve finding perturbations to the exoplanet’s transit signal and are discussed in this and this astrobite post. Today’s paper instead looks at the future of exomoon detection by investigating how upcoming telescopes could directly detect exomoons and subsequently uncover a wealth of knowledge from these systems. The authors consider a technique dubbed “spectroastrometry”, which as you may have guessed combines spectral analysis with astrometry, or looking at the precise movement of celestial bodies. The key to this idea is that the exomoon outshines the exoplanet at certain wavelengths of light, and likewise the exoplanet outshines the exomoon at certain wavelengths. If a telescope were large enough, had fine enough spectral resolution, and sufficiently blocked out the light from the system’s host star with a coronagraph or starshade, the moon-planet system could be identified and separately resolved by looking at the system at different wavelengths of light, as demonstrated in figure 1.

Screen Shot 2015-09-20 at 7.22.23 PM

Figure 2. The flux of the Earth-Moon analog system orbiting Alpha Centauri as a function of wavelength in microns. The top panel shows the flux density of the Earth (blue) and Moon (gray) analogs of the system. The bottom panel shows the percent of the flux density that is due to the Moon as a function of wavelength. Note the absorption features caused by the presence of oxygen, water, and carbon dioxide in the atmosphere of the Earth. The Moon’s spectrum is governed by reflected Solar and emitted thermal radiation. Figure 2 in the paper.


The detection of exomoons using spectroastrometry was analyzed in this study using two fiducial examples of exoplanet-exomoon systems in the habitable zone of a star: an Earth-Moon analog orbiting Alpha Centauri (1.34 parsecs away), and a Jovian planet with an Earth-sized moon orbiting a Sun-like star 10 parsecs away. Part of the spectrum of the Earth-Moon analog is shown in figure 2. Clearly, there are parts of the spectrum relatively close together in wavelength where the moon’s and planet’s individual fluxes dominate the total system’s flux, providing the best target for spectroastrometric analysis. In the infrared, the frequency that the Moon would most dominate in the Earth-Moon analog is the water band at ~2.7 microns (see figure 2), where water in the atmosphere of the Earth absorbs the most light. At this wavelength, the Moon accounts for 99.8% of the total flux of the Earth-Moon system. Comparing the sky position of the system at this frequency and a slightly higher frequency (where the planet dominates the flux) would allow the spatial separation of the planet and moon, such as in figure 1.

No current telescope can spatially resolve these example systems, and planned “next generation” telescopes like JWST (6.5-m primary mirror) will also have insufficient angular resolution. As an example, the spatial resolution needed to separate the Earth-Moon analog orbiting Alpha Centauri would be ~2 milliarcseconds in the infrared, whereas the Hubble (2.4-m primary mirror) would only be able to resolve objects with a separation of ~100 milliarcseconds at these wavelengths (and is not equipped with a coronagraph to block out the light of the host star, anyway). However, these abilities are expected to be within the capabilities of proposed second-generation telescopes like the 12-m High Definition Space Telescope. Figure 3 illustrates in the size of an infrared space telescope (equipped with either a coronagraph or starshade to block out the light of the host star) that would be needed to detect an Earth-sized moon orbiting a warm Jupiter using spectroastrometry in the Solar neighborhood. One of the most exciting prospects of this method is that it will let us do more than just detect the presence of an exomoon; if we can spatially resolve the moon we may be able to look at the spectral features of its atmosphere to gauge if the moon is habitable or contains biosignatures. Also, the large exomoons that we will be able to characterize with spectroastrometry will be on our galactic doorstep, meaning that a trip to Pandora may be closer than we think.

Screen Shot 2015-09-20 at 7.53.03 PM

Figure 3. Minimum telescopes diameters needed for an optimal space-based telescope to detect an Earth-like exomoon orbiting a warm Jupiter between 0.45 and 0.89 microns with 5-sigma confidence using spectroastrometry. The nearby stars (less than 10 parsecs away) are plotted at their given distance. The legend indicates the colors that correspond to stellar type and the shapes that correspond to whether exoplanets have been detected around these stars. Figure 12 in the paper.


Checking in on Uranus and Neptune, September 2015 edition by The Planetary Society

There are no spacecraft at Uranus or Neptune, and there haven't been for 30 and 25 years, respectively. So we depend on Earth-based astronomers to monitor them, including Damian Peach.

Xtronaut – A New Approach to Education and Public Outreach by The Planetary Society

Historically, NASA missions set aside a portion of their budgets for education and public outreach, or EPO. However, the OSIRIS-REx EPO budget got deleted in 2013 as part of a broader federal policy change. Dante Lauretta decided to make a run at a private company to recover the lost OSIRIS-REx EPO program – and Xtronaut was born!

September 22, 2015

All that is old is new again (heavy politics dance remix) by Charlie Stross

This week's amusing (albeit arguably libellous) allegations may be grounds for mirth, but I'd caution anyone who actually believes the Prime Minister stuck his todger in a porker to first remember the words of the immortal Hunter S. Thompson, from Fear and Loathing on the Campaign Trail '72:

This is one of the oldest and most effective tricks in politics. Every hack in the business has used it in times of trouble, and it has even been elevated to the level of political mythology in a story about one of Lyndon Johnson's early campaigns in Texas. The race was close and Johnson was getting worried. Finally he told his campaign manager to start a massive rumor campaign about his opponent's life-long habit of enjoying carnal knowledge of his own barnyard sows.
"Christ, we can't get a way calling him a pig-fucker," the campaign manager protested. "Nobody's going to believe a thing like that."
"I know," Johnson replied. "But let's make the sonofabitch deny it."

I cannot speak to the nature of the bonding rituals of elitist Oxford University drinking societies, but I am fairly sure that Lord Ashcroft, as former treasurer and deputy chair of the Conservative party and being of an age to remember him, has read LBJ's play-book.

What I'm more concerned about is the question of who is supposed to replace Cameron in time to do damage control in the aftermath of the coming fiscal crisis. Theresa May, perhaps?

Financing Female-Led Films. by Feeling Listless

Film Find embedded above a TIFF Industry discussion about the financing of female-led films, encompassing what's gone wrong in the past and with an idea for the kinds of action which could and should be taken in the future. The too long to watch version is that because women have always been in the minority on boards and in decision making roles even if a discrimination in creative circles isn't actively brought by men it can be unconsciously. The action plan is essentially to create goals. The best contributor is Anna Serner, the CEO of the Swedish Film Institute were 50% of the productions they finance are by women directors and as she notes it's all very well women filmmakers getting together in support networks and talking but its then up to them to take their talents up the hierarchy.

Frankly, and let's call them what they are, man films made by men about men for men. To unfairly selecting something at random, I watched the exceedingly average Out of the Furnace recently which stars Christian Bale as a mill worker who's trying to protect his younger wayward war veteran brother Casey Affleck. There's more to it than that, but suffice to say there's some punching and shooting and shouting and amongst a cast which includes the likes of Woody Harrelson and Willem Defoe there's room for just one female character, played as she so often is by Zoe Saldana. Now the potential argument is that the film's portraying a world in which there are few women and in which these men don't much interact with women, but my counter-argument would be ask why we have to see that again?

Despite only appearing in about three scenes, it's soon pretty clear that Saldana's character's story arc is a lot more original and interesting than the generically grim nonsense happening elsewhere but the filmmakers, all men, just simply aren't interested in expanding her role beyond girlfriend who leaves and hooks up with someone else in order to cause the protagonist pain. To have done so would have led to a completely different film, but I suppose my point is we've otherwise seen this film. We haven't seen the version of this film which is about her character. Not that there's any reason why this story couldn't be told with women in the lead roles either beyond the usual lazy gender stereotypes and that in and of itself would have created the necessary variance.

But as the discussion touches upon, it's also about recognising that women make up 50% of the world's population and that not having more than one woman in a lead role both behind and in front of the camera is morally wrong. Flicking through the winter preview section of this month's Empire, all I see are pages and pages of films with male protagonists, often with the character's name in the title. Apart from The Hunger Games (and possibly Star Wars), when women are visible it's as part of an ensemble and even then often as a romantic interest or daughter and the vast majority of this work is created by men. But when these things are made and are successful, as the panel agrees it's treated as a fluke rather than something to be turned into a movement and I'd add if they're a failure it stops that kind of film being made again as if that's what's important.

My Favourite Film of 1980. by Feeling Listless

TV Surprise. One of the benefits of launching into this list without creating any clear definitions as to what constitutes a "film" is that I can essentially make it up as I go along. To suggest a "film" can only be a "film" if it's been theatrically released denies status to a number of clearly very worthy films which have only been broadcast on television or released in a home format.

Perhaps we could look to the intent of the artist or production company, but again now that theatre is being broadcast in cinemas, films projected in theatres and most of it on an iProduct, I'm going to allow myself a certain flexibility. The IMDb lists this the BBC Shakespeares as "TV Movies" which is good enough for me - and both the game shows Pointless and In It To Win It on several occasions.

Does the selection of a video taped studio production require this sort of justification? Probably. But of all the films released in 1980 which due to the list rules I could choose (no repetitions of director or franchise) there isn't anything theatrical which I've actually seen which quite matches up to the esteem I have for the BBC Shakespeare adaptation of Hamlet.

It's quite nice to be able to cross post (original post here) something in from one of my other long term projects, a rare example of something on this list which you can watch legally for free.  This means I don't feel so guilty for having to ignore the Branagh version in the mid-90s part of the list.  It seems quite fitting to have Lalla up there in the week that Doctor Who began again.

For the uninitiated, all three of you, The Hamlet Weblog was and is an effort to watch as many different versions of Hamlet as I can.  This is a sporadic endeavour largely because there's a balance between wanting to see the play but not wanting to see it too much so although there was a period in the past decade when I saw plenty of productions, it's slowed to a trickle.

As is so often the case with these projects, there's the process of reviewing the production afterwards for posterity and there's only so interesting ways you can try and explain why a director might have chosen to include the scenes featuring Fortinbras.  Or not.

Perhaps I should mention I disagree with my slightly younger self on a few points (Lalla's Ophelia particularly), so greet what follows with all the caution of Laertes when he's visited by the ghost at the start of the play.

Hamlet, Prince of Denmark (TV Movie)

I knew when I began this process that there would be certain 'tentpole' productions, so renowned that I'd want to save them and relish them. The BBC Shakespeare Hamlet is one such presentation with its central performance from Derek Jacobi, Patrick Stewart's Claudius and Claire Bloom's Gertrude. But for this fanboy there's an extra level of interest because glancing through the cast list beforehand it would be quite easy to say 'I can't believe it's not Doctor Who'.

In casting terms that means Geoffrey Beavers who played the Doctor's nemesis The Master during the eighties, Lalla Ward (Ophelia), who famously companioned Tom Baker's Time Lord as Romana (before marrying him briefly in real life) and Jacobi who would later go on to play a version of the Doctor on an audio cd (Deadline), The Master in an animated story for the BBC website (The Scream of the Shalka) and is soon to appear in an episode of the new television series (Utopia).

But a range of actors who filled bit part roles in Hamlet would go on to do the same in Who. Geoffrey Bateman (Guildenstern) played Dymond in The Nightmare of Eden), Emrys James (First Player) was Aukon in State of Decay, Peter Burroughs (Player) was the Jester in The King's Demons, Peter Benson (Second Gravedigger) essayed the role of Bor in Terminus, Stuart Fell (Player) has been a whole vast range of different characters including Alpha Centauri in The Curse of Peladon and Reginald Jessop (Messenger) was type cast as a Servant in a number of episodes.

That connection continues behind the camera as the production is kinetically directed by Rodney Bennett who helmed a range of stories for that series in the same period (The Ark in Space, The Sontaran Experiment and The Masque of Mandragora), the fights were co-ordinated by B.H. Barry (The Mind Robber and Four To Doomsday) and the vision mixed by Shirley Coward (The Tenth Planet and Remembrance of the Daleks). The music too is supplied by that series' main composer during the Baker era, Dudley Simpson and indeed one of the few distractions is when Simpson's familiar brass section clashes in between acts or scenes, so redolent of a cliff hanger or the attack of a Wyrrrn.

This is a wonderful production. Tied though it is to the BBC drama department's idiom of the time, all studio bound, multi-camera setups shot on video, it straddles the divide between pure theatre and television and is one of the jewels in the BBC Shakespeare series, so traditional in many ways but radical in others. Perhaps acknowledging the limitations of the medium, Bennett favours performances over setting, a decision that pays dividends.

Series producer Cedric Messina's hope was that the big roles should be played by renowned actors and Jacobi certainly fitted the bill, having seen him in a famous 1977 West End production (more on which at a later date). At the planned time of taping, Jacobi was contracted to play Richard II on stage, so Messina waited until he would be free and thank goodness he did -- this recording captures one of the best characteristics of the role I've ever seen.

I don't think I've seen Jacobi give a poor performance -- even in Evolution: Underworld he manages to keep his dignity. What makes this so special is that the actor absolutely understands the range of emotions that Hamlet is dragged through and is able to successfully layer in the sheer frustration of not being able to carry out his dead father's wishes either because of the situation or his own fallibilities. Watch his face during The Mousetrap as he realises that his uncle hasn't reacted to the mime of the death of Gonzago and that he'll actually have to talk him through the deed, hammering home the message that he knows of the murder.

He's so very vulnerable too, slightly nervous, never entirely sure of his actions even when he's addressing the audience during soliloquys; rather like other fourth wall breakers in such films as High Fidelity, Alfie or Ferris Bueller's Day Off, there's a bond of trust between him and us as he imparts his feelings -- a connection which isn't granted to Claudius when he too sits alone and faces the emotional consequences of his actions (Stewart looks away from the lense even in close up). Only towards the end does Hamlet's loyalty really shift to his good friend Horatio, loyally played by Robert Swann with just a hint of homo-erotic tension.

It's also a very droll turn as Jacobi mines the seam of black comedy that Shakespeare has threaded through the dialogue that I've seen so few other actors take advantage of. Some moments are laugh out loud funny, such as his treatment of Rosencrantz and Guildenstern, here portrayed as nothing more than acquaintances suddenly dropping in unannounced rather like that email you sometimes get from someone you hardly knew at school who's signed up to Friends Reunited.

Some of this is made possible because of the choice to use a near complete text, allowing the actors the space to provide a more complete psychological arc for their characters. In this reading Claudius becomes a full blooded antagonist with almost as much screen time as Hamlet, Stewart relishing the opportunity to show both sides of the character, the public statesman who is privately guilt ridden. That tension is particularly clear in his dealings with a grief stricken Laertes (David Robb), nervously turning parental and sibling loss to his advantage.

There's certainly a grey area as to who the audience should be sympathising with. Although Claudius's murder of Hamlet Snr is unconscionable there's an inference that he took the action for the good of the country to help the peace process with Fortinbras who to my understanding lost part of his kingdom in a previous war. To an extent it's almost as though Hamlet isn't seeing the bigger picture, putting his own revenge plot ahead of the country's needs, Denmark's strength. This production makes plain that if Hamlet Snr hadn't visited his son the stable status quo would have continued -- it's Hamlet Jnr's plans which lead to the death of a family and the downfall of the kingdom. Comedy, tragedy, irony.

It's no pleasure though to report that I don't think Lalla Ward's Ophelia really works. Perhaps it's because her noble Romana in Doctor Who is so effective that here she seems defeated by the text, never once coming across as really being Laertes sister or in love with Hamlet. Only later, during the descent into madness does the performance gain power but even then it's a forced mess of histrionics. Claire Bloom's Gertrude, by contrast, exudes nobility and a surprising eroticism (frankly she's a babe). Throughout there's an implication that her marriage with old Hamlet was rather boring one and her shift to his brother not too difficult a choice and indeed that the bond with her son was broken long before his father's death.

As Susan Willis notes in her wonderful book, The BBC Shakespeare Plays: Making The Canon, from an initial push to produce backdrops that attempt to create a realistic period setting for each of the plays, as the productions drifted onward, taste shifted from representation to abstract with Don Homfray's designs for Hamlet being one of the first experiments. The exteriors then occur in a large empty studio, a grey void ringed with flooring at a slight incline, filled with mist for the battlement scenes, the sounds of the sea for the departing of Laertes and soil and a grave for Ophelia's funeral (which includes the sight of poor Lalla wrapped in drapes lying actually in the grave with mud dropped on top of her).

The interiors are even more experimental. Partitions have been painted with columns and vistas, bookshelves and libraries, paintings and wardrobes but they're generally used without regard for what's on them. During the scene when Claudius and Polonius spy on Hamlet's disposition with Ophelia they hide behind a wall with a landscape painted on to imply the view from the palace and Hamlet opens up the wall to see if he can find them hiding. It's the representation of a palace without regard for its geography which is by turns confusing and exhilarating and could be interpreted as an example of Hamlet losing his grip on reality, of the details of his surroundings losing their importance in comparison to his cause.

Having bought the box set, I'm slowly working my way through all of these BBC Shakespeare 'performances', geekily in production order minus the histories which I'm going to watch together at the end. Some have been better than others but I wouldn't describe any of them as awful. Inevitably I've loved the Measure for Measure and the As You Like It is far from the disaster its reputation suggests (with it just see a young Helen Mirren and an old David Prowse acting in the same scene). If the Romeo and Juliet shows signs of early nerves, Twelfth Night is a lovely romp and The Tempest has real power. But I think this Hamlet almost towers above them all and will be hard to beat.

Kickstarter is Now a Public Benefit Corporation by Albert Wenger

I am super excited that Kickstarter is no longer an INC but rather now a PBC, a Public Benefit Corporation. This means the company has changed its Delaware charter to include its mission of supporting creators as pari passu with creating value for shareholders. I would encourage you all to read the new Kickstarter charter which lays out that mission and also this piece in the New York Times.

Benefit Corporations are a topic I care a great deal about and if you have been a long time reader of Continuations you know the history. I first found out about the movement in October 2011 and wrote my first blog post about it back then. I wrote a follow up post a year later in 2012 where I explicitly mention Kickstarter as a candidate for this.

I became involved in the effort to convince Delaware, where almost all venture backed companies are incorporated, to adopt a Public Benefit Corporation statute. That involved a trip down to Delaware and meeting with the corporate bar there as well as legislators and Leo Strine, the then Chancellor of the Delaware Chancery court.

Finally, I was thrilled to speak as then Governor Markell signed the bill into law. All the credit hough should go to the fine folks at B Lab, who have worked tirelessly behind the scenes for many more years before that. They were the ones who got the ball rolling, got this adopted in many other states before Delaware and finally coordinated all the work that went into the Delaware legislation and its continued revision.

There are other venture backed companies that are Public Benefit Corporations such as Alt Schools. But Kickstarter is the first venture backed business that is profitable and has achieved scale that has converted. I look forward to many more venture backed businesses choosing this path!

Kids in VR, a quick anecdotal thought or two by Dan Catt

Modesty in VR

“Oh ignore all that” I say clicking past the no children under 13 warning and plunging my daughter into space. She’s in a small spaceship drifting outside the entrance to the nearby space station. Ships glide past impossibly close, engines humming, performing last second twisting and turning docking manoeuvres. Light from two brightly burning distance suns glint off the giant torus, slowly revolving to provide gravity to the agricultural fields within.

“Wow” she says before doing what all three of my kids immediately did.

  1. Reach out for the controls on the ships console before them.
  2. Say “Oh, where are my hands”.
  3. Look all the way round behind them, promptly standing up to attempt to walk to the cockpit door behind the pilots seat. I gently guide them back each time.

They were in Elite:Dangerous with the Oculus Rift VR helmet and I was using a 2nd monitor to see what they could see. Having kids means you sometimes get to try out tiny experiments, I wanted to see what their 12, 8 & 6 year olds reactions would be.

When I got the youngest floating through space, I think I finally found the the thing in technology that I knew I was too old to instinctively get.

I’ve been searching for this for a while, the thing that’d make me feel old. Not something I don’t understand but rather that my brain just doesn’t quite do.

I could see she was experiencing it all differently to me, when I put on the VR goggles I think to myself “I’m going into virtual reality now”, she was different…

Over the next hour she explored under the sea with fishes and sunken boats, threw herself off tall buildings, drove a truck (badly) across the UK, lorded over a dungeon spread out before her (which is amazing btw), survived a dinosaur sniffing around and as a giant took huge steps across mountains and lakes.

Isobel and the VR

…she took it all in her stride, that’s when the feeling struck me, the one I can’t quite find the words to explain. But this is as close as I can get.

I call a touch screen a “touch screen”, because it’s a screen you can touch as opposed to a “normal” screen. To a kid under ten a touch screen is a “screen” and a “normal” screen is broken.

A child happily prods, pokes, slides and swooshes between apps on a phone or tablet. Changing from one game to another in VR is still a faff, but even with that friction my youngest was having no trouble with context switching between different realities. Virtual Reality is the “touch screen” that kids will just get, it’ll just be how things work.

At 6 she’s probably right at the upper “always been part of her life” boundary. I can still see it though, she’s growing up knowing what it feels like to fly… sort of… in a VR fashion. Knowing what it’s like to float in space, dive underwater, leap to the top of a castle.

She’s having experiences with weight, scale, location, colour, space and so on that I never had as a kid. She’s going to be multiple reality native, and this one, the “real” one is going to be in some ways broken.

And that’s the part my brain won’t be able to get. No matter how much I want to, no matter how happy I am to put on the VR head set (and I’m very happy to do so), I will never have the experience of always having had the experience of moving through strange lands.

Our kids will grow up knowing what it’s like to fly.

Epilogue - in the game.

The game that had the biggest reaction with all three was Skyrim with the help of VorpX. Experiencing VR is hard to explain, both written down and on video, you have the be in to really understand the size and effect. The game world is no-longer trapped within a rectangle, it’s all around you.

I’d made various save games at points in the world the kids often went to, all three play Skyrim to various extents and all three knew Belethor’s General Goods store in Whiterun inside out. That’s where I dropped them into VR first.

Suddenly they weren’t looking at the store from outside the world, they were in it. Properly in the store, standing looking at the counter and Belethor fidgeting around on the other size. Turning your head you saw the table, wall hangings, crackling fireplace.

For them it was magical, to be inside the thing they’d seen so often on the screen, transported into the actual game.

I look forward to when I retire and can join them and my grandchildren in some distance world… and blow shit up.

How to Download Weather Satellite Images from Space by The Planetary Society

For less than $50, you can download images from NOAA satellites using your laptop and a small radio antenna.

September 21, 2015

Visual Pleasure and Narrative Cinema. by Feeling Listless

Film Students are slowly beginning to return to university or beginning their courses and at around this time, ten years ago, I began my MA in Screen Studies at the University of Manchester. Setting aside the nostalgia implications, a bit, I've decided to celebrate by creating a series of posts highlighting some of my favourite pieces of film related academia.

We begin with Laura Mulvey's essay Visual Pleasure and Narrative Cinema, a pdf of which is available here.

Mulvey is a feminist theorist who's currently professor of film and media studies at Birkbeck, University of London but for many years worked at the BFI. This seminal essay published in 1975, took a psychoanalytical approach to the representation of women in cinema as objects of desire, encapsulated in the concept of "the male gaze".

At its most basic level this amounts to a moment in a film when a shot lingers on a female form, then cuts to a man enjoying said form then cuts back, and the notion is that through editing we've been trained to appreciate the woman in a particular way.

Here's a video filled with examples that seems to have been gathered by a student for class project on just this topic:

At university I was tasked with writing about this in relation to particular films on a couple of occasions and posted the first, about The Breakfast Club here focusing on the "Allison reveal" scene, which through modern eyes, as with all minor pygmaliona, looks utterly wrong.  Andrew should accept her for who she is.

The problem, as Mulvey identified forty years ago, is that it utterly destroys the agency of the female character because it becomes about her appearance rather than her existence as a human being.  It's about what she can do visually for the man rather than her own autonomy.

This is even true when a male character isn't in the scene, since the use of camera angles puts the viewer in that position (with the potential to suggest that a male camera operator, director and cinematographer are also fulfilling this requirement).  Here's another video with plenty of example of the toe to head shot.

Whilst this kind of photography is still prevalent, the trend seems to be that if a film has a solid female protagonist, this kind of shot does not exist. I don't remember seeing it in Jurassic World, for example, or Max Max: Fury Road.  I don't think Paul Feig uses it much either.

When they do occur, in MARVEL films for example, they tend to be counterbalanced in the opposite direction (Thor), though it's important to note that the implications of the female gaze and also how all of this works within queer theory are also markedly different.

But it's so ingrained in the language of cinema now it hasn't gone and is still in use, partly because we as viewers have become trained to expect it.  There's an argument that the reason there was a negative reaction from some about Bryce Dallas Howard's character in Jurassic World is because the film didn't (again as far as I can remember) have Chris Pratt ogling her.

Ad-Blocking Points the Way To Decentralizing Power And That’s a Good Thing by Albert Wenger

Robert Reich has an op-ed piece in the NY Times today arguing that “Big Tech Has Become Way Too Powerful” and that we should apply anti-trust regulation. There is little doubt that network effects have made tech’s new incumbents such as Google and Facebook formidable. But applying traditional antitrust approaches is the wrong idea. We want the social benefits of large networks, but we also want them to be somewhat less powerful and put some of that power back in the hands of individuals. The way to do that is to have the equivalent of Ad-Blocking for mobile applications!

There are many people decrying ad-blocking as undermining journalism and dooming the independent web. That’s a deeply pessimistic view for which I want to substitute a positive one.

In the early days the web was full of ad-free content published by individuals. It was individuals who first populated the web with content long before institutions joined in. When the institutions joined, they brought with them their offline business models, including paid subscriptions and of course advertising. Along with the emergence of platforms such as Facebook and Twitter this resulted in a centralization of the web. More and more content was produced either on a platform or by traditional publishers.

But there is something different about the web. Its underlying standard, HTTP, allows for the existence of a user-agent commonly known as a web browser. The beauty of a browser is that it can truly represent its user. If I don’t want to see ads I can instruct my browser to strip them out *before* displaying the content to me. Similarly, I can instruct it to fill forms for me. To keep copies of the data I submit. And so on. It is code that I, the enduser, control.

Ad-blocking is an assertion of power by the enduser and that is a good thing in all respects. Just as a judge recently found that taxi companies have no special right to see their business model protected, neither do ad-supported publishers. And while in the short term this will mean publishers fleeing to apps (more about that in a second), in the long run it will mean more growth for content that is crowdfunded and/or micropayed, freely shareable and published using open formats. Put differently rather than being the end of the open web, ad-blocking is really the beginning of its renaissance!

And that also points the way towards curtailing the centralizing power of network effects more generally: shift power to the endusers by allowing them to have user agents for mobile apps. I have been calling this for some time the “right to be represented by a bot.” The reason users don’t wield the same power on mobile is that native apps relegate us endusers once again to interacting with services just using our eyes, ears, brain and fingers. No code can execute on our behalf while the centralized providers use hundreds of thousands of servers and millions of lines of code.

How could this be accomplished? Every app talks to API endpoints that the service provider has created. Those same endpoints should be available programmatically. While that’s easily stated it is more difficult to see what an initial regulatory step might look like. My suggestion is to roll back any and all legislation that stands in the way of decompiling the code that runs on my device (any computer I own) and extracting keys from it. If this were legal, then it would be possible for third parties to develop user agents.

Such a mobile user-agent could then once again do things such as strip ads, keep copies of my responses to services, let me participate simultaneously in multiple services (and bridge those services for me), and so on. The way to help endusers is not to have government smash big tech companies, but rather for government to empower individuals to have code that executes on their behalf.

September 20, 2015

More on Subscriptions and Micropayments by Albert Wenger

Two days ago I wrote a post explaining why I am bullish on crowdfunding and skeptical on micropayments as an alternative to funding content, especially journalism. The explanation was a bit dry and abstract, but in the comments I provided a more concrete example that seemed to resonate, so here it is:

Would you pay $2 for a joke you haven’t heard yet? No because you don’t know if it will be funny. What is your incentive to pay after you have heard the joke? It is much diminished.

What about 1 cent? Here is where people make the big mistake, they think that 1 cent is close to paying zero, so why wouldn’t I do it? Because my *actual* cost is much, much higher than that – it is the time I will spend listening to the joke in the first place (time I will never get back if the joke is a dud).

That’s why even a comedian as big as Louis CK doesn’t sell single jokes but rather a special. It is a large bundle of jokes and I know that in toto it will be worth my time *and* money.

Now in the earlier post I had also remarked that subscriptions can be a way around this sometimes. For instance, subscriptions seem to be working reasonably well for music. What has to be the case for a subscription to work? Well, getting more technical again, my expected benefit from the subscription has to be higher than my total cost (which once again includes the opportunity cost of spending time).

That explains why subscriptions work reasonably well for music. The cost of listening is relatively low as it is often a background activity (ie. I am listening to music and reading a book or doing email). And the average benefit on music that I like is high. This is an important caveat – if you tried a subscription and gave me random music it probably wouldn’t work very well either.

This is why a subscription for all written content is so hard. Reading is a foreground activity, just like listening to the joke in my example above. Now if I am only reading things that I have a high expectation of liking then a subscription might work – that’s for instance why we have a subscription at home to New York magazine and to the New Yorker.

But what if we tried to extend this to the internet at large, including say random tweets and blog posts? Well the more content you stuff into this subscription the lower you drive its *average* or expected value per unit of time. And eventually it breaks apart and no longer works at all and I will cancel the subscription.

All the “metered micropayment” schemes where you put up say $10 per month which then gets spread out over all the content you read based on usage are unfortunately just that: a subscription to the entire internet. So unless you make that a mandatory scheme, say by including it in the internet access fee you pay to your ISP you will wind up with a low participation rate in the metering scheme.

That, by the way, is not a reason to just give up on it. There may, over time, be a reasonably high voluntary participation *if* you can use the blockchain to converge multiple such schemes into a single payment for content creators. But I am still skeptical that it will amount to much compared to the potential for crowdfunding both in the upfront and ongoing modes.

Why is that? Because crowdfunding taps into the fact that a small fraction of readers get high value from someone’s content. And crowdfunding is a way for those readers to contribute directly in much higher amounts. For instance, Techdirt raised close to $70,000 on Beaconreader for stories on Net Neutrality. Because this is coverage I want to exist in the world, my benefit includes not just my own reading but the fact that many more people can read independent stories. So under a metering scheme my contribution might be a couple of cents per story at best (take $10 and divide by everything I read in a month on the web). So Techdirt might have gotten maybe $1 from me. Instead I gave significantly more on Beacon (I would put the amount here but somehow Becaon’s UI makes it hard for me to find that).

Again, the two aren’t necessarily mutually exclusive but hopefully this makes clearer why I am bullish on crowdfunding and skeptical on micropayments.

A collection of things I did during the 6 weeks of Future Learn&#39;s Creative Coding by Dan Catt

I've been inundated with requests (2) for me to post some of the code I created as I worked my way through FutureLearn's Creative Coding course. All titles link through to the relevant GitHub page, the whole GitHub repository is here.

One thing I particularly like about Processing is the use of "Sketch" for a program, it feels relaxed, loose, inviting you to quickly sketch out ideas. As such the code here was never really supposed to be public, some of it was iterated over quick sometimes changing direction halfway through. I'm a long time coder, but only dabbled in Processing, often giving up on it due to terrible debugging tools, which are now partially addressed in version 3.

I've gone back into the code to add comments and clean away some of the redundant cruft that was left over from coding swerves. All of this is a long winded way of apologising for the state of the code, much of which was written during late evenings.

There is however a narrative of sorts, an evolution of ideas running through the code. If this were a book it'd make sense to start reading the code at the beginning and work your way through to the somewhat lackluster end. The most probable action though is to jump right into the hexagons as they're the prettiest.

Either way, I hope there's something in here that helps someone somewhere, especially if you came via google looking for answers, they may be in here.

On naming conventions

Apparently I didn't do any coding during week one, but for the rest of the weeks I did stuff based on the course, and more stuff inspired by the course where I wanted to quickly thrash out ideas.

Sketches starting with "c0x_" are sketches done for the course, those with "c9x_bonus_" are extras often taking an element of the course and pushing into something I wanted to test.


Week 2

An introduction to rules and systems, how can we follow rules to create generative art?


25 Squares

Playing with handy, to give things a sketched feel. And promptly wanted to use it for every single new project it was that fun/good.


Line Loops

In which I learn that cutting 500x500px animated gif loops is great for tumblr.


More with the handy library

Which introduced us to accepting keyboard input, meanwhile I was more interested in how circles look awesome with the handy library (again!!)


All the animated cubes

I wanted to move off grid, and always been interested in isometric graphics. An attempt was made to animate cubes, which in this case are really hexagons.

Once again in special tumblr format :)

Week 3

Has me linking sin() functions and frameCount together for more perfectly looping gifs. And into probably the prettiest part of the course, the screensave/wallpaper lines. Which I then wanted to take in 3D.


The loops

Sin waves, circles, the illusion of rotation oh my.


All the pretty lines

I think a lot of the success in these line images comes from an understanding of colours and opacity. By knocking back the opacity I knew what I was looking for wasn't the initial burst of lines, but rather the subtle build up of tones and saturation, while not letting the image get too busy.

Pulling from the more subtle CMYK palette rather than bolder RGB helps too.

What I'm getting at here is that a) I'm awesome and b) going into code with an understanding of colour, tone and texture beats trying to force an end result with just the code. Code itself is not enough.


Draw Zachary

An attempt to combine the line drawing code and an image, to have the effect of the line drawing the image. Arguably this didn't really work. Which I think is a function of the algorithm used to move the line around than anything else. Given time I'd do this one differently.


Too fast explody cubes

I wanted to move the lines code into 3D, but before that I figured I need to explore 3D a little, which I did writing some code to draw, rotate a cube in 3D. And being able to manipulate elements of it. There's something useful going on here for pattern making, but I didn't push it far enough.


Figuring out how to do isometric stuff with cubes thinking I'd go off in that direction. To have them being properly lit by (maybe moving) lighting rather than drawing "3D" cubes in 2D was the end objective. I didn't get very far, but enough to know it was do-able, which is what counts.


Bouncing points in 3D

Taking the "bouncing lines" code and turning it into 3D was relatively simple, although I had to separated out the code that moved the lines from the code that drew them which was closely paired in the 2D version.

The gif above shows the points bouncing around inside a box, meanwhile the full code persists those lines within the 3D space. But it's harder to show what's going in a single still from the whole sketch.

YouTube 3D Lines video

Week 4

This was probably the most fun week, and got the most interesting images out of me. I'd had an image in my head for such a long time from a cut & paste collage I'd seen years ago. Where triangles had been cut out of a design and swapped around.

Triangle pose an interesting challenge for coding in that computers are great at rectangles. Not so much at triangles (and by extension, hexagons).

When we started with the pixel reading/writing code I knew it was a chance to have a go (and by chance I mean, time set aside that I normally wouldn't have).


One pixel Cinema

There are a number of ways to extract the main colour palette used in an image, and each have their own idiosyncrasies. Because I'm terrible slack in the past I always just scale the image down to a 1x1, 5x1, 2x2, 3x2 and so on pixel image depending on my need, then read the very few pixel values and use those.

In this case I just scale the image down to a 2 pixel high image. Then in theory I could just scale the image back up. In this case I'm actually reading the values and then drawing lines between the two values incase I want to do something different or clever with them (maybe throw in some curves).


3D Landscape

Back into the land (natch) of 3D. This was very quick and a test of the random noise generator in Processing. Also to test the lighting a little bit. There's about a billion different ways this could be improved.



And here we are with the great triangle switch-a-roo. Almost everything I want to say about this is in the comments in the code. Again I love this because we don't often see triangles used in computer graphics. I used a number of source images and seems it works best with primary colours and strong contrasts.

Code should just-work so try for yourself.

Week 5

I don't recall there being much in the way of projects to code this week, more of an introduction to new concepts. I took the time out to finally get round to exploring hexes. Just like the triangle grid it's fun to work off square-grid. There's the extra utility that if you have a hex grid by it's very nature you have a triangle grid (and the other way round I guess).




Time and motion. An excuse to throw in the rotation matrix stuff I'd learnt. Pretty simple with a couple of interesting things going on in the framing. I didn't want to let the numbers just float around in space, and yet the framing should distract from the numbes, after all those are what you want to read, but still represent the tick/tock passing of time.



Hexes 2

This was the 1st pass at generating a hexgrid, there's still a few logical/math flaws in the code, but they tend to be hidden away from most use cases.

This code through various tweaking generated a variety of image. The default one is pretty good as showing what's going on with the various sizes of hexes. But changing values around can give a surprising about of different effects.


Sketchy Hexes

Haven't touched the handy sketchy effect for a while, time to sort that out :D


Colourful Hexes

This started off as an failed experiment as importing and reusing an SVG to form a pattern. In the end I just hand rolled the code to divide a hex up into segments. I'll need to come back to importing SVG another day. I suspect my problems were more caused by Illustrator and Photoshop than Processing.

Again, I tweaked the code a lot while figuring out what I was doing.


Circles on a hex grid

What patterns, I thought, would you get if you put expanding circles on the hex grid. One way to find out.


3D hexes

After creating the hex images I was challenged on twitter to add a couple of dimensions. So I added an extra two, the 3rd and time.

Next up Hex-Bert.


Hex cuts

I got my hands on a craft paper cutting machine (like a laser cutter but less likely to burst into flame), you feed it SVGs and it cuts the shapes out.

I failed to import an SVG, how hard could it be to export one.

Super easy as it happens, you start an SVG capture, draw all the shapes then end the capture. With SVG exported, I (using supplied software) sent it to the cutting machine and got the above back.

Week 6

Probably the trickiest of weeks because I had very little time and was probably trying to cram too much new stuff into that time. I aimed to learn all about Vectors (or at least understand them more than I did before) and FFT audio processing, which I've done in JavaScript before but not Processing.


Landscape 1

Landscape 2

Landscape 3

I never quite got this one right. There were some nice elements to it, the random colours, the stars the moved over time as the sky got darker. Flowing landscape that were slightly different each time. And a bunch of fireflies flitting around with boid like behaviour. The speed of the Aurora, flies and hex sizes all controlled by different parts of the audio being played it.

Each part of the code was trying ideas out, so I could take the elements and work with them. But they never really worked together. Something to build on though.

I played (which is why it's a bit shonky) the audio live from a Native Instruments Komplete Kontrol, piping it in via SoundFlower. Could easily change the code to get the audio in from somewhere else.

Vimeo landscape

A Free, Online Course Exploring the Science of Phobos and Deimos by The Planetary Society

Impress your friends and wow your colleagues by learning all about Mars's moons.

The Magician's Apprentice. by Feeling Listless

TV Fucking-A! No sober reflection from me here as we embark on another series of Saturday evening ministrations. After last year, my expectations for these twelve weeks have been lowered to such an extent, anything half competent probably would have done, episode one of The Space Museum as opposed to the other three if you will, and yet here we are, judging by the Twitters, all relaxing in the splendour of watching the show that we know and love reconstituting itself before our eyes. Look at a something like The Magician’s Apprentice with its swagger and general sense of “Yeah, we’re back” and it’s impossible not to think, “He knows he done wrong last time. Moffat knows he done wrong.”

Dispassionately (right?) you might wonder if another idea would simply to have produced an excellent example of a more traditional piece of Doctor Who, with the Time Lord and Clara landing somewhere, chasing about for forty-five minutes heading into a cliffhanger and underscoring the fundamentals of the series, demonstrating that Doctor Who’s an event no matter what it’s about, not unlike the classical mode, a Terror of the Autons affair or Horror of Fang Rock. This is actually a moment when such a thing could have made sense, since a companion’s already in situ and a new incarnation isn’t being introduced. Last time we were in this space was The Impossible Astronaut, so yes, the other approach would have been something neutral.

I can relate. When the show was non-televisual, after a while it existed in two streams, the past Doctor material which archaeologically excavated the show’s history and the Eighth Doctor material which for the most part didn’t. Arguably it went off the rails slightly when it tried to be too experimental, the Divergent Universe on audio and whatever it was Lawrence Miles was doing in the books, but there was always a sense of trying to move forward, do something new. But as the reaction to the Divergent Universe on audio and whatever it was Lawrence Miles was doing in the books and arguably the modern tv equivalent which was last year’s season demonstrate, if you don’t get it completely right, you’re sunk. As is the case with most genre material, we want it different, just not too different.

Having acknowledged that, let’s just throw it out of an airlock (a proper one not something pretending to be an alien planet) and remind ourselves that the part of the mode of the revival is beginning each season with a big brassy occasion and that however much we tell ourselves the Weight Watchers Chocolate Roll is tasty enough and will do, what we really want is the Cadburys. We’ve been so spoilt now, that not for us any more a bog standard alien invasion story or base under siege. Which is a bit weird when you consider that during the Davies era, pretty much all his season openers were just that. Go back and read a synopsis of Smith and Jones some time in the context of tonight’s episode. For goodness sake.

Now, we’re simply not happy with a season opener unless the Doctor’s running to or from something with his current companion(s) wondering what the hell’s going on. Moffat knew that as he wrote The Impossible Astronaut and here he is again, but with near Sisyphian task of pulling back to the fold those of us who strayed and thought he’d lost his mind when putting something as expressively appalling as The Caretaker or Kill The Moon into production and the way to do that is through our fan gene. Taking the chocolate roll metaphor to stretching point, The Magician’s Apprentice is the televisual equivalent of the Tesco Express that has just moved in around the corner with its aisles filled with pastries and cakes within five minutes walking distance. Sticking with the Weight Watchers has been an act of will. But I’m six stones lighter than when The Name of the Doctor was broadcast so …

Davros. Fucking Davros. You and me both know for how many years I’ve been making that joke about every mysterious figure which has darkened the narrative’s door would turn out to be Davros and finally, there he is, pre and post the Big Finish spin-off series (which hasn’t been contradicted yet – that began when he was sixteen years old). I think you can imagine me laughing rather than doing this which was presumably the reaction they were going for, but it’s so rare to be genuinely surprised at such an early stage in a season’s development. Excellent build up too, with the false sense of security surrounding the non-descript battlefield, which if you’ve heard the bullshit rumour that was floating around a couple of days ago, led me to wonder if this young chap was actually the Doctor.

There are multiple questions about this. If young Davros’s existence is pre-Time War, then how’s the Doctor able to go back in time to save or kill the boy? True, such things have become a bit fluid lately, as per Listen - in which Clara confusingly visits the past of a planet which is “lost” in the future – but even the Doctor’s progress through the time vortex wasn’t that easy back in Genesis and he didn’t have a time lock so stringent it sent Dalek Caan mad when he traversed it in order to retrieve Davros. You see, this is what happens when you stack and episode like this with mythology, sections of it are bound to topple over. Perhaps we’ll be gifted with an explanation next week, but probably not.  We'll talk more after the story resolves itself about the effects of the Doctor stepping into his foe's timeline.  Perhaps Hayley Westenra will sing on the soundtrack this time around.

In any case, the callback to Tom’s speech from Genesis was the last thing I expected to hear to tonight which means of course it’s probably just the thing I needed to hear. As the clips tumbled out of the speakers surrounding Julian Bleach (a feat of archival casting worthy of a John Wells drama in an episode filled with them), I almost expected Joseph Lidster’s Terra Firma to put in an appearance to represent the Eighth incarnation and felt slightly cheated when it didn’t. But then Tom’s face actually appeared and those sodding wires and I forgave them everything, even if it wasn’t clear exactly who captured the footage. In universe I mean. David Maloney directed the episode but the TARDIS Datacore entry is a bit thin on facts when it comes to camera operators.

Such are the way of things in an anniversary year, when Moffat’s in the mood to produce his version of The Five Doctors. With all of the references to the Davies era in the episode, it’s impossible not to think of the story as being in some way a tenth birthday celebration, even to the point of the Doctor holding up a gun, not just to a Dalek this time, but their creator. Oh well good. Not the gun thing - hopefully somebody will be there to stop him - because sometimes he needs someone to stop him - but the acknowledgement that although this is all one story, that something immense happened ten years ago. Incidentally, I’m also convinced I saw Martha’s silhouette in The Maldovarium so part of me’s hopeful Freema was able to make some time in her Sense8 schedule.

But this was still very solidly of the Moffat era. Anyone else think Kate and rest of the UNIT were especially uselessless in this episode with Clara essentially telling them the plot? What was with all the freezer-like temperatures in the base? I’m having a lapse in memory but wasn’t a cold environments one of the requirements of the Zygon race? Were we looking at the real UNIT or their doppelgangers from the other anniversary story and if that’s the case why didn’t Clara pick up on it as she was striding about being oh so clever? I expect I’m reading too much into a slightly overaggressive piece of CG, but this could also be a new item of mythology to deal with autonomous Zygons lacking stored imprints, a different strain to the version portrayed in Alan Barnes’s Eighth Doctor audio Death in Blackpool.

Missy’s back too and we’ll forgive them the winking manner of it, the yes I’m alive and can’t even be bothered with explanation because "So you escaped from Castrovalva..." was weak mead even in the 80s. Michelle Gomez feels like she’s been playing the character for years, but there’s a clear re-modulation of the approach in comparison to Dark Water with its Dutch angled push-ins and strange gesticulations, a realisation that a downbeat menace has a more long term gain and funnier in some respects when she’s bluffing away. Best moment (because it’s supposed to be): her approbation at being lower, if not non-existent in the Doctor’s pecking order of arch enemies than Davros. Oh and clearly not dying again at the end.

Her re-emergence also sees Jenna Coleman up her game as she’s forced to rationalise how her character can interact in any kind of meaningful with the Time Lord who slaughtered her boyfriend and put to the back of her mind seeing the deaths of those UNIT agents at the moment when she’s supposed to be working with this lunatic. Her approach is to mainly follow the requirements of the script which sometimes puts her the companion mode to Missy. But watch her in the background of shot, especially her eyes and we're sure that at any moment she could treat this murderess with all the diplomacy that Amy Pond did with Madame Kevorkian (taking us right back to Davros’s speech from Journey’s End about how the Doctor trains soldiers even if he doesn’t mean to).

Is she dead? Hmm, probably not yet, although given yesterday’s announcement which now feels like a piece of stage management rather than post-tabloid damage control, I’m not entirely discounting the idea that they shot footage specially for the trailers featuring her in other locales and that the synopsis for all the episodes after this are a complete blag. Unless Clara’s being played by a different, much younger actress. Or they’re different Claras from across history. Part of me even wishes this extermination was real, due to the poetry of a facet of the character having been somewhat introduced as being a Dalek and her now being killed by one. Well, not poetry exactly, but using a phrase like “narrative bookends” is a bit clunky in this context.  Or one of Lucas's rhymes.

Bestride all of this is Capaldi who now, after a shakier start than we’re used to in the revivals or expected, feels like the Doctor, feels like the same man who played tiddlywinks with Lenin and eloped with Marilyn Monroe, the benevolent alien deploying rudeness as a prop rather than simply his mode. Oh and stop it with your harrumphing over the guitar playing and the tank – this is precisely the sort of business you were expecting from an incarnation played by this actor and were disappointed when you largely didn’t get it last year. I know I was. Everything about Capaldi here is more confident, from his line readings, to his physicality, to his hair, which has finally decided to do that. A lot. The clothes help. The Pertwee homage of last year was fine, but the relaxed David Banks homage, albeit with a darker jacket over the t-shirt, feels more like him.  It's also about the feels, essentially, finally.

Just as an aside, notice how the two prologues that accompanied this episode online actually deepened the experience of the episode, especially The Doctor's Meditation which appeared on Facebook earlier today featuring more from the Doctor's friend Bors played by Daniel Hoffman-Gill (who I'm sure is one of the blokes from that horrendous deodorant commercial which filled the cinemas this year with the dance-off in the middle).  After seeing those five or minutes, we have a real connection to character which makes his fate in the actual episode all the more horrible.  Although paradoxically releasing it before the broadcast of the episode did rather ruin the reveal of where the Doctor is and also one or two of his jokes.

With all this talk of characters and actors, it’s all too easy to ignore the production elements, what's changed from last year, what hasn't. Having expected the show to return to the editing style of Eleventh’s era, perhaps the more interesting aspect is that scene durations are still surprisingly long. One of the more striking choices last year was how some dialogue scenes continued for whole minutes and the general lack of parallel storylines and The Magician’s Apprentice still has those. The teaser is really just two long scenes and there’s a lot of characters chatting in rooms, sitting or standing, admittedly sometimes very large rooms in the case of the arena.

But it doesn’t feel static. Although we don't quite reach the level of editing inherent in something like The Crimson Horror, the cameras are certainly moving around a lot more, shooting from a greater number of angles than last year where directors in some cases seemed to have been asked to provide a mutation of the multi-camera set up of the 60s to 80s.  Now, there’s less of a sense of being able to see which scenes would have been shot on film back then. Everything has the same visual viscosity, not that the style isn’t still markedly different to Blink, Hettie MacDonald’s previous credit for the series. Plenty of the episode is similarly about atmosphere however, notably around the young Davros scenes. For a few brief moments up front, I was somewhat convinced we were about to see a return to The War Games.

But no, it’s Skaro, it’s the Daleks and its multiple Daleks from many eras including what looked like a CG recreation of an classic version in the wilderness. The “new” paradigm’s pretty much dumped now isn’t it? Not since the Mechanoids has a new Dalek related creation been set to one side and good riddance to them. Unless there was one there and I didn’t notice it. The return of the battle-model from Remembrance and such is another attempt to re-engage us older viewers and we’re suckers even though my wilderness years birthed fandom inoculates me a little bit, probably. Armies with a single, authoritarian visual look feel more impressive and deadly than this ragtag, whose slightly jaded appearance resembles the Dalek equivalent of LINDA, even if they clearly have just as much of a destructive capability.

As is the case with two-parters, of this season is supposed to have several, we won’t really know if this is able to maintain the same level of squee-inducing slack-jawed intensity. We fans know that no matter what’s in the throw-forward trailer, we can never completely trust what we see with our eyes and the chances of any of those three having gone are pretty slim. Perhaps my biggest complement is that I wanted to watch it again straight after it had finish, which is quite something when you considered that some of the discs in my copy of the blu-ray set from last year still haven’t left the trays in the amaray case. Welcome back Doctor and welcome back Doctor Who. Here’s to the next eleven weeks and Christmas. Fucking-A!

September 19, 2015

Writing Your NSF GRFP Essays by Astrobites

The National Science Foundation’s Graduate Research Fellowship Program (NSF GRFP, or “the NSF”) is a national fellowship for graduate students who are US citizens/permanent residents that provides three years of full funding at an accredited institution of your choice. Although there are a number of fellowships for graduate students, the NSF is one of the most common fellowships amongst astronomy students. NSF funding allows you to conduct research with professors who could not otherwise financially support you, and relieves you from teaching duty. Here on Astrobites, we’ve already written about the NSF, why you should apply and some of our experiences with the process.

This year’s application is due October 30th, so you might want to look at your application if you haven’t already. In this post, I will outline recent changes in the essay style and offer advice on how to master the written portion of the application. Note that there are other aspects to the NSF application, including letters of recommendation and your transcripts. A strong application needs support from all of these aspects!

Changes in the Application: 3 → 2

Before 2013 (and when Astrobites last touched upon this fellowship), the NSF required 3 essays: a personal statement, a summary of past research and a proposed graduate research project.

The application now requires two essays: a personal statement (3 pages) and a proposed graduate research project (2 pages). The new personal statement is a cohesive discussion about your previous research, your interests and your future goals. The graduate research project is your chance to propose an interesting research question to explore in graduate school. These essays are judged using two primary review criteria: intellectual merit and broader impacts. You can read more about these criteria and the review process here.

How Do I Even Start?

It might be intimidating to even begin these essays: one requires you to summarize your passion for astronomy in just three pages, and the other asks you to present yourself as an independent scientist with creative and feasible research ideas. Many students suffer from impostor syndrome and may think “They’ll think this is a terrible idea” or “I can’t do that”.

If you’re applying for graduate school, chances are that you are completely capable of writing great essays. Believe in yourself! You’ve learned and have grown since your first day of college; let your expert knowledge shine in your essays. Even if you feel that you may not have enough experience under your belt to win this time around, writing the application will be great practice for creating an effective proposal.

No matter how strong your writer’s block may be, begin your essays by simply writing your ideas down. For your personal statement, list past research, accomplishments and outreach. Start with a bulleted list and look for common threads amongst activities. For the proposed research project, list every astrophysical idea, topic or questions which could form your eventual project. You can narrow down these ideas later. Ultimately these ideas will form the frameworks for your essays.

The Personal Statement: It’s All About You!

The personal statement is where you will describe yourself through your previous research experience, outreach activities and passions. The most common (and easiest) approach to the personal statement is to list your research experience or classes in chronological order and tack on a paragraph about outreach experience. This method can work, but it might not be the most effective way of communicating your background and growth.

Instead, try to summarize your personal aspects that you want to describe to the reviewers, and let this be a common thread throughout the essay. For example, Applicant Abby is a great communicator. Abby’s strength allowed her to explore many subfields of astrophysics by quickly picking up jargon and techniques. Additionally, she used her adeptness in communication to convey tricky astrophysical concepts to the general public as part of her outreach experience. She organized her essay by selecting experiences that highlight this strength. Use the adjectives you which describe yourself time and time again in your essay. You should even bold your keywords for the reviewers; they have a lot of applications to read!

Once you have your outline and organization settled, you’ll have to decide how you will present your research or outreach experiences. The level of detail which you should supply will vary with the number of projects you’d like to talk about. Regardless, emphasis any knowledge or experience you gained and tangible outcomes (like an AAS poster or paper) within your research. You are not a robot with a set list of technical skills, so focus on your growth rather than the exact scientific methods you used. When discussing your outreach, talk about why you are passionate about what you are doing and what you gained from the experience.

Research Statement: Think Like a Scientist

In the research statement, the reviewers want to see that you can clearly state a research question and can outline a method to answer this question. If you are awarded an NSF fellowship, you aren’t held to this project, so feel free to propose anything you want! The project can be an extension of your current research or a new idea altogether. Talk to experts about your ideas to get feedback and relevant literature.

I recommend organizing your statement into three parts: a background section, a clear plan of action and a section devoted to possible results and broader impact. In the background sections you demonstrate your expert knowledge of a subfield and explain why your proposed problem is important. Keep in mind that your reviewers are likely not experts; in fact, they may not even be astronomers! Next, explain the concrete steps you would need to take to execute your research. This can include resources you would need (telescopes, super computers, etc), or expert knowledge you would need to gain. In this section it is also useful to tie in any previous work you’ve done which would make you an ideal candidate to implement the proposed research. Don’t be afraid to spell out your plan in a number or bulleted list; again, the reviewers have a lot of application to read! Finally, put your project into context and explain how your results will impact the field as a whole, whether it be through a new discovery or a public database. This is also an excellent place to tie in outreach opportunities.

Parting Words

My final advice to you is to ask anyone and everyone to read your essays. Friends, advisors, letter writers, graduate students and past winners. Everyone. Every editor will hone in a few aspects of your application, and the collective feedback can really clarify and enhance your ideas.

Remember that the NSF is highly competitive, and it’s ok if your application is not successful. Rejection does not mean that you won’t become a successful graduate student or that your proposed project was “bad”.  Most students do not get the NSF, but they can go on to become outstanding researchers.

Curiosity update, sols 1073-1107: Driving toward dunes, distracted by haloes by The Planetary Society

Since I last checked in with Curiosity, the rover has been steadily driving southward, heading directly toward the Bagnold dune field. They are looking for a place to drill into the Stimson sandstone unit, but have been distracted by intriguing pale haloes around frock fractures. Despite a rough road, the wheels are not showing significant increase in damage.

September 18, 2015

A question about the future of the world wide web by Charlie Stross

(This is the preamble to a complex open-ended question, below the fold. Bear with me ...)

Back in 1994-96, during the Big Bang era of rapid expansion as the world wide web expanded into the outline of its future shape, there was considerable discussion of how best to pay for everything. Back in the early days of the internet NSFNet basically forbade commercial use of internet connected systems — this went out the window rapidly once the world wide web caught on as a publishing medium.

There were two contenders for the funding mechanism in the early days: micro-billing (in which you pay pennies, fractional or otherwise, for access to web pages) and advertising (in which the page is nominally free but you pay the bandwidth overheads of downloading someone else's idea of what they want you to see). Advertising won out because in the long-ago era of modem-based downloads micro-billing was expensive; you might only need to exchange a couple of KB of data to fund a transaction, but when many folks were still using modems that topped out at an asthmatic 9600 bits/second, the bandwidth cost was just too high to support microbilling.

So we ended up with banner ads and spam, and then by a hop, skip and a jump today's hideously bloated ecosystem of ad exchanges, trackers, ghost cookies, third-party javascripts that download megabytes of libraries to figure out who you are and who is willing to pay the most for a few seconds in front of your eyeballs ... and so on.

Now we get to the post-2007 era of multitouch smartphones and tablets and monetization/funding strategies. Google -- or should I say, DoubleClick, the world's largest advertising agency -- is reasonably easy to understand: they want their OS (Android) on as many devices as possible in order to funnel ads at the customers. Apple is somewhat less scrutable: they're best understood as a hardware company (although they also have diversified into a variety of fields, including content sales) and they want their hardware in as many hands as possible, which means providing a slick end-user experience. Both firms support an appified ecosystem in which the public web isn't so much locked out as rendered increasingly irrelevant. And Apple has taken the next logical step by permitting ad-blockers in their standard web browser (as of iOS 9).

(Warning to Apple-phobes: you may not like Apple, but current sales of iOS devices outweigh the sales volume of the entire PC industry. Apple won the race to be the first post-PC consumer electronics company, and now occupy a niche roughly analogous to Sony's in the pre-personal computing consumer electronics world: a premium brand and market maker with enormous clout. So for "Apple" read "everyone else in the post-PC device world, sooner or later".)

A lot of self-identified content creators are quite irate about the rise of ad-blocking; they've grown accustomed to their writing being funded by advertising sales. But to those of us who earn our crust from writing without ads, and who pay the atrocious bandwidth and performance bills imposed by the advertisers, it looks like the current state of the ad-funded web is a death-spiral and a race to the bottom. Casual information consumers won't pay for access to paywalled sites, and a lot of the struggling/bottom-feeding resources on the web are engaged in a zero-sum game for access to the same eyeballs that are increasingly irritated by the clickbait and attention-grabbing excesses of the worst advertisers.

Anyway, this leads to my question: is there any way to get to a micro-billing infrastructure from where we are today that doesn't involve burning down the web and starting again from scratch?

So: are there any promising distributed microbilling payment platforms out there on the horizon, other than the Google and Apple company stores? And if not, what is to be done?

(NB: anyone who says "Bitcoin" will be banned for stupidity. Non-BtC applications of the blockchain are another matter: but BtC itself is no more a solution to microbilling than stockpiling gold bullion is a solution to high-interest-rate credit cards targeting the poor and those with bad credit records.)

Upcoming events by Charlie Stross

I'm on the move again, from the middle of next week.

On September 24th, I'll be doing an interview, Q&A, and signing at the American Book Center in Amsterdam (the largest English language bookstore in the Netherlands).

This is while en route to Portland, Oregon, where from October 2nd to October 4th I'm guest of honor at the H. P. Lovecraft Film Festival and Cthulhucon. I'll post my program items here when they're finalized. In the meantime, you might want to keep an eye on my twitter feed for rapid-fire updates.

Bye then Jenna. by Feeling Listless

TV After days of the media thinking fans would be rioting in the streets about it as though a companion moving on was some new thing after fifty-odd years, Jenna Coleman confirmed on the radio this morning that she's already left Doctor Who, having filmed her final scenes even before the show's reached broadcast, Christopher Eccleston style - though obviously with warmer feelings about it.

Would we know this if The Mirror hadn't decided to stick it in the front page?  I shouldn't think so and it's another example of how the modern commercial media's scrutiny of the show effects how we interpret and enjoy the fiction.  Now the season will about how Clara leaves rather than "Wow, she's leaving..."  Unless, as has been highly speculated about the Chris X cover story, it was leaked to a tabloid by someone on the production team.  On purpose.  For some reason.

As you know, I've always firmly been in Team Clara camp, though I know she's not been an especially popular companion over the years.  This is partially due to her introduction as a plot point with multiple versions rather than as a straight in through the doors kind of figure, the version we have now only having shown up after us seeing to earlier fragments of the same personality.  Exciting idea in principle, well conveyed, but still with the inherent Dollhouse-style protagonist problems.

The above interview is as ever about what's not being said.  Jenna hints that it might not be before the end of the year and in fact so tied down is Cardiff now in terms of production blocks and the order of filming, it could well be that her final wrap scene wasn't in her final episode.  Her first episode shot was Hide, with the TARDIS scenes only being filmed months later.  One of these days I should get around to compiling that production calender based on the Pixley specials.

Agent Carter finally reaches UK BD. by Feeling Listless

TV Why did we end up with BD as the acronym for blu-ray rather than BR. Yes, I know it's supposed to stand for Blu-ray Disc - in much the same way as Digital Versatile Disc for the other thing, but it really is confusing. Plus I never can tell whether to capitalise blu-ray or Blu-ray or Blu-Ray. The Guardian's style guide indicates "Blu-ray -- TM; full name is Blu-ray Disc (not Disk), abbreviation BD" because it's a trade mark, which make sense I suppose.

In any case, Agent Carter is finally available on Blu-ray Disc in the UK.

DVD too.

Crowdfunding Is More Likely to Replace Ads Than Micropayments by Albert Wenger

The rollout of iOS 9 and along with it adblocking plugins for mobile Safari has re-ignited the debate over the ethics of removing ads from pages. Marco Arment who released the paid Peace adblocker previously wrote a piece laying out his views arguing that publishers have crossed a line with ads and trackers that are too intrusive and disruptive. In response there are now pieces bemoaning the end of independent publishers and the web altogether. And inevitably micropayments are proposed as an alternative to ads.

I continue to be skeptical though that micropayments are the right way to finance content. The reason can be found in Kenneth Arrow’s work on the “information paradox” from 1962! I don’t know how much a piece of content will be worth to me, so I am not willing to pay for it until I have read it. But then of course I have already consumed the content and now my willingness to pay for it drops radically.

This insight explains why it is so difficult to charge for small pieces of content. Their value will vary widely across readers and ex ante any one reader doesn’t know the value. It also explains why it is a lot easier to charge for say a blockbuster movie or a big production value video game – there are enough potential customers who anticipate enjoying the content enough that they are wiling to pay the price *before* experiencing the content. Finally, this also explains why charging for a subscription bundle is a superior strategy for some types of content, such as Netflix and HBO – for some math see this paper by Yannis Bakos and Erik Brynjolfsson.

All the talk about how the actual costs of making a payments are the reason why micropayments haven’t taken off is therefore likely a red herring. Even newfangled super low transaction cost blockchain based systems do not get around Arrow’s Information Paradox.

How then is journalism to be financed? As I wrote in 2014, I continue to believe that crowdfunding is the answer. Since then great progress has been made by Beaconreader, Kickstarter’s Journalism category, and also Patreon. Together the amounts are still small but it is early days. Apple’s decision to support these adblockers may well help accelerate the growth of crowdfunding and that would be a good thing – I don’t like slow page loads and distracting ads but I will happily support content creation directly (just highly unlikely to do so through micropayments while reading). All of this provides one more reason to support Universal Basic Income – a floor for every content creator and also more people who can participate in crowdfunding.

Update: Follow-up post on subscriptions and micropayments

Building a PC, Part VIII: Iterating by Jeff Atwood

The last time I seriously upgraded my PC was in 2011, because the PC is over. And in some ways, it truly is – they can slap a ton more CPU cores on a die, for sure, but the overall single core performance increase from a 2011 high end Intel CPU to today's high end Intel CPU is … really quite modest, on the order of maybe 30% to 40%.

In that same timespan, mobile and tablet CPU performance has continued to just about double every year. Which means the forthcoming iPhone 6s will be almost 10 times faster than the iPhone 4 was.

iPhone single core geekbench results

Remember, that's only single core CPU performance – I'm not even factoring in the move from single, to dual, to triple core as well as generally faster memory and storage. This stuff is old hat on desktop, where we've had mainstream dual cores for a decade now, but they are huge improvements for mobile.

When your mobile devices get 10 times faster in the span of four years, it's hard to muster much enthusiasm for a modest 1.3 × or 1.4 × iterative improvement in your PC's performance over the same time.

I've been slogging away at this for a while; my current PC build series spans 7 years:

The fun part of building a PC is that it's relatively easy to swap out the guts when something compelling comes along. CPU performance improvements may be modest these days, but there are still bright spots where performance is increasing more dramatically. Mainly in graphics hardware and, in this case, storage.

The current latest-and-greatest Intel CPU is Skylake. Like Sandy Bridge in 2011, which brought us much faster 6 Gbps SSD-friendly drive connectors (although only two of them), the Skylake platform brings us another key storage improvement – the ability to connect hard drives directly to the PCI Express lanes. Which looks like this:

… and performs like this:

Now there's the 3× performance increase we've been itching for! To be fair, a raw increase of 3× in drive performance doesn't necessarily equate to a computer that boots in one third the time. But here's why disk speed matters:

If the CPU registers are how long it takes you to fetch data from your brain, then going to disk is the equivalent of fetching data from Pluto.

What I've always loved about SSDs is that they attack the PC's worst-case performance scenario, when information has to come off the slowest device inside your computer – the hard drive. SSDs massively reduced the variability of requests for data. Let's compare L1 cache access time to minimum disk access time:

Traditional hard drive
0.9 ns → 10 ms (variability of 11,111,111× )

0.9 ns → 150 µs (variability of 166,667× )

SSDs provide a reduction in overall performance variability of 66×! And when comparing latency:

7200rpm HDD — 1800ms
SATA SSD — 4ms
PCIe SSD — 0.34ms

Even going from a fast SATA SSD to a PCI Express SSD, you're looking at a 10x reduction in drive latency.

Here's what you need:

These are the basics. It's best to use the M.2 connection as a fast boot / system drive, so I scaled it back to the smaller 256 GB version. I also had a lot of trouble getting my hands on the faster i7-6700k CPU, which appears supply constrained and is currently overpriced as a result.

Even though the days of doubling (or even 1.5×-ing) CPU performance are long gone for PCs, there are still some key iterative performance milestones to hit. Like mainstream 4k displays, I believe mainstream PCI express SSDs are another important step in the overall evolution of desktop computing. Or its corpse, anyway.

[advertisement] Find a better job the Stack Overflow way - what you need when you need it, no spam, and no scams.

Spectacular New Horizons photo of Pluto's hazes and mountains: How it was made by The Planetary Society

Today, New Horizons released a stunning new image of Pluto's backlit mountains and hazes. I explain how the image was taken with its Ralph Multispectral Visible Imaging Camera.

Searching for the Origins of Earth’s Water by The Planetary Society

Three recently proposed low-cost space missions all aim to answer the same question: Where did Earth's abundant water come from?

September 17, 2015

Jimmy Carter's Cinema Choices. by Feeling Listless

Film As you know I like a good, massively depthful and time consuming annotation project. So here's a very good, massively depthful and time consuming annotation project. Matt Novak has worked his way through Jimmy Carter's presidential diaries and made big long list of all the film that President watched:

"It seems like Carter would watch anything and everything, with over 400 movies screened at the White House and Camp David while he was in office. Some of the screenings were private affairs with just the President and First Lady. Other times a movie was that night’s entertainment for guests at the White House. An April 30, 1979 screening of the Ingmar Bergman film Autumn Sonata notes that there were “approximately 48 members of the White House staff” on hand to watch."
Of course it's worth reminding younger viewers that in order to view these films in the late 70s and early 80s he would have had to have viewed an original print. The first film he apparently saw after his inauguration was All The President's Men, which is of course, hilarious. But the title at least of his final film more than beats it.  I'd say if you wanted a pretty varied, cineliterate list to work through, this would do the job, albeit a few decades out of date.

Brief commercial interlude by Charlie Stross

Because I get asked this a lot ... the US Audiobook edition of "The Annihilation Score" has finally materialized. (Actually, it dropped a week or so ago but I've been flailing around in the guts of a novel and too busy to mention it.) You can find the Amazon Audible page here;; if you know of different platforms/vendors feel free to add a comment.

No, there is no UK/EU audio edition. Nor are there other new UK audio editions of my books. Here's the (lengthy) explanation of why this is so.

Because Amazon hates your wallet, US customers can now buy an ebook bundle of all six Laundry Files novels. This includes "The Annihilation Score". (Yes, part of the high price is a side-effect of TAS still being priced against the hardcover. It'll get cheaper next June when the paperback comes out ... then shoot right up again a month later when they add "The Nightmare Stacks" to the bundle.) There's no UK bundle option yet, but I'm going to suggest it to the folks at Orbit.

If you're a completist, you may also want to add Three Tales from the Laundry Files — an ebook-only bundle consisting of "Overtime", "Equoid", and "Down on the Farm" (which are not collected in any other volumes so far). (UK customers should go here).

Isabel Hardman on the Twitters. by Feeling Listless

Social Media I might not agree with everything Isabel Hardman says, but I always find her contributions valuable, at least in terms of offering a verifiable direction as to where my own political compass is.

 Although she writes for The Spectator, which is in so many ways anathema to me, she has an even hand doesn't feel as ideologically guided as some of their other writers.

 Now here she on Medium talking about just much of a pain that is:

"Those in a political tribe are the least able to judge what is “balanced” or what valid questions are. That is why the stream of angry tweets every time one of us writes an article about any political party or politician don’t make any difference to the journalists reading them, other than to foster a sense that Twitter is just becoming unbelievably tedious. The partiality of those complaining that we’re prioritising something they they consider tittle-tattle is so easy to spot. Often they seem only to have read pieces that they know will make them angry, rather than the ones that paint their opponents in an unfavourable light too. Write a piece about Jeremy Corbyn not singing the national anthem and you’ll receive hundreds of messages asking how you dare do such a thing. Write a piece about the Tory changes to tax credits, or disability benefit cuts, or countering Isil, and it’s as though they were never published. The mob is busy elsewhere."
Twitter has become really quite tedious and certainly not as much fun as a networking tool as it used to be. A lot people who I used to follow/talk to have, I suspect, migrated to the walled garden of Facebook, a place which like tumblr I barely understand or like that much.

Ultra sonic anemometer plausibly solved on an Arduino by Goatchurch

Following on from last week’s episode and the Emil’s sterling work reverse engineering the HC-SR04 echo location board, JD tapped in to a couple of pins on the surface mounted microcontroller so we could get the echo signal directly.


I’ll try to report this as straightforwardly as possible without too many diversions about how we worked it all out.

With two HC-SR04s facing one another, and as per the datasheet, the scope tracing is as follows:
Cyan is the trigger to device A, Magenta is the 40kHz 8-cycle wave generated by device A (delayed to allow the MAX232A chip to charge up), Yellow is its corresponding echo pin which is programmed to go high once the 8 cycle signal goes out, and Blue is the received signal in device B. [The corresponding wave generated by device B is received at the same time by device A which is programmed to take its the echo pin (yellow) low.]

Zoomed in on the initial cycle to see there are 8 and that their wavelength is four to a vertical division, or about 25microseconds each, corresponding to 40kHz. The transducers probably resonate at this frequency (to give a better energy to sound conversion) which is why so many more waves are received than sent out.

This is the received signal, overlaying the half-second of samples (about 10 because I’m repeating the loop with a delay of 50milliseconds) proving that it’s pretty stable.

By zooming in on a single wavelength and averaging over the previous 64 we can make it dance with the application of the fan

Device B (the receiver) is the one by the oscilloscope. By directing the fan along the sound vector we’re able to blow back the sound signal by approximately half a wavelength or about 12microseconds.

I’ve spent hours playing with this setup, but it’s only going to be any good if we can read the values using a microcontroller, like an Arduino.

After a certain amount of looking at pulseIn() function and hard-coding the pin ports, we have code that is as follows.

First this is the global array of 16 time measurements:

unsigned short ws[16]; 

We are going to log each event in this array by counting loops, and use the fact that 65536 provides a very useful upper limit to trap for overflows in our loops so we don’t crash. See below that we bail out when (c++ == 0) in the loop.

Now set the trigger pins on both HC-SR04s HIGH for 10microseconds, as required in their instructions.

PORTB &= 0b11111100; 
PORTB |= 0b00000011; 
PORTB &= 0b11111100; 

Now the loop to detect the 8 cycle signal (in magenta):

unsigned short c = 0; // 2 byte counter
unsigned short* pws = ws;  
do {
    do { if (++c == 0)  return; } while ((PIND & 0x04) == 0); // pin 2 LOW
    do { if (++c == 0)  return; } while ((PIND & 0x04) != 0); // pin 2 HIGH  
    *pws = c; 
} while (pws != (ws+8)); 

And finally the loop (running on from the above) to detect the first 8 cycles (in blue)

do {
    do { if (++c == 0)  return; } while ((PIND & 0x40) == 0); // pin 6 LOW
    do { if (++c == 0)  return; } while ((PIND & 0x40) != 0); // pin 6 HIGH
    *pws = c; 
} while (pws != (ws+16)); 

Let’s have a look at that first do-while block in assembly code, because something amazing was done by the compiler:

     ldi  r30, 0x16  ; ws -> Z-register (low byte)
     ldi  r31, 0x01  ; ws -> Z-register (high byte)
     ldi  r24, 0x00  ; c = 0 (low byte)
     ldi  r25, 0x00  ; c = 0 (low byte)
166: adiw r24, 0x01  ; increment c by 1      (2 cycles)
     breq .+54       ; branch if c == 0 to 1a0 (location of return statement) (1 cycle)
     sbis 0x09, 2    ; skip if bit in I/O register is set !!!! (1 cycle)
     rjmp .-8        ; jump to 166 (2 cycles)

16e: adiw r24, 0x01  ; increment c by 1
     breq .+46       ; branch if c == 0 to 1a0 (return statement)
     sbic 0x09, 2    ; skip if but in I/O register is clear !!!!
     rjmp .-8        ; jump to 16e
     st Z+, r24      ; *pws = c; ++pws (low byte)
     st Z+, r25      ; *pws = c; ++pws (high byte)

     ldi r18, 0x01   ; (ws+8) high byte
     cpi r30, 0x26   ; (ws+8) low byte
     cpc r31, r18    ; compare with carry high byte (how to compare two bytes consecutively)
     brne .-28       ; branch not equal to 166
1a0: ret

I have no idea how the compiler got clever enough to discover those SBIS and SBIC commands, which can only be applied because it could see only one bit set in the (PIND & 0x40) statement (bit 2, in fact), but it’s amazing.

What this means is the loop is only 6 processor cycles long, or 6/16=0.375microseconds, which is a pretty good resolution to be working with.

Unfortunately, because it’s the same precision we’re measuring the signal going out as going in, we have to double it to 0.75microseconds that we can measure the duration of sound travels.

The only way to get back to the 0.375 precision is if we ping off the HC-SR04’s microcontroller and generate the wave from the MAX232A directly, because then we’ll know exactly when it’s going out.

Now to look at the output, by running the function that shows us the timing differences:

for (int i = 1; i  16; i++) {
    Serial.print(" "); 
    Serial.print(ws[i] - ws[i-1]); 

We get lines like:

721 67 67 66 67 67 67 67 4625 67 66 67 65 65 65 64 
722 67 67 66 67 67 67 67 4623 67 66 66 66 66 65 65 
719 66 67 67 67 67 67 66 4624 67 66 66 66 65 65 65 

which are pretty stable.

But do the numbers add up?

Well, 67 loops amounts to 67*6=402 processor cycles which, for a processor running at 16megahertz comes to 25.125microseconds, or 39801 cycles per second, which is 0.5% of being exactly 40kHz.

The travel time between the last impulse being sent to the transponder and the first wave being received is 4625*6/16=1734microseconds, which is the time it takes sound to travel 59cm. The distance between the bottom of each can is 58cm.

Now, it probably takes a few waves to get the transponder to get up to power, and I note that the echo signal (the yellow) line in the oscilloscope trace begins after a slight pause from the last of the 8 waves. This could easily be a tuning issue.

Anyway, this is what happens when I graph that middle number while turning the fan on and off:

The vertical jumps are about 67 units when the microphone decides to skip the first few waves, probably because they've been weakened by the noise from the fan. It's a bit of a mess.

A much better result is if we plot the result as points several times, shifting in Y by multiples of 67 units. Then you can see how it's supposed to line up, if we got the wave front right.
The difference is about 29units, which means the sound arrived 29*6/16=10.8microseconds earlier below its usual 4625microseconds of travel in stationary air, which is a difference of about 0.2% in speed, meaning about 0.75m/s for the air from the fan.

So this is all looking very much like I could build something good here, subject to the problem of guessing the correct wavefront of the multiple coming through, but that's something I feel I can bodge in software. It would help if I used a shorter gap between my transponders.

It also seems plausible I should make a 3D anemometer from this. This could be done from a single 5V arduino mini if I'm careful with the pin count, even with controlling the microphones directly. It's all good fun.

The final result would be a cheap contraption (about £30 in parts) that I could bolt on to a spar poking out the nose of my glider in order to directly measure the airspeed glide angle and side slipping, which would be pretty neat.

In the meantime I've determined to shelve this project and get back to some of the triangular machine tool stuff.

I also haven't done anything productive with any of the data I have collected so far from the glider, so I feel I should do that before I allow myself to build any more devices like this sort of thing.

Accessing a Jekyll site over your local wifi network by Zarino

Jekyll is a simple framework for generating static websites and blogs. Static websites are all the rage these days because they load way faster than dynamically generated sites (eg: sites built on WordPress or Rails) and they can be hosted entirely for free at places like GitHub Pages.

Jekyll and Hyde

When you’re developing your Jekyll site, you can test it out locally using Jekyll’s built-in server:

$ cd ~/my-site
$ jekyll serve
Configuration file: /Users/zarinozappia/my-site/_config.yml
            Source: /Users/zarinozappia/my-site
       Destination: /Users/zarinozappia/my-site/_site
      Generating... done.
    Server address:
  Server running... press ctrl-c to stop.

And ta-da! Your site is accessible in a web browser at or http://localhost:4000 (because “localhost” is an alias for “” in most people’s hosts file).

But what about accessing the site from other devices on your network?

These days, you’d be mad to only test a website on the computer you developed it on. Mobile devices like phones and tablets account for over 50% of web traffic, and chances are you already have one in your pocket that you could test with.

The problem is, just connecting your iPad to the same wifi network as your Mac, and assuming that will work isn’t enough.

Even if you substitute in your Mac’s Bonjour name (like zarinos-mac.local) or local IP address on the wireless router’s network (like it still won’t work.

That’s because, even though the web request is being routed to the device with the address or whatever, the Jekyll command line server only responds to requests for so it doesn't grace your iPad with a reply, and you get an error like “Safari cannot open the page because it could not connect to the server”.

Dr Jekyll says: Cripes old fellow, what a MONSTROUS predicament

Solution: Tell Jekyll which hostname to respond to

Once you’ve twigged what’s going on, it’s quite simple: Just tell Jekyll to respond to all incoming requests, rather than only those aimed explicitly at

IPv4 has a nice shortcut for this, the the magic IP which acts as an alias for any IP destination on the local machine – whether that’s a request from an external device (via or zarinos-mac.local for example), or from your local device back to itself (via or localhost). They all end up at (Thanks @davea for the tip!)

So, just give that to Jeykll using the --host command line flag and you’re done:

$ jekyll serve --host
Configuration file: /Users/zarinozappia/my-site/_config.yml
            Source: /Users/zarinozappia/my-site
       Destination: /Users/zarinozappia/my-site/_site
      Generating... done.
    Server address:
  Server running... press ctrl-c to stop.

Now, when other devices send HTTP requests to your local IP address (like or your Mac’s Bonjour name (zarinos-mac.local), Jekyll will respond, and you’ll be able to test your Jekyll site out on your iPhone or iPad, over wifi, without having to deploy it somewhere public first.

And because also catches, Jekyll will also still respond to requests for or localhost, like normal. So you get the best of both worlds.

Mr Hyde growls: Hey babe, fancy some cross-browser testing?

Even if you don’t need to test your sites on hardware devices like phones and tablets, this trick works exactly the same for virtual machines – like the ones Microsoft provides to run old browsers like IE7, 8, and 9 in a virtual version of Windows XP.

A VM is normally part of your local network just like any other device. Boot it, fire up Internet Explorer, and type in host’s IP address. Boom!

Feeding on the Cosmic Web by Astrobites

Title: Localized starbursts in dwarf galaxies produced by impact of low metallicity cosmic gas clouds
Author: J. Sanchez Almeida, B.G. Elmegreen, C. Munoz-Tunon, D. M. Elmegreen, E. Perez-Montero, R. Amorin, M.E. Filho, Y. Ascasibar, P. Papaderos, J.M. Vilchez
First author’s institution: Instituto de Astrofisica de Canarias, La Laguna, Tenerife, Spain
Status: Accepted for publication in Astrophysical Journal Letters.


Galaxies can grow in two ways. They can collide in violent mergers that combine the mass of two galaxies. Mergers turn orderly spiral galaxies into jumbles of stars called elliptical galaxies. Just such a merger is in store for our Milky Way and the Andromeda Galaxy in 4 billions years. Isolated galaxies grow in a more gentle fashion, feeding on inflowing gas from the cosmic web. The cosmic web has been described by simulations of the dark matter and gas in the universe (Figure 1). The inflowing gas is difficult to observe directly; it is cold and has not yet formed stars. This pristine gas reservoir is studied indirectly by using distant quasars as lighthouses to probe the gas in front of them.


Figure 1. A simulation of dark matter shows a lattice of filaments separated by voids: the cosmic web. The inset shows a 10 million light year region around gas filaments. Galaxies may grow by pulling in gas from such filaments. Credit: Anatoly Klypin and Joel Primack; insert: S. Cantalupo

Today’s paper presents observations of a handful of small galaxies to shed light on how infall of gas from the cosmic web can trigger the formation of new stars.

Measuring the Metals

The authors focus on ten small galaxies, known to contain very few metals. In astronomy, all elements heavier than hydrogen or helium are called metals. Metals are formed in stars, so the amount of metals in a galaxy represent the amount of star formation that has taken place. Thus, these galaxies have undergone very little star formation. The authors took new spectra of these galaxies to see if the amount of metals varies from point to point within each galaxy. Each galaxy’s light is dominated by a single region of active star formation, called a starburst. In Figure 2, the amount of metals in the starburst is much lower than the metal abundance. The same is true for the other galaxies. Because of its low metal abundance, the gas in the starburst has not experienced as much prior star formation as the rest of the galaxy. This discrepancy in metal abundance between the regions of most active star formation and the rest of the galaxy can be explained by infall of metal poor gas from a nearby filament.

Screen Shot 2015-09-16 at 2.43.40 AM

Figure 2. Top: Image of one of the galaxies studied. The bright blue dot is a star forming region. Bottom: Logarithmic metal abundance (red squares) and the density of the star formation rate (blue line) along the galaxy. The distance increases in the direction of the red arrow in the top panel. The star forming region has much lower metal abundance than the rest of the galaxy.

Fed by a Cosmic Filament?

Because of the small size of the galaxies, their rotation will quickly mix their gas, and the metal abundance should become constant across each galaxy. In order to maintain its lower metal abundance, the starburst must be tapping into a reservoir of gas with few metals. The infall of pristine gas from a filament is a natural explanation for the observed metal content. Such gas is part of the primordial cosmic web, and has not been enriched in metals by star formation. In this scenario, a gas clump in a filament collides with the galaxy and compresses its gas, triggering gravitational collapse and star formation in the diluted gas. These observations provide the best evidence yet that galaxies can grow by gas infall along filaments.

Orion Enters Fabrication Phase, but Possible Launch Slip Looms by The Planetary Society

NASA's Orion spacecraft has officially moved from preliminary design to fabrication, but the agency says the first crewed flight of the vehicle could slip two years, from 2021 to 2023.

Blue Origin to Launch Rockets from Cape Canaveral by The Planetary Society

Spaceflight company Blue Origin plans to launch its orbital rocket from Cape Canaveral, Florida, the company revealed Tuesday.

September 16, 2015

In Our Time: The Angry Years. by Feeling Listless

Radio Having embarked on a project to listen my way through the In Our Time podcast from Radio 4 the last thing I expected to hear was just how argumentative the programme was back when the format was first broadcast back in 1998. For just over the past decade, it's generally been a kind of encyclopedia in discussion format with subjects broad and narrow explained in as much detail as is possible in half an hour by three academics refereed by Melvyn Bragg.

But the first series is markedly different, much closer to Start The Week, which Melvyn had just completed  a year's stewardship. In 1998/9 a guest academic with a new book out would be invited on to what was essentially a Phd supervision meeting with someone else in their field and Melyvn, whose style was far more analytical and journalistic back then. From what I've heard so far the discussions were often intriguingly fractious with old scores bubbling to the surface. Here are some highlights.


Dr Helena Cronin vs Dr Germaine Greer.

There are few things in life more entertaining than hearing Greer in full flow and on point destroying the argument of someone who she disagrees with, and so it is with Cronin and her Darwinian theory about women in society which is essentially that men and women have evolutionary developed to be capable of different things and that society should get around to accepting that. Things begin well when Greer, having heard Cronin's initial argument, gives a very lengthy, very telling pause. What follows is twenty minutes of evisceration in which Germaine more than lives up to her name. Utterly thrilling and also likely to promote a goodly amount of righteous indignation, notably when Cronin suggests that women are predisposed not to be geniuses...

Modern Culture

Roger Scrutton vs Will Self.

Oh yes. I think you can imagine which way this goes and it's a typical example of how Melvyn and his guest would essentially tag team as they pick holes in the other guest's work. On this occasion, Scrutton's selling his monograph An Intelligent Person’s Guide to Modern Culture which if their discussion is anything to go by ignores all popular music but for Nirvana and Oasis, film (something Bragg is especially distressed about) and modern technology in general. Admittedly for much of the duration it's good natured sniping, but there are moments when Scrutton sounds like he wants to leave because he knows he's on to a loser. Self is superb on how the artificial walled gardens between high and popular culture are increasingly designed for no other reason than to make intellectuals feel better about themselves.

The Avant Garde's Decline and Fall in the 20th Century

Professor Eric Hobsbawm vs Frances Morris.

Bit of surprising one this. Morris was then specialist in contemporary art and Art Programme Curator for the Tate Gallery of Modern Art and so obviously has an interest in defending the avant-guarde, something which Hobsbawm is keen to indicate is merely the process of watching the decline of a once great cultural force, suggesting that film is by far the more vibrant and important visual art in the twentieth century. Unlike the other two I can see this from both sides, mainly because comparing Gone With The Wind and Guernica seems like an essentially pointless exercise. Everything begins relative genially and then Hobsbawn starts referring to Morris patronisingly as "my dear" which gets her back up, with good reason.  It's an age thing, but it doesn't half start making you want to side with her.

Shakespeare and Literary Criticism

Professor Harold Bloom vs Jacqueline Rose.

There are few things more entertaining than when Melvyn scoffs and there's a classic one in here when Bloom suggests that Freud's Oedipal complex was more heavily influenced by Hamlet. "Do you have proof of that?" Bragg asks and we then hear a superb example of how the arguments some academics have around subject are too complex to be able to articulate in a few sentences whichever side of the fence of correctness they're on.  Worth listening for the moment when Rose utterly destroys Jane Eyre for any reader with a social conscience and Bloom proposes an utterly barmy new approach to teaching literature in universities.

Architecture in the 20th Century

Daniel Libeskind vs Richard Weston

Actually pretty good natured even though the participants are on opposite ends of the architectural belief spectrum. The real meat is in how Bragg and Weston attempt to describe Libeskind's buildings for the radio, notably the Spiral Extension to the Victoria and Albert Museum which wasn't subsequently built.  Most of it isn't anything the architect hasn't heard before, but there is still one moment which made me "oof" as I was walking around Sainsbury's.  As with all these discussions, they're time sensitive and there's plenty of talk about where architecture is in the 20th century and where it will be going in the next couple of decades.

On Demand Services and (Mis)Understanding Vertical Integration by Albert Wenger

While thinking about my talk at the On Demand Conference today in New York, I stumbled upon a tweet which quotes a Bloomberg piece on Baidu’s Robin Lee (the CEO):

To Li, the math is simple. While search advertising is big, the services and retail market is much bigger. The initiative will “definitely” eclipse search revenue over time, he said. In cities like Beijing, the vision is already taking shape. Click an application on your smartphone and someone will stop by your home to wash and walk your dog. A few more taps and someone will deliver groceries or medicine.

This of course sounds a lot like the on demand services that entrepreneurs are building here in the US.

The basic logic here appears to be as follows: you could either operate a search engine and then hand off a customer to a service provider collecting either nothing (organic) or an advertising fee (paid search). Or you could integrate backwards vertically and provide the entire service (or sell the good) directly. Obviously your revenues will be much bigger. Your profits will also be bigger as an absolute number, but you may *not* be more profitable.

When it comes to profitability it is important to keep in mind that this is a ratio measure of profits over capital, effectively the return on capital (lots more detail to this, such as capital structure, but good enough as a first approximation). Now whether or not you can increase profitability by vertically integrating depends on whether you are a more “profitable” owner than the alternative owners (with the base case being separate businesses). 

And the answer to that question hinges on many factors. One that weighs heavily though is the competitiveness of that market. The more competitive it is, the less likely that vertical integration will increase your profitability. Here is an interesting example: book stores and Amazon. Why did it make sense for Amazon to actually become a book retailer? Because book retail is not very competitive. And the reason it isn’t is that book stores generally don’t set their own prices which are instead determined by publishers. And each publisher has a monopoly (by virtue of copyright) of the titles it publishes. So it turns out that vertically integrating let Amazon access some of those additional economics and much of the subsequent legal battles with publishers were over just that!

Now think about something like laundry. In a city like New York you don’t have to walk more than a few blocks to find a dry cleaner. The pricing for laundry and dry cleaning is quite competitive as a result. Hence access to “rents” (excess profits) to increase overall profitability is not really a valid reason for vertical integration. The same would seem to be true for many other on demand services.

That doesn’t mean that there aren’t other valid reasons largely having to do with contractual incompleteness and the need of added coordination in an on demand system (see the Paul Joskow paper for a great summary of the economic literature on vertical integration). It is super important though to keep in mind that you are actually *buying* that increased coordination. It comes at a cost and if anything as a result reduces overall profitability and may also negatively impact motivation as I described in my post on the trade-offs between firms, markets and networks.

So as you start, build or invest in an on-demand business that is vertically integrated, make sure you think long and hard about the existing and future economics of the business. The benefits from increased coordination have to outweigh their cost, which is largely a question of how inelastic the demand is for “on demand” – if you cannot add a significant price premium over the standalone base service then you could find yourself in trouble.

Triggered fragmentation in self-gravitating disks by Astrobites

Title: Triggered fragmentation in self-gravitating discs: forming fragments at small radii
Author: Farzana Meru
First author’s institution: Institute of Astronomy, University of Cambridge, UK.
Status: Accepted for publication in MNRAS.


Where do (the most) massive planets come from?

The story seems to be clear. The evidence is overwhelming, that most  planets originate from agglomerations of small dust particles, also known as the core accretion model. A short version of the story is summarised in this astrobite. Using modern versions of the core accretion scenario, computer simulations manage to form massive planets at large orbital radii beyond 10 AU and more.

However, the older theory of the origin of massive planets is called disk fragmentation (or global gravitational instability, there are some confusions of this term with the settling of the dust to the mid-plane of the disk in astrophysical literature, which is why we will stick to disk fragmentation). The idea, which is nicely summarised in this astrobite, is that the initial massive disk undergoes a collapse due to its own gravity and fragments into clumps. A video belonging to the simulations we will talk about later probably helps a lot to clarify what is going on.

Courtesy of Farzana Meru.

(Just as a side note, I personally very much admire the aesthetic and beauty of this theory, as it easily produces mesmerizing global disk insights like the video above or thisthis, this or this.)

The problem which is hard to neglect is the fact that disk fragmentation rarely forms planet “clumps” in a realistic mass regime (meaning up to several Jupiter masses at maximum, not tens of Jupiter masses) and usually these clumps tend to migrate inwards extremely fast, such that they will be swallowed by the central star pretty soon.

“Do not disturb my circles”

Again, one thing seems to be clear here. If disk fragmentation operates on a level producing planet-like objects within the very first stages of protoplanetary disks, this happens in the outer regime at several tens of AU away from the central star. Only out there the disk is cold enough that the gas can collapse.

Therefore, we have two distinct regimes. Planet formation in the inner disk up to ~ 10 AU seems to be clearly dominated by core accretion processes. Disk fragmentation does not operate here because it is too hot because of the intense radiation from the star. Planet formation in the outer disk is harder to explain. Very far out (> 30-40 AU) the disk fragmentation could operate as explained before. However, we are left with a very huge gap in the intermediate range from ~10-30 AU. At this range the disk density is likely too low to feature dust densities high enough for particles sticking to each other – even as recently core accretion modelers found ways to circumvent this problem. For disk fragmentation though, the conditions also do not seem to be proper, as the combination of intermediate temperatures and gas densities does usually not reach the crucial values (remember, we need the disk to be cold and massive).

Fragmentation – inside out, or outside in?

As computer simulations become more and more sophisticated (btw, read this bite, I seem to link to it every time) physical properties of seemingly simple physical systems change and emergence phenomena appear. In particular, the emphasis of today’s study is on the effect of already induced clumps via fragmentation on the other parts of the disk. This has rarely been done before, as mostly the initial conditions were inappropriate to achieve the correct order of fragmentation. For example, isothermal models (without cooling or heating) tend to artificially enhance fragmentation and hence such disk models produce clumps in the inner parts, which is unrealistic.

Figures 1 and 2 show the subsequent evolution of the disk and the formation of two distinct clumps in the disk, which are heavier than ~1 Jupiter mass.


Figure 1 (Meru, 2015) : Velocity (left) and density (right) distribution within the disk, before the 2nd fragment has formed. The black and turquoise circles represent the star and the formed clumps in the disk, respectively. The presence of the outer fragment increases the velocity of the gas at the position of the arrow, which then migrates rapidly inwards. This enhances the gas density in the inner spiral arm and… (see Figure 2)


Figure 2 (Meru, 2015):  (see Figure 1) … triggers the fragmentation of the spiral arm to form a 2nd clump!

As the above images show, the presence of a massive clump in the outer part of the disk triggers a clump in the inner part of the disk, at a position where it could not form alone. Normally, this part of the disk does not feature the right conditions, but these get enhanced by the dynamics of the 1st clump. The question is now, whether this is a pure effect of the clump or if the phenomenon also would occur without the 1st clump. As Figure 3 indicates, this is indeed only the effect of the clump and is not due to the general disk dynamics.


Figure 3 (Meru, 2015): Normalised radial velocity in the disk, shortly before the formation of the 1st (solid line) and 2nd (dotted line) clumps. The dashed line shows the radial velocity for the time of the dotted line, but in a version where there has been no 1st clump at all. This means, the gas just outside of the innermost spiral arm in Figure 1 does not reach high velocities toward the central objects and hence does not trigger fragmentation in there.

What does all of this tell me?

Firstly, planet formation is a highly complex process and the story is not always as clear as people make us think. Especially, in the intermediate disk regime the community is not clear on how to produce planet-like bodies. Several mechanisms, which work in principle, have been proposed. Fortunately, some day in the not-so-far future we will be able to distinguish which version could possibly dominate the planet formation process in the intermediate disk range, as the outcome from core accretion (aka pebble accretion) and disk fragmentation is wildly different. The first favors planetary objects of the size of Neptune, whereas the latter most often produces planets which are way more massive, up to tens of Jupiter masses.


Disclosure: The author of this article was once a member of the research group I am currently working in. However, I was never involved in this study at any point.

Apply to Write for Astrobites! by Astrobites

astrobites21Astrobites is seeking new graduate students to join the Astrobites collaboration.

Please share the information below with your graduate student colleagues. Applicants must be current graduate students. The deadline for applications is 1 November. Please email if you have any questions. Come join us!

Application Details

  • Deadline: 1 November
  • Required information: A sample astrobite post and two short essays, which can be submitted at
  • All graduate students in astronomy and astrophysics may apply. We aim to cover a wide variety of research topics from a diverse set of perspectives. Individuals from underrepresented groups are especially encouraged to apply. If you are passionate about sharing the latest research in astronomy and astrophysics and enjoy writing, we want to hear from you!
  • Astrobites is a volunteer collaboration and does not offer compensation. The benefits of joining our team include public outreach (disseminating journal articles to a wider audience) and professional development (reading papers, writing, and editing).
  • Applications are reviewed anonymously. Please do not include your name, affiliation, or other identifying information in your short essays or sample astrobite.


Sample Post Guidelines

Your astrobite should summarize a published astronomy or astrophysics journal article that is at least three months old and has not been featured on astrobites. Do not write about one of your own papers. Your post should discuss the motivation, methods, results, and conclusions of the paper. Title your sample astrobite as you would title a post on the site.

Write at a level appropriate for undergraduate physics or astronomy majors. Effective astrobites avoid jargon and thoroughly yet succinctly explain unfamiliar concepts. We encourage you to provide links to previous astrobites or other websites where appropriate. Your sample post should include at least one figure from the paper with an appropriate caption (not just the original caption). Astrobites are usually between 500-800 words in length, and your post should definitely not exceed 1000 words. We suggest you read a few published astrobites by different authors to get a sense how posts are written.

Author Responsibilities

Authors write one astrobite each month and edit another author’s astrobite once per month.

Writing an astrobite typically takes 3-6 hours for experienced authors, which includes selecting and carefully reading a paper. It takes somewhat longer for new authors. Editing another author’s astrobite usually takes less than 30 minutes. Some of us also spend time contributing to the glossary pages, arranging for guest posts, representing astrobites at conferences, and maintaining the website. These activities are optional, so new astrobiters can choose how much time they would like to devote to the collaboration. Authors typically write for two years, but this can be adjusted on an individual basis.

The Hiring Process

The hiring committee will review submissions based on the quality of their sample astrobites and their responses to the two short essay questions. The names and affiliations of the applicants will be concealed from the hiring committee until after the final list of candidates is selected to promote equity. Successful candidates will be notified approximately one month after the application deadline.

Regardless of whether an applicant is selected to write for astrobites, submitted sample astrobites may be posted on astrobites as guest posts (with the applicant’s permission). We will notify applicants before posting their submissions.

How to Apply


Roundup of the September 11, 2015 New Horizons raw image release by The Planetary Society

Last Friday the Internet received its first post-encounter pile of goodies from the New Horizons flyby of the Pluto system.

Finding the Surveyor retro-rockets on the Moon by The Planetary Society

Planetary scientist Phil Stooke may have found the retro-rockets from NASA's Lunar Surveyor missions, sent to the Moon in preparation for Apollo.

In Pictures: A Partial Solar Eclipse from Space by The Planetary Society

Two sun-observing spacecraft in Earth orbit captured images of a partial solar eclipse Sunday morning.

September 15, 2015

How commodity is something? by Simon Wardley

tl;dr There are a number of different ways you can refine your maps but in most cases, it's rare that you should go beyond aggregated views. Most of the benefit can be gained by a group of people sitting around sharing and discussing the map

Everything (whether activities, practices, data or knowledge) evolves through the application of competition (supply and demand side). Each instance of an evolving act diffuses through a market over time and hence evolution can be seen as a series of diffusion curves (often many hundreds) with each diffusion curve having its own chasm

Those diffusion curves however have different markets and different time spans i.e. diffusion of the first phones was not the same as diffusion of later, more evolved, more mature phones. So when examining an act (A) which is evolving through ever more mature examples of the act (A1 to A5) then you see a diffusion patterns that looks like figure 1.

Figure 1 - Diffusion of an activity A

At any moment in time in the market you may have many examples of the act in existence i.e. older phones to modern phones to the latest phone can all exist in the market at the same time. You cannot simply look at a market and see 50% of households having something and make a claim over how evolved it is. Whilst evolution contains many diffusion curves, evolution does not equal diffusion and you don't know which diffusion curve you're on.

However, there is a pattern to evolution which you can measure by examining the ubiquity and the certainty (i.e. maturity and completeness) of a thing. Unfortunately, the measurement can only be done for the past. Something has to become a commodity before its past can be measured accurately. This means that whilst I know the direction of travel of something (e.g. the genesis of an act becomes custom built becomes product becomes a commodity - see figure 2), I can not directly measure where something is but only where it was.

Figure 2 - Evolution

The reason for this is certainty axis itself and the role of individual actors in the market. I can only say where something is with certainty once it becomes certain. The future unfortunately acts as an information barrier and until something has become certain (and I can therefore measure its past) then I have to guess unless I invoke some form of crystal ball.

Now this evolution curve provides the x-axis used in mapping. So, how do I know where to place something (e.g. activity A) on the x-axis if I can't measure it? (see figure 3).

Figure 3 - Where on the map is it?

There are a number of techniques you can use.

Ask yourself what this thing is.
The first step is to ask yourself is this thing :-
a) rare and poorly understood i.e. genesis?
b) uncommon and somewhat understood, normally provided by custom built examples often built by consultants?
c) quite common and reasonably understood, often provided as product or rental service through product vendors?
d) well understood and ubiquitous provided via utility services or as undifferentiated commodities?

If you take a number of people (say 2-4) with experience of the field then you can get a better answer by asking the group. This is also helps when there's arguments because that usually signals the act consists of several components at different stages of evolution.

Look at the properties of the thing
The second step is to examine the characteristics of the thing. As it evolves its characteristics will change from the uncharted space to the industrialised (with a transitional space in between). See figure 4.

Figure 4 - Characteristics.

Hence look at the activity for example and ask yourself :- Is this rapidly changing? Is this a differential? Is this exciting? Is it stable? Does it seems chaotic? Is this well defined? Am I experimenting? etc.

Look at other maps.
Obviously everyone has bias, hence ideally you should use a group to determine how evolved something is. However, if you have several maps then you can use a aggregated view (i.e. a summary of common points on different maps) to determine how this thing should be treated. You do this by looking for clusters in order to remove bias (see figure 5).

Figure 5 - Aggregated View

The above graphic is a great way of removing duplication in an organisation and stopping groups from custom building the sort of thing which should be a commodity e.g. user registration

Look at publication types.
At this point I'd probably say you're going overboard. You don't need to get maps this accurate in order for them to be useful tools for organisational learning, collaboration, communication, scenario planning and strategic gameplay. However, if you want to go the extra mile then start examining the publication types (see figure 6).

Figure 6 - Publication Types

E.g. examine the frequency of different key publication types to determine roughly where you are. Please note, even this measure is rough until the act has become well defined (i.e. certain) and in which case the volume of publication types can be used to determine where it was (past tense) on the certainty axis. However, you can use this as a weak signal but I mostly use this in anticipating future change. 

NB. You have to be careful when examining publication type because certain words / phrases appear in multiple stages. For example people talk about platforms when describing a product (i.e. you can build on my product) and they also talk about platforms when describing utility services. The two are not the same.

At this point you're definitely going overboard and trying to create a perfect map. Don't bother, your map will change and you shouldn't spend more than a few hours to a day creating it. The only time you should be stepping into weak signals is when you're into anticipating a very specific change and working out when to roughly attack a market. 

There are a number of signals that can be used, for example when crossing a boundary you need to overcome inertia and that requires a number of factors to be in place - concept, suitability, technology and attitude.

For example in figure 7, we have an evolving act and every time it evolves from one stage (e.g. custom) to another (e.g. product) and hence crosses a boundary then there's inertia to the change created by the incumbent group. Hence the originator (for genesis to custom), consultants (for custom to product) and product vendors (for product to commodity) all resist a change that impacts their market. The size of the inertia barrier increases as the act becomes more evolved (and more established). 

Figure 7 - evolution and inertia.

In order to overcome an inertia barrier you need the four factors in place. The concept of providing the act in the next stage must exist. It has to be suitable (i.e. widespread and defined enough). The technology to achieve this must exist and finally there must be an attitude of acceptance for such a change by the customers. The latter is normally represented by the customers being dissatisfied with the existing arrangement.

You can go further into weak signals, for example the rate at which higher order systems can be built from lower components accelerates as the underlying systems become more industrialised. You can even use timing based upon common economic cycles such as peace, war and wonder (see figure 8). However, these are overkill for a simple mapping exercise and are more suitable when directing an attack.

Figure 8 - Economic cycles & speed of building higher order systems.

There are a number of different ways you can refine your maps but in most cases, it's rare that you should go beyond aggregated views. Most of the benefit can be gained by a group of people sitting around sharing and discussing the map.

My Favourite Film of 1981. by Feeling Listless

Film With the school exam results released in the past few weeks, I thought again, somewhat wistfully, of those days, though cautiously given that I was bullied mercilessly most of the time and didn't come away with anything like what might be described as decent exam results. But it's still possible for me to think of the good times, the aforementioned art classes (see the Adventures in Babysitting entry) and pre-GSCE English classes when a teacher took it upon himself to show feature films during English lessons for a year or two. One of those films was Raiders of the Lost Ark.

Perhaps it's important to contextualise this.  Some of the films were Shakespeare adaptations.  It's here I first saw - and giggled through it has to be admitted - the Polanski Macbeth and the Zeffirelli Romeo and Juliet, my only recollection of that viewing being his boasting that part it had been filmed at his house (something which to this day I've not been able to verify).  But some of his choices were rather more eclectic.  Even the afterlife of the Pythons made an appearance in the form of The Jabberwocky and Time Bandits.

The television room at school was a tiny space which would otherwise have been used as an office for a head of department.  I can still smell the dust of the wooden varnished floors and the chair were those fairly standard 70s stacking examples with the tubular metal frames as featured in this eBay entry;   The screen was a 26 inch DER colour CRT locked in a large television cupboard with a top-loading VHS player underneath.  All of the films we watched were off-air recordings with the adverts edited out via the pause button during broadcast were necessary.

Much of the time this English teacher would simply show the films.  But every now and then he'd throw in some rudimentary film studies, perhaps as a way of introducing us to how narratives work in preparation for future exam courses.  In Raiders of the Lost Ark, this meant the mechanics of the action sequences and character foreshadowing.  He'd pause the film now and then to highlight signposts.  How establishing shots emphasise the geography of the plane before Indy struggles against Pat Roach.  How Indy earlier notes his lack of appreciation for serpents before later being thrown into a pit full of snakes.

In that period just before the National Curriculum was introduced and teachers didn't feel constricted about how to teach or indeed what to teach, I wonder what value any of this had other than to give us access to something which we might not otherwise have encountered at least not in the context of serious discussion rather than pure entertainment.  Part of me wishes he'd been able to go further and still do.  Despite its academia, film is still considered the poor cousin of literature despite, as I've since discovered, having the potential for just as much complexity to tax the young critical mind, both through textual matters, themes, signs and meaning.

Product Land (Part 3) by Richard Pope


This is the 3rd and final part of an essay about design and possibilities.

The first part - You can't build what you can't think of in the first place - was about the process of design being too linear, taking inspiration from evolution and the concept of hyper-volumes of 'potential products'; the second part - Tools for exploring the margins - listed some approaches for thinking harder about the things that are possible in product design.

This final part is about power and about the obligations you now have if you make digital services in the 21st century.

Sampson diagram from The New Anatomy of Britain (1972) - largest bubbles are Civil Service, Industry, Parliament ans Conservatives

The image above is from The New Anatomy of Britain by writer and journalist Anthony Sampson. He wrote a series of books on the subject on political power in Britain, published approximately every 10 years from 1962. Each included a diagram of what he considered the current state of play. The one above is from 1971, the one below is from the final book in the series Who Runs This Place, published just before his death in 2004:

Sampson diagram from Who Runs This Place (2004) - largest bubbles are Media, Prime Minister and The Rich

I've always loved these diagrams (back at OpenTech in 2009 myself and Rob McKinnon used them to map civic tech projects).

Just like the biomorphs or 'History of the World' from part 1 of this essay, Sampson's drawings are attempts to help us think about a subject that is inherently multi-dimensional. They are a tool for thinking about a problem - in this case how power is distributed and, taken together over the years, how it can change.

What might Sampson have drawn today, in 2015?

Well, politics is about the distribution of power in society, and in the early 21st century digital products are exerting influence on how power is distributed among us.

Redrawn 11 years later it seems clear to me a Sampson diagram would have large bubbles for the big digital services.

Politics in the 21st century will, in part, be about control over the digital services we now rely on, and which hold an ever growing concentration of our personal and household data, from how often we move (fitbit, jawbone), where to (Google Play Services), what we tell people (WhatsApp, Facebook) and to how often we burn our toast (Nest).

The same tight orbit that digital product design seems to be stuck in at the functional level (again, see part 1 of this essay) also exists at the organisational level: in the design of the organisations that run them.

The meme: 'The only way to solve a given problem is to create a private company, provide a free service to users and mine their data' is strong, but is also the equivalent in genetics of 'the only animal that could possible exist is a hyena'.

And frankly, that's getting a bit scary. As everything from household appliances to the most basic transport infrastructure gain an IP address and become fonts of data, at the same time as the democratic organisations of the last century seem unable to keep up, it is only going to get more so.

Software is politics now.

This was a subject that Vitalik Buterin, founder of the Etherium (a distributed, auditable computer) talked about at Nesta's FutureFest event back in March.

There is a PDF of his slides here, but to try and summarise: the core utilities of the 19th and 20th centuries (roads, water transport, electricity system) were eventually run or regulated by governments, but the core utilities of the digital age (identity, communications, payment, sharing) are currently run by the first private company that happens to make its way to a near-monopoly. Etherium, a distributed auditable computer, is an alternative to unaccountable monopolies.

Just like water was in 19th Century London, where the adhoc organisations, with little accountability when things went wrong, were replaced with the first the Metropolitan Board of Works, and then a wider municipal democracy in the form of London County Council.

The story of the industrial revolution too often reads like that of entrepreneurs taking personal risk to weave the future against the odds. Now, granted that is a history, but not the interesting one in my opinion.

The interesting history is the one of the building of institutions that had the concept of accountability to the public baked into them - not an evolution of one thing to another, but active choice of a more accountable method of providing a service the public rely on.

Whether something like Etherium, which binds services to behave in a certain way via immutable code, is the right answer, or whether we need organisations that account for themselves in more traditional ways - membership, voting, but built for and of the digital age, are not the important things.

The first thing is recognising that the accountability mechanisms for a digital service are just another set of axes in product space - another thing that should be thought about and chosen.

Finding alternative models to run something like Uber, Google Now or Homekit that are viable is going to be hard (much as municipal democracy had a spluttering start and there were many failed attempts at finding a viable models for co-ops before the Rochdale Pioneers ended up with one that worked), but that's no reason not to try.

The second, I think, is recognising the risk of designing services that are superficially the height of simplicity, but can never be understood. To steal a phase from Matt Jones' brilliant talk on design fiction and the understanding of systems - "magic is a power relationship".

If a user can never understand how something works, where is the opportunity for recourse? To pick an obvious example: what does it mean when you can't view source on an ever more powerful Google Now.

To address this problem, I think the accountability model for a service needs to be an intrinsic part of the design of that service. Accountability needs to be embraced as part of the service design rather than abstracted away.

This creates some interesting design constraints: it means there is a delicate balance between designing something that people can use without having to understand how it works without totally obfuscating the underlying workings of the service.

The third is that the private sector does not have a monopoly on good digital product design, but equally more accountable digital products should not just be clones with a democratic overhead.

New technologies bring new possibilities for accountability. So design patterns like accountability at the point of use become relevant in a way they never would in a commercial context.

The final thing though, is recognising that if you build or design digital products in 2015 you have a new responsibility.

You are not just building the best, simplest, user experience, or the most elegant code. You need to be as vigilant against creating concentrations of power as you are in creating efficiency.

The image of power flowing from one part of a Sampson diagram should be ever-present in your head.

The reason? If you accept the argument that software is politics, you are by definition also designing a power structure, and that is an important responsibility.

Or to put it another way, sometimes the user need is 'because democracy'.

Written between March and September 2015 - Brixton, Broadstairs and West Norwood.

Basic Income and On Demand Services by Albert Wenger

It is Basic Income Week and I am speaking at the On Demand Conference in New York. That is a fortunate co-incidence as On Demand companies ought to be strong supporters of Basic Income.

First, with Basic Income, the distinction between employee and contractor status becomes irrelevant and should go away entirely. The reasons that we have the distinction today is that we have historically tied together employment with benefits and with tax withholding. With Basic Income anyone can take care of their basic needs independent of employment status. And along with Basic Income we should vastly simplify the tax code to do away with deductions and treat all income equally.

Second, Basic Income should greatly reduce the fear of automation and the potential for conflict it creates between on demand companies and their current workforce. The role of humans in many on demand services is that of undifferentiated labor that could and should eventually be carried out by a machine. I say should because it is hard to make a case that many of the on demand tasks are a sustained source of purpose for the people carrying them out. Uber’s well documented heavy investment in autonomous vehicles can serve as Exhibit A here.

Conversely a doubling down on the current distinction, especially when accompanied with a $15 minimum hourly wage would only serve to further the existing distortions, such as companies and even public sector agencies artificially restricting work hours. I am sympathetic to the motivation behind a higher minimum wage as too many people are working hard and yet can’t meet their basic needs but a Basic Income is a better longrun solution to that problem.  For instance, a $1,000 per month basic income is the equivalent of 16 hours per week at $15 per hour.

So it will be interesting to see if I can help get some support from on demand companies for a Basic Income scheme.

September 14, 2015

Dr Hannah Fry on Ada Lovelace. by Feeling Listless

Science Dr Hannah Fry has a new BBC Four documentary on Thursday about Ada Lovelace. Which is, well it is, isn't it?

"Ada Lovelace was a most unlikely computer pioneer. In this film, Dr Hannah Fry tells the story of Ada's remarkable life. Born in the early 19th century Ada was a countess of the realm, a scandalous socialite and an 'enchantress of numbers'. The film is an enthralling tale of how a life infused with brilliance, but blighted by illness and gambling addiction, helped give rise to the modern era of computing."
If you're as much of a fan of Fry's work as I am, you might like this collection of talks I put together earlier in the year.

September 13, 2015

Vancouver. by Feeling Listless

Film Welcome to the newest edition of Every Frame a Painting, the film essay channel in which @tonyszhou offers incredibly compelling slices of commentary in about ten minutes. The latest montage is an especially personal example in which he describes how his home city is constantly on film, but never as itself. Yes, it has a shot from the Doctor Who tv movie and yes it does acknowledge the existence of Continuum.  I wonder if a rep cinema or tv channel in Vancouver has ever scheduled a season of these films?

We've generally been more lucky in Liverpool for all the reasons you can imagine, but there has been an increase in seeing parts of the city centre doubling for everywhere from London to other European capitals and especially New York, the streets around Dale Street resembling Manhattan and Brooklyn.  None of this will be on the scale of a Cardiff audience watching Doctor Who perhaps, but it's still relatively jarring to be watching the car chase in Fast & Furious 6, heading through the Mersey tunnel or past the World Museum and William Brown Street.

Veteran Spacefarer, 2 Rookies Return to Earth by The Planetary Society

A three-person crew commanded by humanity's most experienced space traveler is back on Earth today.

September 12, 2015

UR #17: Finding Transiting Planets With LSST by Astrobites

astrobitesURlogoThe undergrad research series is where we feature the research that you’re doing. If you’ve missed the previous installments, you can find them under the “Undergraduate Research” category here.

Did you do an REU this summer? Or maybe you’re just getting started on an astro research project this semester? If you, too, have been working on a project that you want to share, we want to hear from you! Think you’re up to the challenge of describing your research carefully and clearly to a broad audience, in only one paragraph? Then send us a summary of it!

You can share what you’re doing by clicking here and using the form provided to submit a brief (fewer than 200 words) write-up of your work. The target audience is one familiar with astrophysics but not necessarily your specific subfield, so write clearly and try to avoid jargon. Feel free to also include either a visual regarding your research or else a photo of yourself.

We look forward to hearing from you!


Savannah Jacklin
Villanova University

Savannah worked on this project as a NSF REU student at Vanderbilt University after her junior year at Villanova. Her research has since been published in her first first-author paper in the Astronomical Journal.

Exoplanets with LSST: Period Detection of Planets Orbiting 1 Solar Mass Hosts

The Large Synoptic Survey Telescope (LSST) will photometrically monitor approximately 1 billion stars for ten years. The resulting light curves can be used to detect transiting exoplanets. In particular, as demonstrated by Lund et al. (2015), LSST will probe stellar populations currently undersampled in most exoplanet transit surveys, including out to extragalactic distances. In this paper we test the efficiency of the box-fitting least-squares (BLS) algorithm for accurately recovering the periods of transiting exoplanets using simulated LSST data. We model planets with a range of radii orbiting a solar-mass star at a distance of 7 kpc, with orbital periods ranging from 0.5 to 20 days. We find that standard-cadence LSST observations will be able to reliably recover the periods of Hot Jupiters with periods shorter than approximately 3 days, however it will remain a challenge to confidently distinguish these transiting planets from false positives. At the same time, we find that the LSST deep drilling cadence is extremely powerful: the BLS algorithm successfully recovers at least 30% of sub-Saturn-size exoplanets with orbital periods as long as 20 days, and a simple BLS power criterion robustly distinguishes approximately 98% of these from photometric (i.e. statistical) false positives.


Two-dimensional histograms representing period recoverability across period and radius space where the top two plots represent period recoverability considering only top BLS peak accuracy. The bottom two plots include an additional power threshold with all detections also having a BLS power greater than 7.69526 for regular cadence and 7.32893 for deep drilling cadence.

Two-dimensional histograms representing period recoverability across period and radius space where the top two plots represent period recoverability considering only top BLS peak accuracy. The bottom two plots include an additional power threshold with all detections also having a BLS power greater than 7.69526 for regular cadence and 7.32893 for deep drilling cadence.

How the duck got its neck: Rapid temperature changes from self-shadowing may explain 67P's unusual activity and shape by The Planetary Society

When Rosetta approached comet Churyumov-Gerasimenko last summer, both its shape and its activity were surprising. It looked like two comets welded together at a skinny neck. A new paper explains how the neck may be steepening itself.

Subscriptions (feed of everything)

Updated using Planet on 7 October 2015, 05:48 AM