This combines together on one page various news websites and diaries which I like to read.
March 26, 2015
At last week's Lunar and Planetary Science Conference, the MESSENGER team held a press briefing to share results from the recent few months of incredibly low-altitude flight over Mercury's surface. The mission will last only about five weeks more.
Meet NASA's Winning Asteroid Redirect Spacecraft, and the Asteroid It May Visit by The Planetary Society
NASA has decided to pluck a small boulder off a large asteroid, instead of bagging an entire asteroid outright, the agency announced Wednesday.
One-Year ISS Mission Preview: 28 Experiments, 4 Expeditions and 2 Crew Members by The Planetary Society
This Friday, astronaut Scott Kelly and cosmonaut Mikhail Kornienko will embark for a one-year mission aboard the International Space Station.
March 25, 2015
Dinner. £3.50. Uncle Sam's Bistro & Restaurant, 94 Bold Street, Liverpool, Merseyside L1 4HY. Phone:0151 709 2111. Website.
A new app that allows readers to swap swear words in their novels with sanitised versions is facing a backlash from furious authors, who have accused it of setting a dangerous precedent of censorship.The app, entitled Clean Reader, has been designed to take explicit words out of any book printed in electronic format - with or without permission from its author - to swap them with child-friendly versions.
(I'm not linking to Clean Reader directly—don't want to give them any free inbound Google mojo.)
Mangling an author's text is a clear violation of the author's Moral rights, an element of copyright which is very weak in the United States and very strong elsewhere (primarily in civil law jurisdictions). (The moral right is the right of an author to be identified as the creator of a work, and for the work represented as their creation to be unaltered by other hands, so that the relationship between creator and created work is clear.) Mangling an author's text may be legal or illegal in the USA, depending on whether it occurs before or after sale. After all, I can't stop you buying one of my books and editing it with a sharpie: it's a physical object and according to the first sale doctrine, it's yours to do with as you wish. I may be able to legally stop you modifying an ebook, though: ebooks are not sold but a limited license to download and use them is granted in exchange for money—a fine legal distinction that was borrowed from the software business's tame sharks—and that limited license may permit or deny such usage.
Clean Reader claim to get around this by (a) being a licensed distributor (they provide the app and sell books for it sourced from PageFoundry, a distributor who back-end onto various publishers), and (b) the censorship is performed on the reader device by the reader app, once the book has been purchased and downloaded. There's a bunch of case law around whether or not it's legal to do this to movie rentals or downloads, or legal to skip advertisements in recorded programming on your TiVo—it gets murky fast. But let's suppose they're right and what they're doing ("protect the children! At any cost! From naughty words like 'breast' and 'fuck'!") is legal.
Speaking as an author who deeply resents the idea of his books being mutilated to fit the prejudices of a curious reader's blue-nosed and over-protective parents (hint: I write for adults—if you don't think my books are suitable for your or your child's tender eyes, don't buy them), what can I do about this?
It's worth quoting some correspondence posted on Absolute Write at this point. The PR contact for Clean Reader had this to say, in answer to a public enquiry:
As for how we deal with context, the app does look for specific sequences of letters lick cock, shit, or f--k. But it also requires white space on both sides of the word. So your example of cockapoo would not be blocked by the app. But cock a poo would have cock blocked. There will be times when the app blocks a word that isn't being used as a profanity. Jesus Christ is another example. If a reader is reading the Bible with Clean Reader there will be quite a lot of words blocked; hell, damn, ass, Jesus, etc. The user will have to make a judgement call as to whether or not to use the "Clean Reader" feature with each book. If it's a religious book they may just opt to turn the feature off. Or if it's a book about chickens they may want to leave it off also. But for example, I'm currently reading American Sniper. It seems to have at least one F-word on every page and sometimes multiple per page. It's frankly a little over the top. Otherwise the book is fantastic and entertaining. So even if the app blocks out a word every now and again that wasn't necessarily being used as a profanity, I'd rather deal with that then have to read F--- every page. Those who have written articles about Clean Reader have typically downloaded a book that is riddled with swear words to show examples of how frustrating the book would be with Clean Reader. But I can tell you we aren't selling many of those types of books. I've read several books with the app and I typically only see a word blocked once every few pages. And it's usually pretty easy to get the gist of what was being said. It's just nice to not actually see it.
So. While it might be possible to get my books pulled from that particular distributor, I am more inclined to deal with this idiocy by getting creative with my scatalogical vocabulary.
No more "fucks" freely interjected; instead I shall steal "unclefucker" from South Park.
No more "cunt!" as a free-standing gender-neutral insult[*]; instead it'll have to be "cuntfart!" or "pissflaps!" or "clunge!" (go look it up) ...
... But that's not going far enough.
I am pretty sure there's plenty of context in which the censorbot can be induced to fuck-up a perfectly clean paragraph beyond all recognition, simply by removing words delimited by whitespace. "Chimney-breast" for example, becomes "Chimney-chest". "The cunt line of the mainbrace" becomes "the bottom line of the mainbrace".
How far do you think I can take this?
Cory Doctorow takes a radically different approach ("I hate your censorship, but I'll defend to the death your right to censor"). I think he's missing the distinction between censorship and editing—that what's happening here is not straightforward "you can't read that" blocking, but actual substitution of someone else's words for my own, subtly or unsubtly corrupting and misrepresenting the author's words. One thing is clear, though: while we're having a doctrinal argument, it's important to keep in mind the essential fact that we both think that Clean Reader users are stupid poopy-heads.
[*] That's what "cunt" is, in Scottish vernacular—usage differs wildly across the anglophone world, and in Scotland it carries much less gendered misogynistic freight than it does in American usage.
Film I'll make this brief because there really isn't much to the idea. In recent years there's been much annoyance that the likes of Andy Serkis and Andy Serkis haven't been nominated for acting awards despite their motion captured performance being an important part of the process in creating a computer generated character like Gollum or Kong. Extraordinary pieces of magic like Paddington (which I saw last night and led to this brainwave) lumped into the general special effects categories when the work on them is clearly of a different order to throwing a car at a motorway or what have you.
Last night it occurred to me that what could happen is that the characters themselves are nominated. In other words, the Oscar would go to Gollum or Paddington or Groot as the result of a collaboration and the collaborators who worked on the character as a group would win the award, the voice/actors, designers and animators with the name of the character as the representative element of the achievement (the members of that group decided upon by the production as part of the nomination process perhaps with the director/producers deciding which element they're most proud of).
To separate it from straight animated characters, it would be called something like "Best Animated Character Within a Live Action Context" or something snappier. So that different achievements can be represented, you'd also perhaps only allow one character per film, which would make things tricky for Guardians of the Galaxy. Also you'd have to limit it to characters who're predominantly CG. Stuff like cartoon Legolas wouldn't count.
All of this would reduce the discussions about how much of an actor's performance is enhanced by animators, how much of it is truly just about creating a fully computer generated make-up or simply copying a performance. As to who would actually get the award? A raffle? Or would it go to the animation studio? Dunno. Actually this still means Andy Serkis wouldn't end up with an award doesn't it? Oh hum.
Film Clearly the funniest part of 22 Jump Street are closing titles in which we're given a preview of upcoming previews heading off into infinity along with all the logical diminishing returns. Here, in an absolutely massive post, creative directors, producers and the directors discuss the concepts and production:
"When did you begin to think about the end title sequence for the film?The film is now available on Netflix UK and I really do recommend it. There's a clever sense of irony about the whole thing, a you know that we know that you know that we know that ...
Phil Lord: We had always planned to do something. Originally we ended the movie with Jonah and Channing walking away from Cube saying, “We are never ever going to do this again,” and then we were going to cut to future Jump Street movies with other actors."
Film Much as I love Kissing Jessica Stein, and I love Kissing Jessica Stein, across time I have begun to wonder how its viewed in the LGBT community. After Ellen gave it a retro-review in 2008 and the four participants seemed in general ok with it with a few reservations.
Authors: Alexander L. DeSouza and Shantanu Basu
First Author’s Institution: Department of Physics and Astronomy, University of Western Ontario
Status: Accepted by MNRAS
A major goal for the next generation of telescopes, such as the James Webb Space Telescope (JWST) is to study the first stars and galaxies in the universe. But what would they look like? Would JWST be able to see them? Recent studies have suggested that even the most massive specimens of the very first generation of stars, known as Population III stars, may be undetectable with JWST.
But not all hope is lost–one of the reasons why Population III stars are so hard to detect is that, unlike later generations of stars, they are believed to form in isolation. Later generations of stars (called Population I and Population II stars) usually form in clusters, from the fragmentation of large clouds of molecular gas. On the other hand, cosmological simulations have suggested that Population III stars would form from gas collected in dark matter mini-halos of about a million solar masses in size which would have virialized (reached dynamic equilibrium) by redshifts of about 20-50. Molecular hydrogen acts as a coolant in this scenario, allowing the gas to cool enough to condense down into a star. Early simulations showed that gravitational fragmentation would eventually produce one massive fragment–on the order of about a hundred solar masses–per halo. This molecular hydrogen, however, could easily be destroyed by the UV radiation from the first massive star formed, preventing others from forming from the same parent cloud of gas. While Population III stars in this paradigm are thought to be much more massive than later generations of stars, they would also be isolated from other ancient stars.
However, there is a lot of uncertainty about the masses of these first stars, and recent papers have investigated the possibility that the picture could be more complicated than first thought. The molecular gas in the dark matter mini-halos could experience more fragmentation before it reaches stellar density, which may lead to multiple smaller stars, rather than one large one, forming from the same cloud of gas. These stars could then evolve relatively independently of each other. The authors of today’s paper investigate the idea that Population III stars could have formed in clusters and also study the luminosity of the resulting groups of stars.
The authors of today’s paper begin by arguing that the pristine, mostly atomic gas that collects in these early dark matter mini-halos could fragment by the Jeans criterion in a manner similar to the giant molecular clouds that we see today. This fragmentation would produce small clusters of stars that are relatively isolated from each other, so they are able to model each of the members in the cluster independently. They do this by using numerical hydrodynamical simulations in the thin-disk limit.
Their fiducial model is a gas of 300 solar masses, about 0.5 pc in radius, and at a temperature of 300 K. They find that the disk that forms around the protostars (the large fragments of gas that have contracted out of the original cloud of gas) forms relatively quickly, within about 3 kyr of the formation of the protostar. The disk begins to fragment a few hundred years after it forms. These clumps can then accrete onto the protostar in bursts of accretion or get raised to higher orbits.
Most of the time, however, the protostar is in a quiescent phase and is accreting mass relatively smoothly. The luminosity of the overall star cluster increases during the bursts of accretion, and it also increases as new protostars are formed. The increasing luminosity of the stellar cluster can make it more difficult to detect single accretion events. For clusters of a moderate size of about 16 members, these competing effects result in the star cluster spending about 15% of its time at an elevated luminosity, sometimes even a 1000 times the quiescent luminosity. The star clusters can then have luminosities approaching and occasionally exceeding 108 solar luminosities. Population III stars with masses ranging from 100-500 solar masses on the other hand, are likely to have luminosities of about 106 to 107.
These clusters would be some of the most luminous objects at these redshifts and would make a good target for telescopes such as ALMA and JWST. We have few constraints on the star formation rates at such high redshifts, and a lot of uncertainty in what the earliest stars would look like. So should these exist, even if we couldn’t see massive individual population III stars, we may still be able to detect these clusters of smaller stars and gain insight into what star formation looked like at the beginning of our universe.
March 24, 2015
"You were obsessive about trying to find the truth. And yet during the series a former detective told Sarah that, in some cases, detectives just want to build a case, not find the truth. Did you find that shocking?Related: Adnan Syed's lawyers file first document to challenge murder conviction.
We were completely shocked, this concept of “bad evidence” shocked the pants off us. That detectives might not go and learn some truth in case it didn’t support their theory of the case. I get why, but as a journalist you’re like – No it’s not that I want to find facts to support my theory, I just need to find the facts that tell me what happened."
I’m doing a lot of politics stuff running up to the general election, but I don’t feel like blogging about it much.
Actually, I’m not doing that much. What’s happening is that all the work I did ten years ago is finally being put into use today by other people, resulting in an event such as this fine rant from the LibDem MP in Bristol West:
To claim, as the website Public Whip does, that I ‘voted very strongly for selling England’s state owned forests’ is misleading in the extreme. I have never voted in favour of selling forest land – I voted against two poorly worded and hyperbolic motions submitted by the Labour Party.
I never believed I’d see the day where MPs would have to answer for the things they voted for in Parliament. Anyway, there’s that and electionleaflets.org and Francis’s Candidate CVs project gathering steam. What a difference five years of internet technology advancement and greater generational awareness can make.
Meanwhile, I was in Bristol for a few days helping a friend with some DIY, because I want to put this kind of laminate floor down in our kitchen on top of some real insulation:
Then I did some painting before getting relieved of my duties for spreading paint all up the paint brush handle.
I spent the night on the Blorenge, then flew at lunch time completely alone for two hours until I suddenly got dumped down in the bottom landing field in Abergaveny. Nobody took any notice of my death spiral down to the ground; just kept walking their dogs. I am still working hard to process the data into something meaningful — if this is possible.
An invite went out for a surf on the Dee Bore on Saturday morning. I thought it might be special, being the day after a partial solar eclipse, but it was a damp squib and most of us lost the wave within a few hundred metres.
- Title: Radial Velocity Prospects Current and Future: A White Paper Report prepared by the Study Analysis Group 8 for the Exoplanet Program Analysis Group (ExoPAG)
- Authors: Peter Plavchan et al.
- First Author’s Institution: Missouri State University
To date, we have confirmed more than 1500 extrasolar planets, with over 3300 other planet candidates waiting to be confirmed. These planets have been found with different methods (see Figure 1). The two currently most successful are: the transit method and the radial velocity method. The former measures the periodic dimming of a star as an orbiting planet passes in front of it, and tends to find short-period large-radius planets. The latter works like this: as a planet orbits around its host star, the planet tugs the host star causing the star to move in its own tiny orbit. This wobble motion —which increases with increasing planet mass— can be detected as tiny shifts in the star’s spectra. We just found a planet.
That being said, in our quest to find even more exoplanets, where do we invest our time and money? Do we pick one method over another? Or do we spread our efforts, striving to advance all of them simultaneously? How do we assess how each of them is working; how do we even begin? Here it pays off to take a stand, to make some decisions on how to proceed, to set realistic and achievable goals, to define a path forward that the exoplanet community can agree to follow.
To do this effectively, and to ensure that the US exoplanet community has a plan, NASA’s Exoplanet Exploration Program (ExEP) appoints a so-called Program Analysis Group (ExoPAG). This group is responsible for coordinating community input into the development and execution of NASA’s exoplanetary goals, and serves as a forum to analyze its priorities for future exoplanetary exploration. Most of ExoPAG’s work is conducted in a number of Study Analysis Groups (SAGs). Each group focuses on one specific exoplanet topic, and is headed by some of the leading scientists in the corresponding sub-topic. These topics include: discussing future flagship missions, characterizing exoplanet atmospheres, and analyzing individual detection techniques and their future. A comprehensive list of the different SAGs is maintained here.
One of the SAGs focused their efforts on analyzing the current and future prospects of the radial velocity method. Recently, the group published an analysis report which discusses the current state-of-affairs of the radial velocity technique, and recommends future steps towards increasing its sensitivity. Today’s astrobite summarizes this report.
The questions this SAG studied can roughly be divided into three categories:
1-2: Radial velocity detection sensitivity is primarily limited by two categories of systematic effects. First, by long-term instrument stability, and second, by astrophysical sources of jitter.
3: Finding planets with the radial velocity technique requires large amounts of observing time. We thus have to account for what telescopes are available, and how we design effective radial velocity surveys.
We won’t talk so much about the last category in this astrobite. But, let’s dive right into the former two.
No instrument is perfect. All instruments have something that ultimately limits their sensitivity. We can make more sensitive measurements with a ruler if we make the tick-marks denser. Make the tick-marks too dense, and we can’t tell them apart. Our sensitivity is limited.
Astronomical instruments that measure radial velocities —called spectrographs— are, too, limited in sensitivity. Their sensitivity is to a large extent controlled by how stable they are over long periods of time. Various environmental factors —such as mechanical vibrations, thermal variations, and pressure changes— cause unwanted shifts in the stellar spectra, that can all masquerade as a radial velocity signal. Minimize such variations, and work on correcting —or calibrating out— the unwanted signals they cause, and we increase the sensitivity. Not an easy job.
Still, it can be done, and we are getting better at it. Figure 2 shows that we are finding lighter and lighter planets — hand-in-hand with increasing instrument sensitivity: we are able to detect smaller and smaller wobble motions. Current state-of-the-art spectrographs are, in the optical, sensitive down to 1m/s wobble motions, and only slightly worse (1-3m/s) in the Near Infrared. To set things in perspective, the Earth exerts a 9cm/s wobble on the Sun. Thus, to find true Earth analogs, we need instruments sensitive to a few centimeters. The authors of the report note that achieving 10-20cm/s instrument precision is realistic within a few years — some such instruments are even being developed as we speak. Further push on these next generation spectrographs is strongly recommended by the authors; they support a path towards finding Earth analogues.
Having a perfect spectrograph, with perfect precision, would, however, not solve the whole problem. This is due to stellar jitter: the star itself can produce signals that can wrongly be interpreted as planets. Our ultimate sensitivity or precision is constrained by our physical understanding of the stars we observe.
Stellar jitter originates from various sources. The sources have different timescales, ranging from minutes and hours (e.g. granulation), to days and months (e.g. star spots), and even up to years (e.g. magnetic activity cycles). Figure 3 gives a good overview of the main sources of stellar jitter. Many of the sources are understood, and can be mitigated (green boxes), but other signals still pose problems (red boxes), and require more work. The blue boxes are more or less solved. We would like to see more green boxes.
The radial velocity method is one way to discover and characterize exoplanets. In this report, one of NASA’s Study Analysis Groups evaluates the current status of the method. Moreover, with input from the exoplanet community, the group discusses recommendations to move forward, to ensure that this method continues to be workhorse method in finding and characterizing exoplanets. This will involve efficiently scheduled observatories, and significant investments in technology development (see a great list of current and future spectrographs here), data analysis and in our understanding of the astrophysics behind stellar jitter. With these efforts, we make steps towards discovering and characterizing true Earth analogs.
Full Disclosure: My adviser is one of the authors of the SAG white paper report. I chose to cover it here for two reasons. First, I wanted to further your insight into this exciting subfield, and secondly, likewise, my own.
Cassini recently took a long, high-resolution movie of the F ring, catching a view of its ringlets, clumps, and streamers, and two potato-shaped moons, Prometheus and Pandora.
Desert Moon, a 35-minute documentary that tells the story of Dr. Gerard Kuiper and the dawn of planetary science, is now available online.
March 23, 2015
Film David Bordwell and Kristin Thompson are two of the most important film theorists of the past few decades and there are few people who've studied movies in an academic context who either haven't read their work and also can't say that work wasn't the reason why they managed to pass their course. If it hadn't been for either of them I certainly wouldn't have been able to write my MA dissertation due to their lucid, clear and thoughtful explanations of narrative and genre and of all the books which I read during that year, it's their Film Art: An Introduction which I've kept referring back to. They continue to write about film at their blog, which just yesterday posted about the origins of the very thing my dissertation was about.
Finding material featuring Thompson is tricky because there seem to be a few of them featured on YouTube, but I did manage to find this video essay:
... and appearances in Q&As at Ebertfest in 2011:
Bordwell was much easier to track. For chronological ease, I've put these appearances in the relevant years.
Ebertfest 2011 - Do Film Students Really Need To Know Much About Classic Films?
How Motion Pictures Became the Movies 1908-1920
How Motion Pictures Became the Movies 1908-1920 from David Bordwell on Vimeo.
Constructive Editing in Robert Bresson's Pickpocket
Constructive Editing in Robert Bresson's Pickpocket from David Bordwell on Vimeo.
TIFF Keynote | Industry Dialogues 2012
Ebertfest 2012 - Wild and Weird Q&A
Ebertfest 2012 - ON DEMAND: Movies without Theate
Fantasia 2012 - David Bordwell wins Lifetime Excellence Award.
Ebertfest 2012 Q&A: CITIZEN KANE [Commentary by Roger Ebert Intro by David Bordwell]
Which is incredibly poignant from the off, I think you'll agree. The commentary they're referring to appeared on this 2001 release of the film (along with another by Peter Bogdanovich) but is missing from all subsequent releases, replaced with a track by Ken Barnes.
Ebertfest 2013 - The Ballad of Narayama introduction
Ebertfest 2013 - The Ballad of Narayama Q&A:
Mildred Pierce: Murder Twice Over
Mildred Pierce: Murder Twice Over from David Bordwell on Vimeo.
CinemaScope: The Modern Miracle You See Without Glasses
CinemaScope: The Modern Miracle You See Without Glasses from David Bordwell on Vimeo.
ART of the MARTIAL ARTS FILM | David Bordwell | Higher Learning
On Master of the House:
The US (not UK, alas) ebook edition of "The Atrocity Archives", Laundry Files book 1, is on a special promotion. It's $1.99 for the rest of March! you can find the Amazon.com Kindle edition here, or the Barnes and Noble Nook edition here, Google Play store version here, and Apple iBooks store version here.
If you're reading this on my blog you're probably already aware of my Laundry Files series; in case you came here from elsewhere, "The Atrocity Archives" is book #1 in the sequence, and the latest novel, "The Annihilation Score" (book 6) comes out in the first week of July. In fact, I just finished checking the page proofs now, so it's heading back to the typesetter agency and then on to the printers next month ...
March 22, 2015
A new project—"Mars Academy"—aims to expand the cosmic horizon and offer a broader sense of opportunity for at least one group of underprivileged children in an impoverished neighborhood in Rio de Janiero, Brazil.
March 21, 2015
People Steven Levy writes for Medium about attending this year's TED. For fans like me, it's almost like a film festival review, previewing upcoming attractions on their YouTube channel:
"Sitting through them makes your brain feel like a mushy piñata, whacked by one mind-blowing idea after another. Did you know that babies use sophisticated data analysis to guide the way they use squeeze toys? Meet the Frank Gehry of the rain forest, who creates the bamboo edifices in Bali. Believe it or not, when adulterers say to their betrayed spouses It’s not about you, they’re telling the truth. Oh, and here’s a guy who landed a spaceship on an asteroid."The Monica Lewinsky talk has just been posted:
To make sure this good old blog code is still working.
After far too much wrangling with custom blog code I think I'm nearly there, at least on the being able to upload new posts.
The longer version is this...
I'm trying to own my own content, images, posts, audio and video. To do that I'm basically writing posts as flat markdown files into a directory structure that's reflected in the URL structure /year/month/day.
Then a whole bunch of node code converts the markdown files into .html files and this all runs as a static website. All that was fine but at the same time it rebuilt the whole site each time, in turn making FTPing up all the files a PITA.
The new code only commits files to the file system if they are new or have changed.
This is turn creates a list of files that have changed in a .txt file, which is
fed into the
tar command. The last step was for the script to upload the
compressed tar file and uncompress it.
Thus giving me a single script to build and upload just the changes to the site.
I want to bolt on an easy way for me to manage my images, which again involves just dropping them into the right folders and the script can deal with resizing them to all the sizes I need.
I think I'm rebuilding my own old version of flickr.
Oh, and also so that I can also output the whole lot in gopher format too for the gopher server I'm getting up and running.
I've been asked about my pen (for reals) a couple of times, so I thought I'd write a blog post about it. It's a Tombow Zoom 707 Ballpoint Pen (amazon UK/US), it cost £28 and I bought it for myself as a Christmas present.
I keep two Field Notes notebooks in my pocket, at night I take them out and put them on the bedside table. My life is dense, not hectic, not crazy busy, just every moment is filled. We have three kids, we home educate, the start-up I'm involved in is blowing up, I try to swim, I try to run, I'm learning the bass, I try and put together a podcast that takes an age, sometimes I even try to write a blog post or two. In all of that there's hardly any time to do other stuff, although that doesn't stop me thinking about other stuff. That other stuff goes down in one of the two notebooks.
When I think of something I often can't get to a laptop or my phone in time, I tried, the thoughts don't stay in my head long enough to survive the gauntlet of children asking me things on the way upstairs. If you've watched the film Memento it's like that scene where he's looking around for a pen to write the thing down before he forgets it. I decided I needed notebooks and a pen with me at all times.
I think it's the most I've ever spent on a pen.
Before this I used the Field Notes pen that came with the notebooks. It's a good pen, feels nice to hold, flows well but the clip doesn't clip it in my pocket properly. I can't slide it into my jeans without having to put a fingernail round the back of the clip to make sure it clips properly. When I sit down the pen didn't stay in the same place.
It was all kinds of wrong.
The Zoom 707 slides into the pocket right next to the seam, and better still it stays there, after all I didn't want to lose a £28 pen. For the next few days I'd reach down and feel for the red ball on the clip, to know it was still there.
Now it's a reflex action, I'll brush my hand past the side seam of my jeans and feel if the pen's clip is still there. When I feel it I know I can't forget anything, life is speeding on but in that one moment I know I haven't left anything behind. If I need to remember something it's in the notebook, if it's in the notebook I don't need to remember it. I can clear my mind and move onto the next thing.
When I stop to take a moment, I can touch the red ball feel it against my fingertips and the memory of the last thing I wrote comes back to me. It's a shortcut to having to open the notebook and read it back.
It's a memory machine, a meditation device and an anchor.
All this is fine and good, but how does it write?
It was all scratchy when I first used it, I thought I'd made a terrible mistake. Other pens I'd had took a word or two for the ink to get going, with this I wrote a page and it still didn't seem right. I just needed to give it more time, after a while of being in my pocket and being used the ink started flowing properly.
I was used to the thick slick blackness of the Field Notes pen, the Zoom is light and narrow. My handwriting is spidery and the pen suits it well.
It turns out, importantly, that I can now write more words per line, more thoughts per page with the new pen. It's a little thing but these notebooks aren't big so utilising more of the page is a good thing. I wasn't expecting it, so it's a bonus.
It writes fine and good.
That is the story of my new pen.
- Title: A tidal encounter caught in the act: modelling a star-disc fly-by in the young RW Aurigae system
- Authors: Fei Dai, Stefano Facchini, Cathie J. Clark, and Thomas J. Haworth
- First Author’s Institution: Kavli Institute, MIT, Cambridge, USA and Institute for Astronomy, Cambridge, UK
- Paper Status: Accepted for publication in MNRAS
Physical models in astronomy are generally as simple as possible. On the one hand, you don’t want to oversimplify reality. On the other hand, you don’t want to throw in more parameters than could ever be constrained from observations. But some cases deviate just enough from a basic textbook case to be really interesting, like the subject of today’s paper: a pair of stars called RW Aurigae that features a long arm of gas and dust wrapped around one star.
You can’t model RW Aurigae as a single star with a disk of material around it, because there is a second star. And you can’t model it as a regular old binary system either, because there are interactions between the stars and the asymmetric circumstellar disk. The authors of today’s paper create a comprehensive smooth particle hydrodynamic model that considers many different observations of RW Aurigae. They consider the system’s shape, size, motion, composition, and geometry, and they conduct simulated observations of the model for comparison with real observations.
A tidal encounter
The best-fitting model of RW Aurigae matches observations of many different aspects as observed today, including particle motions. Because the model is like a movie you can play backward or forward in time, the authors are able to show that the current long arm of gas most likely came from a tidal disruption. This was long suspected to be the case, based on appearance alone, but this paper’s detailed model shows that the physics checks out with our intuition.
What exactly is a tidal disruption? Well, in this case, over the last 600 years or so (a remarkably short time in astronomy!), star B passed close enough to the disk around star A to tear it apart with gravity. Because some parts of the disk were significantly closer to star B than others, they felt different amounts of gravitational force. As time went on, this changed the orbits of individual particles in the disk and caused the overall shape to change. This is the same effect that creates tides on Earth: opposite sides of the Earth are closer or farther from the Moon’s gravity at different times, which causes Earth’s oceans on the far side from the Moon to feel less force than the water closer to it. The figure above shows present-day motions of simulated particles in RW Aurigae that resulted from the tidal encounter. The figure below shows snapshots from a movie of the hydrodynamic model, from about 600 years in the past to today. Instead of representing motion, the brighter regions represent more particles (higher density).
Observational astronomers are always constrained to viewing the cosmos from one angle. We don’t get to choose how systems like RW Aurigae are oriented on the sky. But models let us change our viewing angle and better understand the full 3D physical picture. The figure below shows the disk around star A if we could view it from above in the present. As before, brighter areas have more particles. Simulated observations, such as measuring the size of the disk in the figure below, agree well with actual observations of RW Aurigae.
The final mystery the authors of today’s paper explore is a dimming that happened during observations of RW Aurigae in 2010/2011. The model suggests this dimming was likely caused by the stream of material between stars A and B passing in front of star A from our line of sight. However, since the disk and related material are clumpy and still changing shape, they make no predictions about specific future dimming events. Interestingly, another recent astro-ph paper by Petrov et al. report another deep dimming in 2014. They suggest it may arise from dust grains close to star A being “stirred up” from strong stellar winds and moving into our line of sight.
Combining models and observations like today’s paper does is an incredibly useful technique to learn about how structures of all sizes form in the Universe. Tides affect everything from Earth, to stars, to galaxies. This is one of the first cases we’ve seen of a protoplanetary disk having a tidal encounter. The Universe is a messy place, and understanding dynamic interactions like RW Aurigae’s is an important step toward a clearer picture of how stars, planets, and galaxies form and evolve.
When New Horizons flies past Pluto in July, we will see a new, alien landscape in stark detail. At that point, we will have a lot to talk about. The only way we can talk about it is if those features, whatever they turn out to be, have names.
A report released by NASA’s Office of Inspector General warns that the agency may be hard-pressed to have its Kennedy Space Center launch facilities ready by November 2018.
Planetary Society Media Producer Merc Boyan presents our new video resource.
March 20, 2015
Art Tomorrow morning Cass Arts opens a Liverpool outpost on School Lane near The Bluecoat in Liverpool. Full details here. It's in the newly vacated old Cath Kidston shop.
As part of the PR process, they were nice enough to send me some art supplies for me to try out the sort of thing they'll be selling.
I've been painting. pic.twitter.com/inZuaY0LZL
— Stuart Ian Burns (@feelinglistless) March 20, 2015
Suitably prodded I decided to have another go. I say another go because it's during art at school it became abundantly clear I can't paint, I don't have any artistic DNA in my genome. Much of my GCSE was spent copying out Disney characters and drawing terrible renditions of chimney boxes and my single (non General Studies) A-Level, which happens to be in Art, was gained due to a sympathetic teacher who allowed me to create collages for two years. Needless to say, nothing has changed.
I can't paint. pic.twitter.com/4PV83Krr93
— Stuart Ian Burns (@feelinglistless) March 20, 2015
Still. I did watch this instructional video on YouTube where a man paints a tree and had some idea about light and shading but in the end, as you can see I still did the collage, except in paint. Is it the sky? Is it the sea? Is it a bit of both? Who can tell? I don't have the patience now to learn how to do this effectively or indeed the time. So for all my readiness to criticise the art of others, it's always with a sympathetic eye. At least they can do this.
I've been quiet due to (a) recovering from delivering the hopefully-final draft of "Dark State" (the first book in the "Empire Games" trilogy, due from Tor next April), (b) visiting relatives, (c) having a nasty head-cold, and (d) having the page proofs of "The Annihilation Score" (July's Laundry Files novel) land on my desk. Normal service will, as they say, resume as soon as possible.
My current plan is to tackle the aforementioned page proofs, work on the next book, then head for Dysprosium, the British eastercon, over the Easter bank holiday weekend. And before I go I really ought to fit in time to catch up with the last Jim Butcher book that I haven't read yet, because he's one of the two guests of honour at Dysprosium and I'm on program to interview him. (If you've been reading the Laundry Files you might have noticed a tip of the hat in his general direction.)
Finally, here is an extremely dangerous toy (probably illegal in all sane jurisdictions).
It’s good to at least see the President Obama acknowledge this as an error. But he is still in office and he should do this using executive power. After all, he did sign an executive order authorizing the indefinite detention of terrorism suspects (in 2011) which is now a key ingredient in maintaining Gitmo together with the National Defense Authorization act of 2012 which does the same, despite having been partially blocked by a judge.
So, I will call and write to the White House asking President Obama to use executive power to close down Guantanamo Bay and stop indefinite detention more broadly. Please take a minute to do the same.
Authors: R. Rodrigues da Silva, B. L. Canto Martins, and J. R. De Medeiros
First Author’s Institution: Department of Theoretical and Experimental Physics, Federal University of Rio Grande do Norte
Status of paper: Published in ApJ
Nothing sits still in our Universe. Everything is always on the move. Like planets, stars rotate. Really, they are the most obsessive ballet dancers, perpetually doing spins (or fouetté, if you will) until they die. The authors of this paper found certain types of stars unexpectedly display rapid rotations when they are not supposed to.
Astronomers like — really like — to categorize things. Stars are categorized according to their spectral features (ie, the presence of certain elements in the spectrum) and can be one of the following spectral types: O, B, A, F, G, K, or M (“Oh Be A Fine Girl Kiss Me”). Temperature decreases from spectral type O to spectral type M, with O stars being the hottest and M stars being the coolest. However, because stars of the same spectral type can have widely different luminosities (and so different radii by the Stefan-Boltzmann Law, which relates luminosity, radius, and surface temperature of a star), a second classification by luminosity is added, where stars are assigned Roman numerals I-IV. The paper today focuses on evolved stars of spectral type G and K and luminosity class IV (subgiants), III (normal giants), II (bright giants), and Ib (supergiants). Supergiants are the brightest and largest, followed by bright giants, normal giants, and finally subgiants.
Humans wind down over the years, and stars do too. Stars spin down as they age, induced by loss of angular momentum through outflows of gas particles ejected from stellar atmospheres, also known as stellar winds. Therefore we expect evolved stars to spin slower than young stars. Evolved G- and K-stars are known to be slow-rotators (rotating at a few km/s), with rotation decreasing gradually from early-G stars to late-K stars. However, as it always the case in astronomy, there are always counter examples. As far back as four decades ago, astronomers found rapidly rotating G and K giant stars (luminosity class III) spinning as fast as 80 km/s. How and why these stars are able to spin this fast is still a puzzle, with theories ranging from coalescing binary stars, sudden dredge-up of angular momentum from the stellar interiors, and engulfment of hot Jupiters (Jupiter-sized exoplanets that orbit very close to their parent stars, hence the name “hot Jupiters”) by giant stars causing a spin-up.
Using a set of criteria, the authors of this paper hunted for single rapidly-rotating G- and K-stars in the Bright Star Catalog and catalog of G- and K- stars compiled by Egret (1980). Out of 2010 stars, they uncovered a total of 30 new rapidly-rotating stars among subgiants, giants, bright giants, and supergiants. To date, rapid rotators have only been found among giant stars; this work reports for the first time the presence of such rapid rotators among subgiants, bright giants, and supergiants. In fact, these objects make up more than half of the number of rapid rotators in their sample. Figure 1 shows the velocities along line of the sight (v sin i) versus effective temperatures for their sample of evolved rapid rotators, compared with G and K binaries (ie, binary star systems consisting of G- and K-stars). The similarity between the two populations implies a similar synchronization mechanism between the rotation of single evolved stars and orbital motion of the binary systems. That interesting relation aside, the main point to note from the plot is the large observed velocities of the rapid rotators compared to the mean rotational velocities of G- and K-stars.
The rapidly-rotating stars are analyzed for far-IR excess emission, which may indicate the presence of warm dust surrounding the stars (warm dust emits radiation in the mid- to far-IR regime). Looking at figure 2, a trend of far-IR excess emission is clearly seen for almost all of the 23 stars they analyzed. The origin of dust close to to these stars are not well understood; some attributed it to stellar winds driven by magnetic activity, while others hypothesized that it comes from collisions of planetary companions around these stars. In any case, any theory that tries to explain the nature of rotation in these single systems needs to account for the presence of warm dust.
The authors proposed that the coalescence scenario between a star and a low-mass stellar or a substellar companion (ie, a brown dwarf) or the tidal interaction in planetary systems with hot Jupiters to be plausible scenarios that can explain singly rapidly-rotating evolved stars. Because each scenario should produce different chemical abundances, the authors suggested searching for changes in specific abundance ratios, such as the relative enhancement of refractory over volatile elements, in these stars to differentiate between the various possible scenarios above.
Spinning stars are cool. Even cooler are rapidly rotating giant-like stars that spin of the clutches of theoretical predictions. While in the past rapid rotators among evolved giant stars can be explained away using small-number statistics, the authors of this paper added an order of magnitude more items to the list, forcing stellar astrophysicists to come face-to-face with the question of the nature of these rapid rotations.
LPSC 2015: First results from Dawn at Ceres: provisional place names and possible plumes by The Planetary Society
Three talks on Tuesday at the Lunar and Planetary Science Conference concerned the first results from Dawn at Ceres. Chris Russell showed a map of "quads" with provisional names on Ceres, Andreas Nathues showed that Ceres' bright spot might be an area of plume-like activity, and Francesca Zambon showed color and temperature variations across the dwarf planet.
One presenter at the Lunar and Planetary Science Conference asked the audience not to blog about his talk because of the embargo policy of Science and Nature. I show how this results from an incorrect interpretation of those policies. TL;DR: media reports on conference presentations do not violate Science and Nature embargo policies. Let people Tweet!
A semi-authoritative ranking of creatures that co-inhabit rocket launch sites around the world.
Slides from the LPSC 2015 Session on the Community Response to NASA's Budget Request by The Planetary Society
The Planetary Society helped organize a community response to the latest NASA budget at the 2015 meeting of the Lunar and Planetary Science Conference.
March 19, 2015
Lunch. £4.25. East Avenue Bakehouse. 112 Bold Street, Liverpool, Merseyside L1 4HY. Phone:0151 708 621. Website.
TV File this under "unlikely Amazon purchases":
A friend on Twitter noticed this pre-order page for the looong delayed release of Doctor Who's The Underwater Menace, the only surviving episodes of Doctor Who still awaiting release (that we know of). So of course I put a pre-order in even though I don't really expect it to be posted to me by the 7th April and not at that price, which is back to the funny numbers for single stories of the original VHS releases (£39.99 for Revenge of the Cybermen? Yes, please!). Here are the problems:
(1) Kasterborous has a statement from the BBC which says its been removed from the schedule.
(2) The BBFC website doesn't list a recent classification. The episode three assessments are from over a decade ago when it appeared in the Lost in Time boxed set and before that The Ice Warrior cardboard box.
(3) Gallifrey Base is very ¯\_(ツ)_/¯
(4) It's not available at the BBC Shop.
So it's probably a database error. But nothing the world could stop me from pre-ordering. Just in case.
I have long been writing about climate change on Continuations. Mostly it has been about more evidence with the occasional call to action. Today is such a call if you live in London: Our portfolio companies AMEE and Twilio together with IBM, CapGemini and others are organizing an Environmental Data Hack Weekend this coming weekend.
Climate change is showing up in data everywhere. But many of the data sets are dispersed and not easily accessible. The Environmental Data Exchange was built by Digital Catapult and AMEE to bring together disparate datasets and make them more easily accessible. The goal of the Data Hack Weekend is to test out this new platform and build interesting applications on top of it.
I believe that as more people are exposed to the data that is already available and more data becomes available it will continue to raise awareness of how grave the situation already is. One example here in the US is the ongoing California drought that is turning into a major water emergency.
So, dear friends in London who care about this: go and join the Environmental Data Hack Weekend!
- Title: Evidence for Gamma-ray Emission from the Newly Discovered Dwarf Galaxy Reticulum 2
- Authors: Alex Geringer-Sameth, Matthew G. Walker, Savvas M. Koushiappas, et. al
- First Author’s Institution: Carnegie Mellon University
- Paper Status: Submitted to Physical Review Letters
Dark matter has been observed only “indirectly”
Most readers of this blog have likely heard of dark matter before; you may even take it for granted that it exists. A huge body of astronomical research exists offering conclusive evidence for some kind of mass in the universe which a) exerts gravitational force on regular matter, b) doesn’t emit light, and c) seems to be 5 times more abundant than regular matter. Evidence for this extra mass comes from gravitational effects, including
- The high velocities of stars/gas in galaxies, and of galaxies in larger clusters
- The oscillations of the CMB power spectrum
- Direct measurements of cluster masses from gravitational lensing (Fig. 1)
These observations indicate that there is much more mass in the universe than we can see with light.
But these gravitational clues are only “indirect evidence” for dark matter’s existence. It is like seeing a dancing marionette puppet (the gravitational effects), and inferring there must be strings (dark matter) to make it move.
Some models (such as Modified Newtonian Dynamics) try to explain these gravitational effects through a subtle change to General Relativity. These changes could make gravity stronger in certain places without requiring more mass. This explanation is like saying the puppet is held up by magnets: it could explain the puppet’s movement just as well as strings could. To prove dark matter really exists, we must actually see the strings, not just the puppet. Today’s paper offers some of the best evidence yet!
Where to look for “direct” evidence of dark matter
In order to conclusively prove the existence of dark matter, astronomers and physicists alike want so-called “direct evidence”: a unique measurement of a new particle, using some property other than gravity. Astrobites has previously covered many examples of ways to directly detect dark matter (assuming the particles are something like a WIMP). The most direct detection method is with particle detectors buried underground. Another direct method is to detect gamma-rays or radio waves emitted when two dark matter particles collide (Fig. 2).
Because dark matter particles don’t interact with light, they shouldn’t produce photons on their own when they collide. However, some models predict they could create a particle-antiparticle pair, like a tauon and an antitauon. These byproducts of the dark matter collision can then collide again, resulting in gamma-rays. They could also spiral around galactic magnetic fields, producing radio waves. This method is less direct than actually seeing a collision in a particle detector. But, it would be much more conclusive evidence for dark matter than the gravitational effects alone.
The best place to look for the emission from colliding dark matter particles is where dark matter is thought to be the most dense: the centers of galaxies and clusters. Observers claim to be very close to detecting gamma-rays coming from dark matter at the center of the Milky Way (see this Astrobite). But while large galaxies like the Milky Way should have a lot of dark matter, they also have a lot of background gamma-ray sources! Pulsars, supernova remnants, and our supermassive black hole all emit gamma rays like crazy, and are all clustered right near the center of the Milky Way. So even a strong dark matter signal could be tough to distinguish from the regular gamma-ray emission from the galaxy.
A better place to look for dark matter signatures is in dwarf galaxies. They may have less dark matter overall, but also have the benefits that:
- Dwarf galaxies are very common (there are more than 50 around the Milky Way and Andromeda)
- Many are located at the galactic poles, away from the gamma ray emission of the Milky Way
- They have many fewer internal gamma-ray sources, like pulsars and supernova remnants
Therefore, dwarf galaxies are thought to offer one of the best ways of detecting dark matter, if it actually emits gamma-rays at all.
A detection of gamma-rays from a dwarf galaxy
Previous attempts to observe gamma-rays coming from dwarf galaxies have been marginally unsuccessful (see this Astrobite). The authors of today’s paper analyzed archival measurements from the Fermi-LAT gamma-ray space telescope, looking for emission from the dwarf galaxy Reticulum 2. They estimate the expected background in two ways: one from the Fermi group’s model of diffuse emission in that region, and one from actual measurements, averaged in the regions around Reticulum 2.
In an exciting breakthrough, the authors find more gamma-ray flux than expected from either background model! This emission is observed to come from gamma-rays between 2–10 GeV (Fig. 3). If this emission comes from dark matter collisions — rather than some other source that hasn’t been identified yet — this energy range could help constrain the mass of the dark matter particles themselves.
The authors claim this to be a significant detection of dark matter. Values as high as 4σ are tossed around, suggesting less than a one-in-ten-thousand chance that this is just noise. This exciting result has been getting lots of press attention over the past week (such as in this Brown University press release, this Discovery article, and even the New York Times). However, these significance levels are dependent on the models chosen for both background emission and for the dark matter particle physics.
In this Astrobiter’s opinion, the most that can be confirmed is the detection of an excess of gamma-rays from Reticulum 2. The authors themselves stress that there is a lot more work ahead to confirm this emission indeed comes from dark matter. More observations of this galaxy in gamma-rays are needed, to improve the quality of the signal. The Fermi group is soon releasing a new data reduction algorithm, which will improve the resolution of the observations already completed. Perhaps even more importantly will be follow-up in other bands, like in optical or X-rays. These can identify whether another source which could produce gamma-rays (like a pulsar or a young, massive star) is nearby to this galaxy. Dark matter has not yet been conclusively detected, but these observations indicate we might be very close. It is unquestionable that this object (and other dwarf galaxies like it) will be getting a lot of attention in the years to come.
In my first post from the 2015 Lunar and Planetary Science Conference, I discuss the latest work on Philae images, and some cometary polymers.
A huge amount of effort goes into deciding where to try to collect a sample on Bennu. There are roughly nine months to survey, map and model the asteroid to help make this decision.
March 18, 2015
Film Irrespective of Richard Linklater's achievements, Before Sunset is a reminder of how films have forced a dedication in me which I should probably usefully apply to other things. Glancing through the review posted when the film was released, I'm reminded that because the film didn't turn up straight away at FACT in Liverpool, I travelled out to the Cornerhouse in Manchester after work, nearly an hour by train there back again (this was when I was at Liverpool Direct storing up the money to go to university, or not depending on whether my application was accepted) (oddly enough moving to Paris was plan-B and at this point I'm not sure that wouldn't have been the better option) (but I digress). This wasn't an unusual trip. Years before I'd done the same with a friend for a screening of Singing in the Rain (and Hamlet even earlier) and would later kill myself to get to a screening of the first episode of Torchwood (not literally clearly though given what happened with the rest of that series it might have been preferable too).
With Netflix and rentals-by-post and the price of tickets, would I do this again now? Probably. I saw the sequel, Before Midnight, in the same screen at the same cinema in the same seat nine years later at the end of a day spent shopping in Manchester though it's fair to say I probably went to city on that week, on that day because Before Midnight was playing. That would also describe my approach to Cedric Klapisch's Chinese Puzzle. What Netflix lacks and by-post only just manage to retain are the sense of anticipation and the ritualistic aspect of going to the cinema, of buying a ticket, hoping your favourite seat is free and if you're me going to the toilet three time before the film starts because I've drunk too much coffee again. Oh and reading the cinema's brochure advertising upcoming attractions which in the case of the Cornerhouse is often films which I know I won't be watching for another six months while I wait for them to turn up on dvd or streaming (creating a different kind of anticipation).
But much of the trip has to do with the Cornerhouse itself. As you'll read in the coming weeks, the Hyde Park Picture House in Leeds was for a time my cavern of dreams. After that the 051 in Liverpool. Then FACT. But the Cornerhouse is the Cornerhouse. It's my favourite cinema. For me, it is cinema. Partly it's because living in a different city I can't take it for granted so it'll always be special. Partly it's because in earlier years with its cushioned fabricy blue seats, basement screens of various sizes and box office across the street from its largest venue it didn't seem like any cinema I'd ever visited. The smell. Plus back in the day before the internet, those brochures always seemed filled with films which I'd never heard and would never see. Even now, the Cornerhouse is the place I go when I want to see films which are important or feel like they're going to be important to me. The idea of getting on a train just to see a film at the Cornerhouse has never seemed ludicrous because they experience of seeing a film there will be unlike seeing a film anywhere else.
Soon it'll be gone of course, replaced by a much larger, and to be fair, probably better designed venue up the road. Presumptuously called "Home" it'll be interesting to see if it retains the mystique. There'll be more screens, so a greater selection of films (one of the joys of the Cornerhouse was the limited selection at odd screening times which meant that sometimes I'd stumble upon a film I wasn't expecting because it's all that happened to be on when I happened to be in Manchester). Plus a theatre. Plus a cafe. Plus the gallery space. It sounds like it could still be like no other cinema, bar FACT or the BFI, I've ever visited. But it'll also be nothing like the Cornerhouse because initially it'll be without the memories which can only be gained by visiting a building repeatedly. That place may well be called "Home" but the Cornerhouse at the moment has a better claim to the description. The last film I saw there, quite deliberately, was Chinese Puzzle but if it had been another installment in the life of Celine and Jesse that would have been just right too.
Here I am at a climbing wall in Bristol, having balked at the price and instead sat in the cafe. They ought to have a discount rate for people don’t actually like climbing. Here’s what I enjoy instead. (I am so predictable):
Brrr, it was cold in grimmest South Wales on a grim Saturday when all the colours are grey and the light makes everything flat. I enjoyed a sublime hour long flight above Pontlottyn, even though I didn’t go anywhere except soar up and down the ridge alone and then top land in a howling gale.
My box of tricks is in the 3D printed purple box on my left next to the airspeed indicator. I’ve not had the chance to do anything with the data except plot it and go: “that’s pretty noisy” at the accelerometer data. I have plans to extract consistent correlations, barometer vs altitude vs temperature, bar position vs wind speed, roll angle vs turning rate registered on the compass, and whether I keep diving out of turns because I don’t have enough speed and control.
Unlike my so far doomed attempts at manipulating house and fridge temperature data, this flight dynamical system is memory-free. The same temperature, pressure, windspeed, and wing angle at any time of the flight should result in exactly the same response. Deformations of the wing are brief and temporary. This is not the case for the fridge where every cycle begins with a different temperature distribution within the dense cabbagy foodstuffs and chemical pumping machinery.
Subject to instrumental noise and turbulence, all the data should be with me, and I can only hope this isn’t going to end up as just another one of my expensive failed software projects that looked plausible when I began, but then crash landed in the trees.
When I discuss the impact of automation on the labor market I sometimes run into the argument that self driving cars are still years away and that there are a lot of edge cases to solve before they will replace human drivers.
That argument makes it appear as if there was some distinct moment in the future where we finally say “hurrah, the self driving car is completely finished” and that there will be no impact until then. But that is not how technology progresses.
Instead, technology tends to arrive piecemeal in fits and starts. And it has an impact all along the way already. The same is true for self driving cars. We already have some key components that have entered the market: the most impactful one has been GPS based navigation (which has now come to pretty much every car courtesy of smartphones). While this doesn’t automate the driving per se it does take over a crucial bit which is figuring out how to get from A to B. And that means that cheaper labor can now be substituted for trained labor in driving taxis and limousines!
What will be the next big step? Well one possibility is remote driving. Suppose you have a self driving car but it can’t handle certain edge cases, like getting stumped on a New York city street that’s blocked by an oil delivery truck (we’ve all been there). Instead of having to be able to solve a situation like that completely autonomously one can envision a future of driver pools that stands by whenever a self driving car needs an assist. Here technology provides leverage — a single driver might now be able to “supervise” multiple cars. And of course that driver could be in a low cost part of the country.
This has of course already happened although not yet with cars — the US military has many new drone pilots who have never logged an hour of flight in a real plane. And drones can keep themselves up in the air and takeoff and land by themselves. So even the number of pilot hours relatively to flight hours has already changed. And while I don’t know anything about military pay, in commercial aviation the impact of automation has helped drive down airline pilot pay at the 75% percentile from $150K in 1999 to $100K in 2011.
So let’s please not pretend that we shouldn’t think about labor market consequences of self driving cars because fully autonomous cars en masse are still years away. The impact of what is becoming possible will be felt all along the way.
Authors: Mark Hollands, Boris Gaensicke, Detlev Koester
First Author’s Institution: Department of Physics, University of Warwick, Coventry, CV4 7AL, UK
The galaxy is littered with white dwarfs, the burnt out remnants of stars that have run out of hydrogen fuel in their cores, but were too small to explode as supernovae. But far from being lifeless orbs, around a tenth of white dwarfs have powerful magnetic fields, a million times stronger than that of the Sun. How did these magnetic white dwarfs become such strong magnets? And just how many are there? Hollands et al. set out to answer the second of these questions, in the hope that it will shed light on the first.
By far the easiest way to find magnetic fields in white dwarfs is to look at the absorption lines in their spectra. Normally, the light absorbed by an element at a certain wavelength is seen as a single, thin line. However, if the white dwarf is magnetic, then a process called the Zeeman Effect splits the line into three. The further the outer two lines are from the central, original absorption line, the stronger the field. This offers a simple way to both find magnetic white dwarfs and to measure their field strengths.
But as white dwarfs get older, they cool down, and the helium absorption lines in their spectra disappear as the atoms fall to their ground state. This hides the magnetic field, preventing us from knowing the magnetic behaviour of white dwarfs over their entire age range. Fortunately, a large fraction of these white dwarfs also have planetary systems. Asteroids and other planetesimals often fall onto the white dwarfs, producing the absorption lines needed to reveal their ancient magnetic fields. White dwarfs like that are known as DZs, Degenerate objects that have elements with a high atomic number (Z) in their atmospheres.
The authors used the huge Sloan Digital Sky Survey (SDSS), filtering down the thousands of objects observed to find 79 DZ. Of these, 10 of them were found to be magnetic, 7 of which no one had spotted before. Their magnetic fields were huge: up to 9.59 MegaGauss (nearly 1000 Tesla). For comparison, the large magnets used in MRI machines generate around 3 Tesla. The authors found that this implied that roughly 13 percent of DZ white dwarfs were magnetic. However Hollands et al. point out that there are many biases involved, such as the difficulty in seeing dim objects, or in spotting weak magnetic fields in spectra with low signal to noise, as well as the uncertainty over the geometry of the magnetic fields. Accounting for all of these biases could lead to an even higher occurrence rate.
The authors then compared this rate with that found by others looking at different types of white dwarf, or over different temperature ranges. What they found was surprising: Magnetic fields appeared to be much less common in other types of white dwarf, and much weaker.
With this in mind, Hollands et al. then tried to answer the question of where the magnetic fields came from. The simplest explanation is that they were there from the start, left over from when the white dwarfs were stars. The progenitors of the magnetic white dwarfs could have been the curious Ap/Bp stars, which have much stronger magnetic fields that most stars at around a few KiloGauss. As they shrunk down into white dwarfs, conservation forces would have ramped up the field to the required MegaGauss levels. Unfortunately there aren’t enough Ap/Bp stars to account for the high occurrence rate of magnetic DZ. Nor does this explain why they found many more magnetic DZ compared to other types of white dwarf.
The authors’ next suggestion is that the magnetic fields are the result of a binary merger, where the white dwarfs combined with companion stars into single objects. During this process a magnetic dynamo would naturally be created. This would also explain why these white dwarfs seem to have a higher average mass than other white dwarfs. The problem with this hypotheses is that the authors only know that the white dwarfs are magnetic thanks to the white dwarfs’ planetary systems, which probably wouldn’t survive such a merger. However, the authors say that it is possible that it could be a planet doing the merging, rather than a companion star. But this explanation again falls victim to the high incidence rate: models of planetary systems at white dwarfs suggest that it can’t happen enough to make all of the magnetic white dwarfs.
Thanks to these flaws in all of the potential explanations, the authors are forced to leave the question of how the magnetic white dwarfs formed unanswered. They finish by pointing out that the next set of results from the SDSS is on its way, promising to provide many more of the mysterious magnetic white dwarfs to investigate.
March 17, 2015
People This week BBC Four broadcast Climate Change by Numbers in which three mathematicians - Dr Hannah Fry, Prof Norman Fenton and Prof David Spiegelhalter utilised a set of numbers each to explain why climate change should be the most pressing issue on the planet Earth we should all be dealing with and why we should believe the 97% of climate scientists who believe it to be due to human activity. The whole thing is available to watch here for the next two weeks, assuming it isn't repeated in which case it'll be a bit longer.
As her biography page at UCL explains, Dr. Fry, "is a lecturer in the mathematics of cities at the Centre for Advanced Spatial Analysis (CASA). She was trained as a mathematician with a first degree in mathematics and theoretical physics, followed by a PhD in fluid dynamics. After a brief period working in aerodynamics, she returned to UCL to take up a post-doctoral position researching a relatively new area of science - social and economic complex systems. This led to her appointment as a lecturer in the field in October 2012."
Here she is herself with a richer explanation of her interests from a German science conference's channel:
Which is fascinating and for the past few years she's appeared at a range of conferences and courses demonstrating this approach to data and I've gathered as many of these talks as I can find. There's some repetition in places, but she's excellent at finding new ways of presenting similar data for different ends.
Let's begin with Fry's own channel where there are three pieces, two of which are provided for her students and (just slightly) above entry level:
Can maths predict a riot
"This video came about as a splinter project of some work I've been doing at UCL, trying to understand the 2011 London riots from a mathematical perspective. "
Linear relationships, power laws and exponentials
Logs and exps
The general UCLBASc channel has this piece explaining the course in greater detail:
But perhaps she's best known publicly for this brilliant TED Talks (the first also available as an eBook and which as Fry's media page demonstrates gained some interest around Valentine's Day this year):
The mathematics of love
"Finding the right mate is no cakewalk — but is it even mathematically likely? In a charming talk, mathematician Hannah Fry shows patterns in how we look for love, and gives her top three tips (verified by math!) for finding that special someone."
Is life really that complex?
"Hannah Fry trained as a mathematician, and completed her PhD in fluid dynamics in early 2011. After a brief period working as an aerodynamicist in the motorsport industry, she came back to UCL to work on a major interdisciplinary project in complexity science. The project spans several departments, including Mathematics and the Centre for Advanced Spatial Analysis, and focuses on understanding global social systems -- such as Trade, Migration and Security. Hannah's research interests revolve around creating new mathematical techniques to study these systems, with recent work including studies of the London Riots and Consumer Behaviour."
In 2014 she was invited to speak at Ada Lovelace Day 2014:
Can Maths Predict the Future?
"Hannah Fry shows how maths can explain real world events. From crimes to relationships, patterns in numbers such as Benford's law on the prevalence of numbers starting with 1', help us predict the future."
She extemporised on riot prediction theory at re:publica 2014 across a whole hour:
I predict a riot!
Synopsis: "It happens again and again: peaceful protests turn violent, paving stones become missiles, the police respond with truncheons, tear gas and water cannons. People are hurt, sometimes fatally. But why were certain areas and neighbourhoods affected by the violence, while others remained completely peaceful? Why does social unrest start in the first place? How does it spread?"
She's also a regular on the Numberphile channel:
And the BBC's Britlab:
She's also presented a Radio 4 documentary, Can Maths Combat Terrorism?
For more on Dr Hannah Fry, visit her website.
Think of a map in combat or a chessboard. There are two things that both instruments tell you.
First, is the position of things or WHERE things are in relation to other things. This Knight is on this part of the board next to the King etc. The troops are on this hill and the enemy is in the pass below.
The second thing they tell you is WHERE things can move to. We can move these troops positioned on the hill, down the hill to the south but we can't move the same troops north because there is a cliff which falls into the sea etc. We can move this Knight here or that Rook there.
A Wardley Map enables you to draw a line of the present (an existing business or organisation) on a landscape of position (value chain) vs movement (evolution). See figure 1.
Figure 1 - A MAP
"We're going to make the lives of cats better everywhere. It's part of our vision to become the best supplier of cat food. That's why we're building the internet of things for kittens!"
BUT be warned. When asked the same question you need to reply in the same vague, hand wavy way. Don't give away information. Misinform. So do a bit of digging on your competitors before you attack. Just in case they know what they're doing. Chances are, you'll be fine.
Which is the other point of mapping. Don't just map yourself, map your competitors. If a competitor doesn't understand the game, don't hesitate to help yourself to their market. It's always easier to build by taking out the easy prey before you tackle the hard targets.
Film When I wrote my MA Film Studies dissertation about "hyperlink" films, as you might expect I had to watch a lot of them and although the process was mostly about analysing them looking for commonalities, pretty soon I did realise that some of them were better than others. Which is why I can just about agree with Nicholas Barber in today's G2 about how some films which would have worked perfectly well as portmantau films found themselves mixed and matched in unuseful ways:
"... prefer a good honest portmanteau. It’s less tricksy than a hyperlink film. Every section has to stand or fall on its own merits, as well as complement the whole. If one segment doesn’t entertain, it risks being cut out: the other segments will survive without it. Sub-Altman hyperlink films, on the other hand, can use their constant back-and-forthing to disguise the weakness of the individual strands. You don’t get such shilly-shallying from Dr Terror’s House of Horrors, where every gruesome story packs a punch and has a twist to remember. Great title, too."But as I researched my dissertation, one of the questions I had to deal with in relation to considering if this was a genre and what its tropes might be was what the "pleasures" might be for the audience, the repeated element that people look for. In hyperlink films this is the moment when you realise how the people are connected, that two characters you've been following are actually (sorry) siblings or married or co-workers or whatever something which often doesn't happen until deep into the film causing you to re-evaluate what you've seen before. Todd Solondz is a master of this - Happiness being the primary example. Oddly, he doesn't pention Paris J'Taime which brilliantly is a portmantau film, until it isn't.
Last night at dinner while waiting for the food we played the “Google autocomplete game.” Here is how it goes. You go to Google and type the beginning of a search query. Then you tell the others the possible completions and they have to guess which is the most popular (you can even try to guess the whole order — this is loosely based on Family Feud and if you want a more elaborate version check out Google Feud).
So when it was Susan’s turn she typed “VCs are” and got this, which is pretty funny but also a bit revealing
I am not exactly sure if this list is entirely driven by the most frequent queries or takes other signal into account but in any case it does reveal something about the perceived image of venture investors (see my posts about information cascades on why this should be taken somewhat with a grain of salt).
The first one is interesting because it is right but not necessarily in the way you might expect. Most likely what people are looking for are examples of VCs doing mean things to founders. There is another interpretation though. I have encountered a number of (especially younger) entrepreneurs who have selected their investors based on whom they got along with the easiest. But that’s a shallow definition of friend. Rather a friend should be someone who will tell you when you are screwing up. With that definition of “friend” I agree that a lot of VCs are not friends because they wait too long to tell entrepreneurs that and then it is usually too late.
The second one could be fed by many things but I suspect that it primarily arises from situations in which the interests of the VC and those of entrepreneurs diverge. There are many such situations that can arise over the lifetime of a company starting with giving entrepreneurs reasons for passing on an investment and for others negotiating the price at which an investment takes place. My own approach to these situations and others (e.g., sale of a company) has been to make the conflict completely transparent and discuss it openly and thoroughly. This takes more time and may lead to somewhat different outcomes. But I find it sure beats hiding behind claims that something is “standard” or possibly even claiming it is good for the entrepreneur when it is not.
The third one frankly came as a surprise to me. Personally I could be considered a failed academic (completed my PhD but didn’t go into academia), but many VCs I know are former operators and often with great success. So this morning I figured I would look what this query actually returns and the top result is an article from Australia titled “VCs are ‘failed academics’” that as far as I can tell has nothing to do with VCs. So here then we are left with a bit of a puzzle and potentially a sign of an information cascade of the type that had Google in a defamation lawsuit by the German first lady.
The fourth one is, well, dumb, and I am sure applies to many other professions (I didn’t actually test this). For good measure then, here is what Google thinks about entrepreneurs
The Planetary Society's LightSail spacecraft made an appearance on national television Monday night during a two-minute segment by CBS Evening News.
For those of you who are here at LPSC 2015, we’ve organized a special session at noon on Tuesday, March 17th in the Montgomery Ballroom to bring together representatives from the three major professional organizations that represent planetary scientists to address your questions and concerns about NASA's 2016 budget request.
March 16, 2015
Continuous and Sustainable Competitive Advantage comes from Managing the Middle not the Ends by Simon Wardley
Why bother with a profile?
The components in the profile are all evolving from left to right (i.e. competition drives them to more industrialised). If nothing novel was ever created then it would all (assuming effective competition) become industrialised. You can use a profile to determine the balance of your organisation and whether you need more pioneers, more settlers or more town planners (see figure 6).
To give you an idea of how to apply the three party structure. I've covered this before (many, many times) but this diagram will help.
Figure 7 - Applying PST (Pioneer, Settler and Town Planner) to a Map
As with the need for using different methods whether purchasing or project management (agile for genesis, lean for transition, six sigma for industrialised) then the roles of pioneers, settlers and town planners are different. So, is their culture. So is the way they organise. So are the type of people. See figure 8.
Figure 8 - Characteristics of Pioneer, Settler and Town Planner
Ok, but what has this to do with the middle?
This form of structure is the only way I know of sustainably solving the Salaman and Storey innovation paradox of 2002. This isn't "nice ideas", this came from competitive practice over a decade ago and subsequent re-use. Why it worked so well could only be explained afterwards in 2007 when the evolution curve went from hand waving pattern to something more concrete.
"Of all the series mentioned above, “Harry Potter” is most pertinent to Boyhood. Their production periods overlapped considerably, and their directors faced similar major challenges. This despite the disparity in their finances. “Harry Potter” had a huge budget supplied by Warner Bros., ranging from the lowest at $100 million for Chamber of Secrets to the highest at $250 million for Half-Blood Prince. (These are the budgets as publicly acknowledged, taken from Box-Office Mojo, which has no figures for the last two films.) Boyhood had a lean budget of $200 thousand for each of the twelve years and totaling about $4 million with postproduction and other expenses added in. Still, as we shall see, the challenges really had nothing to do with the budgets."Previously. I love the nugget that the the IFC Center in New York ran all eight Potter films in a row as a homage to Boyhood. Kristin makes a convincing case for the two to be somehow inextricably linked, and the appearance of the Potter books in Boyhood suggests it is something Linklater was clearly conscious of.
The NASA Administrator declared that the Opportunity rover is a mission 'whose time has passed' and will be defunded next year. Will Congress act to save it?
March 15, 2015
These days I use maps of organisations to apply a cell based structure ideally using a Pioneer, Settler, Town Planner model. The map is based upon a value chain (describes the organisation), how things evolves (describes change) and the changing characteristics of activities, practices and data from two extremes (the uncharted to the industrialised) - see figure 1.
Figure 1 - PST.
I use these maps across many business and governments to reduce duplication, bias & risk whilst improving situational awareness, gameplay, correct use of multiple methods & communication - see figure 2.
Figure 2 - Map of a Media Company.
Key to mapping is the evolution axis i.e. you can't actually anticipate change over time in any effective manner (we don't have a crystal ball). See figure 3.
Figure 3 - Evolution
A bit of HISTORY.
My early attempts to map an environment were a disaster because I attempted to use time and time based sequences to measure change. I eventually postulated a path for how things evolved and presented this at Euro Foo 2004, EuroOSCON 2006 and many conferences in between. - see figure 4.
Figure 4 - Early Evolution [from EuroOSCON 2006].
- I don't use Colonist anymore. I use Pioneer, Settler and Town Planner.
- I don't use innovation anymore. I use the term genesis (of a novel and new thing) because innovation is so abused to mean everything.
- I don't use chaotic anymore. I use uncharted to describe the unknown space.
- I don't use linear anymore. I use industrialised to describe the defined space.
Mapping, use of a three party system, the extremes of uncharted to industrialised are not new. Despite refinement to the terms, this stuff is all nearing or over a decade old. The field has moved on since then into understanding common economic patterns, weak signal analysis and other forms of organisational learning.
By the time something of use gets written in a book or a publication, the value related to it has often been extracted long before. You can't get to the bleeding edge of competition by reading books or publications. You have to live it. Even models like pioneers, settler and town planner are starting to feel dated and being examined through lenses of a continuous spectrum and other structures.
The laggards are nowhere near getting to the spaces where the leading edge have long since moved on from.
The gap appears to be widening.
I used to think the gap between the leading edge and laggards was five to seven years, it now seems like fifteen or possibly much more (i.e. how long it will take the laggards to catch up with where the leading edge is today). This may be a normal consequence of inertia, an acceleration in the underlying cycle of change combined with poor situational awareness - of this, I'm not sure. It needs testing.
However, within the laggards it does feel as though the copying of easy to digest memes appears to be more rampant, circulated by those who benefit from the memes to those that need an 'easy solution' in a marriage of convenience. As Rosenzweig once said
“Managers are busy people, under enormous pressure to deliver higher revenues, greater profits and ever larger returns for shareholders.They naturally search for ready-made answers, for tidy plug-and-play solutions that might give them a leg up on their rivals. And the people who write business books – consultants and business school professors and strategy gurus – are happy to oblige.”
I see an awful lot of companies plunging down paths they shouldn't with little or no situational awareness. This isn't getting better for some. There's going to be an awful lot of casualties.
March 14, 2015
- The cycle of commoditisation
- The evolution curve (ubiquity vs certainty)
- Evolution vs Diffusion
- Three layers of IT shifting from product to utility
- Why we had no choice (pressure for adoption, Red Queen)
- Accelerators to evolution (network effects, open source, standards)
- Polar extremes of an organisation (innovation to commodity)
- Properties of those polar extremes
- Why one size methods don't work and the need for different methods (Agile and Six Sigma)
- How to organise for these two extremes (Pioneers, Settlers and Town Planners)
- Different investment / competition effects as activities evolve
- The importance of a maps (though I didn't cover how to map).
- The issue of bias
- Creative destruction
- The innovation paradox
- How reducing non strategic cost is both friend (operational efficiency) and foe (reducing barriers to entry)
I also provided Four Predictions
- An increase in the rate at which innovative services and products are released to the web.
- More disruption in the information markets.
- Increasing organisational pressure within IT.
- The architecture will never be completed.
7 years later ... the predictions all hold and every item on the presentation is still relevant.
These days, any arguments are about who created the ideas first. Usually some consultancy firm is claiming them. This makes me smile. Most of the ideas come from long dead economists. I like long dead economists, they work cheap and don't grumble.
It's marvellous to see how things have changed.
Today, I work at the LEF where many governments and companies now fund me to research into even more marvellous areas. Had I never taken the risk, spent my life savings and given the concepts all away then that would have never happened. I'd never had worked for Canonical. I'd never have written the 'Better for Less' paper. I wouldn't have met the good friends I have today.
Karma is kind.
The slides are just an aid to the talk. At some point I'll record the video again (I have the original text) and it'll make more sense but the slides are enough for now.
Apologies to my regular blog readers but I have been battling a nasty cold. So instead of getting up in the morning and writing a blog post I have been sleeping in a bit longer. I probably should have taken a couple of days off when I first got this cold a couple of weeks ago but have been kind of swamped at work and have ignored my own advice.
Lots going on that I would like to write about though in particular I have been intrigued by the debate around Hilary Clinton using her own email account during her time as Secretary of State as well as the recent decision to retain state workers email for only 90 days in New York. If you want to discuss some of this, I suggest heading over to USV where Privacy & Security are the topic of the week.
Adding Churyumov-Gerasimenko to my scale comparison of comets and asteroids by The Planetary Society
Having found a color photo of the comet, I finally added Churyumov-Gerasimenko to my scale comparison of comets and asteroids visited by spacecraft.
Next week is the 46th Lunar and Planetary Science Conference (LPSC), and Emily Lakdawalla will be attending to tweet and blog about news from Rosetta; Curiosity; MESSENGER; GRAIL; Chang'e 3; Dawn; New Horizons; Cassini; and more.
March 13, 2015
- Pioneers don't disrupt. There is nothing to disrupt.
- Settlers sometimes disrupt (as in product to product substitution in an industry). This is unpredictable.
- Town Planners often disrupt past industries. This can be anticipated quite a time in advance.
- There are two forms of disruption (predictable and non predictable) but that doesn't stop people pretending there is one.
- You are using an outside ecosystem to act as your pioneers (e.g. providing utility services through public APIs and mining consumption data)
- You introduce some form of forcing function to prevent novel activities being created internally e.g. introducing a requirement to create a press release before writing a business plan or prototyping anything. [Hint : people can't write press releases for novel, uncertain activities which haven't been explored yet]
- You are using an outside ecosystem (consumption) to act as your pioneers
- You are using an outside ecosystem (supply) to act as your town planners
Friendship is context-sensitive.
I wouldn't describe Terry as a friend, but as someone I'd been on a first-name acquaintanceship with since the mid-1980s. If you go to SF conventions (or partake of any subculture which has regular gatherings) you'll know the way it works: there are these people who don't really see outside of this particular social context, but you're never surprised to see them in it, and you know each other's names, and when you meet you chat about stuff and maybe sink a pint together.
I haven't seen Terry since the Glasgow worldcon in 2005. The diagnosis of his illness came in 2007; I'd been spending a chunk of 05-07 out of the country, and after the bad news hit I didn't feel like being part of the throng pestering him (for reasons I'll get to later on in this piece.)
I first met him, incidentally, back in 1984, at a British eastercon in Leeds. It was, I think, my first SF convention. Or my second. I was a spotty 17- or 18-year-old nerd, wandering around with a manuscript in a carrier bag, looking for an editor—this was before the internet made it easy to discover that this was not the done thing, or indeed before word processors made typewritten manuscripts obsolescent. (Let's just say that if in a fit of enthusiasm you borrowed your future self's time machine and went back to that convention in search of me you'd have been disappointed.)
There were plenty of other embryonic personages floating around there, of course. I remember meeting this tall goth dude with shaggy hair, dressed all in black and wearing mirrorshades at midday, who resembled the bassist from the Sisters of Mercy. He was called Neil, he wrote for a comic called 2000AD, and he had an oddly liminal superstar quality even then: everyone just knew he was going to be famous, or in a band.) And there was this thirty-something guy with glasses and a bushy beard propping up the bar. What set him apart from the other guys with beards and glasses was that he had a hat, and he was trying to cadge pints of beer with an interesting chat-up line: "I'm a fantasy writer, you know. My third book just came out—it's called 'The Colour of Magic'." So you'd buy him a drink because, I swear, he had some kind of bibulous mind-control thing going, and he'd tell you about the book, and then you'd end up buying the book because it sounded funny, and then you were trapped in his snare forever.
Back then, Terry was not some gigantic landmark of comedy literature, with famous critics in serious newspapers bending over to compare his impact on the world of letters to that of P. G. Wodehouse. Terry was earning his living as a press officer and writing on the side and didn't feel embarrassed about letting other people pay for the drinks. And so over the next few years I bought him a pint or two, and began to read the books. Which is why I only got hooked on Terry's shtick after I'd met him as Terry the convention-going SF fan.
Some time between about 1989 and 1992, something strange began to happen. I started seeing his name feature more prominently in bookshops, displays of his books planted face-out. He started turning up as guest of honour at more and more SF conventions. When a convention did a signing with Terry, suddenly there was a long queue. And when he walked into a room, heads turned and people began to close in on him. There's a curious phenomenon that goes with being famous in a particular subculture: if everybody knows you, you become a target for their projected fantasy of meeting their star. And they all want to shake your hand and say something, anything, that connects with what your work means to them in their own head. (If you want to see this at work today, just go to any function he's appearing at—other than the Oscars—and watch what happens when Neil Gaiman walks into the room. He is, I swear, the human Katamari.)
Being on the receiving end of this phenomenon is profoundly isolating, especially if you're one of those introverted author types who can emulate an extrovert for a few days at a time before you have to hide under the bed and gibber for a while: you're surrounded by strangers who desperately want to connect with you and after a time it becomes really hard to tell them apart, to remember that they're individuals with their own lives and stories and not just different faces emerging from the surface of a weird shape-shifting fame-tropic amoeboid alien. It's not just authors who get this: if anything we get off very lightly compared to actors, politicians, or rock stars. (For some insight into it, go listen to the lyrics of Pink Floyd's "The Wall".) I should add, this sort of introversion is really common among writers. It's an occupation that demands a certain degree of introspective self-absorption, alongside a constant distance from the people you're observing, who—they mostly don't know this, of course—may provide the raw fuel for your work. So, if you want to hang on to your sanity, eventually you either go and hide for a bit, or you surround yourself with people who aren't faintly threatening strangers who want a piece of your soul. Which is to say, you selectively hang out with your peers, or folks you met before you caught the fame virus.
Terry was not only a very funny man; he was an irrascible (and occasionally bad-tempered) guy who did not suffer fools gladly. However, he was also big-hearted enough to forgive the fools around him if they were willing to go halfway to meeting him by ceasing to be foolish at him. He practiced a gracious professionalism in his handling of the general public that spared them the harsh side of his tongue, and he was, above all, humane. As the fame snowballed, he withdrew a bit: appreciating that there was a difference between a sharp retort from your mate Terry at the bar and a put-down from Terry Pratchett, superstar, he stepped lightly and took pains to avoid anything that might cause distress.
Anyway, this isn't a biography, it's just the convoluted lead-in to an anecdote about the last time I saw him (which was a decade ago, so you'd better believe me when I say our relationship was "situational friend" rather than "personal friend").
On the last day of the worldcon in 2005, I was wandering around feeling extremely frazzled and a bit hunted. I'd just won my first Hugo award, and my right hand was sore from people I didn't know grabbing it. Eventually I realized that I just couldn't cope with the regular convention concourse in the conference centre—I was a walking target of opportunity for people who wanted to shake the hand that held the pen that wrote the ... something, I guess.
At a British worldcon, you can count on there being a really excellent real ale bar tucked away in a corner of one of the hotels or fan areas. I headed for the real ale bar and found a degree of comfort and shelter there, because it was mostly full of familiar faces who didn't need to push into my personal space because I was just some guy they'd been bumping into in convention bars for a decade or two. The rate of hand-grabbing dropped to a survivable level: I began to relax, and found a couple of old friends to hang with. And then I noticed Terry.
Terry had not won a Hugo. He didn't need to. (As he said, "I was in the audience at some literary awards ceremony or other with J. K. Rowling one time, and she was lamenting how they'd never give her one, so I turned to her and I said, Jo, me neither: we'll just have to cry ourselves to sleep on top of our mattresses stuffed with £20 notes." Money being, of course, the most honest token of appreciation a commercial author can receive.) Terry didn't need a shiny new Hugo award to find it nearly impossible to walk around a convention and just be a fan: I was getting my first taste of the downside of fame, but Terry had been living with being Terry Pratchett, OBE, Richest Author in all the Land, for more than a decade. He was looking tired, and morose, and a bit down in the dumps. So we went over to say hi.
At this point, he perked up. Omega, who I'd been chatting to, had first met him in the mid-80s, about the same time as me: Feorag got a pass for being married to one of us. He'd been having a hard time being Terry Pratchett in public for five consecutive days. He wasn't quite ready to go and hide out in his hotel room, but he needed some respite care from being a Boss-level target in every starry-eyed fan's first-person autograph shooter; so, as it was coming up on lunchtime, by mutual agreement we dragged him away from the SECC to Pancho Villa's in Glasgow for lunch. Okay, Glaswegian-Mexican food is not what you'd necessarily call good good. But it filled a corner and, more importantly, it got him far enough away from the convention to decompress a little in company that wasn't going to place any demands on him.
Now, Terry (like the late Iain Banks) seemed to feel a bit of noblesse oblige (or maybe just plain survivor's guilt) over the sheer mind-boggling scale of his success. ("I realized I was rich," he recounted, "when I got a call from my agent one Thursday. That cheque I mailed you—did you get it? He asked. And I realized I couldn't find it: lost down the back of the sofa or something. Can you cancel it and mail me a new one? I said. And he said, yes I can do that, but you realize you won't be able to deposit it before next week and you'll lose the interest on it? And I said sure, just go ahead, cancel it, and send me a new one. Then I put the phone down and realized it was for half a million pounds.") Things had obviously changed since the days when he had to cadge drinks off fans in convention bars: and I realised that I hadn't bought him a pint since about 1989, and this rankled a little bit. Nobody likes to think of themselves as a charity case. Also, I'd just won a Hugo and landed a new three book deal and was beginning to feel a bit of that survivor's guilt myself.
So at the end of the meal, while he went to the toilet, I tried to pick up the bill. But the waitress was slow, he got back to the table before she could make off with my credit card, and when he pulled out his gold visa card, snarled "who's the rich bastard here?!?", and chuckled to himself, I knew I was beat. And I never did get to buy him lunch, in the end.
Anyway, those are some of my memories of Terry Pratchett.
He was generous not just with money, but with his soul. He was irrascible, yes, and did not suffer fools gladly: but he was empatic as well, and willing to forgive. Witty. Angry. Eloquent. A little bit burned by his own fame, and secretly guilty over it, but still human. And the world is smaller and darker without him, and I miss him deeply.
Some of you may be aware that there's a tabletop role-playing game set in the Laundry Files universe, sold by Cubicle 7 Games.
It's available on paper, and as PDF downloads via the usual folks (such as DriveThruRPG).
Anyway, there's a special promo for the next couple of weeks; Bundle of Holding, who do humble bundle style sales of RPG materials, are doing a special Bundle of Laundry offer. For $8.95 or more, you get the core rule book and the player's handbook as PDFs; if you pay more than their median price (currently $24.32) you get a whole bunch of extra supplements—basically the entire RPG for under $25 (or about £16.50 in real money). Oh, and this stuff? Is all DRM-free.
So if you've had a vague yen to dust off a tabletop RPG for an evening's fun with friends, why not see if you, too, can survive your training as a Laundry operative without losing your mind?
Title: The Three Dimensional Evolution to Core Collapse of A Massive Star
Authors: S. M. Couch, E. Chatzopoulos, W. D. Arnett, F. X. Timmes
First Author’s Institution: TAPIR, Walter Burke Institute for Theoretical Physics, California Institute of Technology, Pasadena, CA
Status: Submitted to ApJ Letters
There are hordes of them out there. Giant behemoths that masquerade as massive stars, but that never birthed a single radiant photon nor fused a pair of hydrogen nuclei. All of them are found on Earth. Instead of atoms, they’re built of up strings of 0’s and 1’s and live in computers of all shapes and sizes across the world (though they’d prefer the roomier accommodations of supercomputers, if you ask them). Inspired by their real, yet enigmatic counterparts in the physical universe, we first brought them to life in simple, spherically symmetric, one-dimensional form. We quickly bestowed them full three-dimensional complexity and an increasingly comprehensive set of input physics as soon as our computers possessed the computational brawn to handle them, feathery convective plumes and other instabilities, mottled compositional complexions, freewheeling invisible neutrinos, and all.
All for one goal: to understand how they die. We’ve observed their real counterparts supernovae over and over and over again, at a rate of about one every five seconds, bursting in galaxies near and far, their spectacular showers of photons (sometimes rivaling that of an entire galaxy) often traveling cosmic distances to reach our army of telescopes on Earth. We’ve sought to replicate these supernovae in our virtual massive stars, giving them encouraging nudges by injecting extra boosts of energy-imparting, explosion-inducing neutrinos, giving them a bit of a spin, tweaking how their mass is distributed, sending in sound waves, draping them with magnetic fields.
And yet. Despite the care with which we’ve crafted them, our virtual massive stars almost always refuse to explode.
What might we be missing in our theories of massive stellar death? It’s a question we’ve been asking for decades. Instead of focusing on the properties of the stellar cores that collapse and usher in their deaths, the authors of today’s paper instead turned to consider the life of the star preceding. They were motivated particularly by hints that the silicon-burning shell surrounding the pre-collapse core could be violently turbulent, stirred by convective motions in the shell. The authors thus concocted a new star, one 15 times the mass of our Sun. They harnessed the power of MESA, a special-purpose code built specifically for modeling the life of stars in 1D, from their early lives burning hydrogen, then helium, carbon, all the way to silicon and the formation of an iron core, about three minutes shy of core collapse. In order to focus on the effects of a convection in a silicon-burning shell, they stripped their stars of any complexifying qualities: no rotation, no magnetic fields.
At this point, however, their star possessed no convection, which cannot develop in 1D. Thus the authors turned to FLASH, a powerful hydrodynamics code that can follow the evolution of the complex gas motions that give rise to convection in stars. And this time, they let the star evolve in 3D. At the end of this, they had a star with a fully convective silicon-burning shell (see Figure 1), replete with characteristic convective plumes—spectacular ones that spanned the entire width of the silicon-burning shell and churned at velocities of several hundreds of kilometers a second, whirling around an 1.3 solar mass iron core on the verge of collapse.
And then, of course, came the collapse. The authors exploded two stars, twin stars, identical in every way except that one lived in one dimension and was thus spherically symmetric, while the other lived in three (though because of the computational complexity they modeled only an octant of the star) and thus retained its convectively-stirred, complex 3D structures. To help the stars explode, they were given identical shots of extra energy in the form of neutrino heating, then let go. And go they did—and differently in some key ways. Their cores initially evolved much in the same way: they collapsed, rebounded, giving birth to a shock, both of which successfully continued to grow. When the shock reached the silicon-burning shell, substantial differences began to show: the 3D convective star’s shock grew more rapidly than its 1D twin, and had a larger explosion energy. Though the authors did not evolve the collapse long enough to determine whether or not the star eventually exploded, these were promising signs that an explosion could be achieved more readily.
So does this mean that we now have it—the secret to the deaths of massive stars? Not quite. Many assumptions and simplifications—the initial 1D models, the 3D octant of the star, to name a few—were made. But while these new models were necessarily contrived, given the limits of today’s computational brawn, they are still an instructive demonstration that the turbulent environments in which the cores of massive stars breathe their last can affect how the rest of the star’s death plays out.
An internal ocean on Ganymede: Hooray for consistency with previous results! by The Planetary Society
A newly published paper confirms a subsurface ocean at Ganymede. An ocean there was already suspected from its magnetic field and predicted by geophysics; new Hubble data confirms it, and even says it is in the same place we thought it was before. Such consistency is rare enough in planetary science to be worth celebration.
Three International Space Station crew members are back on Earth today following a morning Soyuz landing on the snowy steppes of Kazakhstan.
March 12, 2015
Ok, for those who want to learn how to map, I've provided a few links to what I consider useful posts.
A quick video (speaking is my preferred medium)
A longer and more detailed version.
NB Some use the older terms Chaotic & Linear which I changed to Uncharted & Industrialised (more apt). I've put this list here (there's about 950+ posts on this blog). I'll try and add more, tidy things up and you never know ... persuade someone else to turn it into something readable.
An introduction to Wardley Maps
A step guide to mapping
On creating a value chain
Getting stuff done.
A guide to mapping.
From strategy to mapping to pioneers (slides)
The good bits about mapping
The amazing bits of mapping
The wow of mapping
Properties of evolution
There are many chasms to cross
The Two Extremes
Oh, no Six Sigma vs Agile
When to use a curve
Of Perils and Alignment.
On Ecosystems and Porter.
What's right and wrong with Christensen.
On evolution, disruption and the pace of change.
Does maturity matter
Ten graphs on organisational warfare.
Basics of operation
Let the Towers Burn
How we used to organise ourselves
Pioneers, Settlers and Town Planners
(another post on PST)
Two speed IT / Bimodal.
Other Tools I use with Mapping
Business Model Canvas ... the end of a long journey
Maps and the Target Operating Model.
Maps are imperfect but that's ok.
Rough guide - use cloud, build cloud and microservices.
On maps, components and markets
This is not the data you are looking for.
Why no consultants.
Open source, gameplay and cloud
Strategy vs Action
On disruption and executive failure.
Epic Fails of Sensible CEOs
Tower and Moat.
On Chess and Business
Preparing for War
Dungeons and Dragons vs The art of Business.
On D&D and Ant Battles.
Quick route to building a strategy.
Four basic smackdowns on competition
Does commoditisation lead to centralisation.
Fast Follower Conundrum.
Attack, defend and the Dark Arts.
Self disruption and super linear.
On the death of great companies.
The interesting thing about cutting costs.
Jevons in a nutshell
A useful summary post
Project, Products, Open source and Proprietary.
Context, situation and components.
Is war the mother of invention?
The abuse of innovation
Half completed book
Final Last Few
Continuous and Sustainable Competitive Advantage comes from Managing the Middle not the Ends
Two Speed Business? Feels more like inertia.
On Pioneers, Settlers, Town Planners and Theft.
There's also the team at WardleyMaps (I'm not affiliated with) who are trying to turn my long and sometimes rambling concepts into something readable.
For reference, all my writings are creative commons 3.0 share alike licensed, as is the entire mapping concept, maps and evolution diagrams.
I have a new book coming out in the first week of July: it's The Annihilation Score (UK ebook link), and here's the cover Orbit have done for the British edition!
And in case that's not enough, because it's published on both sides of the pond, here's the US ebook edition, and the American cover art:
If you detect a certain violin-theme running through both covers, you'd be perfectly right. Because this may be the sixth Laundry Files novel, but there's a new twist: this one isn't about Bob, it's about Mo. And superheroes. And a certain bone-white instrument ...
Liverpool Life Lately for various reasons this past couple of years I've been getting the bus home from the Liverpool One bus station, the 80, the 80s, the 75,the 75a, the 76 and during rush hour such services, despite me only living at Sefton Park, because of rush hour and the process of getting through the traffic in the city, and collecting passengers at the various stops can take up to fifty minutes if I get on at 4pm.
Except this month and next, from the beginning of March to the end of April, due to gas maintenance work on Hanover Street, these buses are not at Liverpool One. They're picking up and dropping off at Great Charlotte Street. Coming into town this means I've been getting off at the old Lewis's building and walking down Raneleigh Street into Hanover Street and onwards to my destination, which I've quite enjoyed, especially being able to pass through WH Smiths at Central (underground) Station on the way.
Coming home, being an entirely lazy human and wanting to avoid the crush of Great Charlotte Street in rush hour and all the "hey, there's a queue here" moans which come with that bus stop, I decided to throw out convention and catch a 27 bus, the Sheil Road circular to close to home and walk from there instead. I assumed this would cock up the whole routine, making the process of coming home even more taxing.
Um, no. It's not worked out that way. In fact, I can't imagine why I didn't think of this before.
Catching the 27 cuts out the whole of the city centre. Travelling towards Parliament Street thence to Park Road to Princes Avenue takes about ten minutes, twenty minutes shorter than it takes to get through the city street on the edge of being engulfed by rush hour traffic. Then, being as I said inherently lazy and with a Day Rider or Saveway-type ticket, I've jumped off the 27 and onto a 75 which has taken me to my usual bus stop near home.
A home I've now getting to a full half hour less than I have for the past two years on those days. Like I said, I can't imagine getting a different bus from there now.
-- This can only work with these buses. Even when they return, this will not save you from the murder of the 86 or 86a. Unless you get the 27 to Princes Avenue swap to a 80 or 75 then swap again on Smithdown Road at the stop near the post office. This seems like it could be unnecessarily complicated, but I guess there are probably enough 86s on the roads that the wait times will still be shorter than the mess of getting out of the city centre on a busy day.
-- This happened on Monday:
I would have any clue who amongst the multitude I supposed to stand behind and who'd got there and when. Just him presumably. (2)
— Stuart Ian Burns (@feelinglistless) March 9, 2015
On Tuesday I missed the 27 (though they run about every five minutes) and this queue, which looks like a daily occurrence seems actually to have been for the X1, the Runcorn express bus which goes up the Dock Road to Aigburth Road and seems to be the way that people who'd usually get the 82 skip the city centre themselves. Quite how the gentleman expected me to know this, I'm not sure.
In spite of being up to lots of things, I’ve not been very interested in blogging of late.
I got my first flight of the year — a 3 minute top-to-bottom that began with a nil-wind terror swoop on take-off, followed by my almost forgetting to unzip the harness on landing due to being distracted by the sight of ducks paddling around in one corner of the water-logged field.
Here’s the data stream from the landing.
Vertical lines at 5 second intervals. Yellow for barometer (air pressure rises as I descend), red for airspeed, cyan for GPS ground speed (seeming to correspond), white accelerometer pitch measurement, showing the pathetic flare coming into landing when all the speed drops off. The previous hump may correspond to the final approach turn (you have to push out to tighten the turn to a turning circle of about 35metres).
Here’s the take-off sequence, with a slight push-out which was not held long enough, so I dropped very fast. The yellow for the barometer briefly goes below the starting value showing that at one time I got a bit of lift and could have been almost half a metre above take-off.
All in all, quite disappointing, but I’m glad to have some data to work with from my electronics device. I’m going to really appreciate the next flight when I stay up for a bit.
Oh yeah, here’s a close-up of the one corner of my dog’s breakfast electronics project.
Luckily, 3D printers can print anything — including the abominable box I’ve “designed” in OpenSCAD to cram that electronics stuff into.
Meanwhile, all this will probably be shelved due to this widget showing up in the hack-space this lunchtime. More later.
- Title: 3D printing meets computational astrophysics: deciphering the structure of η Carinae’s inner colliding winds
- Authors: T. I. Madura, N. Clementel, T. R. Gull, C. J. H. Kruip, J.-P. Paardekooper
- First Author’s Institution: Astrophysics Science Division, Code 667, NASA Goddard Space Flight Center, Greenbelt, MD 20771, USA
- Paper Status: Accepted for publication in MNRAS
A new dimension
In a sense, our view of the Universe is two dimensional. We are relatively confined to our cosmic location, and can only observe the passage of minuscule spans of time compared to the timescales of most cosmic processes. As we look out into space, we see light from stars and galaxies as a 2D sheet stretched over the inside of a giant spherical shell, and are to unable spin around a supernova remnant to view it at different angles or click fast forward to view the entirety of a galactic merger event. Modern technology has helped to alleviate this constrained view, as computer simulations allow astronomers to view these events as more than a static picture on the sky and gain knowledge of how they evolve in time. Today’s paper utilizes computational astrophysics to examine a time-dependent event, but also incorporates another technological advance into their research arsenal. The authors brought a new dimension to their results by including 3D interactive features into their journal publication and creating the first 3D prints of an astrophysical supercomputer simulation’s output. From rocket engines to replacement human organs, applications of 3D printing have offered enormous potential for technological advance, and as this paper shows, can also lead to new discoveries in astrophysics.
Eta Carinae – A big fan of stellar winds
Eta Carinae is a poster-child of stellar explosions. In the 1840s, this massive binary star system produced the biggest non-terminal stellar explosion ever recorded, which made it the second brightest object in the non-solar-system sky and later produced the famous Homunculus nebula. Though its light has dimmed since then, its attention in the astrophysical community has certainly not. Massive stellar binaries such as Eta Carinae are relatively rare in our galaxy, and the interaction of these behemoths is a test bed for interesting physics.
Because they are so luminous, both stars have powerful radiation-driven stellar winds. In fact, the more luminous primary component of the system (ηA) has one of the densest known stellar winds, whereas the less luminous member (ηB) has extremely fast winds that are believed to fly at a swift 3000 km/s. The authors explored the region where the stellar winds of the two stars collided and interacted, known as the wind-wind interface region (WWIR). This violent collisional interface results in strong X-ray emission but is tough to model; with a short period of ~5.5 years and high eccentricity of ~0.9, the orbital motion of the system greatly affects the geometry and dynamics of this region. Smoothed particle hydrodynamic (SPH) simulations were used to model the colliding winds, which performs hydrodynamics by using many smeared-out interacting particles.
3D visualizations of the simulation were created for 3 specific phases of the orbit: apastron (furthest orbital separation), periastron (closest orbital separation), and 3 months after periastron. Figure 1 shows 2D slices of the wind-wind interface at these three orbital positions, but if you have Adobe Acrobat Reader, you can see the real deal 3D representation. Though the interface region at apastron (figure 1, left column) is by far the “cleanest”, it is still riddled with protrusions due to Kelvin-Helmholtz and non-linear thin shell instabilities (instabilities that arise from a thin wind being bound by shocks on both sides). The visualizations at periastron (figure 1, middle column) and 3 months post-periastron (figure 1, right column) are far more complicated, but also far more intriguing, and uncovered features that were unnoticed by previous studies using 2D visualizations. One feature is a clear “hole” in the trailing arm of the WWIR. The authors believe that this is due to the high eccentricity of the orbit; when ηB‘s orbital dance brings it closest to the massive ηA, it becomes embedded in ηA‘s dense wind and its own wind is forced out in a confined direction. Figure 2 shows the cavity being blown out in the WWIR by mapping the wind speed and density for a small space around the stars in the simulation (this can be seen much clearer by the interactive video, which can be found in the paper). The complex post-periastron geometry contained another surprising new feature uncovered by 3D analysis – protruding tubes in the WWIR of ηB‘s hot wind encased in thin shells of ηA‘s dense wind (see figure 1, right column). About 2 dozen of these “fingers” were produced in the WWIR, and projected out several AU. The study suggested that the fingers are results of instabilities in the colliding winds such as thin-shell or Rayleigh-Taylor instabilities, though if the fingers are in fact real not even Hubble’s sharp eyes would be able to spatially resolve them.
Holding a star in your hands
3D-prints have been made of astrophysical objects before (even of Eta Carinae), but all have been rendered based on observations and not simulations. This study created the first 3D models from the output of simulations, which allowed even better examination of the system’s detailed geometry and its variations over time. Though the intricacies of the models led to a broken finger (not a human finger, thankfully), it was nothing that a little glue couldn’t remedy. Figure 3 shows one of the 3D prints of Eta Carinae’s WWIR.
This paper stresses the importance of incorporating 3D interactive figures into PDF journal publications, and the benefits of using 3D visualizations and models to best understand complex, time-dependent astrophysical phenomena. 2D figures are dominant in the literature because of the need to display 3D data in a classic paper-journal format, but now that most astronomical journals are available online this need is becoming obsolete. Though still in its nascent stages, 3D printing in astronomy offers a new and interactive way of communicating complex cosmic systems to the public, and even offers a medium to better describe the wonders of astronomy to the blind and visually impaired. Maybe one day soon a subscription to the Astrophysical Journal will come equipped with a pair of 3D glasses.
A blast of fire and smoke lit up the hills of Promontory, Utah this morning as NASA and Orbital ATK completed a test firing of a Space Launch System solid rocket booster.
March 11, 2015
Film The other key film during my film studies days was Don Roos's Happy Endings which was the reason I ended up writing a dissertation about hyperlink cinema. The original subject was metafiction in Woody Allen's films, notably Annie Hall, Deconstructing Harry and one other which I could never quite decide on, but after reading around on the topic I realised that very little had been written about the topic in film terms at that time which meant I'd spend most of my time reading literary criticism and I didn't want to do that. Luckily I happened to be reading Jason Kottke's blog one afternoon and noticed him writing about hyperlink cinema and the article's he linked to became the backbone of what I'd spend the rest of that summer writing and led to the ability to say that I actually wrote about Richard Curtis's Love Actually for my dissertation. Actually. The guts of what I wrote about that film is here and I had planned to write something just as long about why Happy Endings isn't rubbish, but having reviewed the chapter it turns out I wrote a lot more about what Curtis did than either Altman (in Short Cuts) or Roos presumably because there was a lot more to say about it but also because I sensed, I think, that Happy Endings isn't really a hyperlink film, but an ensemble piece more akin to Hannah and her Sisters or Parenthood with its familial connections and the like. Expect spoilers.
The narrative in Happy Endings is closer to the more straightforward structure presented in most ensemble films with just three stories running in parallel. Don Roos identifies his work as a comedy ‘but obviously gets deeper’ (Johnson, 2005) and is not tied to a familiar generic story pattern. Quart confers hyperlink cinema status on the film because of the complexity with which those stories are told: ‘Roos takes the baggy plotting of the Altman picaresque into web territory: in Happy Endings playing games with time and personal history are a given’ (Quart, 2005: 48). Roos’s motivation for using the form was similar to both Altman and Curtis: ‘There’s not one story line that has to deliver everything […] because you have several stories, the audience can be freshened up. They can feel different things as they go from story to story’ (Johnson, 2005). The film opens with Mamie running into the path of a car, flashing back to the moment when she and Charley conceived their son and the aftermath, then forwards again to the scene when Mamie meets Javier for what appears to be a regular meeting. From here the film unfolds fairly conventionally, with three forms of disruption occurring in the set-up – Nicky blackmails Mamie into helping him make a film so that he can get the information about her son, Charley begins to suspect that Max might be his partner Gil’s son and Otis introduced Jude to his life, his house and his father. The connections between the characters are obvious from then on because Roos was wary of trying the force the connections: ‘I don’t like it when they all kind of connect co-incidentally at the end. Like Crash (2005) they all connect to something and I prefer it when its casual’ (Roos el al., 2005).
Narrative density is increased however because of the employment of non-diegetic captions that interrupt the mise-en-scène, presenting information regarding the characters and story outside of exposition within dialogue. The first instance is after the Mamie’s shocking motor accident to explain to the audience that ‘She’s not dead. No one dies in this movie, not on-screen. It’s a comedy, sort of.’ These effectively introduce an extra level of subtext into each scene, with details that the spectator would not otherwise have been aware of, impacting upon their relationship to the action, potentially increasing their level of suture because the concentration of information being presented is greater than the standard shot/reverse shot. In his opening scene Nicky is introduced to the audience before Mamie enters the café, and the caption explains that ‘Nicky is 25, oldest of three kids. He has a gun which he is realizing he left in the car. He has to pee’, de-threatening the character and changing the tone of the ensuing scene outside of the diegetic space (one wonders for his example if some of Nicky’s desperation is as a result of body functional needs). The implications and meaning of these captions change on subsequent viewings – it is later revealed that Nicky is the adopted older brother of the boy that Mamie could not abort. Note that these captions only ever complement the action and never intrude on scenes presenting important verbal or visual narration – usually action will pause (as occurs with Nicky’s introduction) or be of a humdrum nature (Mamie’s arrival at the salon) – so that the attention of the audience is still directed in a linear fashion.
The captions eventually restructure the climax, because once the narrative reaches its apparent conclusion, Roos explains the fates of the characters, sometimes years or decades after the timeframe of the film. Unlike the ‘where are now section’ of National Lampoon’s Animal House (1978) or Friday Night Lights (2004), as Victor Morton identifies there are ‘enough changes in fortune (i.e. drama) to make a whole new movie. Compressed into three minutes. And then with a coda of its own’ (Morton, 2005). It could be inferred that this is partially the result of the editing process, since as with Love Actually the first assembly was three hours long and so the director needed to ‘cut a lot of scenes when it was done’ (Lee, 2005). Like Curtis but unlike Altman, Roos appears locked into a need to complete the narrative structure identified by Todorov, even if it means increasing the plot duration exponentially. The closing montage sequence includes a flash forwards ten years to show Mamie and Charlie meeting their son possibly completing both of their story arcs but in other cases the captions present exposition that reaches even further than that - it is explained that Otis ‘watches Ted and Charley’s dogs sometimes and never plays the drums again. But in 20 years he’s happier than anyone else here. But that’s another story.’ Indeed, this whole section also allows new key character relationships to be created and their ‘happy endings’ sometimes occur because of these chance or synchronous meetings, running counter to the normal expectations of hyperlink cinema that such incidents will motivate the central action.
By producing Happy Endings as an ‘independent’ film, Roos is able to make two of these characters gay without their stories being about their sexuality: ‘It’s a rare studio movie that you can talk about the things I want to talk about. You can have gay characters in a studio movie, but it has to be about them being gay, or else they’re the sidekick. […] You can’t really talk about the love life of a gay man, like I did in The Opposite of Sex (1998)’ (Cavagna, 2005). Despite Alyssa Quart’s insistence that the characters exist without hierarchy (Quart, 2005: 51) each of the three stories has a main protagonist. These are clearly Mamie and Charley; and although Roos thinks of the third as being Jude’s tale -- ‘The girl meets the boy, she changes her mind, she attaches herself to the father, she blackmails the boy to keep silent, she finds out in the meantime she’s fallen in love with the father, she’s exposed, she has to give him up’ (Cavagna, 2005) – the narrative and mise-en-scene suggest this is Otis. Jude’s appearance creates the disruption in his life and it is only when he has left that the equilibrium returns; in the scene after the band rehearsal Otis’s nervy reactions appear in relative close-up as he leans against the counter and the reverse shot is over his shoulder and the spectator enjoys his long shot point of view as Jude riffles through the kitchen looking for food, with the camera in a later seduction scene angled in such a way that Otis’s reactions are prioritised over hers. In the main, Jude appears antagonist to Otis’s protagonist.
See what I mean? Apart from odd lines noting how the architecture in the houses between the characters reflects their class and how Roos has a better idea of representing diversity in his film, although it's still with a secondary character. The key element which gave Quart and others the impression that they were watching something akin to Short Cuts must be the on-screen captions but the rest of it simply isn't akin to Crash or indeed Love Actually in how the stories are told. But it is still a remarkable film, because of those captions, because of the performances notably from Maggie Gyllenhaal whose character, a singer, contributes to one of my favourite film soundtracks. It's on this blog's old Forgotten Films list and although I haven't seen it recently images are stuck in my head. Images like:
Cavagna, Carlo. 2005. Interview: Don Roos. In. AboutFilm. Available at: http://www.aboutfilm.com/features/happyendings/roos.htm. Accessed: 17th July 2006.
Johnson, Tonisha. 2005. Happy Endings: An Interview with Director Don Roos, Jesse Bradford and Jason Ritter. In. Black Film. Available at: http://www.blackfilm.com/20050715/features/happyendint2.shtml. Accessed: 17th July 2006.
Lee, Michael J. 2005. Don Roos. In. Radio Free Entertainment. Available at: http://movies.radiofree.com/interviews/happyend_don_roos.shtml. Accessed: 17th July 2006.
Morton, Victor. 2005. Tone Deaf: 'Happy Endings' Can't Get It Right. In. TheFactIs.org. Available at: http://www.thefactis.org/default.aspx?control=ArticleMaster&aid=1048. Accessed: 17th July 2006.
Quart, Alyssa. 2005. Networked. In. Film Comment. 41:4.
- Title: James Webb Space Telescope can Detect Kilonovae in Gravitational Wave Follow-up
- Authors: Imre Bartos, Tracy L. Huard, Szabolcs Marka
- First Author’s Institution: Columbia University
- Paper Status: Submitted to The Astrophysical Journal
Hopefully by now you have heard about just how awesome the James Webb Space Telescope will be (cue the George Lucas styled teaser…). In short, the Webb telescope is the successor to the famous Hubble Space Telescope — but the Webb telescope is much more than that. Using a mirror seven times the size of Hubble’s and its multi-object spectrograph, Webb will be able to study hot topics like the atmospheres of distant planets or the hectic lives of the earliest galaxies. Its capabilities seem only limited by our collective imagination — which is why scientists share their ideas for Webb as white papers on the arXiv. White papers are those which are not designed for publication in the typical scientific journals; they are a mean for scientists to informally share ideas to a wide audience. The Webb has no shortage of such papers. Today’s paper is actually a submitted article in which the authors propose the idea of marrying Webb with another major instrument: LIGO.
LIGO and Kilonovae
The Laser Interferometer Gravity Observatory (LIGO) is made of two massive Michelson interferometers which can detect extremely small spatial fluctuations using a high-powered laser. These tiny fluctuations aren’t caused by seismic tremors or LSU football games; they are actually ripples of spacetime called gravitational waves. These waves originate from a number of astrophysical events, but today we’ll be focusing on just one type known as the kilonova.
Kilonovae are a result of the collision of two compact objects, such as two neutron stars. Before the neutron stars collide, they will spiral inwards, causing massive gravitational waves. LIGO will hear this inspiral as a “chirp,” while telescopes will see this collision as a bright explosion that is about 1000 times as bright as a classic nova (hence the name “kilonova”). Due to the complex nucleosynthesis of kilonovae, these objects are brightest at longer (infrared) wavelengths of light (You can read more about that process in this astrobite.)
Unfortunately, LIGO hasn’t detected any gravitational waves yet. But have no fear! Scientists are working on a network of updates to LIGO to create Advanced LIGO which should be 10 times more sensitive to gravitational waves. Advanced LIGO will be able to detect kilonovae as far as 1 billion light years away. There is one problem: Advanced LIGO will not be able to constrain the location of the kilonova very well — only to about 10 square degrees, or 1% of the sky. If we want to follow up LIGO detections with telescope observations, we need to search this area of the sky efficiently because kilonovae will dim dramatically a week or so after their explosion. If kilonovae were as bright as supernovae, this would be an easy task: just look for the brightest thing on that patch of the sky. Unfortunately, kilonovae are actually fairly dim; they only peak at around 24th magnitude in the optical (although they are 1000 times brighter in the infrared).
Webb to the Rescue
The authors of today’s paper argue that the Webb telescope is a great tool for finding the source of LIGO’s detections, as shown schematically in Figure 1. They offer a few, interconnected reasons of why we should use the Webb. First, Webb will have a near infrared camera. As previously mentioned, the infrared is approximately where kilonovae should be brightest. Additionally, because Webb hosts a huge primary mirror, it is capable of detecting a kilonova within a two second exposure. Figure 2 shows this exposure time as a function of filter and days since the kilonova explosion. Finally, Webb is equipped with an array of microshutters, which can be used independently. This drastically cuts back on the time it takes the Webb to read out images from its camera.
This all sounds great, but there is one major setback. Webb’s slew time (or the time it takes the telescope to change positions between observations) is very slow. In fact, if you naïvely used the telescope to observe the entire 10 square degrees in question, it may take over 50 hours to find the explosion!
Bartos and collaborators suggest taking a systematic approach to searching to the kilonovae. Previous studies have shown that kilonovae often occur in places with many stars forming (or galaxies with a high “star formation rate”). These star-forming galaxies often have strong H-alpha emission from the hot ionization of atomic hydrogen. Simply put, this means that galaxies with strong H-alpha emission are more likely to host kilonovae. Therefore, if we limit our Webb search to just known galaxies with substantial H-alpha emission within 1 billion light years of Earth, we can reduce the time to survey the 10 degree squared region to just 2-3 hours, depending on the number of galaxies.
With such a short time commitment, the Webb telescope will be able to follow up many LIGO targets over multiple nights, giving us unprecedented access to kilonova light curves and statistics. Hopefully, these observations can help us better understand the neutron-rich atoms (r-process atoms) and the equation of state for neutron stars and other compact objects.
What to expect when you're expecting a flyby: Planning your July around New Horizons' Pluto pictures by The Planetary Society
As New Horizons approaches Pluto, when will the images get good? In this explainer, I tell you what images will be coming down from Pluto, when. Mark your calendars!
Interview with an SLS Engineer: How Booster Test May Help Drive Golden Spike for Interplanetary Railroad by The Planetary Society
With NASA and Orbital ATK preparing for an important test tomorrow in Utah, an SLS engineer describes the inner workings of the vehicle's new solid rocket boosters.
March 10, 2015
Music On the occasion of the loss of filmmaker Albert Maysles, Diffuser looks at his work on concert films and more widely how there's a huge difference between a standard recording of a set and what's achieved when someone with a sense of narrative and wider context is involved. There are some interesting nuggets throughout, like this on Woodstock:
"Michael Wadleigh’s Woodstock documentary makes similar historical omissions for various reasons. Turning three days of peace, love and music into three hours of cinema mandates that some artists simply aren’t going to make the final cut. Additionally, technical issues seriously affected footage of some bands, and in a couple of cases artists didn’t want anything to do with the film.The BBC's Glasto coverage ever expands but even then some bands find themselves omitted because the BBC doesn't happen to be covering a given stage. Or as with Young, the band decides they don't want to available on iPlayer for the following month.
"Neil Young stands as the most famous of the latter group. Crosby, Stills and Nash’s acoustic set remains a highlight of the movie, but their electric set with their fourth member is lost to the ages because Young refused to play with Wadleigh’s camera crew on stage, feeling that they were too invasive. As a result, the Woodstock mythology remains cemented as a wonderful night for CSN without the Y."
It is now six years ago that we at USV held our one day mini conference on “Hacking Education.” A lot has happened since then. We have made investments in Codecademy, Duolingo, Edmodo and Skillshare. MOOCs really took off with classes that have been taken by tens of thousands of students. Susan and I started homeschooling for our children. And yet when I think about what is possible in theory it still feels like we are early on.
One of the things that I envision is an interactive learning environment that seamlessly combines math, physics and programming (and possibly a lot more). Imagine that when you learn about coordinates, you have a coordinate system where you can either drag a point around and see its coordinates change or change the coordinates and watch the point move. And right there and then you can add some code (e.g. a loop) to animate the point by programmatically changing its coordinates. You can then see how that could quickly be used to show the trajectory of a ball that is thrown in the air.
Ideally, these learning objects would be fully embeddable inside other content (e.g., inside this blog post) and would also allow links to be attached inside of them, for instance a link out to the wikipedia entry on Cartesian coordinates. If these components themselves were open source with a permissive license then people could not only use them any which way they wanted but also contribute their own components and help extend and further develop existing ones.
The learning system itself would then take care of presenting various paths through knowledge and providing continuous testing of knowledge to wind up with a completely individualized learning experience (much like Duolingo provides for language learning). The system would know what you have mastered, where you need re-enforcement and offer you potentially new things to learn. It would also begin to understand whether you learn better by reading a text or watching a brief video. And if you don’t grok a concept in one presentation it will surface other ones for you automatically.
One of the closest things to the level of interactivity I envision is the incredibly well done Desmos graphing calculator that’s available on the web and as apps. At present its components seem proprietary though and it is not a full on learning system yet. But if you graph a line for instance, all you have to do is enter y = mx + b and it automatically provides you with sliders to adjust the value of the slope m and intercept b. This shows nicely what is possible. None of what I have described above cannot be built. If this is something you are working on or interested in working on I would love to hear from you (including connections to the Desmos team).
These two imaginary guys influenced me heavily as a programmer.
Instead of guaranteeing fancy features or compatibility or error free operation, Beagle Bros software promised something else altogether: fun.
Playing with the Beagle Bros quirky Apple II floppies in middle school and high school, and the smorgasboard of oddball hobbyist ephemera collected on them, was a rite of passage for me.
Here were a bunch of goofballs writing terrible AppleSoft BASIC code like me, but doing it for a living – and clearly having fun in the process. Apparently, the best way to create fun programs for users is to make sure you had fun writing them in the first place.
But more than that, they taught me how much more fun it was to learn by playing with an interactive, dynamic program instead of passively reading about concepts in a book.
That experience is another reason I've always resisted calls to add "intro videos", external documentation, walkthroughs and so forth.
One of the programs on these Beagle Bros floppies, and I can't for the life of me remember which one, or in what context this happened, printed the following on the screen:
One day, all books will be interactive and animated.
I thought, wow. That's it. That's what these floppies were trying to be! Interactive, animated textbooks that taught you about programming and the Apple II! Incredible.
This idea has been burned into my brain for twenty years, ever since I originally read it on that monochrome Apple //c screen. Imagine a world where textbooks didn't just present a wall of text to you, the learner, but actually engaged you, played with you, and invited experimentation. Right there on the page.
(Also, if you can find and screenshot the specific Beagle Bros program that I'm thinking of here, I'd be very grateful: there's a free CODE Keyboard with your name on it.)
Here are a few great examples I've collected. Screenshots don't tell the full story, so click through and experiment.
Visualizing Algorithms – amazing dynamic visualizations of several interesting and popular algorithms.
Parable of the Polygons – a playable post on the shape of society.
Sight and Light – interactive explanation of 2D visibility calculations.
Rolling Shutters – an animated explanation of the visual glitches introduced in digital cameras by CMOS sensors when taking pictures of fast moving objects.
Sorting.at – a live visualization of common sorting algorithms.
The future of games history is workplace theft – illustrates software history by embedding an emulated, fully playable version of Wolfenstein 3D right in the page.
Feel free to leave links to more examples in the comments, and I'll update this post with the best ones.
In the bad old days, we learned programming by reading books. But instead of reading this dry old text:
Now we can learn the same concepts interactively, by reading a bit, then experimenting with live code on the same page as the book, and watching the results as we type.
C'mon. Type something. See what happens.
I certainly want my three children to learn from other kids and their teachers, as humans have since time began. But I also want them to have access to a better class of books than I did. Books that are effectively programs. Interactive, animated books that let them play and experiment and create, not just passively read.
I want them to learn, as I did, that our programs are fun to use.
|[advertisement] Stack Overflow Careers matches the best developers (you!) with the best employers. You can search our job listings or create a profile and even let employers find you.|
Title: The Fastest Unbound Star in our Galaxy Ejected by a Thermonuclear Supernova
Authors: S. Geier et al.
First Author’s Institution: European Southern Observatory
Status: Published in Science
You’ve probably noticed the action movie trope of “cool guys walking away from explosions”. The authors of today’s paper argue that this cinematic cliche is playing out on a scale much grander than Hollywood. The hero of this cosmic blockbuster is a sub-dwarf O-type star 8.5 kpc away in our galaxy’s halo. The explosion was a supernova in the Galactic disk which may have detonated 14 million years ago.
The vast majority of stars in the Milky Way are orbiting on roughly circular trajectories around the center of the galaxy. Starting in 2005, astronomers have found a handful of stars that don’t obey this rule. These so-called hypervelocity stars are speeding away from the galaxy at such high velocities that the galactic gravitational potential will not be able to stop them. These stars will leave the Milky Way entirely and travel off into intergalactic space. Many of these stars are on trajectories originating near the center of the galaxy, as predicted by Hills in 1988, and were likely flung outward in an interaction between a binary star system and the central supermassive black hole. The authors of today’s paper focus on the fastest of the hypervelocity stars, called US 708.
To determine the trajectory of US 708 in space, the authors first measured the radial velocity of the star along our line of sight using spectra. Measuring the Doppler shift in absorption lines coming from the star (Figure 1) , the authors found that US 708 is moving away from us at 900 km/s. But this is just the motion along the line-of-sight. To solve the full motion of the star through space, the authors measured its proper motion using images obtained over 50 years from the Digitized Sky Survey, Sloan Digital Sky Survey, and PanSTARRS. The measured proper motion in shown in Figure 2. Together, these observations indicate that the star is moving at about 1200 km/s away from the disk of the Milky Way. Now we know where the star is going, but where did it come from, and how was it ejected from the galaxy?
Where in the galaxy did this runaway star come from? In Figure 3, the likely region of the galactic disk is shown. Using a model of the galactic gravitational potential, the authors run dynamical simulations of the star’s past trajectory, and determine where the star intersects the disk of the Milky Way. These simulations show that US 708 did not originate near the central supermassive black hole, but was most likely ejected from the disk about 8 kpc from the center. This rules out the traditional ejection mechanism – US 708 was not hurled out into space by the galaxy’s black hole. So how did this star get its hyper velocity?
The authors propose that US 708 was part of a binary system, orbiting closely around a white dwarf. The two stars were close enough that the white dwarf pulled material from US 708 and eventually detonated in a Type Ia supernova. The supernova released US 708 from its tight orbit, and US 708 flew along a straight line with the same speed at which it was orbiting. The closer the binary system, the faster the ejected star. The observed velocity of US 708 indicates that this progenitor binary system had an orbital period of only 10 minutes! Another hint that US 708 originated in a close binary its rapid rotation. Take another look at the spectra in Figure 1. The absorption lines are broadened due to a rotation velocity of about 100 km/s, which makes US 708 by far the fastest rotating runaway star known. In a close binary, the rotation of each star is synchronized with their orbits, known as tidal locking. As we have seen, the orbital velocity of this system is very large, so the ejected companion will be rotating quickly.
If a supernova was responsible for ejecting US 708 from the galaxy, this hints at an unusual flavor of Type Ia supernovae. Traditionally Type Ia supernovae are thought to be formed when a white dwarf accretes mass from a red giant, exploding when it reaches the Chandrasekhar limit. However, US 708 is a compact sub-dwarf O-type star, which is similar to the Helium core of a red giant that has been unearthed from the puffy outer Hydrogen layers. Thus, US 708 would have donated Helium onto its white dwarf companion, which can detonate the white dwarf well below the Chandrasekhar limit. This type of detonation may be one of the explanations for the recently observed Type Iax supernovae.
If US 708 was ejected from the galaxy by a Type Ia supernova, a closer study of its composition could help to unlock the origins of these cosmic explosions which cosmologists rely on as standard candles. The nearby supernova may have polluted the surface of US 708 with heavier elements, warranting a closer spectroscopic study. Soon, missions like Gaia are expected to greatly expand the number of hypervelocity stars known, perhaps uncovering more like US 708 that have been ejected from the galactic disk. Until then, look for US 708 at a theater near you this summer; this star was walking away from explosions 14 million years before it was cool.
A cadre of CubeSats including The Planetary Society’s LightSail spacecraft completed a cross-country journey to Florida, where they await installation aboard an Atlas V rocket.
In pictures of the planets, the stars aren't usually visible. But when they do appear, they're spectacular.
March 09, 2015
"Between November 1975 and September 1976, a man named Roy Colmer decided to photograph New York City's doors. Not all of New York City's doors. No doors in particular. And in no real particular order. But his aptly named Doors, NYC project amounted to more than 3,000 photos, which now live with the New York Public Library.My favourite? The one which says boldly, "CLOGS OF COURSE" because of course, clogs.
"If you're like me and want to obsessively look at every single one, the best way to do that is here. But then, I did that so you don't have to. Firstly, note the door on the bottom left. For every dozen-ish nondescript doors, you'll find a little treat — like a poster of a cat ..."
March 08, 2015
Life Why do I think the way I do?
This isn't the first time I've wondered this and since it's International Women's Day, I thought I'd attempt to trace through my memory to try and work out why I think the way I do about, well everything.
Have I always thought this way?
Quite honestly I don't know.
The only reason I'm asking is because there seem to be so many people who for some reason don't and I feel sorry for them and don't unlike them think they're entitled to have that opinion, not in 2015.
I do have some memories.
My first two best friends were girls, I think because their Mums were my Mum's best friends. I have photos of a birthday party at about the age of five, the three of us sitting around the birthday cake.
At primary school, I remember the story books and later on text books featuring the usual gender roles. Men went out to work. Mothers were housewives.
All my real friends were girls. I have vivid memories of sunny break times sitting in grass and making daisy chains when all the rest of the boys played football. I had friends who were boys but it wasn't same. I didn't like football.
That hasn't really changed. I find women much easier to talk to than men. I still don't like football.
Except when I began secondary school, it was an all boys school and just as puberty hit, I lost the ability to talk to girls. I'd get nervous. Odd. All my friends were male for years.
Then girls arrived in the sixth form and they were utterly brilliant and thought so even though I couldn't speak to most of them.
There was also the moment at university at a hall formal, which was at a hotel, stuck on a toilet overhearing two blokes at the urinals outside referring to potential conquests as "the blonde one" and "the ginger" and wincing and wishing to god I'd never be anything like them.
I was also bullied a lot at school which has led to a dim view of any kind of oppression. Gender, race, anything.
Is any of this really relevant? I don't know. Probably not.
But what I'm trying to say is that I can't remember the moment when I became a feminist or at least thought women should have the same rights as men. There's no one thing which made me "get it".
I've just always thought so and can't understand why anyone wouldn't.
Is this unusual? I don't know that either.
People just have the experiences they have I suppose. I was reading Woman Woman comics at an early age. Watched a lot of Star Trek: The Next Generation as a teenager and I expect a lot of my liberalism can be traced back to that. Reading Shakespeare's Measure for Measure and Chaucer and being shocked at the treatment of women in those by societies of the past. Listening to a lot of female singer songwriters dealing with their experiences through lyrics. Tending to identify with female protagonists in films more than men. Reading The Guardian's Woman pages.
See what I mean? It's all a bit woolly.
If anything it's become even more focused this past few years, thanks to social media, reading feminist writing, watching my way through this and the general sense of injustice but knowing full well I'm not the right gender to really understand what it's like to live within a patriarchal society, or as I've taken to calling it "the fucking patriarchy".
I have no answer. So I'll just be pleased that I can see it and hope that someday everyone will.
One axis describes what something is (the value chain, from the user needs to the components required to meet it through a chain of needs). The other axis (evolution) describes change.
I've posted before on how the evolution axis was determined but when exploring this subject, I didn't start with the evolution axis as I had to discover it. In the very early days (2000 - 2004), I tried all sorts of different forms of measuring change. All failed.
Figure 1 - Evolution
4) Usefulness. There is a long list of reasons why mapping across the evolution axis is useful.
7) Recursive. Evolution can be equally applied at a high level (e.g. a specific business activity) or at a low level (e.g. a component of an IT system).
The early days
Before discovering the process of evolution, I did try to map with all sorts of things. These were inevitably failures. I thought I'd go through a couple and explain why.
Figure 2 - Map based upon evolution, direction.
Figure 3 - Map based upon diffusion, direction.
Hence from a mapping perspective, this makes diffusion curves useless. However, don't confuse that with diffusion curves being useless, in other contexts they are extremely useful.
Hype cycles are extremely useful in many contexts. They are not however based upon any physical measurement but instead are aggregated opinion. This means they are not testable, so we just have to accept them at face value (even when they do change the axis from visibility to expectations). The other axis is time (though it used to be maturity). Time, in this case, is more of a vague direction of travel rather than any measurement of time. It also suffers from a lack of direction in the same way diffusion curves do.
However, we have software as a service / ASP being in the slope of enlightenment in 2005 and cloud computing being in the peak of inflated expectations in 2010. Now, this all depends upon opinion and whatever definition they wish to choose. But, as with diffusion there's no direction i.e. one thing doesn't evolve into another but rather one thing in the slope of enlightenment becomes another related thing in the peak of inflated expectations. So when it comes to mapping rather than compute (as a product) evolving to include compute (as a rental service) then evolving to compute (as a commodity) including compute (as a utility), with the hype cycle you have the same back and forth that you have with diffusion curves.
Figure 4 - Four Hype Cycles
As I found out long ago, the hype cycle in the context of mapping is useless (as with diffusion curves). Don't confuse that with the hype cycle being useless itself. In many contexts as a source of aggregated opinion, then it is very useful.
--- 13th March 2015
Based upon Henry's question (below), I've added a graph to show the link between diffusion curves (Adoption vs Time) and Ubiquity.
The first graph shows adoption curves for different instances of an evolving activity A [through various instances A1 to A6], but rather than simply add adoption as the Y axis, I've added applicable market. The reason for this is adoption is to an applicable market and the market sizes for each evolving act are different.
Figure 5 - Diffusion Curves (adoption vs time).
I've then added where each evolved instance of the act A is in the evolution curve.
Figure 6 - Evolution
[The two graphs above are purely illustrative. I've taken a few liberties to make the transition less complex.]
Now, a couple of things are worth noting.
Ubiquity is a measure of how ubiquitous something is. In order to determine it, you need to know the end point i.e. when the activity has become a well established commodity. This point of ubiquity can only be accurately estimated when the act has become certain. At this point the past history can be graphed.
I have to emphasise, that the accuracy at which you can determine where something is or where something was on the graph increases as the act becomes more certain i.e. a commodity. If the act is not a commodity then where it currently is on the ubiquity axis becomes increasingly uncertain the newer the act is.
For this reason, you cannot when something novel appears (e.g. the genesis of electricity with the Parthian Battery in 400 AD) determine how ubiquitous electricity will become some 1600 years later. No-one can. This requires a crystal ball. However by 1960s, you've a pretty good idea how ubiquitous electricity is and can graph the past.
The certainty axis is determined from publication types (see figure 7). In particular, it is a relative measure of type II and type III publication types.
Figure 7 Publication types.
To graph the past, you need to first use the publication types (used to create the certainty axis) to determine if something has become an established commodity. You can then determine the point of ubiquity (what I call the reference point) and use this to determine how ubiquitous something was. You can then plot ubiquity against certainty in the past. An example of this covering a range of entirely different activities is provided in figure 8.
Figure 8 - Ubiquity vs Certainty.
You can then overlay the different areas where different states (custom built, product etc) dominate which gives figure 1 - the evolution curve at the beginning of the post.
To graph the future, you can't. The best you can do is to make a best guess at to where something is on the curve. Which is why with mapping I use broad categories - genesis, custom built, product etc. I get groups of people (with experience of the field) to estimate how far along it is. The accuracy of this will increase as the act becomes more certain.
To see evolution then you need to abolish time from graph (i.e. there is NO way to measure evolution over time in any form of repeatable pattern). However, whilst things will evolve over some (unspecified length of) time, you cannot accurately demonstrate where it is the evolution curve until it has become a commodity. Then you can accurately demonstrate where it was.
I cannot emphasise this more. The future is an UNCERTAINTY barrier which we cannot peek through. The more UNCERTAIN something is (i.e. genesis, custom built) the less able we are to see the end state. ONLY when something is close to being CERTAIN can we see the point of ubiquity (the reference point).
I do see people draw diffusion curves and start claiming things about them like here is a commodity, this end is an innovation. Alas, there is no repeatable graph of evolution over time.
We do NOT have a crystal ball. No-one does.
Diffusion <> Evolution. Evolution consists of many diffusion curves and there are many chasms to cross. The two are not the same, mix this up and you'll get lost.
Despite lacking any crystal ball, we can however say that something will evolve from genesis, custom built to product (+rental services) to commodity (+utility services). It will become more common and more certain. This change is driven by supply and demand competition. This is what figure 1 shows.
--- Just for Fun (13th March 2015)
Oh, and just because I like a good debate. I'll leave you with this graph of progress. Whilst being efficient with energy is good, anyone who thinks we can somehow reduce energy consumption permanently whilst maintaining progress needs to think again. Whatever we do, eventually if progress continues our energy consumption will exceed today. If you want to solve the energy related issues then you need to look for less damaging forms of production. Competition (whether life or business) is a constant exercise of reducing entropy within the local system through the consumption of energy.
Ubiquity vs Adoption (16th March 2015)
I thought this was fairly obvious but it seems there is still some confusion. I need to clarify.
100% Adoption of the iPhone 1 <> 100% Adoption of the iPhone 6. The markets are different.
Q. But can't we look at the smartphone market?
Well, what do you measure against?
Q. When it's ubiquitous?
Q. When everyone has one?
So, gold bars aren't ubiquitous because not everyone has one? Gilts aren't ubiquitous because not everyone has one? CRM systems aren't ubiquitous because not everyone has one? Coffee beans aren't ubiquitous because not everyone has one? Computer servers are not ubiquitous because not everyone has one?
Except ... Gold bars, Gilts, CRM, Coffee beans and computer servers are all ubiquitous in their markets. It's just their markets are very different.
Q. But how do you convert adoption to ubiquity?
You need to find the point of ubiquity in a market. To do this you need to look at publication types and work out when something is a commodity. Then you can find the point of ubiquity and plot back how ubiquitous something was vs how certain (i.e. well defined, well understood) something was.
You cannot simply take adoption of something in its market and call that ubiquity. You cannot jump from an Everett Roger's diffusion curve to an Evolution curve. Just because they both happen to have an S-Curve shape doesn't make them the same thing.
Q. But that's what everyone does!
Well, they are all wrong.
March 07, 2015
Music Tonight while I was watching a double bill of Ozon's Jeune & Jolie and the Melanie Lynsky starring divorcee indie Hello I Must Be Going both of which were telling similar stories in different ways, the BBC announced this year's UK entry for the Eurovision. Having tried viewers votes for the actual songs and artists, and then just the songs having chosen the artists, and then neither, we're now at point where they have some kind of open cattle call and some lyrics are given to some singers at the end. In other words, here you, cheer this on then. Here it is above. Some notes:
(1) It's really weak. I'm quite the fan of 20s & 30s revivalism especially when The Puppini Sisters and Christina Aguilera have tried it, but this doesn't take us much further than The Doop and Doop and that was 1994. Plus it's for the Eurovision in an anniversary year and we're witnessing a total loss of confidence. After fielding a half decent record last year which came 17th they've clearly decided to on the "fuck it let's just do novelty" strategy on the assumption we'll probably come mid-table again anyway but at least there'll be a solid reason for it other than Europe hates us.
(2) The bullshit gender politics in the lyrics. Especially from the second verse onwards. See below. It's all about the man telling the woman how to deport herself while he's not in her company and her submissively agreeing. Which might have been acceptable writing a hundred years ago, but not now. Not now at all.
Him:I'm willing to admit the odd word may have been poorly transcribed - I'm not sure in particular about that final line - but on the whole I don't understand any of this anyway. Unless she's simply telling him that and just doing what she sweet wants anyway? I hope so. Not that you can tell from the performance. If I'm off base on this do tell me. It's not Doctor Who's The Caretaker, but it feels pretty close. Minus points to for the official BBC press release not including the lyrics which are supposed to be half the point.
"Some younger guys, with roving eyes, may tantalise you with their lies, you must be wise and realise, leave well alone 'til you get home, dear."
"Won't see other fellas, won't make you jealous, no need to fear when you're not here, I'm still in love with you."
"Don't walk on the red light, don't stay out at midnight, don't get in a fist fight, that pretty face can't be replaced."
"Won't be out at night hon, it wouldn't be right hon, no need to fear when you're not here, I'm still in love with you."
"Don't make a fuss you have to trust, this is how it always must be, when I stop to think of us, I can assure you, I adore you."
"God you're so gorgeous, no need to be cautious, take good care when I'm not there, I'm still in love with you."
"You have a fun time, soak up that sunshine, but don't drink too much wine, just one or two will have to do."
"I know what you're thinking, so won't be drinking, no need to fear, when I'm not here, I'm still in love with you."
(3) The music is a confused mish-mash. Perhaps having sensed that previous performers from other countries have run aground when actually just keeping within genre they've decided to lather this thing in ill fitting electronica and country riffs which simply confuse the whole business.
(4) Here are the biographies of the writers:
David Mindel has had a successful career in song-writing, working with the likes of Olivia-Newton John, Barry Manilow, The Shadows, John Travolta, Mud and Musical Youth to name but a few.
Following a successful song writing career, David embarked on a new chapter - writing and recording some four thousand TV and radio commercials, including penning the themes for BBC One’s National Lottery and Euromillions TV shows.
Adrian Bax White is a classically trained multi-instrumentalist whose eclectic music career has spanned pop to fusion jazz with everything in between. Adrian has worked with a multitude of singers from multi Grammy award-winners such as John McLaughlin and Narada Michael Walden to underground indies such as Blue Orchids and acid jazz godfather Lonnie Liston Smith.
Mindel's IMDb expands on some of the other tracks he's worked on: Bob's Weekend, Coogan's Run, Challenge Anneka, The District Nurse, Real Life, The Hot Shoe Show, I Get Your Act Together, Rory Bremner, Who Else?, Harty, Food and Drink and Jim'll Fix It. Some of which I used to love.
(5) In a world where Taylor Swift's 1989 exists, where even I'll admit, glancing at the UK top 40 demonstrates there's some really interesting, ballsy pop music in production, the BBC and whoever have seen fit to choose this which says nothing about the British music industry or the state of the art. Once again we're treating Eurovision and a "fun party" and "nothing to serious" and "a joke" when it could be a celebration of who we are and what our music industry is. Which I know we already do across the world with the "real" music, but wouldn't it have been amazing if Scott Mills had introduced something tonight and our reaction would have been the collective awe of hearing "Shake It Off" or "Let It Go" or "Happy" or "Fireworks" or whatever Beyonce's doing this week, something with a push and a donk on it instead of hearing this and trying to rationalise which judging by the social media even people who're generally supportive are doing?
(6) I just don't like it, ok? Sigh.
A response to Jim's Cloud Post.
Radio In case you hadn't noticed due to the lack of posts, I've been a bit ill this week with the cold/manflu thing which has been going around. After not knowing what do to do with myself it's actually left me with cold sores, or as is the case now scabs, around the mouth and lips which makes it incredibly difficult to talk, or smile, or eat or do anything which a mouth and lips are meant to without some pain. Though it is getting easier and I have some antibiotics from The Doctor. Sorry, a Doctor.
All of which explains why I entirely failed to notice that the Drama on Radio 3 last week was a new production of As You Like It:
"A new production of Shakespeare's most joyous comedy with an all star cast and music composed by actor and singer Johnny Flynn of acclaimed folk rock band Johnny Flynn and The Sussex Wit.It's available to listen to here for the next few weeks.
Lust, love, cross dressing and mistaken identity are the order of the day as Rosalind flees her uncle's court and finds refuge in the Forest of Arden. There she finds poems pinned to trees proclaiming the young Orlando's love for her. Mayhem and merriment ensue as Rosalind wittily embarks upon educating Orlando in the ways of women.
With an introduction by Pippa Nixon who played Rosalind to great acclaim at the RSC and now reprises her role as Shakespeare's greatest heroine."
You can also download it here.
[Not the streaming page says its only an hour and a half, but it is actually 2h 15m or so the cuts will presumably be pretty standard. I won't know until I can listen and I won't be doing that until I can smile properly.]
- Title: Volatile Delivery to Planets from Water-rich Planetesimals around Low Mass Stars
- Authors: Fred J. Ciesla, Gijs D. Mulders, Ilaria Pascucci, Daniel Apai.
- First Author’s Institution: Department of the Geophysical Sciences, The University of Chicago, 5734 South Ellis Avenue, Chicago, IL 60637.
- Paper Status: In preparation for the Astrophysical Journal.
Water (still) seems to be important to harbour life
Scientists in many fields see the presence of water as a major ingredient for the potential of an ecosystem to support emerging life – however you want to define life. This might result from our lack of creativity to imagine beings in the absence of water, and is likely a direct consequence that we only have one “life benchmark system” which we know for sure. Of course, we are working hard on resolving this drawback of our ideas, as was just shown by researchers who modeled an oxygen-free cell membrane which might fit in the chemical environment of Saturn’s moon Titan. Anyways, since hydrogen is ubiquitous in the Universe and we can be sure that a watery environment acts as an excellent breeding ground, looking out for water in other worlds seems still to be our best bet.
Water delivery via accretion of small bodies
To be able to make a statement about the conditions all around us in the Universe we first need to understand where all the water on Earth came from. A nowadays favourited idea to explain the massive amount of water on the Earth’s surface is that most of it was accreted from water-rich impactors. In the early phases of its formation the Solar System underwent a phase of chaotic dynamics – the so-called Late Heavy Bombardement. We can see that this event most likely happened at approximately 4 billion years ago in lunar rock samples from the Apollo missions. During this time a high number of asteroids collided with the terrestrial planets. But why is this important for the water on Earth? Since the Earth formed at a distance to our Sun where the temperatures are high in the protoplanetary disk (a lot of astrobites already exist on this topic here and in general here) not much water was incorporated in its bulk material. (This is also explained comprehensively in this astrobite.) However, the small bodies in the outer parts of the Solar System formed farther away and therefore harbour higher volatile (chemical species that go into gas phase easily, so including water) reservoirs than the inner terrestrial planets. This lead to the thinking that most of the water we see in the oceans today was brought to our planet after its formation.
What about other planetary systems?
When looking at other systems far away from ours, we can ask: did they emerge from the same conditions as our own? Probably not always. Sure, we find a multitude of systems to resemble the Solar System, systems with multiple planets, super-Earths and so on. However, there is a crucial ingredient which determines the amount of volatiles in the primitive planetary bodies: the amount of heating by short-lived radioactive isotopes like . These dominated the heating in early Solar System bodies and are the cause for most of their volatile loss (by outgassing to the surrounding). is likely variable across different planetary systems around solar-type stars and therefore other planetary systems might have more volatiles and thus more water!
The consequences for systems with more water
And that’s now the author’s approach in this study. They adopt conditions for higher volatile enrichment than in our Solar System and calculate the gravitational dynamics and evolution of the bodies in these systems. To do so they use the N-body code MERCURY which models each body around the star as a particle and then calculates the gravitational interactions in between them and the resulting orbits and eventually impacts. Each individual particle is given a certain volatile mass fraction. They also account for the differences in stellar masses and the resulting changes in the extent of the habitable zone (in which liquid water can exist) and the mass of the disk around them (higher stellar mass in principle means there was more material available in the beginning and thus we also expect more material in the surrounding).
Figure 2 gives an idea about how the outcome of such a simulation looks like. Initially all bodies are arranged in a certain distribution around the central star and then are dynamically evolved. After a certain amount of time most of the initially small bodies have been accumulated to bigger bodies and the resulting small ones are on very eccentric orbits. Now, running simulations with initial values alike the Solar System and with higher volatile fractions enables the researchers to study the impact of such changes.
Sadly, we cannot know for sure at this point. We can imagine a lot, put as much knowledge, hard work and precision in our simulations as we can – in the end all our physical descriptions of the real world out there are just approximations, some better, some worse. Therefore, the authors of this study try to identify trends in the results of their simulations that might in the future enable them or others to compare their outcome with the distributions of observed exoplanets (see Figure 3). This is a very hot topic in exoplanetary research – do statistics with the observed distribution of planets and allocate trends to understand the origin of planets in general. What they find is that adding water to the outer bodies in comparison with the value of the Solar System leads to much wetter planets. Additionally, they find that low mass stars are the host of of more planets of lower masses, whereas in comparison the stars with higher masses end up with less planets, but more massive ones. These are relevant descriptions of the stochasticity of planet population synthesis (the study of the planetary population on a statistical basis) and thus gives clues whether our ideas are right and may or may not coincide with reality. This eventually enables observers to refer to when we have detected and characterised (!) enough exo-worlds to do reliable statistics about it and be in a state to find out whether at least some of our understanding withstands a critical check.
There is a first time for everything. Riding a bike, stargazing, and yes, even lobbying Congress. Jack Kiraly describes his first Legislative Blitz with Michael Briganti and Casey Dreier on Capitol Hill last week.
Mini mission updates: Dawn in orbit; Curiosity short circuit; Rosetta image release; Hayabusa 2 in cruise phase; and more by The Planetary Society
Dawn has successfully entered orbit at Ceres, becoming the first mission to orbit a dwarf planet and the first to orbit two different bodies beyond Earth. I also have updates on Curiosity, Rosetta, Mars Express, Hayabusa 2, the Chang'e program, InSIGHT, and OSIRIS-REx.
Dawn's Chief Engineer, Marc Rayman, gives an update on the mission's highly anticipated arrival at Ceres.
Technology writer Paul Gilster shares his interest in how we depict astronomical objects, focusing on the dwarf planet Ceres.
March 06, 2015
I am still suspended head-down in a vat of boiling edits. The deadline is next Friday, so don't expect normal blogging to resume before then.
(The book in question, "Dark State", is tentatively due out from Tor in April 2016—assuming I can hit that deadline.)
"More innovation, giving civil servants the opportunity to experiment and explore solutions in a risk-free environment. techUK's 'innovation den' model will be used to provide a test platform for new projects, and is designed to overcome the problem of public sector innovation being strangled by the fear of failure. techUK will develop a 'techmap' of suppliers, ensuring Government is aware of all the options available to them."
About During the week, Tumblr emailed to remind me that I set up one of those eight years when it was still in beta testing.
Across time it's generally been a place to collect together content from various places and I've decided (now that If This Then That is intermingling the APIs) to try something similar again.
Here it is.
At present it's posts from this blog, links from Twitter and flickr images. This will hopefully be the first post if everything is working properly. I know some of you prefer it over there (or here if you're reading this on Tumblr) so here we are. Hope you like the choice of title bar.
Not that I actually understand Tumblr in the same way that I didn't understand LiveJournal back in the day. Did you know LiveJournal's still going?
Figure 2 Maps and Sub Maps
Critical Path / Fault Tree
I also use maps to help determine critical paths and fault trees i.e. if something happens to my recommendation engine where does that impact me? In the above scenario I can still commission shows and deliver a streaming service but it does impact the web site. If however I lose internal power then multiple systems can be impacted and no service is delivered. Hence I try to mitigate this with multiple sources of power or by pushing a higher order system (such as compute) into a more resilient environment. In this case because compute is a commodity, I'd tend to use a volume operations based service in which I distribute my risk across a wide geography (e.g. cloud services like AWS). I've shown an example of such analysis in figure 3.
Figure 8 - Scenario PlanningTeam Structure
I like numbers particularly when those numbers represent cash in a profitable way. When we think of business, I tend to think in terms of flows of revenue and cost. I hence use the map to create different flows through the system, identify cost and revenue and create business flow diagrams (see figure 9).
Figure 9 - Business Flow Diagram
Sometimes I'll find one business flow is more profitable than another. Sometimes an act will evolve and a particular flow will become unprofitable or another will become profitable or a further will become possible or most likely a combination of all the above will happen. These diagrams can be a bit tedious hence it's a good idea to like making money. However, the upside is that you can often find entirely new ways of making revenue during the process and it certainly helps you to focus on this.
I'll use maps to look for opportunity i.e. points which we can attack. An example is being the first mover to create an industrialised component and then building an ecosystem on top. I might also look for areas where we might wish to gamble and experiment building an uncertain need.
My preferred approach is not to gamble but to be the first mover to industrialise and the fast follower to any new activities built on those industrialised components. This is a technique known as Innovate, Leverage and Commoditise. You commoditise an act, get everyone else to Innovate on top of it and then mine (i.e. Leverage) the consumption data of your ecosystem to spot future successful innovations that are evolving. You can then Commoditise those activities (or data) and rinse & repeat. See figure 10.
Figure 10 - ILC play.
My preferred approach is not to gamble but to be the first mover to industrialise and the fast follower to any new activities built on those industrialised components. This is a technique known as Innovate, Leverage and Commoditise. You commoditise an act, get everyone else to Innovate on top of it and then mine (i.e. Leverage) the consumption data of your ecosystem to spot future successful innovations that are evolving. You can then Commoditise those activities (or data) and rinse & repeat. See figure 10.
Operations is one of my favourite topics (along with strategy) but in this case, it's all about getting stuff done. Its the execution side and that means people (but not in a Wolf Hall sense). I first start by using a map to break out the team structures that I'll need - see figure 12.
Figure 12 - Team structure
Before you ask about the all important "Why?" of strategy. Do remember "Why?" is a relative statement as in "Why here over there?". Maps help you find the many wheres and hence determine the why. Strategy however is another discussion which I've touched upon in many previous posts.
With the NASDAQ going above 5,000 for the first time since the year 2000 valuations in tech are once again on everyone’s mind, mine included. It has been a long period writing here on Continuations that I have thought valuations were too high and they have only gone higher since. All the arguments I provided back in 2012 are still true and in particular the line that “rates of return available on many other investments are at historic lows” — this is especially true for interest rates. Very low interest rates provide double fuel for stock prices. First, because investment dollars migrate from fixed income to equities. Second, and more importantly, because discount rates are low. If you have ever built a DCF model you know just how insanely sensitive the valuation is to the interest rate. And with technological deflation it is actually a reasonable expectation that interest rates will stay super low.
Yesterday, Mark Cuban wrote a click-bait titled rant on valuations which I read after it had popped up multiple times in my Twitter stream. While I didn’t think it was entirely coherent it did give me an interesting idea though for a weird interaction between public and private markets that I had not previously considered. During the original Internet Bubble, public market valuations relatively quickly sky rocketed in an all out frenzy. Private valuations didn’t have that much time to follow because a lot of companies went public quickly and then the bubble burst.
This time round things have been growing much more gradually over time and the supply of public companies has been small compared to the amount or private wealth creation as companies have been slow to go public or have avoided it altogether (see my related posts on still waiting for IPO 2.0). So now we have a different phenomenon: demand in the public markets outstrips supply which results in well higher prices. But then in turn it is those high public market multiples that inform private market valuations. And voila you have a case of MC Escher's famous picture of hands drawing each other (dear Internet: someone please put “public” and “private” on the hands and have them write 10x each; addendum: gorgeous illustration provided by Alec Hutson).
Because it is happening gradually and because the logic looks internally consistent (and add to that the low interest rates), this could continue to go on for quite some time. This strikes me as the classic case of Nassim Taleb's point about fat tailed distributions where it is the higher order moments (kurtosis) that really matter. So the process looks very smooth and gradual for quite some time until there is a sudden and fairly violent swing
Title: Bayesian ages for early-type stars from isochrones including rotation, and a possible old age for the Hyades
Authors: Timothy D. Brandt & Chelsea X. Huang
First Author’s Institution: Institute for Advanced Study
Status: Submitted to ApJ
The Hyades cluster forms the head of Taurus the bull in the zodiac constellation. It is one of the most famous open clusters—a group of stars that all formed at the same time from the same cloud of gas. This cluster was thought to be 625 million years old, however new research suggests that the Hyades is much older. This makes for a slightly awkward situation; the Hyades underpins our understanding of stellar ages. If its age is wrong then a lot of other ages are wrong too…
To understand why the Hyades plays such an important role in stellar dating it’s necessary to explain a little bit about how the game works (and what a mess it is!).
Isochrones, models and interpolation
It can be difficult to tell the age of a main-sequence star just by looking at it; changes that happen deep in the stellar core don’t have much impact on their outward appearance. Stars do change a little in brightness over their main sequence lifetimes, and by measuring their luminosities in different colours we can use some (slightly convoluted) trickery to infer their ages. We can measure the apparent magnitude of a star in different colours—this is a pretty easy measurement, so available for lots of stars. You can almost think of it as a really low resolution spectrum. If you know how far away your star is, from parallax measurements, for example, you can figure out the star’s absolute magnitude in those colours.
The next step involves matching the observations up with theoretical predictions. Complex physical models of stars are used to produce predictions of the luminosity you would expect, given a mass, metallicity and age. You can’t do this for all values of mass, metallicity and age —that would take forever! The model predictions are therefore computed on a grid; at discrete intervals of the three properties, to produce isochrones. Isochrones are lines of constant age on a colour-magnitude diagram, or Hertzprung-Russell diagram (see figure 1). Two stars that lie on the same isochrone should be the same age and their mass and metallicity define where on the isochrone they sit. By finding the position on an isochrone that is closest to your star’s position on the colour-magnitude diagram you can estimate its mass, age and metallicity. Even better—don’t simply match your star to the nearest isochrone, but interpolate across the grid to find the values of mass, metallicity and age at the exact position of your star.
I should mention by the way, this method of finding an age for a star isn’t great. It’s still the best thing we can do, but ages measured by isochrone fitting tend to have uncertainties of at least 50% and often more than 100%. That’s just because, like I said, stars don’t change their outward appearance very much over their main-sequence lifetimes.
The method outlined above is standard practise. So what’s new in this paper? Well, usually, these theoretical models that predict a luminosity for a given mass, metallicity and age don’t include the effects of stellar rotation. All stars rotate though, and some rotate quickly enough to make a big difference to their positions on the colour-magnitude diagram. Stars that rotate rapidly fling out the material around their equators, increasing their radii slightly. The equatorial parts of the star are further away from the hot stellar interior and are therefore a little cooler and less luminous. Both of these factors affect the star’s luminosity. Try to fit a rapidly rotating star to an isochrone that doesn’t take rotation into account and you may well measure the wrong age for that star.
The Hyades’ age problem
The authors of this paper use rotating stellar models to measure the age of the Hyades cluster and, guess what? It turns out to be older than previously thought. It’s gone from a youthful 625 million years to a, slightly less youthful but still pretty youthful 950 million years. Of course, this difference is miniscule when you consider that most Hyades stars will live for billions of years; aren’t I splitting hairs? Here’s the thing though: stellar ages are really really hard to measure for main sequence field stars, but much easier for cluster stars. That’s because they are stellar populations – they’re all the same age and roughly the same metallicity. You can fit an isochrone to an ensemble of stars —that gives you a much tighter constraint on its age (assuming the stellar models are correct, of course). For this reason, clusters are used to calibrate other stellar dating methods. Gyrochronology, for example: the method of dating a star from its mass and rotation period relies enormously on the fact that the age of the Hyades has very small error bars! If the age of the Hyades is wrong, that could have a serious domino effect. Hundreds of stars could have the incorrect age as a result.
Posteriors, posteriors, posteriors!
Everything done by the authors of this paper do is probabilistic. They don’t report one age for the Hyades, they report a probability distribution over ages (and provide samples from the posterior probability distribution). They even provide an awesome web interface where you can enter the name of a star, along with a rough estimate of metallicity (actually a metallicity prior), and it spits out the posterior probability distribution for the age of that star. The posterior probability distribution for the Hyades’ age and metallicity is shown in Figure 2.
This conflict may be resolved soon—the Kepler spacecraft (now reincarnated as K2) is currently observing the Hyades. It will be able to detect asteroseismic oscillations in some of its stars, revealing their true ages. Hundreds of inferences rely on the age of this cluster—unveiling the mystery will be an exciting moment for stellar astronomy!
- Albert Wenger
- Charlie Stross
- Dan Catt
- Emily Short
- Fairphone blog
- Feeling Listless
- Jeff Atwood
- Simon Wardley
- The Planetary Society
- Vinay Gupta
Updated using Planet on 26 March 2015, 06:48 AM