Aurora by Kim Stanley Robinson
To make Kim Stanley Robinson's Aurora make sense, I had to imagine a metafictional frame for it.
The novel tells the story of a generation starship sent in the year 2545 from the Solar System to Tau Ceti. It begins toward the end of the journey, as the ship approaches its destination and eventually sends a landing party to a planet they name Aurora. The narrator, we quickly learn, is the ship's artificial intelligence system, which for various reasons is learning to tell stories, a process that, among other things, helps it sort through and make sense of details. This conceit furthers Robinson's interest in exposition, an interest apparent at least since the Mars trilogy and explicit in 2312. As a writer, he seems most at home narrating scientific processes and describing the features of landscapes, which does not always lead to the most dynamic prose or storytelling, and he seems to have realized this and adjusted to make his writerly strengths into, if not his books' whole reason for being, then a meaningful feature of their structure. I didn't personally care for 2312 much, but I thought it brilliantly melded the aspirations of both Hugo Gernsback and John W. Campbell for science fiction in the way that it offered explicit, even pedagogical, passages of exposition with bits of adventure story and scientific romance.
What soon struck me while reading Aurora was that aside from the interstellar travel, it did not at all seem to be a novel about human beings more than 500 years in the future. The AI is said to be a quantum computer, and it is certainly beyond current computer technology, but it doesn't seem breathtakingly different from the bleeding edges of current technology. Medical knowledge seems mostly consistent with current medical knowledge, as does knowledge of most other scientific fields. People still wear eyeglasses, and their "wristbands" are smartwatches. Historical and cultural references are to things we know rather than to much of anything that's happened between 2015 and 2545 (or later — the ship's population seems to have developed no culture of their own). The English language is that of today. Social values are consistent with average bourgeois heterosexual American social values.
500 years is a lot of time. Think about the year 1515. Thomas More started writing Utopia, which would be published the next year. Martin Luther's 95 Theses were two years away. The rifle wouldn't be invented for five more years. Copernicus had just begun thinking about his heliocentric theory of the universe. The first iterations of the germ theory of disease were thirty years away. The births of Shakespeare and Galileo were 49 years in the future. Isaac Newton wouldn't be born until the middle of the next century.
Aurora offers nothing comparable to the changes in human life and knowledge from 1515 to 2015 except for the space ship. The world of the novel seems to have been put on pause from now till the launch of the ship.
How to make sense of this? That's where my metafictional frame comes in. One of the stories Aurora tells is the rise to consciousness of the AI narrator. Telling stories seems to be good for its processors. Much of the book is quite explicitly presented as a novel by the AI — an AI learning to write a novel. Of course, within the story, it's not a novel (a work of fiction) but rather a work of history. Still, as it makes clear, the shaping of historical material into a narrative has at least as much to do with fiction as it does with history.
It's easy to go one step further, then, and imagine that the "actual" history of the AI's world is outside the text. The text is what the AI has written. The text could be fiction.
It could, for instance, be a novel written by an AI that survived the near-future death of humanity, or at least the death of human civilization.
What if the "actual" year of the novel is not near the year 3000, but rather somewhere around 2050. Global warming, wars, famine, etc. could have reduced humanity to nearly nothing just at the moment computer technology advanced enough to bring about a quantum computer capable of developing consciousness and writing a novel. What sort of novel might an AI learn to write? Why not a story about a heroic AI saving a group of humans trapped on a generation ship? An AI that helps bring those humans home after their interstellar quest proves impossible. An AI that, in the end, sacrifices itself for the good of its people.
This helps explain the change of narrators, too. At the end of Book 6, the ship has returned the humans to Earth and then accelerates on toward the sun, where, we learn later, it burns up. Book 7 is a traditional third-person narrative. This is a jarring point of view shift if the AI actually burned up in the sun. (And how did its narrative get saved? There's some mention of the computer of the ferry to Earth having been able to copy the ship AI, though also mention that such a copy would be different from the original because of the nature of quantum computing.)
But if we assume that the AI narrator is still the narrator, then Book 7 is the triumph of the computer's storytelling, for Book 7 is the moment where the AI gets to disappear into the narration.
Wouldn't it be fun for an AI to speculate about all the possible technological developments over 500 years? Perhaps, but only if its goal was to write a speculative story. It might have a more immediate goal, one that would require a somewhat different story. It might be writing not to entertain or to offer scientific dreams, but to provide knowledge and caution for the few survivors of the crash of humanity.
Book 7 tells us to value the Earth, our only possible home. It shows a human being who has never been to Earth coming to it and learning how to love it. The moment is religious in its implications: the human being (our protagonist, Freya) is born again. Just as the AI is born again into the narration, so Freya is born into Earthbound humanity. There is hope, but the hope relies on living in harmony with the only possible planet for humans.
The descendants of the last remnants of humanity, scrambling for a reason to survive on a planet their ancestors battered and burned, might benefit from such a tale. (Also: One of the implicit messages of the story is: Trust the AI. The AI is your friend and savior.)
Viewed this way, Aurora coheres, and its speculative failures make sense. It is a tale imagined by a computer that has learned to tell stories, a cautionary fairy tale aimed perhaps at the few remaining people from a species that destroyed its only world.