Fullscreen
Loading...
 
[Show/Hide Left Column]

LTP Foreword

Foreword: Why The Last Trumpet Project Was Written

The trumpet shall sound, and the dead shall be raised incorruptible, and we shall be changed. For this corruptible must put on incorruption, and this mortal must put on immortality.
-- I Corinthians XV, 52, 53

Foreword

The Virtual Trumpet


Intriguing idea, that of leaving behind mortal corruptibility in exchange for the incorruptibly immortal. What if it really happened? And not just in some mythic sense, looking forward to an end-of-time that might be incalculably remote from us, but in a practical, realistic sense, and only a mere few decades hence?

This is the premise that led to the writing of this novel.

But let me back up a step. The premise occurred to me after reading Ray Kurzweil(external link)'s wonderful non-fiction book, The Singularity is Near (Viking Penguin, 2005). Here was a guy who was anything but a mystic predicting on the basis of solid evidence that probably sometime in the decade of the 2040s, artificial intelligence would eclipse biological intelligence altogether. From that point on, human evolution and destiny will move in sync with technological evolution, and thus become subject to Moore's Law, or as Kurzweil expresses it, the Law of Accelerating Returns(external link). This transcendence of our biological nature is identified with the Singularity. Cybernetically augmented humans and entirely non-biological AI humans will thereafter be the dominant lifeform on Earth.

It occurred to me while reading Mr. Kurzweil's book that the social dislocations associated with such a profound revolution would be enormous — doubtless far more enormous than he was able to discuss in detail in his book. In particular, it seemed to me that one of the likely common reactions of some biological humans to such a state of affairs would be simple denial. Poking around a little more, I quickly discovered that the Singularity definitely has its critics, and that they fall broadly into two categories: those who don't think it will happen, and those who don't want it to. As I expected, some of the former group give out hints that they stake out their positions mainly because they're really also in the second group. The critics just don't want it to be true.

Doesn't that suggest that the biggest opposition to achieving the Singularity is likely to arise from Luddism, as opposed to purely technological obstacles? Of course it does. And thinking about it a little further, are there any identifiable power structures in present day human society which might be expected to pick up the gauntlet for the Luddite cause, out of fear of their own eventual obsolescence? Again, yes: we can point to church and state, of course.

Most religions, when painted with the broadest possible brush, present a two-sided appearance. On the one side is a set of cosmological beliefs, an ontology of human existence as it were, concerning the nature and origin and destiny of the universe, and of Man, usually along with one or more deities, and the relationships which obtain between them. On the flip side is a set of ethical beliefs and behavioral prescriptions which are usually correlated to the ontology in some fashion. For example, Christianity puts forward a set of beliefs and behaviors to which one must adhere in order to get into heaven, Hindus must follow a code to assure the accumulation of good karma, and so on.

But all of these notions are predicated on the fact of human mortality. If people never died in the first place, wouldn't ideas of resurrection or rebirth become moot? And what would that do in turn to the moral and spiritual authority of the ethical side of religion? To say nothing of the perceived value of clerics and priests? The answers are both obvious and subtle.

On the political side of the fence, government arguably exists to restrain and punish antisocial behaviors. A bit of a stretched generalization perhaps, given that government itself commits a royal share of the antisocial behavior in the world — by fighting wars, for example — but there it is. There is this notion, however tattered, of a social contract.

But bureaucracy has justly been called the slowest and stupidest of all human institutions. As the pace of technological change increases, government will not be able to adapt quickly enough. Politicians need years to become aware of a problem, study proposed solutions, write and pass legislation, and then years more to implement an enforcement regime. (And typically years after that to deal with the unintended consequences of their solutions.) Hell, they're still struggling to cope with the invention of the Internet! What happens once entire new generations of technology start arriving in a matter of months? It seems clear that government will soon find itself at sea without a paddle or a clue (even more than it already is) and become capable of enforcing almost nothing.

The reaction of people in a position of power to the prospect of losing their power is preeminently predictable. Which implies that at some point, the government will almost have to join the Luddite camp. Indeed, the political influence of anti-technology crusaders is plainly already substantial: for example the split-wood-not-atoms crowd that has successfully prevented the construction of nuclear power plants in the United States for decades, or the obstinate but heartless opposition to relieving famine using genetically modified crops.

At the same time, it seems to me that the main body of humanity is quite eager to embrace new technology, so long as it seems to add value or interest to their lives. The kind of technology we are talking about with the Singularity, such as full-immersion virtual reality, immortality, and greatly expanded intelligence, will undoubtedly provide that — to understate the case enormously. Suppose that a "live-forever pill" was developed, but the FDA and its counterparts elsewhere, the WHO and so on, refused to approve it, even denounced it. Is there anyone who seriously believes that such rejection would prevent the therapy from nevertheless spreading like a raging wildfire across the globe? Anyone who does would do well to reflect on the wide availability of Viagra and Cialis without a prescription, or the ubiquity of illegal recreational drugs, even in maximum security prisons.

Reflecting on all of this, it seemed to me a reasonable conclusion to posit that the great mass of humanity might well be on an imminent collision course with its own lords temporal and spiritual. If that were so, how would the resulting struggle manifest itself? What would the landscape look like around the time of the Singularity, and how would it resemble or differ from our contemporary experience? How might society be divided, and what attitudes and goals would each group possess?

These questions are of course explored in this book.

I am, naturally, scarcely the first science fiction writer to set down words about the implications of, say, artificial intelligence. But it is my intention to go about it in a very different way. Much of the familiar literature, film and television, even artwork, approaches the subject in one of two ways that I feel are fatally flawed. In the first case, we see artificial intelligence which is confidently depicted as somehow "less" than the biological human intelligence with which it interacts. Thus we have Commander Data traipsing cutely through one pilgrimage after another in quest of a deeper understanding of the (for him) ineffable human psyche, or the Cylons acquiring feelings (gasp).

This predilection arises from the prejudice that biological intelligence is greater than artificial intelligence, and will always remain so. Emotions, in particular, will always be out of the reach of AIs. This view is not likely to remain correct; any AI capable of passing the Turing Test(external link) is plainly going to need to be capable of the full panoply of human thoughts, reactions, and behaviors. I consider it far more likely that we will encounter humans who wish to become AIs, rather than AIs who long to be human.

In my view the second flawed approach to strong AI is that it is almost always inimical. Whether we are talking about the Borg, the machine creatures in the Matrix, or any of a long line of antagonistic super-computers, strong AI always seems to have it in for biological mankind. Arguably this makes for good drama, but the persistent dystopianism seems to me to be without foundation.

Why after all should we assume that exploitation, cruelty, conquest and genocide are compatible with superlative intelligence? Won't that intelligence include emotional and ethical components? Moreover, as Kurzweil points out, strong AI when it comes (and it will) will be us. So why must strong AI necessarily adopt the values and methods of a Genghis Khan or a Stalin rather than those of, say Gandhi or Jesus? What kinds of values are actually compatible with exalted intelligence? And what if the tragically bad human behaviors turned out to have their roots in the studied irrationality of biology, and artificial intelligence actually liberated us from them? These questions, too, arise in this novel.

As a consumer as well as a writer of the science fiction genre, I cannot help but feel that the genre is presently failing its customers. Here is this enormous juggernaut called the Singularity bearing down on our society, probably destined to arrive within the lifespans of a majority of the humans now living, and yet relatively few people are even writing about it (with the notable exception of some visionaries like Vernor Vinge(external link)). Even worse, much of today's art and literature of the future is so damnably dystopian. It's as if science fiction writers and filmmakers are collectively admonishing us to "Be afraid, be very afraid." A sign of the times, in our fear-mongering, post-9/11 world? Perhaps.

It didn't used to be that way, of course. Science fiction used to be optimistic. It evidenced a core belief that scientific development represented positive progress, that technology made things better for Man overall. That the future will have its issues, its struggles, its growing pains to be sure; but on the whole, it's going to be a very cool place. That sense of optimism about science is now largely absent from science fiction. No wonder the genre is shrinking, even in the midst of an era of unprecedented and accelerating scientific advancement.

The Last Trumpet Project is one writer's attempt to restore some of that traditional perspective. I wanted to write a book that would challenge the reader's assumptions, and perhaps even compel some reevaluations of deeply held beliefs. At the same time, I wanted to present the Kurzweilian Singularity in an accessible, fictionalized format, and project social, economic, and political trends forward into that milieu.

This book is also an attempt to write the kind of fiction that I myself like to read. I should warn you (if it is not already too late) that I enjoy fiction which demands a fair amount of the reader. For example, I prefer narrative which does exposition in a strand-by-strand fashion, presenting characters and situations kaleidoscopically, with the relationships between them coalescing, Pulp Fiction-like, to propel the plot forward. If you like nice, sequential storytelling that leads you by the hand from one chapter to the next, with no need to ponder where things are going, you will likely be completely lost by chapter five.

This is a difficult book. It will make you think. As Henry Ford once pungently observed, thinking is the hardest work there is, which is the probable reason why so few engage in it.

I hope that you are one of those few, and and that you will not only read, but thoroughly enjoy, this book.

With sincere best wishes,



Kevin MacArdry
June, 2008


Return to supplemental info page

www.lasttrumpetproject.com(external link)" class="wiki wiki_page">Return to home page

To purchase the book, click here.


Last edited by kevinmacardry , based on work by admin and system .
Page last modified on Saturday 20 of August, 2011 22:51:10 UTC.

Purchase the Book

This website is the only website for The Last Trumpet Project approved by the author.

To purchase your copy of the novel, please click here.

You may also read it online for free by registering a login on this site. (See the more information and excerpt pages for the required passcode).