Nexus Rothdas book review RSS
3.0 Stars
1-1-2015

This isn't a great book, or a hugely intelligent or clever book, but it is at least wrong in interesting and provocative/demonstrative ways. The setting is the world in 2040, where technology promises/threatens the creation of post-humans who have mental and physical abilities well beyond the baseline. In particular, the book focuses on technologies and programs that allow the inspection of minds and direct communication between minds. The main story is a kind of action-techno-thriller as different parties try to shape how these technologies will be used and disseminated. As an action-thriller, it is decent and readable.

So, the problems. One is that I found myself skimming surprisingly large sections of the book. Some of the fight scenes became a bit boring and overdone, but the primary culprit was the frequent sections involved with LSD-like trips and cosmic oneness and light and unity. These weren't particularly moving; the author doesn't have the skill to convey religious experience in an interesting way.

This leads into the second problem, which is that while the author has taken the first step of agreeing that brains make minds, he hasn't really allowed the implications of that to seep into the rest of his world view. E.g. if he can see a lab animal that has a wire stuck in its brain to cause it to feel pure joy, and then he turns around and uncritically describes his own metaphysical experience of light and unity as a sort of ultimate and objective good, I think he has sort of failed to put 2 and 2 together.

In addition, in his description of brain techs, the author often falls into the kind of implicit Cartesian Dualism that Dennet complains about. If some nano-probes are in your head and mucking up your memories, you wouldn't feel tendrils in your mind, since there is not a homunculus in your head to observe the changes. To put it another way, you wouldn't feel the change anymore than if someone changed a webpage somewhere out on the web. When the change happened there would be nothing for you to notice, it's just that the next time you went to the webpage, the content would be different. Or at least that's my understanding of things. Similarly it doesn't make any sense to say that a character uses their will to resist these sorts of physical level changes. There are a number of related problems I could go on about, but I will leave it with the above two.

Hmm, what else can I complain about. The main protagonist is a moderately skilled programmer who develops software that runs on/affects brains. He uses his own mind as a test machine for this, i.e. he rolls out changes to his own mind without testing them anywhere else first. And he's writing in, like, low level C. As a programmer, I can't look on these practices with anything but horror. I also feel like I disagree with the author on his main ideological points. He seems to think that group consciousness would be a wonderful thing, while the last thing in the world I would want is to let other peoples' trash minds touch mine. More seriously, I think if you look at programming, where you lay out semi-thoughts in a semi-physical form, the first thing a programmer wants to do when they come to someone else's codebase is to rewrite and refactor everything into their own personal style. The other person(s) code seems terrible and smelly and alien and you want to make it right. I'm not saying this instinct to refactor is a good instinct, but it is definitely there and it is definitely common. I feel like if the author looks at his own personal experience with programmers, where we can't even deal well with this small shadow of another person's mind, I don't see how he can say that more direct experiences would be better. The author also thinks that these sorts of transformational technologies should be freely available to anyone, and tries to sell that at length in the book. My own opinion is that if we get to the point that anyone with access to a high school biology lab and an internet connection can wipe out large swathes of the human race, we'd be pretty fucked. I'm not a huge fan of behemothic surveillance states, but existential tech threats would be one of the few one valid justifications I could think of for them.

Ok, those were the complaints. Despite all of them, I didn't mind the book that much as it did trigger more thought than most novels. I wouldn't want to read a sequel, but one was good.