On The Death Of Twitter

Elon Musk’s purchase of Twitter for the vastly-inflated price of $44 billion is probably the singularly largest and most rapid destruction of corporate value in the history of American business thus far. While there have been plenty of terrible deals in American history – AOL/TimeWarner stands out as an example, it took years for those business deals to turn sour. Musk has managed to destroy Twitter in a matter of weeks.

While Musk had been known as the visionary genius behind Tesla’s electric cars and SpaceX’s incredible rockets and spacecraft, he has managed to torch not only a social networking site, but his own mystique. Instead of a tech visionary, Musk looks like a terminally uncool and out-of-touch shitposter with a boundless ego and an equally boundless sense of self-importance. Because Musk heavily leveraged his Tesla shares, it is quite possible that he could lose control of that company. Given that NASA and the Department of Defense are some of SpaceX’s largest and most important customers, the backlash to Musk’s radicalism could (and probably should) cause him to be ousted from that company as well. And the next time SpaceX needs a capital raise how many investors will see the garbage fire that is Twitter and think that giving Musk more money is a sound investment? Musk’s infantile antics have real-world implications for both him and his companies.

I said that if Musk reinstated The Former Guy, I would leave the site. He did, and I did. What is even more pathetic is that Musk is practically begging TFG to return to Twitter. While TFG would certainly get a bigger audience at Twitter than at his private internet pigsty Truth Social, TFG seems uninterested in returning. Knowing how TFG loves making others squirm, Musk’s pathetic entreaties must tickle the Mango Mussolini of Mar-A-Lago. But given the choice of playing on Musk’s playground or the one he owns completely, TFG appears content to stay put.

Not only has Musk invited TFG back to the site, but he is actively restoring the accounts of every shitposter, racist, and fool he can find. Project Veritas, the painfully unfunny Babylon Bee, the idiot’s intellectual Jordan Peterson, etc. Musk is rapidly turning Twitter into a virtual Mos Eisley Cantina – a digital den of scum and villainy.

Musk’s idiotic business deal means that Twitter needs to bring in roughly twice the revenue it has ever had just to service the site’s massive debt obligations. The math behind Twitter’s debt obligations simply does not work. In order for Musk to keep the lights on at Twitter he needs to find a source of revenue.

Unfortunately for Musk, he has pissed off advertisers to the point that major firms have already begun pausing or even cancelling campaigns. Major brands do not want their ads next to a racist rant from “JewHatr1488.” Instead, Musk has been pushing for the $8 “Twitter Blue” subscription model, that includes some additional features and a “verified” checkmark. However, Musk failed to understand that the purpose of verification was not to make a user seem cool, but to ensure that everyone else knew that user was legitimate. This misunderstanding led to a clusterfuck of epic proportions as people used Twitter to impersonate major brands. If Musk had already been on thin ice with advertisers before, the botched Twitter Blue rollout made things infinitely worse.

Even if Musk is content to let Twitter go without its main source of revenue, the idea that people are going to pay $8 a month for a septic tank of a website seems hopelessly naive. The only value of a social media site is its people. And as normal people leave Twitter in droves, whether decamping to Mastodon, Instagram, or any of the up-and-coming sites like Hive or Post, the value of Twitter drops even more. Most people are not going to wade through a sea of filth just to hear what their friends are doing. And even fewer still are going to pay $8/month for the privilege of doing so. The chances that Musk is going to make enough money to even come close to servicing Twitter’s debt obligations with Twitter Blue subscriptions is naive at best, and catastrophically idiotic at worse.

This does not even touch on the way in which Musk’s has mismanaged Twitter’s employees. Musk’s management strategy is basically “the lashings will continue until morale improves.” The employees that Musk has not fired either as a headcount reduction strategy or in fits of pique have largely left. At this point the skeleton crew that is left tends to be people who have few other options, like H1-B visa holders that cannot leave without the risk of deportation.

The fact is that Elon Musk is someone so desperate for any kind of attention that he will burn billions of dollars in cash to do so. The purchase of Twitter was not a savvy business move—it was a toddler’s tantrum. Musk’s crowing about Twitter’s spiking usage shows how he fundamentally misunderstands his position. Gawking at a dumpster fire brings eyeballs, but it does not bring revenue. Insulting users is not going to make people want to stay on the site, and once a critical mass leaves, the rest will follow. It is quite possible that critical mass has already left. From my experience, most of the people I follow have already decamped to Mastodon. Despite Mastodon’s issues, it is far less toxic.

In the end, Twitter is likely to collapse. There are plenty of ways in which Twitter could die in very short order. The mass firings of engineers could cause the site to slowly break down to the point that people just do not bother. Elon’s blatant violation of Twitter’s FTC consent orders could cause ruinous fines or even personal liability. Google and Apple could see a Twitter that’s become a haven for porn, piracy, racism, and trolling and decide to boot the Twitter app from their respective app stores. Twitter could simply run out of money and have to file for bankruptcy. There are a million ways that Twitter could die at this point and very few scenarios in which the site survives.

Business and law textbooks will no doubt have lengthy chapters on Musk’s Twitter acquisition and its fallout. None of them will be flattering to Musk.

Sad Puppies Victorious

Science fiction is full of stories of a rag-tag group of rebels taking on the Big Bad Empire. These rebels, using a combination of wits, pluck, and determination, inevitably manage to take the Empire down and bring a new era of peace to the galaxy.

Rarely, however, does such a thing happen in real life.

The Hugos are the Oscars of science fiction and fantasy, an award with a long and pedigreed history of honoring the greatest visionary minds in the genre. However, in the last few years the Hugos have become the exclusive playground of a clique—the “Mean Girls” of genre fiction—who have tried to use the Hugos as a political statement rather than a celebration of the incredible diversity of science fiction and fantasy. Instead of being awarded on merit, the Hugos were passed out to members of the clique, led by a powerful and influential publisher, and restricted to only those who towed the political line.

Cue the rebels.

A group of science fiction authors had enough. They founded “Sad Puppies 3,” a slate of deserving works and authors long passed-over by the Hugo clique. These works were not picked on politics, but on their merit.

The fans—the fans that have been neglected, insulted, or ignored by the ruling clique—won. The plucky rebels defeated the Empire.

Just like the great sci-fi epics, the rebels won—the Sad Puppies slate dominated the Hugo nomination process, along with another fan-driven slate. The old system of exclusion was broken by a flood of new entrants.

Of course, the Empire tried to strike back. As SF author Brad Torgersen put it:

It’s been most of a full day since the final Hugo award ballot was announced, for the 73rd World Science Fiction Convention. If you’re tuned in to this thing — and if you’re reading this, you probably are — you’ve no doubt seen the small mountain of verbal outrage which has flooded forth. Because the SP3 slate didn’t just do well with nominating voters, it did overwhelmingly well. A raft of notions has been forwarded by different critics, to explain the “discrepancy” in the 2015 ballot. Most of the critical commentary takes the form of very earnest protestations focusing on violation of etiquette — though, again, SP3 broke no rules — and seem intent to make SP3 out as nothing more than a “fringe effort” by a minority.

. . .

You, gentle SP3 supporter, are not good enough. The refined arbiters of the field all say so. Your politics are wrong, your taste is wrong, your reading habits are wrong, your affiliations within fandom are wrong, you like the wrong things, you go to the wrong fan meetings, you are part of the wrong circles, you like the wrong publishers, and you vote wrong when you cast your ballots. You’ve been told this for years, in variously subtle and sometimes unintentional ways. But now your intellectual and moral betters in the field are getting more explicit about it.

But the self-appointer arbiters lost out this week. The fans—the fans that have been neglected, insulted, or ignored by the ruling clique—won. The plucky rebels defeated the Empire.

The Empire would rather blow up the Death Star than let it get captured—there’s an organized campaign to vote for “No Award” rather than let us heathens sully the Hugos with our doubleplus ungood badthink. However, even though nominations are closed, for $40 you can be entitled to vote on which work you believe deserves a Hugo. Anyone with a love of science fiction and fantasy is welcome—and the more people who vote for the Hugos, the more representative it will be of the entirety of fandom, not the ruling few.

There is a lesson to be learned here. Too often those of us on the right-leaning side of the political spectrum just throw our hands up and say that we can’t change the culture. We spent inordinate amount of time trying to change the political system while forgetting the culture moves politics, not the other way around. So many of the issue for which we advocate are cultural issues, not political ones.

Despite this, we barely show up to the fight in the culture wars. We give up the battle before it even begins, and the Empire takes over.

What the Sad Puppies campaign teaches us is that we can make a difference—we can change the culture. All we have to do is show up. We think that culture is built from the top down, when more often than not it comes from the bottom up.

How big a difference could be made if small book clubs featured more conservative or libertarian works? How big a difference could some political diversity make on local library boards? There are countless little opportunities for change that those on the right neglect because they become so focused on the narrow canvas of politics. Meanwhile, the other side began its “long march through the institutions” fifty years ago and have burrowed their way into nearly every facet of American life. What strategy is more effective?

Sad Puppies is an excellent reminder of the difference that individuals can make if they band together, work towards a goal, and execute. Cultures do not change overnight, but they can change.

As a die-hard science fiction fan, growing up watching Star Trek: The Next Generation and reading Asimov, Bradberry, and Clarke, I loved stories about the brave rebels defeating the Empire. Those stories were designed to inspire, empower, and teach important lessons about courage, initiative, and determination. The best science fiction and fantasy holds a mirror up to our society and gives us the opportunity to look at our world from an entirely new perspective. The Sad Puppies movement is in the spirit of the best of the genre, and serves as a valuable reminder that we are only powerless if we choose to be so.

UPDATE: Entertainment Weekly published a shameless hit-piece on the Sad Puppies campaign. Glenn Reynolds has a compendium of links to the original EW piece and reactions from the Sad Puppies organizers. Of course, EW got it entirely wrong, slimed the Sad Puppies organization, and never bothered to ask for the other side. In other words, the typical practice of the mainstream ideological media. This is why groups like Sad Puppies not only need to exist, but must be willing to speak out against the agenda-driven journalism that would try to marginalize them.

Steve Jobs RIP: An American Icon Is Gone

There are, on rare occasions, people of brilliance and insight that forever change the world around them. Ford. DaVinci. Einstein. Disney.

Add to that Steve Jobs.

Even though he only lived to 56 years, Steve Jobs changed the world forever. Not just the world of technology, but the way millions of people across the world communicate. He took the personal computer, which had been a utilitarian appliance, and made into into a work of art. That would have been enough for many, but Jobs went even farther. The iPhone transformed the industry. The iPad took an idea that had never quite worked and made it into something extraordinary.

There may never be another like Steve Jobs for a very long time: but the impact he made on technology and culture will live on. He wanted to change the world, and he did.

Rest in peace, Mr. Jobs, and know that your vision will live on.

The New Jay Reding.com

I’ve finally gotten around to updating the template around here. The last update was all the way back in 2007, so it was time to do a little freshening up. For one, since 2007 more and more people are browsing the web on mobile devices like the iPad. And this site is now designed to look great on the iPad. (For those of you using an iOS device, try adding this site to your home screen!) The site has been rebuilt from the ground up using HTML5, CSS3, and all the other latest acronyms. I’ve also tried to make the typography as legible as possible regardless of your screen size or device.

Of course, that means that you’ll need to use a recent-generation browser to view the site. I’ve tested it on Safari, Firefox, Chrome, and the latest versions of Internet Explorer. If you’re using an older version of Internet Explorer (7, or God help you 6), you probably won’t get the site as it was intended. But it should still be readable and usable. And if you’re on an iPhone or Android device, you’re golden—the site should work just fine on latest-generation mobile devices.

And since everyone else is doing it, I figured it was time for this site to jump on the social networking bandwagon as well. So you can use the “Share/Bookmark” link at the bottom of each post to post articles here on Facebook, Twitter, or any number of social networking sites.

Of course, there will undoubtedly be much tweaking of elements as time wears on, as well as additional features. I’m never quite satisfied with a blog template. If you see any problems with the template, feel free to email me at comments – at – jayreding.com.

I’m hoping that this new design will last as long as the one it replaced. Each site template this site uses has been created by hand rather than built from a modified prepackaged template. That’s because I believe that this site should be unique and not use the same template or design everyone else is using. I can’t promise it will be that way forever—as I continue my career my spare time keeps getting shorter and shorter. But for now, this design is 100% hand-crafted pixels. I hope you enjoy it.

The iPad Experience

I’ve had about a month to play around with the iPad, Apple’s long-awaited tablet computer. The iPad seems to engender more controversy than any other gadget I’ve seen. People seem to either love the iPad or absolutely hate it. After playing around with it, I’m firmly in the “love it” camp. The reason why the iPad provokes such strong reactions seems to be because it’s such a revolutionary device—here’s why.

Grokking the iPad

One of the reasons why the technical elites seem to look down their nose at the iPad is because it’s not intuitive what the iPad really is. The iPad is not a laptop replacement. Yes, it replaces many, if not most, of the functions of a laptop, but it’s not designed to replace a primary computer. The iPad has to be connected to iTunes before it can be used the first time. The iPad isn’t the right device if you want to use Photoshop or write a thesis—although it can edit images and has a decent word processor. It is what Steve Jobs said it was back in January 2010—it is a device that sits between a laptop and a smartphone/iPod.

The critics argue that it’s just an oversized iPod touch. In many ways they’re right—but that misses the point. The iPod touch is a fantastic gadget, and it sells like hotcakes. It has a huge base of users. So when Apple says in regard to the iPad that “you already know how to use it” they are absolutely right. Coming from an iPod touch or iPhone to an iPad is basically seamless. The only learning curve comes from getting used to the larger virtual keyboard. And it is that vastly expanded screen space that makes the iPad different. Calling it a bigger version of the touch ignores what being a bigger iPod touch entails—it opens up new uses for the device.

For example, watching video on an iPhone is possible, but painful. The screen is just two small at 3.5 inches. But on an iPad, watching video is a dream. The iPad’s screen is naturally suited to it in a way that the iPhone’s is not. The same is true for web browsing. The iPhone browser is great, but when you expand the screen real estate to the size of the iPad, web browsing becomes much more natural.

That’s what makes the iPad so ineffable. It’s hard to describe the feeling of sitting on a couch with an iPad and just surfing the web. It feels incredibly natural. It’s completely effortless. That’s the advantage of the iPad: it takes the familiar touch-based interface that millions already know and loves and gives it much more room. Handling it in the store doesn’t give the full experience—the iPad is a device that isn’t instantly intuitive, but once you understand it and get a feel for it, you just get it.

Giving the Deskop the Finger

Here’s where the revolutionary part comes in. The iPad is the future of computing. That’s not hyperbole, it’s based on the nature of the device.

Since the late 1970s, computers have all followed the same basic metaphor. You have arbitrary files in a hierarchical file system. Graphical user interfaces all tend to use “windows” representing applications that are controlled with a pointing device. There’s a “desktop” underneath where files and application shortcuts can be saved. When Xerox PARC came up with this metaphor in the 1970s it was revolutionary. Everyone, from Apple to Microsoft to Linux, copied that metaphor.

From a computer science standpoint, it makes sense. From a user’s standpoint, it doesn’t. The desktop metaphor is just not that intuitive. For example, take the task of trying to find a picture from vacation. Is it on the desktop? Is it is ‘My Documents\My Pictures’? Or did it end up in ‘C:\Program Files\Some Application\Some Arbitrary Directory\Timestamp\Vacation Photos’? Various operating systems have tried to make it easier to find files, but it can still be a pain.

The iPad jettisons that whole metaphor. There’s no “desktop” on the iPad, just a space for applications, and only applications. If you save a picture to the iPad, it’s in a common repository and nowhere else. All the videos are in the video application, all the music is in the iPod application. The user never thinks of interacting with “files” stuffed into a hierarchical file system. That file system is there, underneath everything, but it’s been shrouded from view.

And, most critically, there’s no pointing device. The benefits of multitouch interfaces are obvious. And the iPhone OS that runs the iPad was built especially for multitouch devices. Microsoft’s efforts shoehorn multitouch into Windows 7 have failed, because there’s a fundamental difference between an OS designed for touch and one designed for a pointing device. Apple understands this, and has designed the iPhone OS to be built for multitouch and nothing else.

The old desktop metaphor made sense back when it was invented and used. But it no longer makes sense for a device like the iPad. What makes the iPad so revolutionary is that it proves the desktop metaphor is no longer required. The touch metaphor has replaced it, and the touch metaphor has much more potential for innovation than the desktop metaphor did.

What about Freedom?

The critics say that the iPad isn’t a liberating device—you’re stuck playing in Apple’s sandbox when you use it. That’s only half true. Yes, the App Store requires you to play by Apple’s rules and Apple’s rules alone. But there’s a good reason for this, and even then the App Store is not the only thing that makes the iPad shine.

First there’s the issue of iPad apps. Apple has gotten a lot of heat for their policies on how apps are approved and how they may be created. Some of it is admittedly deserved. But the purpose behind these rules is valid: Apple wants the iPad to just work. Right now a user can install any iPad app without fear of crashing their system. There’s no need for installers—every app is in its own self-contained sandbox. There’s no need for uninstallers—when you get rid of an app it goes away completely. There’s no fear in adding apps to the iPad in the way that many users fear adding apps to their computers. Apps can be disposed of quickly and easily. To the user, this is liberating. The iPad is a computer than no one fear to break.

Yes, that means that developers must follow Apple’s rules. And yes, Apple has admittedly been less than consistent in how they enforce those rules. But the rules are not arbitrary. They are to control the platform, but not just to the benefit of Apple. This walled-garden approach benefits users as well.

The iPad is not a closed ecosystem though. Remember when Google announced their Chrome OS project? The tech world swooned at a tablet that did nothing but run web apps. Think of the iPad being a version of that tablet with an additional proprietary app store bolted on. The iPad can run any given web app, and it runs them well. The same technology that powers the iPad’s browser also powers the browser for Android devices. And Google’s Chrome. And the new Blackberry 6 browser. That means that the iPad is part of a huge meta-platform that can run web apps that run across just about every device out there. Web apps won’t necessarily replace native apps—at least not yet, but they do give developers virtually unlimited freedom.

Screw Flash

But the iPad doesn’t run Flash! So what?

I’ll be blunt. Flash is a pile of crap. I don’t miss having Flash on my iPad, because I don’t even use Flash on my desktop. The Mac OS X version of Flash is slow, buggy, and annoying. I have Flash content blocked by default on every one of my computers, and virtually never unblock it.

Flash is old technology. It belongs in the scrap heap with Java Applets and Microsoft’s Active X. The future lies in HTML5, a completely open standard not controlled by any one company. Flash is a dead man walking, but Adobe has yet to figure that out.

Now, I could be wrong. Maybe Adobe will get Flash working so well on Android that Apple’s devices will be at a competitive disadvantage because they won’t run all the great apps written in Flash.

And maybe a naked Angelina Jolie will parachute into my backyard with a suitcase full of $100 bills.

Flash is a dying platform that’s being quickly overtaken by better and more advanced technologies. Steve Jobs is right to chuck it out. The App Store does not need a bunch of slow, buggy, third-rate apps that depend on Adobe’s notoriously slow development cycle when Apple updates iPhone OS. Apple’s been down that road before, and they’re not doing it again.

The lack of Flash isn’t a glaring omission from the iPad, it’s a feature. The web will embrace HTML5 long before Apple feels the need to embrace Flash. If Adobe were smart, they’d be embracing HTML5 too. There are enough good and innovative developers at Adobe that they could do it if they’d stop staring into the rearview mirror.

Welcome to the iPad World

The iPad is a revolutionary device. It is just as polished as Apple’s other offerings, and being based on mature technologies, it’s more polished than a first-generation product normally is. It’s a device that once used quickly becomes indispensable. The critics tend not to understand it, and keep trying to compare it to devices that are not comparable. Just like the original iPhone, the critics will end up owning one or more of them in a few years.

The iPad is the future of computing. The desktop metaphor is no longer the only game in town. Apple is betting their future on the idea that computing will become less about desktops and laptops and more about small devices connected to the “cloud” of internet-based applications. And just like the iPhone, Apple has taken a product that hadn’t yet had a breakout devices and created something that will have everyone else scrambling to catch up. Even if Apple somehow fails (and the one million iPads sold in a month say that’s not going to happen), they have left their mark on the industry. Look at the iPad. That’s what computers of the future will look like.

iPad: The Biggest Tablet Since The Monolith?

So, Steve Jobs has bestowed the iPad upon the world. This is the device that a lot of tech-heads have been predicting for years: the almost-mythical Apple Tablet. This thing’s been predicted before even the iPhone.

What’s In A Name?

The “iPad” moniker was a bad call. Yes, it’s already the butt of jokes. Yes, it falls in line with “iPhone” and “iPod”, but it’s too close to the latter. But then again, a rose with any other name would smell just a sweet, right? Even if the rose sounded vaguely like a feminine hygiene product.

Flash In The Can

I’ve heard plenty of moaning about the lack of Flash. This shouldn’t have been a shock. Apple does not like Flash. It’s proprietary. Flash on OS X performs terribly. For a lengthy take on why the iPhone/iPod touch/iPad will likely never support Flash, see John Gruber’s piece on Apple, Adobe, and Flash.

The other big question is why does the iPad need Flash? To view video — it already does that, and with better performance than Flash. Yes, it doesn’t view all web video, but as Apple’s multitouch devices continue to proliferate, I’m guessing a lot of sites will abandon Flash rather than abandon those devices. (And yes, that includes the porn sites that are probably the reason many want Flash on the iPad…)

To play web games? For one, Apple offers plenty of games through the App Store. Not only that, but many Flash games wouldn’t even work on a multitouch device — especially anything that needs keyboard input. Flash games would suck on multitouch devices.

For ads? The fewer obnoxious ads, the better.

For more interactive web pages? The real solution would be to embrace open web technologies like HTML5, CSS, and JavaScript. Those technologies aren’t controlled by one company, unlike Flash.

Winners And Losers

The biggest losers in all this could very well be Amazon, Barnes and Noble, and Sony. They’ve all heavily invested in e-reader devices, and the iPad makes a lot more sense than those devices. E-Ink screens are nice, but if the iPad makes for a good enough reading device, it won’t matter.

The saving grace for them is that they have the opportunity to create their own reader applications for the iPad. (I’m guessing that both the Kindle and Barnes & Noble reader applications for the iPhone will work on the iPad.) I’m guessing that Amazon sells the Kindle hardware at a loss, in the hopes of making up the difference in book sales. Does Amazon care whether they sell books on the Kindle or the iPad? Probably not. The question is whether Apple cares that third-parties are selling books on their platform. I’d wager they don’t care — Apple isn’t in the publishing business, they’re in the hardware business.

The winners are probably publishers. The iPad gives them some great opportunities to have e-books proliferate in the same way that multitouch apps have. That’s a win for an industry that’s facing some very bad times.

Looking Ahead

Apple is heavily invested in multitouch, and the iPad is just another example of that. It’s an opportunity to fundamentally transform computing. These devices abstract away old concepts like file systems and a hierarchy of folders. The old metaphors can finally be swept away: no more folders, no more mouse cursors, no more file managers, not even windowing systems. This is the face of 21st Century computing: and Apple is setting the trend.

The iPad is just another device, one of the first in a long series of devices. It’s likely to be extremely popular, and is very well designed. But ultimately, it reaches beyond that: this is about redefining the way we use computers. Apple has paved the way, and while others are trying to catch up, the iPad proves they’re still running one step ahead.

UPDATE: John Gruber observes a point I missed: Apple now makes their own blazingly-fast mobile processors. Apple’s acquisition of chipmaker P.A. Semi seems to be paying off. Apple is a hardware company at its core, so designing their own chips is a wise move.

The Lost Moon

Yesterday was the 40th anniversary of the launch of Apollo 11, culminating in the first human footsteps on the Moon.

Charles Krauthammer has a deeply thoughtful piece on the Moon we left behind:

But look up from your BlackBerry one night. That is the moon. On it are exactly 12 sets of human footprints — untouched, unchanged, abandoned. For the first time in history, the moon is not just a mystery and a muse, but a nightly rebuke. A vigorous young president once summoned us to this new frontier, calling the voyage “the most hazardous and dangerous and greatest adventure on which man has ever embarked.” We came, we saw, we retreated.

That we ascended to the stars, but then turned our backs to them shows just how foolish our society can be.

Apollo was probably unsustainable, but had we allowed space to be another place where human creativity, ingenuity, and daring could have thrived rather than a sterile “commons” visited only by state actors, our present could have looked much more like the future depicted in 2001.

If an alien race were to come to Earth and see what we have done—or not done—in the past 40 years, I doubt they’d understand it. How a civilization can pull back from such a dazzling achievement would be beyond the understanding of any rational creature.

China Invests In Pebble-Bed Technology

Next Big Future reports on a joint Chinese-South African project to advance pebble bed reactor technology. Pebble bed reactors are an advanced type of nuclear reactor design that promises to be significantly safer than conventional designs, for more details see here.

One of the reasons I’ve said that the future may well belong to the East is because the Chinese are willing to invest in this kind of technology while Western governments are too motivated by short-term political pressure to invest in projects such as these. The only way we will be able to meet the energy needs of the future and preserve the environment is to start moving towards nuclear energy. The truth is that wind, solar, geothermal, and other “green” technologies cannot produce enough power to meet our needs. They may be supplements to a nuclear infrastructure, but they will never supplant it.

If President Obama wished to be truly forward-looking, he would commission a similar program in the United States. For all the talk about the “Republican war on science,” the Democrats remain in thrall to an environmental lobby that wants to push for forms of alternative energy that will never be able to meet America’s needs. So instead, we keep our inefficient fossil fuels and push for stopgap solutions like “clean coal” rather than investing in an energy infrastructure that truly meets the needs of the 21st Century.

Pebble bed reactors promise a safer, cleaner, and more plentiful form of energy for America and for the world. If we are to remain a superpower into the 21st Century, we cannot turn our back to advances such as this. We cannot let the stigma of the word “nuclear”—and the irrational fear it engenders—stand in the way of our future.

Hat tip to Glenn Reynolds for the link.

Watching The Watchmen

This weekend, I managed to catch Zach Snyder’s adaptation of the “unfilmable” comic bookgraphic novel Watchmen. I didn’t read the book until after the movie had been announced, and while it is unquestionably a good story, it is more than a little dated. This review will avoid spoilers except in the most general sense, for those who have not seen them movie.

As to the story, one reviewer notes why it has not worn well:

Watchmen’s brand of dystopian misanthropy has been specifically refuted by events. It’s one thing to worry about the evil U.S. policies of containment and mutually-assured destruction in 1986. It’s one thing to paint a particular political party as being unconstitutionally obsessed with the possession of power and recklessly in pursuit of nuclear confrontation with an enemy who probably wasn’t so bad.

But as it turns out, that entire worldview was vitiated by events. In 1989 the Berlin Wall fell and the Cold War ended. Reagan’s strategic policy decisions vis-a-vis the Soviet Union were completely vindicated. MAD proved to be an effective deterrent. The conflict between the East and West was settled without a shot being fired. And, perhaps most importantly, the Truman/Kennedy/Reagan view of communism as an insidious ideology which led to violent, repressive authoritarianism was borne out.

So Moore was wrong. His fears were wrong. His warnings were wrong. His fundamental view of the world was wrong. And ‘Watchmen,’ in particular, is left as a bizarre cultural artifact. A pretentious piece of commentary masquerading as philosophy.

That being said, how is the movie?

It’s okay. It isn’t a great movie. It isn’t even the greatest comic-book adaptation out there. However, it is very ambitious, and very well done.

The good: Jackie Earle Haley’s Rorschach is an amazingly well-done character. Rorschach is not supposed to be all that sympathetic. He’s a psychotic vigilante with a brutal streak. Yet he’s arguably the most sympathetic character in the whole movie, because he has the most realistic motivation. We don’t really connect with the other characters in the way we connect with Rorschach, and that is due not only to the blandness of the others but because Haley makes him such a complex and multi-faceted character.

Billy Crudup’s Doctor Manhattan is fascinating to me. What if someone became what was essentially a god? Crudup plays the character (through CGI) as someone who is torn between his humanity and his incredible powers. The problem is that the CGI is not quite realistic enough to make it all work. The facial expressions don’t always work, especially the mouth. Crudup does a good job with what he has to work with, and the idea of the character is fascinating, but the execution is not up to what it could have been.

Jeffrey Dean Morgan also did a good job as the murderous thug, The Comedian. The problem with his character is that his emotional arc never goes anywhere. Part of that is the fault of the story where his sudden conversion is more about serving the plot than anything else. However, when Morgan is on the screen, he’s got charisma, and even The Commedian is such a terrible person, you almost want to root for him.

The bad? The old-age makeup is atrocious. It rings false for some reason that isn’t quite quantifiable. Carla Gugino’s older self in the movie comes off as a cliche, even though she’s a talented actress.

There is a sex scene that is absolutely cringe-worthy. Making a nude scene with a comely Swedish model be boring takes a special kind of incompetence—and Snyder is a talented director in many other respects. The emotional core of the movie is lacking because Snyder (much like George Lucas) is a great director with fight scenes and effects shots, but doesn’t have the same fluency with actors. It’s in many ways harder to frame and edit a shot with two actors talking than to blow up New York. Many, if not most directors, don’t always know how to do that, and Snyder has yet to learn how to be subtle. Hopefully as he grows as a director he’ll learn.

Malin Akerman has her moments, but is still rather wooden. Part of that is due to Snyder not knowing how to bring out an emotionally resonant performance from her. But still, she never quite hits the mark.

Patrick Wilson as Dan Dreiberg/Nite Owl also gives just shy of a great performance—and again, I blame Snyder for this. He plays the washed-up old superhero finding his true place fairly well, but never quite connects. He’s probably the most accessible character in the movie, and the most realistic. We want to connect with him, but his low-key performance seems to get overwhelmed by the scale of the movie. Still, he isn’t bad by any stretch of the imagination, and his character has his moments.

Matthew Goode as Adrian Veidt/Ozymandias also fails to hit the mark. He’s supposed to be the world’s smartest man, but comes off as a bad David Bowie impersonator. We never really get a sense of his guile, and at one point he states that “he’s not some comic book villain.” The problem is that by that point, that’s exactly how he comes across. I never thought the Veidt character was well-developed in the book either, but here he’s just another part of the plot.

Fans of the comic note that the ending has been rather significantly changed. I’ve mixed feelings about that—the original ending would not have worked well on film. Yet at the same time, the new ending falls a little flat. It plays well into some of the themes being developed in the film, but it still lacks punch.

All-in-all, Watchmen never becomes much more than the sum of its parts. It’s too cool, too sterile, except for when Rorschach is on the screen. There’s no emotional heart to the story, and the big emotional moments fall flat. Snyder is a great action director, but he doesn’t yet have the skills to bring out the subtleties in the piece.

The problem with the story’s politics is also there. It’s impossible to uproot it from Cold War paranoia, yet the Cold War is a distant memory for its audience—and many of them never lived through it at all. Watchmen‘s alternate reality not only assumes the existence of costumed vigilantes, but also a Cold War radically different than the one that actually happened.

Watchmen is not a bad movie, it’s more than serviceable, but it never crosses the line to being a great movie in the way that The Dark Knight does. Snyder did prove that Watchmen is not actually unfilmable, but at the same time, it stops just short of being great.