Saturday, February 14, 2009

And now for something completely WordPress

I officially shifted the blog to WordPress hosting -- which you'd know if you were checking it via DorkmanScott.com! For shame.

Anyway, that's where I plan to post new posts from here on. All posts and their comments to date have been transferred over, so it'll be like nothing changed, aside from the look. See you there!

Thursday, February 12, 2009

Happy Darwin Day!

Today marks the 200th anniversary of the birth of naturalist Charles Darwin, and the 150th anniversary of the publication of his book On the Origin of Species. Darwin did not create the theory of evolution, which dates back to the Greeks; he only postulated the mechanism by which it occurred, natural selection.

Despite the efforts of certain people to discredit evolutionary theory, because of a vested interest in Bronze Age superstitions that cannot be reconciled with observable fact, our understanding of evolution forms the cornerstone not only of modern biology, but resonates throughout all of the natural sciences, from chemistry to archaeology to anthropology to medicine.

If evolution weren't true, none of the modern medications or antibiotics we have would work. But they do, because it is.

There was a lot that Darwin didn't know about -- he had no knowledge of what we today call genetics, for example. Although DNA would be discovered during his lifetime, its implications in relationship to natural selection would not be understood until the mid-20th century. With all the advancements in knowledge we have made in the last century and a half, Darwin would likely find any of the current work being done in the field of evolutionary biology completely mystifying.

But through 150 years of science, evolution by means of natural selection has been proven, and strengthened, with every new discovery. There are certainly conceivable discoveries that could be made that would render the theory invalid, or at least inadequate to explain them; but such discoveries have, to date, never been made. Anyone who tells you that evolution is a "theory in crisis" is either ignorant of the facts, or lying about them.

Ars Technica has an article about appreciating evolution. I'm sure you will find many more out there to enjoy and learn from.

Today, by the way, also happens to be the actual 200th anniversary of Abraham Lincoln's birthday. So props to him too, though the official birthday celebration has been paired with the celebration of Washington's birthday and gets everyone not working in post-production a day off on Monday.

Wednesday, February 11, 2009

Sony Releases New Piece of Shit that Doesn't Fucking Work

Brilliant, biting, and like all good satire, way too fucking true.

Where the Wild Things Are -- Wild Thing Pics

Slashfilm has some images from the Spike Jonze film adaptation of Where the Wild Things Are, showing for the first time the eponymous Wild Things.

WTWTA is a classic children's book -- I doubt there are many of my generation or younger who don't have a strong affection for it. And everything I've seen and heard about the adaptation sounds fantastic. There was a brief scare after Jonze turned in his first cut -- it was deemed too scary for kids and too weird for adults. This worried the studio. But it excited me.

Even though WTWTA is an extremely short book (only 48 pages, and averaging a single sentence per two-page spread), it sounds like Jonze used that as a starting-off point to create what could be an equally classic, timeless film. Bizarrely, even though there is limited source material and I expect (and even desire) deviations, my expectations are higher for this than any other adaptation in recent memory.

Loved Jonze's work to date. He's just weird enough to pull this off. Fingers crossed that October 2009 it turns out to be what I hope it is.

Monday, February 09, 2009

New Job Joys

So as a brief note, I've landed a full-time visual effects gig for the next couple of months with Digiscope. I can't say yet what I'm working on, I'm afraid, but I think I will be able to by the time the trailer hits.

This isn't my first experience with working at an FX house. I had a brief stint at Glowgun at the end of last year working on Feast 3 (which I assume is safe for me to say since they've already added it to my IMDB profile). But this is the first time working on a high-budget, high-profile movie that will see a wide theatrical release. But again, I can't say more than that right now.

I've got a ten-hour workday and I've still got a bunch of personal projects that need my attention in the off-hours, so while I'm not suspending blog activity, I'm going to shift the focus for a little while. Instead of the longer, more opinion-driven posts, it'll probably be more re-blogging. YouTube videos, news articles, stuff like that, with maybe a very brief commentary. That's aside from Secular Sunday posts, which will still be relatively comprehensive.

So I'll still be posting with frequency, and possibly even more frequency than before since the posts will be brief. And stay tuned for when I can actually say what I'm doing here!

Saturday, February 07, 2009

Secular Sunday: Atheist Q&A!

So I actually can't find my copy of Case for a Creator right now. I picked it up to have it with me to write this week's entry and now I can't remember where I put it, and with a new job taking up ten of my daily waking hours, I actually don't have a lot of time to look for it.

It probably sounds like I chucked it in the bin, but I didn't -- I'd tell you if I did. So please, nobody send me another one. I will find it.

But this week, we're not falling far from that tree, because I'm still going to talk about a Lee Strobel topic. Lee Strobel was asked some questions, by an atheist, which he answered.

I may address his answers another time, but I am led to think that maybe I've been a little harsh on the guy. I've accused him of intentionally obscuring or distorting the truth, but it appears quite possible that he really just has poor critical thinking skills, no doubt atrophied from years of disuse. It seems like he may honestly believe that the things he writes and relays in his books really are logically sound.

To paraphrase Gandalf: a fool he may be; but perhaps, at least, an honest one.

But as I said, that's not what I'm going to post about today. In response to the atheist questions posed to him, he and some of his apologist buddies came up with some theist questions they would like to hear answered by atheists. Other atheist blogs have addressed them, but I thought I'd take my own crack at it.

What I say is not the "official atheist answer," as no such thing can exist. Atheism has no tenets or dogma and thus cannot have an "official" position other than the non-belief in gods. These are only my responses to these questions.

By the way, some Harry Potter spoilers slipped in there by means of comparison. If you haven't read the books, particularly the last two, then you should have by now, but I'll still tag it in case you want to avoid.

Christian apologist Mike Licona: "What turns you off about Christianity? Irrespective of one's worldview, many experience periods of doubt. Do you ever doubt your atheism and, if so, what is it about theism or Christianity that is most troubling to your atheism?"

Licona's first error, of course, is in assuming that the only alternative to atheism is Christianity. I might ask him what "turns him off" about Buddhism, Hinduism, or Islam. Perhaps he would give reasons that regarded the behavior of certain of those religions' adherents, but ultimately I think it would come down to "I just don't buy what they're selling." As the saying goes, we are both nonbelievers in Apollo, Thor, Mithra, Shiva, and the Flying Spaghetti Monster, among thousands of others. I only take it one God further than he does (or three, depending on your perspective).

My issue, first and foremost, is not that Christianity has "turn offs." It is that theism in general lacks sufficient evidence to indicate the existence of any god, much less any one(/three) in particular.

Though I both experienced and continue to research theistic beliefs, I have yet to come across any evidence that has "troubled" me with regard to my current lack of belief. I would be more than willing to acknowledge such evidence, should it ever be presented, but I'm not holding my breath.

Don't get me wrong, there's a lot about what's written in the Bible that I find repulsive, and I'm pretty sure from a literary standpoint that God is actually the villain of the story. And there's a lot about the intolerance and arrogance that Christianity has a tendency to engender in its followers that "turns me off." And I think that it damages critical thinking skills, and does not allow for sufficient questioning or doubt. But none of that has anything to do with the reason I don't believe in it.

Christian philosopher and apologist William Lane Craig: "What's the real reason you don't believe in God? How and when do you lose your faith in God?"

Well, first of all, I object to the way this question is phrased. Asking for the "real reason" implies that I have or would give a "fake" one.

That aside: I don't believe in God because I have not been shown any compelling reason that I should. It's the same reason I don't believe in unicorns, faeries, goblins, or Lord Voldemort.

The second question is equally presumptuous, as it assumes that the atheist being questioned has ever had faith in any god in the first place. It happens to be true in my case, but that doesn't change the fact that it's a very loaded question.

I've already written my answer to this, but the short version is that I lost my faith in God when I went seeking for evidence to strengthen my faith, and sharpen my apologetical skills, and found at every turn that none existed. And so I was forced to determine -- against my heart's desire, at the time -- that God, too, most probably did not exist.

Author and Christian pastor John Ortberg: "How can you create a meaningful life in a meaningless universe?"

My question in return is: how does the meaningfulness of the universe impact the meaningfulness of one's own life?

Sure, the fact is that millions of years from now, not only will I be long gone, but the entire human race will be gone. There will be no one left to remember my name or my deeds, and the universe will continue to do what it does as if humanity had never existed. But that's true whether God exists or not, isn't it? The fact that my life doesn't mean anything to the dust of Mars is a fact, whether there is a God or isn't.

Does that really preoccupy anyone on a day-to-day basis?

Quite honestly, I think life has more meaning when that meaning is ours to determine and create, rather than just fulfilling a grand "plan" in which our every action is already anticipated and accounted for. Where is the meaning there, when your part to play is given to you by some outside entity rather than self-determined? How is this life "meaningful" when it is supposedly the lesser of the two lives one will live?

I create a meaningful life by making use of my life to improve and enhance the lives of those around me. It is fleeting for all of us, and that should make us more determined to make it as enjoyable as possible. Meaning is whatever we make of it.

Resurrection apologist Gary Habermas: "Utilizing each of the historical facts conceded by virtually all contemporary scholars, please produce a comprehensive natural explanation of Jesus' resurrection that makes better sense than the event itself." "These historical facts are:

-Jesus was killed by crucifixion
-Jesus' disciples believed that he rose and appeared to them
-The conversion of the church persecutor Saul
-the conversion of the skeptic James, Jesus' half-brother
-The empty tomb of Jesus.

These "minimal facts" are strongly evidenced and are regarded as historical by the vast majority of scholars, including skeptics, who have written about the resurrection in French, German, and English since 1975. While the fifth fact doesn't enjoy quite the same universal consensus, nevertheless it is conceded by 75 percent of these scholars and is well supported by the historical data if assessed without preconceptions."

That's a lot of assertion, and a lot of bandying-about of the word "fact" without actually backing it up.

Who are these "contemporary scholars"? I want names, and the reasons that they "concede" these "historical facts." Better yet, skip the appeal to authority and just tell me what the evidence is that makes those assertions "facts." As far as I can tell, they are not facts at all, just a semantical ploy. Borrowing from another atheist blogger's answers, I will rephrase your "facts" in the form of questions, because that's really what they are: questions to be answered.

Was Jesus killed by crucifixion? Skipping the questionable nature of the very existence of Jesus at all (and yes, it is questionable), it is reasonable to believe that he might have been crucified. It was an actual method of execution, so it is not outside the boundaries of possibility that a rabble-rouser named Jesus was executed by means of crucifixion.

Did Jesus' disciples believe that he rose and appeared to them? Again granting that he existed at all, sure. His followers may very well have believed that Jesus rose and appeared to them. But the Aztecs believed that human sacrifice made the sun rise. Scientologists believe that our bodies are filled with alien ghosts. Just because a person or group of people believe something does not mean that it is true.

Did the church persecutor Saul convert to Christianity? Once again, the first assumption is that such a person ever existed, although admittedly it is likely that he did. Saul of Tarsus may genuinely have converted to Christianity, and may genuinely have believed that he met Jesus on the road to Damascus. But again, because someone believes something does not make it true. As no one was with him when he had his vision, it seems perfectly possible that he hallucinated the experience. He was walking in desert heat, maybe he got sunstroke. That he genuinely believed it happened does not mean it really happened.

My pet theory, on the other hand, is that Saul realized that he could benefit much more by conning believers than by killing them. Paul realized he could make some serious cash off the whole "tithing" thing if he got in at the high levels of the church, which he did. He's also the one who invented the notion, out of thin air, that Jesus' salvation applied to Gentile as much as Jew. Sounds like he was trying to add more members to swell up the coffers.

I have no evidence of that, but it's certainly a "natural explanation...that makes better sense than [resurrection]."

Did the skeptic James, Jesus' half-brother, convert to Christianity? As I hope is clear by now, I find this point irrelevant. Doesn't make it true even if he did.

Was Jesus' tomb empty? So what if it was? The best and most sensible explanation you've got for a dead body not being where it's supposed to is that it un-died? Grave-robbing is a perfectly reasonable explanation for this, if there in fact even was a Jesus and if in fact there even was an empty tomb. I might as well say that [HARRY POTTER SPOILERS]Dumbledore's cracked tomb is evidence of Voldemort's return[/SPOILERS]. If we can't even establish the existence of the tomb, much less its empty or cracked state, then it's fatuous to claim that its emptiness is evidence of supernatural events.

In fact, of all explanations for all the so-called "historical facts," even if their historicity was totally undisputed, the resurrection explanation is the one that makes the least sense, is the least reasonable, and has the least evidentiary support.

These "minimal facts" are strongly evidenced and are regarded as historical by the vast majority of scholars, including skeptics, who have written about the resurrection in French, German, and English since 1975.

Name them, and their writings. Don't just say "there's lots of them, srsly." Doesn't fly.

Christian philosopher and apologist Paul Copan: "Given the commonly recognized and scientifically supported belief that the universe (all matter, energy, space, time) began to exist a finite time ago and that the universe is remarkably finely tuned for life, does this not (strongly) suggest that the universe is ontologically haunted and that this fact should require further exploration, given the metaphysically staggering implications?"

"And, second, granted that the major objection to belief in God is the problem of evil, does the concept of evil itself not suggest a standard of goodness or a design plan from which things deviate, so that if things ought to be a certain way (rather than just happening to be the way they are in nature), don't such ‘injustices' or ‘evils' seem to suggest a moral/design plan independent of nature?"

Well, to the first question. I'm not sure that the belief that all matter and energy began a finite time ago is "commonly recognized." Certainly the universe as we know it had a "beginning," which we call the Big Bang, but it was not a sudden creation of matter -- just a sudden expansion. It represented a change in the state of matter and energy, but not necessarily the beginnings of them.

I'm sure the "fine tuning" argument will come up later in Case for a Creator, so I won't go into it now, but I have a counter-question: if the universe is so "finely tuned" for life, why is there remarkably little life in the universe? Why is so much of the universe hostile to life as we know it? The vacuum of space does not support life, nor does any other planet of which we are currently aware. Even our own planet has large swaths of its surface that are hostile to life. It's a bit like finding a single silver atom in a 20 ton granite boulder, and saying that the boulder was "finely tuned" for silver.

A universe "finely tuned" to support life should presumably be teeming with it. It seems to me that life as we know it has finely tuned itself to survive within the constraints of this universe, rather than the reverse.

As for the question of evil, it's an easily observable fact that there is no universal morality or concept of evil that transcends boundaries of culture. We believe the actions of Muslim terrorists are evil; they in turn think the same of our actions. Who is right?

Everyone defines evil in their own way, and cultures create a consensus, one that can shift drastically (see, for example, the shift in Western culture from considering homosexuality "evil" to merely "undesirable," and now very nearly to "acceptable").

The notion of something being "bad" or "wrong" is not remarkable when each culture, and each individual, ultimately defines it for themselves.

Radio host Frank Pastore: "Please explain how something can come from nothing, how life can come from death, how mind can come from brain, and how our moral senses developed from an amoral source."

Okay, one at a time:

How does something come from nothing? Atheists aren't the ones that say it does. Theists are, and they have no answer for how other than "magic." In the beginning God created etc.

I happen to think that all the matter in the universe has always existed in some form. So I can't answer the question because I don't believe the assertion I'm being asked to defend.

How can life come from death? Life doesn't come from death. Life, as we define it, comes from natural chemical processes that occur in various reproductive cycles.

How can mind come from brain? Dunno how. It's a fascinating question currently without an answer.

But despite the fact that we don't yet know how it does, we do have strong evidence indicating that it does. With MRI and other scanning technology, we can see brain activity occurring when a person engages their higher functions of thought and reasoning, and the areas of the brain triggered have a consistent correlation with the types of thought processes occurring. And we have plenty of documented cases in which brain damage has drastically altered a person's personality and thought patterns (aka what we would call "mind").

How did our moral senses develop from an amoral source? This is a question that would be done a disservice with a short blog answer. Entire books can be (and have been) written on the subject, and I suggest you look into them for a more comprehensive answer. But for the sake of the Q&A, the short-to-the-point-of-oversimplification version is that humans are pack animals, a cooperative species. In our evolutionary past, we would have survived better working together than against each other, and so it would have benefitted us as a species to evolve a sense of how to get along with each other. Hence what we call "morality."

Not to mention basic empathy. There's nothing mystical about "I don't want it to happen to me, so I won't make it happen to others."

Christian apologist Greg Koukl: "Why is something here rather than nothing here? Clearly, the physical universe is not eternal (Second Law of Thermodynamics, Big Bang cosmology). Either everything came from something outside the material universe, or everything came from nothing (Law of Excluded Middle). Which of those two is the most reasonable alternative? As an atheist, you seem to have opted for the latter. Why?"

The first question implies that there is a "why," and also that "nothing" being here is even a possibility, neither of which we have reason to claim are or could be the case. As I mentioned above, just because the universe as we know it is not eternal, does not mean that the matter comprising the universe is in some way finite. On what basis would you expect there to be "nothing" here?

The "two" options are not only a false dichotomy -- they are actually saying the same thing. If everything has to come from somewhere, then the "something outside the physical universe" had to come from somewhere. Or else it came from nothing. So if you believe that something outside of the universe created the universe, you're still stating that everything came from nothing, you're just pushing that "nothing" back a step. What's the point of that?

Neither of the two options presented is particularly reasonable, and as a result, I have not "opted for" either one.

What about the third option, that the universe has always existed? That's the answer you would give to "where did God come from," isn't it? "He's always been there." So why can't that answer be true of the universe, and just skip the tacked-on anthropomorphized "cause"?

Well, that was fun. Presumably back to Case for a Creator next week.

Wednesday, February 04, 2009

Continuing my education: Casablanca

I didn't go to film school. I've stated my thoughts on film school many times -- though never on this blog, so I probably should do a post just to have it on the record -- but the short version is I think it's a vestigial concept that was useful once but is now generally a waste of money, at least in terms of writing or directing. (I think programs for the more technical skills like editing, cinematography, sound and VFX are still useful because there's an element of learning the technology hand-in-hand with the technique, but that's a caveat for another day.)

My point is, for better or worse, what I know about films and filmmaking is largely self-taught. I've always believed that in order to make movies, you have to watch movies, and if there's anything in my life about which I could be considered "religious," it's probably that. I'm fortunate to have been born in the age of the video rental, and my grandparents kept a membership to a Mom & Pop video store through the 80s and 90s, just for me. My late grandfather had two VCRs, so that he could dupe the rental to another tape, and I could watch the movies over and over without having to re-rent them.1

Still, my movie-watching education has been a curriculum of my own devising, and I have many "classic" films yet to check off the list. See, for example, my being so late to the party re: The Godfather. So here's the latest in what I consider the lifelong continuation of my film education: Casablanca.

So what did I learn?

Well, just to have it said: I liked it a lot. A few re-watches and I might even love it. Well-structured, moved along at a good pace without feeling rushed. And my God, the dialogue! No wonder this movie is so oft-quoted. It's pretty theatrical, even corny sometimes -- and according to IMDB and Wikipedia, the screenwriters themselves acknowledged as much -- but somehow it just all works.

Some of the exchanges between Rick and the less scrupulous denizens of Casablanca are particularly sharp and witty, though like most of the Golden Age films, it has a certain quality to it that initially makes it feel like "very good writing" rather than "believable conversation." Part of it is no doubt the fact that every line had to be very carefully constructed to get approved under the Hays Code. Part of it is the fact that film was still fairly new, and most writers were playwrights first and had a more theatrical style.

But part of it, I think, is outside the writing and in the performance. I think part of it is the way that performers were essentially property, loaned out and traded like baseball cards, generally lacking any passion for the projects they did. They just showed up and read the lines for their day, and it has the (pleasantly) odd, workmanlike, even formal feel that I've frankly come to appreciate about the old films.

They're almost like feature-length demonstrations of the Kuleshov effect, in which you're presented with films which give you, as an audience member, very little in the way of emotional inflection; and yet by the things that the characters say and do, it stimulates your own emotions and even causes you to project emotions onto the characters who are otherwise not expressing them.2

There's not a lot of screaming, or sobbing, or contorting of faces, or even moving much faster than a brisk walk. But in that stark simplicity -- whether it was by design, or simply an inevitable result of the assembly-line nature of Golden Era filmmaking (and I suspect the latter) -- they created a canvas with strong outlines, leaving it to the audience to complete the picture. And engaging the audience, as I said just recently, is a crucial factor in whether or not they give a damn about the movie while they're sitting in their seats, to say nothing of when they leave the theatre. I can definitely see why people might prefer this style to the more raw, even if perhaps more "real," style of acting today.

A modern disciple of this style would be David Mamet, from whose book On Directing Films I apprehended "inflection" as a filmmaking term (though I don't know for certain if he coined it). He generally seems to allow a single character a single emotional outburst in his dialogue-heavy films; otherwise the performances are very calm, restrained, and straightforward. Yet in that calm, because of the situations he creates, there is somehow anger; or fear; or arrogance; uncertainty; desperation; triumph -- all created by juxtaposition of the words and images, rather than the actors actively emoting. And because the emotions are not expressed, they are breathtakingly potent, boiling beneath the surface. (I still need to do my review of last year's phenomenal Redbelt, in which I can get into this more deeply.)

This is also, I would venture a guess, why many people of my generation and younger find these kinds of movies "boring." They are not used to having to do part of the storytelling work, are used to taking emotional cues from the actors' faces and tone of voice rather than their choice of words. It is a different kind of movie, a more challenging kind of movie for both the audience and (if he's doing it intentionally) the filmmaker. It relies on both sides of the equation placing a certain degree of trust into the other, and a certain degree of effort into the creation of something that can become intensely personal to everyone who views it. And it keeps the movie alive and relevant, because part of its vibrancy comes from the viewer, not from the screen.

In short, I learned for myself why Casablanca is a classic. I'm glad that I watched this before many other films, because a less-quotable film may not have caught my attention and helped me understand my role in the filmmaking experience -- both expanding it as a viewer, and perhaps contracting it when I'm in the hot seat. I think I will appreciate films of the era more, having experienced Casablanca first, and I do think it will improve my own filmmaking. It deserves its place in the pantheon of great films.

Instead of closing out this post with one of the old standard quotes -- "looking at you, kid," "beautiful friendship," etc. -- I'll go with one that Peter Lorre's character says in his one scene in the film (though I bet they used his name prominently in the advertising).

I use it because I think it relates perfectly to the odd way that the actors' and filmmakers' dispassionate, occasionally even hostile relationship with Golden Age material could sometimes, somehow, produce films of profound, lasting truth and power:

You know, Rick, I have many a friend in Casablanca. But somehow, just because you despise me, you are the only one I trust.



  1. That's right, my grandfather was pirating movies decades before it was cool. He even duplicated the FBI warning telling you not to do that, the deliciousness of which I appreciated when I got older.

    On a footnote side-note: despite my grandfather being perfectly capable of understanding how to daisy-chain two VCRs together and having one record to another, while still outputting the movie to the television so I could watch and dupe simultaneously, the clocks on both VCRs flashed 12:00 for the entirety of my formative years.

  2. The Kuleshov effect technically only refers to two images juxtaposed, via editing, without inflection, or any sort of visual "editorializing" telling you what to think; and yet their juxtaposition creates an implied statement or emotion. But I would say that this applies equally to two lines of dialogue presented in the same manner.

Tuesday, February 03, 2009

Bale-istic Remixes

Welcome to this wonderful thing we call the web. Within 24 hours of Bale's outburst it has already been remixed several times.

My favorite so far is Christian Bale vs. Bill O'Reilly.

And from master celebrity re-mixer RevoLucian, who during the campaign season brought us Sarah Palin Remixed, we now have Bale Out.

Turn-around is fast on these things.

UPDATE: RevoLucian has put up a link to download a high-quality MP3 of Bale Out. Get it now!

I'm with Bale

Apparently there was a bit of a row on the set of Terminator 4 last summer. Christian Bale blew his top at the DP, apparently for walking onto the set during a take. And now TMZ, that perfect encapsulation of our celebrity-obsessed culture, has the audio.

First impression: that audio is really clean.

Second impression: I'm siding with Bale.

Now, I wasn't there. I don't know what really went on. And I know a lot of people hear that audio and think "Holy shit, what a fucking diva asshole. I would never work with that dick and I've lost a ton of respect for him." But quite honestly, I disagree.

Acting is hard. If you want to do it well, it's hard. You've got to live in the space, you've got to really believe everything you're saying every moment that the camera is on you. Worse yet, in the film world, you have to believe it in five minute chunks, aka takes. You've got to know your lines, take direction, make sure you're made up, make sure you hit your marks, and maneuver around the lumbering apparatus that is a shooting crew -- while simultaneously looking like you aren't doing any of that at all. While looking like the character you're playing a real person in a real situation.

Gary Oldman has lamented the fact that the crew gets to take hours to do their jobs, and yet he's expected to show up, say his lines, get it right the first couple times and move on. Everybody always makes a big thing about how an actor in a biopic -- whether it be Carrey in Man on the Moon or Langella in Frost/Nixon -- never breaks character while on set. The fact is that this is the only way they can be sure they're doing their job properly. If they don't focus themselves wholeheartedly to respectful personification, it is far, far too easy in the staccato world of production to just fall into impersonation. The actor's job is to forget that they are acting, so that you, in turn, can also forget that they are acting. Sometimes, with very complex characters, that means they have to never acknowledge they are acting as long as they are on the set, or else they will not be able to maintain the character's reality before the cameras.

On a big-budget picture, I imagine the pressure is immense. There's an awareness that you are burning cash at a terrifying rate just by standing there. You've only got the brief period between "action" and "cut" to actually focus in and put yourself in the world. You need to use that time to immerse yourself in the fantasy world of the film. Because you care about doing a good job, doing the best job. The quality of your work matters to you no matter what the project, so within the limitations you are going to cast everything out of your mind, and just be in that other world, with everything you've got.

So imagine, that in that brief period between "action" and "cut" that is yours, that moment you need everyone to disappear from your awareness so you can be that character, imagine that the DP goes wandering onto the set, right in your line of sight, right in the middle of the scene. And he thinks it's okay because the camera can't see him.

He's showing total disrespect for your craft, to the extent that he doesn't even seem to acknowledge that your work is important -- doesn't acknowledge that you're even working at all. Listen to his excuse -- he's "checking the light." You can't wait for "cut," guy? You can't just look at the frigging monitor?

If you're an actor that cares about the quality of your work, how do you NOT go apeshit over something like that?

David After Dentist



I saw this kind of behavior more than once at/after parties while I was in college. It's much, much cuter and funnier when it's a seven-year-old tripping out on anaesthesia, though.

Sunday, February 01, 2009

Secular Sunday: Day of Rest

No post today, sorry. I've got a bunch of other things I'm in the midst of writing, and some FX work for both Sandrima Rising and another project I'm not allowed to talk about yet. Yesterday's post on evolution will have to suffice in its stead.

I won't make this a habit, and in fact I've discovered the function that lets me write a post but publish it much later; in future I'll do my best to make sure I've got the post written before Sunday.

So for today, enjoy your Superbowl, or your To Catch a Predator marathon, or whatever it is you do today.

Saturday, January 31, 2009

Newsweek drops the ball

What the hell, Newsweek. You act like you want to be taken as a serious journalistic source for, you know, news. And yet you go publishing this utter nonsense about a supposed re-emergence of Lamarckist theory?

The short version of Lamarckist theory is that it is "evolution via acquired traits." The article gives a good example of the theory, which is that giraffe's necks got longer because short-necked giraffes stretched their necks to reach the trees, and their offspring were born with the longer necks. This is tantamount to claiming that if a very light-skinned person spends a lot of time in the sun and gets tan, he or she will have tan-skinned offspring. The theory is obviously absurd to us today, but before Darwin hit on natural selection, the jury was still out on how exactly evolution took place (although the fact that it did take place was not generally under dispute).

The explanation of Lamarckism is about the only thing this article gets right.

First of all, what the hell is this Sharon Begley character doing writing Newsweek's science articles? A glance at other articles to her credit include "Can God Love Darwin, Too?" and "Science Finds God." I'm sure that's just a coincidence.

She starts this article with the following:

Alas, poor Darwin. By all rights, 2009 should be his year, as books, museums and scholarly conclaves celebrate his 200th birthday (Feb. 12) and the 150th anniversary of "On the Origin of Species" (Nov. 24), the book that changed forever how man views himself and the creation.

That's right. Not the universe. The "creation."

Oh, but it gets better. From Begley's biography:

Sharon Begley, widely known for her ability to break down complex scientific theories and write about them in simple prose, returned to Newsweek in March 2007 from the Wall Street Journal, where she wrote the "Science Journal" column for five years.

This woman is supposed to be an actual science writer. It would seem that by "breaking down complex scientific theories," the writer of this biography (no doubt Begley herself) means "completely misunderstanding and misrepresenting complex scientific theories, so that they can be explained simply and reach the conclusion she desires."

Though granted, I'm only going off one abysmal article. Maybe her others do better, but I haven't read them so I can't address them. Let's talk about the current article.

Some water fleas sport a spiny helmet that deters predators; others, with identical DNA sequences, have bare heads. What differs between the two is not their genes but their mothers' experiences. If mom had a run-in with predators, her offspring have helmets, an effect one wag called "bite the mother, fight the daughter." If mom lived her life unthreatened, her offspring have no helmets. Same DNA, different traits. Somehow, the experience of the mother, not only her DNA sequences, has been transmitted to her offspring.

Second paragraph in, and already she's got so much wrong. Only the mother's DNA sequences are transmitted to her offspring. Her experiences affect the expression of those DNA sequences. Ms. Begley, science writer extraordinaire, seems not to grasp the fundamental difference between a phenotype, and a genotype.

I'm not a programmer, so I can't put this into an actual programming language like a clever blogger, but the developmental process we're talking about basically works like this:

For TRAIT A:
-if CIRCUMSTANCE B is present, express TRAIT A
-if CIRCUMSTANCE B is not present, do not express TRAIT A

That's the very very very simplified version of what we're dealing with here. Begley's article makes it sound like if a water flea's mother is in threatening circumstances, the offspring is spontaneously born with a spiny helmet, apropo of nothing as far as the DNA is concerned.

But the truth of the matter is quite different: the gene for the spiny helmet (TRAIT A) exists in every water flea. It is part of the genotype of that organism. The ones whose mothers were not put in a threatening situation (CIRCUMSTANCE B) simply did not express the trait, meaning that the organisms of the same genotype are of a different phenotype.

It is a well-known principle of gestational development that chemical triggers cause genetic traits to be expressed, or not. For one example, a chemical trigger applied to a human fetus at the right time will cause the fetus to express male traits, i.e. it makes a boy; if that chemical trigger is not applied, the fetus will develop to be a baby girl.1

It is also well-known that environmental factors can affect both the gametes of an organism, and the development of offspring even after the egg has been fertilized. That's why you shouldn't smoke or drink during pregnancy: it screws up the chemical processes of the body, and as the fetus develops, some important functions may never be triggered on; or some traits that ought to stay off get turned on.

I really shouldn't keep using code analogies since I'm not a programmer, but here goes: Since evolution has been a blind process, extraneous information has not necessarily been culled from the genome to keep things tidy. Each version of the software has added or altered code as needed, but not necessarily subtracted if not needed, only made it inactive. There's a lot of legacy code in our genes, some of which can be very harmful if it is re-activated.

All that to say that the existence of a life-threatening situation simply results in a particular chemical trigger that kicks in during some water fleas' gestational development, causing them to express different traits than water fleas without that chemical trigger.

It is not a level of science that is outside the understanding of a layperson, and certainly not an observation that gives those who accept Darwinian evolution "heart palpitations" as she claims. If a human woman experiences a great deal of stress during pregnancy it can affect the child's development, too. This is no different in principle, the only difference is that the water flea genome has a contingency trait should that situation arise (TRAIT A for CIRCUMSTANCE B).

Begley makes another example about the diets of pregnant mice affecting traits in their offspring, by altering the DNA of their eggs (gametes). Begley "emphasizes" that "this is not a mutation," and Begley is frankly stupid to do so. Of course this is a mutation. The alteration of DNA is the definition of genetic mutation.

Let me emphasize something in my own turn: what she is talking about are not acquired traits. The mother flea does not get a spiny helmet from somewhere else and pass that onto her kids. The mother rat does not turn brown and pass that onto her kids. They undergo experiences which alter the development of their offspring, either at the DNA level or during key stages in development, to express dormant or recessive traits. That is just standard genetics. Nothing new or revolutionary here at all.

So let's skip right to the end here:

The existence of this parallel means of inheritance, in which something a parent experiences alters the DNA he or she passes on to children, suggests that evolution might happen much faster than the Darwinian model implies. "Darwinian evolution is quite slow," says Whitelaw. But if children can inherit DNA that bears the physical marks of their parents' experiences, they are likely to be much better adapted to the world they're born into, all in a single generation. Water fleas pop out helmets immediately if mom lived in a world of predators; by Darwin's lights, a population of helmeted fleas would take many generations to emerge through random variation and natural selection.

It's true that natural selection usually moves fairly slowly. The point, which anyone with even a passing understanding of evolution would manage to grasp, is that those many generations to create the genetic code for helmeted fleas have already occurred, leaving us with the water fleas as we observe them today. The helmets are not a new trait, they are an existing trait that is either expressed or not.

It is certainly interesting that they developed in such a way that allows environmental factors to inform the expression of the trait, rather than just automatically having all water fleas be helmeted. But this is in no way an affront to the theory of evolution via natural selection, and is in fact easily accounted for as a positive survival adaptation. The fact that this article attempts to make it sound like it could invalidate or undermine the theory is not only sensationalism, it's just plain bad science, and Newsweek should be ashamed.


  1. The human Y chromosome is a modified X chromosome. This throws something of a wrench into the arguments of those creationists who would claim that scientific discoveries are only proving information that has always been in the Bible. Gestational development indicates that Adam, in fact, comes from Eve.

Friday, January 30, 2009

DorkmanScott.com working again

Sorry to those of you who took my advice and updated your bookmarks only to find yourself 404'd. There was a problem with my account at 1and1 and it took a few days to fix. All good now!

Monday, January 26, 2009

I heart Scrivener

I’m a fairly disorganized person by my nature. I’m easily distracted — though I’ve never been formally diagnosed, I’m fairly certain I have ADD. My thought patterns will often skip from one thing to the next, and sometimes it can be hard to focus in.

I personally think a little mental anarchy can be good for creativity. I think it allows you to make unique and interesting connections, to synthesize old ideas into new in surprising ways (sometimes surprising even to myself, for my part). But at the same time, it’s hard to sit down and actually put things together one after the other; worse yet, it’s sometimes hard for me to keep track of where I keep all my ideas.

I have dozens of spiral notebooks and idea journals with only a few pages written upon apiece. I keep trying to get in the habit of carrying a notepad with me for when inspiration strikes, but ultimately that only results in barely-used notepads being left all over the place. I have a lot of ideas that I want to help shepherd into fully-fledged stories, but often I’ll completely forget them until I stumble across a notebook during a move or a cleaning binge. (Such a discovery will usually result in the end of said binge, as I suddenly become involved with the idea again.) Or I’ll have a great idea for a story moment that I totally forget about, but later I come across it and wonder how it’s possible to forget it since it was the solution to a major story problem.

There’s also the matter of organization. Even once I’ve got all the ideas and I think the story is there somewhere, I have a lot of trouble putting the pieces together. Like the notebooks and journals, I can’t count how many stacks of index cards I’ve bought, thinking I’d write down scenes and pin them to a corkboard and shuffle them around until the goddamn thing made sense, just like the real writers do. (I’ve even bought a corkboard, still in near-mint condition.)

But I don’t like writing by hand. It’s too slow, too clumsy. I’ve been using computers since I was three years old (and happy 25th, Apple!), I type WAY faster than I can write by hand. Cursive never took and my attempts to write that way are sheer chaos. So I prefer to work at a keyboard.

I’ve tried lots of writing tools, the ones that do the digital index cards, the ones that are supposed to help you plot the whole damn thing and have it practically ready to print when you’re done, and when it comes to writing software — really, when it comes to any software — the best program is the one that gets the hell out of your way and facilitates what you want to do. To date, the only specialized writing software I’ve really found worthwhile beyond Microsoft Word (though I’m now using Pages, it’s essentially the same thing) has been Final Draft.

I’ve been a user since Version 3, and I just love FD. Its attempts to add fancy feature sets have been spotty. The “reports” it generates can be useful, but the included auxiliary program, Final Draft Tagger, is so buggy and unreliable as to render it totally useless. But the software’s raison d’être, which is to conform your writing to accepted industry screenplay format, is a workhorse that never lets me down.

More importantly, for me, it’s functionally transparent. I don’t have to stop what I’m doing to pull down a menu item, I don’t even have to use hotkeys. If I’m not typing words, I’m either using ENTER to move to the next line, or TAB to change the input type (from “Action” to “Character,” for example, or “dialogue” to “parenthetical”). I forget that the software is there, and I just write.

A few months ago, some folks on Twitter started raving about a program called Scrivener. I checked out the webpage and wasn’t really convinced. To me it looked like just another word processor with a few extra but largely unnecessary features. Between Word/Pages and Final Draft, I figured I had it covered. I wasn’t sure I saw the benefit of a lot of the features, especially a “full-screen mode,” the prominent advertisement of which I found somewhat inexplicable. But people kept taking up the recommendation, trying it out, and raving and recommending it themselves, so I figured I might as well check the thing out.

Even after downloading, I sat on the demo for several weeks before yet more people’s positive tweets compelled me to sit down and go through the software tutorial, which walks you through the feature set and gives you a sense of what Scrivener can do.

Immediately after I finished the tutorial, I paid my $40 to get the full license — I still had 29 more days of the demo1, but I knew it was $40 well-spent. That was two days ago, and now here I am, coming full circle to recommend it to my fellow (Mac-based) writers out there.

And here’s why.

Scrivener is, in fact, not a word processor. It is actually a database management tool disguised as a word processor. Within Scrivener you can create multiple discrete documents — different chapters of a novel, or scenes of a screenplay, or each one can be a character bio, or each one just a little doodle of an idea — and you can view them together or separately, create as many as you want for whatever uses you want, all organized into folders as part of a “Draft.”

You can also import reference material such as images, video and audio files, even web pages. Once imported, they're kept locally within the Scrivener (".scriv") file, which means you can take the .scriv to any computer with Scrivener installed, and all your content will be there. You can choose to associate the reference material with certain documents or drafts — for example, Anthony likes to use certain songs as inspiration for certain scenes in his writing, so he could have the songs directly accessible from the relevant document. Likewise you can associate documents with one another, so that you can connect, say, a character bio to a scene including that character, in a sort of interconnected pseudo-Wiki to help you keep track of all your thoughts.

This all exists and is easily manipulated within the Scrivener interface, but if you look under the hood, the .scriv file is really an archive file, like a .zip, and Scrivener is the UI to dynamically adding, rearranging, and viewing the content within the archive. It does the work of creating a file structure and all of that behind the scenes, making the creation, addition, or connecting of content dynamic and creative rather than a lot of “housekeeping.”

I love that. I love that I can just throw everything I’ve got at Scrivener, and although it may be a bunch of different documents, different resources, it’s considered a single file by Scrivener, one which I can easily move around and be sure I'm not losing any of my work. I can shuffle and rework at will without worrying if I’ve forgotten something or buggered the organization, as I would do if I were maintaining the file structure myself.

One way Scrivener uses this to its advantage is with the “snapshots” feature, a smaller-scale version of OSX’s “Time Machine” function. If you’ve got a document that you want to try something new with, but you don’t want to lose your old version, you just create a “snapshot,” and you can call up or restore any snapshot at any time. You can have an effectively unlimited number of snapshots because in truth, the software is just doing an incremental save, and putting the older versions somewhere safe within the database. But from the point of view of the user, you can be sure you’re always working with the latest version, with the older versions right within reach. No more confusion over which version of the document is the most up-to-date.

Also, the fullscreen mode is, indeed, fantastic. As I said, I’m easily distracted, and while I’m writing it’s all too easy for me to go clicking on the Safari icon and checking my e-mail instead of getting the words down, or opening any other program and finding any other excuse. There’s so much on my computer I can be doing, I feel like I should be doing more things at any given moment.

So I set the fullscreen preferences in Scrivener to display green text on an otherwise black screen. The toolbar is invisible, as is the mouse arrow (unless I move it), and my fully-loaded laptop suddenly becomes a simple, old-school word processor.2 The psychological value of the visual simplicity is hard to describe, but try it and see if you don’t notice a difference in how much writing you can get done that way. I’ve actually written this whole post in Scrivener’s fullscreen mode, and enjoyed the experience tremendously.3

I’m not going to go into a full blow-by-blow of how to use the app, because there’s a tutorial for that. If you’re serious about writing, it will be well worth your while to download the free demo, and take 30 minutes or so to work along with the provided tutorial file, to get a sense of what Scrivener can do. Even try the demo for the 30 days before you make up your mind. I would guess you would quickly see how the program is worth your $40. For me, it's exactly what I've always needed.


  1. I discovered afterward that Scrivener's 30-day demo is "30 days of using the program." A day only counts toward the limit if you fire up the program on that day, rather than it counting 30 calendar days from first use. So if you only used Scrivener every other day, the demo lasts 60 calendar days, etc. So no need to worry about firing it up to have a play if you "won't have time" afterward -- unlike other software, the demo works around your schedule. And for the record, yes. I still would have bought the license on day 2 even if I had known.

  2. Fullscreen mode also has a “typewriter style” carriage return, in which the line of text you are currently editing is always in the middle of your screen, as opposed to most word processors where you write your way down to the bottom of the screen and stay there. It sounds like a small thing, but as with fullscreen mode in general, it’s surprising how much you appreciate it once you start rocking and rolling.

  3. There is another program which is just a word processor in fullscreen mode, called WriteRoom. WriteRoom's default scheme is the green on black, which is what compelled me to set up Scrivener the same way. If you just want the "distraction free" writing without all the other features of Scrivener, WriteRoom will get you there, although considering the pricing (WriteRoom goes for $24.95), I feel like the extra $15 for Scrivener is worth it. They're from different developers, as far as I can tell, so it's not as simple as upgrading if you change your mind.

Sunday, January 25, 2009

Secular Sunday: The Case for a Creator: Chapter Three, Part 3

Picking up in Chapter Three, still in the Wells interview, we address “icon of evolution” number two: “Darwin’s Tree of Life.”

In brief, Wells makes the assertion that, while the ever-branching tree of life, where everything flows and diverges from a common ancestor, is a good representation of Darwin’s theory, it isn’t supported by the fossil record.

This is, in a word, a lie. Part of it is Wells’ denial that any “transitional forms” exist in the fossil record (but we’ll get to that when he starts in on archaeopetryx), and the other part of it is the Cambrian explosion:

"The Cambrian was a geological period that we think began a little more than 540 million years ago. The Cambrian Explosion has been called the 'Biological Big Bang' because it gave rise to the sudden appearance of most of the major animal phyla that are still alive today, as well as some that are now extinct." [page 43]

Okay, this part is admittedly not a flat-out lie. The issue is more in the presentation -- once again, he clearly expects people not to know and not to do any research.

The Cambrian has been called the Biological Big Bang, but unlike the Big Bang, it isn't theorized to have occured suddenly, at an instant in time. Wells, in using the word "sudden," makes it sound like it happened in a very brief period of time, but the Cambrian period is actually a period of about 80-90 million years. An eye-blink in geological time scales, sure; but in terms of the process of evolution, it's more than enough time for life forms to diversify.

He then states that it "gave rise to...most of the major animal phyla that are alive today." I think that he knows people will read "most of the major animal phyla" and understand it as "most of the animals."

Let's go back to high school science class, and scientific classification. The mnemonic device I learned was:

Kings
Play
Chess
On
Fine
Grained
Sand

This is to help remember the scientific classifications in order, from the most general to the most specific:

Kingdom
Phylum
Class
Order
Family
Genus
Species

See how far up "phylum" is? It's the second most general form of classification. Even today, with all the billions of named species, and billions more that are probably as yet undiscovered, you know how many phyla there are?

About thirty-five. So it's not really inconceivable that over the course of about 90 million years, life could diversify in a couple dozen ways for a start.1 Wells makes a true statement, but phrases it in such a way that it sounds like the current forms of life all popped up at once, fully formed (and if you think I'm putting words or intentions into his mouth, he makes his intentions very clear in following paragraphs, as you will see).

This is simply not the case. The forms of life that arose at that time were still very, very primitive.

Continuing his description, Wells says:

"[A]t the beginning of the Cambrian -- boom! -- all of a sudden, we see representatives of the arthropods, modern representatives of which are insects, crabs, and the like; echinoderms, which include modern starfish and sea urchins; chordates, which include modern vertebrates; and so forth. Mammals came later, but the chordates -- the major group to which they belong -- were right there at the beginning of the Cambrian." [page 44]

This quote has the same word-games, although Wells is getting a bit bolder with his disinformation. Notice he throws in the "modern representatives" of the various phyla, mixed up with the discussion of the earlier phylogenic forms. If one wasn't reading closely enough, one might easily misconstrue this statement as saying that modern animals, essentially in their current form, appeared at the beginning of the Cambrian period. The "boom!" again makes it sound like it was something that happened near-instantly, instead of over 90 million years.

He also just skips merrily over the part where mammals "came later." Where did they come from if not evolution? But of course Wells doesn't bother to answer the question. Stunningly, he doesn't even seem to realize he's raised one.

Goaded on by Strobel, Wells continues with a football analogy that really goes for broke in misrepresenting the Cambrian explosion:

"Okay," he said, "imagine yourself on one goal line of a football field. That line represents the first fossil, a microscopic, single celled organism. Now start marching down the field. You pass the twenty-yard line, the forty-yard line, you pass midfield, and you're approaching the other goal line. All you've seen this entire time are these microscopic, single-celled organisms.

"You come to the sixteen-yard line on the far end of the field, and now you see these sponges and maybe some jellyfish and worms. Then -- boom! -- in the space of a single stride, all these other forms of animals suddenly appear. As one evolutionary scientist said, the major animal groups 'appear in the fossil record as Athena did from the head of Zeus -- full blown and raring to go.'

"Either way, nobody can call that a branching tree!" [page 44]

Ignoring the fact that football fields don't have a "sixteen-yard line," this is a fairly accurate representation of the geological time scale. Richard Dawkins has a similar illustration he uses, and it goes something like this (paraphrasing from memory): if you hold out both your arms as wide as you can, and consider that the history of the universe, starting with the tip of your left middle finger and the tip of your right middle finger being the present, then life appears somewhere around the wrist of your right hand, complex life appears at about the first knuckle of your middle finger, and the whole of human history is the sliver of dust scraped off the nail by a single light stroke of a nail file.

Cosmic. The problem, again, is that Wells doesn't attempt to give any concrete numbers to the abstraction. The "single stride," the recycled "boom!" all try to make it sound like a much shorter time than it was -- an impossibly short time, in other words. And it simply isn't. Not to belabor the point, but that "single stride" is a period of 90 million years. While the reasons why the Cambrian Explosion occurred do still confound evolutionary biologists, it is not seriously considered a problem for evolutionary theory.

Skeptical Strobel makes a comeback, and this time he actually raises a sensible objection, although it doesn't really seem to follow what they've been talking about before. "Maybe...Darwin was right after all -- the fossil record is still incomplete. Who knows how natural history might be rewritten next week by a discovery that will be made in a fossil dig somewhere?" [page 45]

Wells, surprisingly, admits that it is a possibility that a future fossil discovery will "suddenly fill the gaps...But I sure don't think that's likely...It hasn't happened after all this time, and millions of fossils have already been dug up." [ibid]

What Wells -- and most creationists/ID proponents who make this argument -- seems not to realize, is that fossilization is extremely rare. A large number of circumstances must all fall into place to create a fossil. It is, frankly, astonishing that we have found the millions of fossils that Wells admits we have -- all telling the same story and aligning perfectly with evolutionary understanding, I might add. We have never found a fossil of an animal from a later period in strata dated earlier. There are no fossils of, for example, Jurassic rabbits. The fossil record that we do have it completely consistent with evolutionary theory.

And as an aside: "it hasn't happened after all this time." All this time? What arrogance!

Remember that human history has been too brief to even register as a blip on the cosmological radar. We are coming to the party several billion years late, and have only undertaken the study of paleontology at a serious level for a few hundred years. And yet if we haven't figured out the answer to every question in that time, there must not be one?

That's like walking into a friend's house, and immediately he tells you he's been looking for his keys for three days and asks you to help. Before you can even blink, he says "What, you haven't found them yet? Well, they must not be anywhere!"

"After all this time?" What is Wells smoking?

So they spend a couple of pages insisting that the fossil record doesn't support evolutionary theory. Again, a flat-out lie. That's what fossils are: evidence of the progression of life.

Strobel says, amusingly: "Protestations from Darwinists aside, the evidence has failed to substantiate the predictions that Darwin made." [page 46]

I can only conclude that when Strobel says "protestations from Darwinists," he actually means "evidence presented by people who actually know what they're talking about, but which I choose to ignore." This is another typical strategy -- ask for evidence, but when it is presented, dismiss the person giving the evidence, use that to deflect having to address their evidence, and claim that no evidence has been presented.

See also: "ad hominem."

Fuck, are we still not done with this chapter? Next week might be a long entry; I'm going to plough through as much as I can because we've barely hit the halfway mark, and I'm really tired of this clown. Wells could help me immensely by choosing not to speak in sentences that are almost entirely composed of falsehoods and fallacies, in dire need of explanation and correction, but I don't think I can count on that happening.


  1. At least, no more inconceivable than the time scale of "90 million years" is in general.

Wednesday, January 21, 2009

Viral Marketing -- ur doin it rite

Okay, so I lied about not posting about Watchmen no more.

But this viral piece from Watchmen comes on the tail of a disastrously failed viral marketing campaign in Australia, of which I was aware because I follow a number of Aussies on Twitter.

Viral marketing is hard to do right. It's hard to predict what people will latch on to and really start talking about and pass on to their friends. I get a lot of people asking for advice on how to make their video a viral hit, and there are factors you can look at. High-quality content is likely to get passed around. Content attached to some kind of celebrity will probably get passed around. Funny or uplifting usually has a better chance than somber. But beyond that, I dunno. The RvD films are a total fluke -- it's not like we planned for them to be smash hits (although we hoped, on the second one), and I'm not sure you can plan that kind of thing -- although the Ask a Ninja guys might disagree with me.

It seems you can't go wrong with cute animals acting strange. You've seen the sneezing panda? Of course you have. Everyone has. Fucking bear has 30 million views on YouTube. I don't even know how many views the frigging dramatic prairie dog/chipmunk/gopher has, because it's been uploaded about 4000 separate times -- but most of the search results have half a million hits or more.

But trying to actually make an ad badass enough to catch on? It happens. Usually when you're dealing with Superbowl spots, you can be guaranteed people are going to seek it out, and if you do other spots throughout the year like that, there's a good chance people will talk about it and they'll look it up online. But that's more word-of-mouth from traditional advertising than viral marketing, which seeks to make the audience do the work. They spread it around, they show it, they talk about it and it becomes part of the zeitgeist, at least for a little while.

Viral marketing for movies had its genesis with The Blair Witch Project. They set up a website -- when the vast interconnected community they call "Web 2.0" was only just starting to appear on the scene -- which basically asserted that the film was a real documentary about real events. The campaign was so successful that not only did everybody know about this micro-budget indie flick, with almost no real marketing to speak of, but for years afterward I would meet people who still thought it was real.

Quite frankly, I think viral marketing for a film can be a beautiful thing. In the old days, movies were more like live theatre. You sat in your seat and the curtains went up and an overture played. Like in live theatre, the overture was meant to both accommodate some stragglers who were finding their seats, but also to set the mood. If it was a musical it would give you hints of the musical themes you were going to hear. But most of all it provided a buffer zone between your real life, and ushered you into the fantasy life you were about to see on the screen.

We don't have that anymore, except in more specialty theatres. Most theatres are little boxes with chintzy decor. You're bombarded with advertisements for various products, other movies, and reprimands about proper etiquette which people seem to ignore anyway. These days movies don't even have opening credits for the most part, which means you just have to hit the ground running when the film starts playing.

Viral marketing like the Watchmen piece below help, I think, to fill that gap. It creates a whole "experience" of the film's reality, allowing you an early taste of accepting and understanding and engaging the world of the film.

When done right, the seams are invisible. For one thing, note that this video never once mentions the film Watchmen. It's not really an advertisement so much as supplemental material, about what is ultimately one of the central concerns of the story (mild spoilers): Dr. Manhattan changed the world, and no one can in that world can imagine it without him.

It sets the stage for the time period (an alternate 1980s) by being a very faithfully-produced replica of a 70s-era news broadcast, complete with "bad VHS" type degrading, which is heaviest early on.

It also frees up the filmmakers to not really have to deal with setting this up too much in the movie. The world will be different, and what we have here is three minutes of exposition which are unlikely to be crucial to the story, but create, as I said, a fuller, more immersive world.

The last "nice touch" is that the video is posted by the user "thenewfrontiersman," which is the name of a sort of widely-read, conspiracy-theory type newspaper in the Watchmen world. The kind of paper that sometimes gets a scoop but is usually just adding editorial paranoia to otherwise innocuous events (i.e. the kind of paper an unfortunate number of people, in our America as well as theirs, tend to believe). If they do more videos, we may get to become acquainted with the personality of The New Frontiersmen, as well as other characters in the film.

This kind of marketing becomes fun, almost interactive. A kind of spontaneous roleplaying has already showed up in the comments, with people pretending that this world really exists, that this news broadcast is a genuine part of our history.

"I gotta say, I miss all those costumed heroes," one says. "Sure they were reckless, but they made things a lot more interesting."

If a video gets posted about Adrian Veidt, we'll probably see comments praising his products and his humanitarian efforts, while others malign him as a sell-out and a heartless mega-corporation, probably even using anti-Wal-Mart rhetoric to give it a realistic flair. They are engaging with the movie and they haven't even seen it. They are becoming part of the tapestry of the film.

Zack Snyder and the producers have already shown a strong grasp of getting people to feel like they are a part of this film -- they held a short film competition to produce advertisements for Veidt products, the best of which would be seen as television advertisements within the film itself. When you get people to feel a sense of ownership over the movie, to feel that they helped make it what it is, you are more successful creatively (because they're more engaged), and more successful commercially (because they're more likely to come).

I'm really very excited about this movie because more than anything, it just seems like they really get it. This clip is no exception.

Monday, January 19, 2009

Sunday, January 18, 2009

Secular Sunday: Or, to summarize...

Secular Sunday: Nobody believes in Zeus anymore...

Last week, toward the end of my analysis of the latest section of CFC, I spoke of something in quick and dismissive passing, when it actually deserves more focus. So before I move on today (or instead of doing so, we'll see how long this ends up being), I want to go back to it.

Strobel makes the case, sort of, that the abiogenesis of life is nothing short of miraculous. This has been addressed by better and more intelligent writers -- than myself, let alone Strobel -- such as Richard Dawkins, who points out (and I'm paraphrasing here): if the odds of life as we know it arising on any planet, the odds of all the qualities of a planet aligning perfectly to support such life, are one in a billion billion, then out of a billion billion planets, it is not only probable but mathematically certain that on one planet, life will arise. If only one planet in the universe has life, then we are the one in whatever number of planets there are -- those are our odds.

In an infinite universe, it's not miraculous that life arose here. In an infinite universe, it would actually be miraculous if life as we define it didn't arise somewhere. (Of course, if that were the case, there would be nobody to marvel at the miracle.)

But that's not what I want to address (as I said, Dawkins among others has covered it much better). What I want to address is the following quote, attributed to Walter Bradley, "origin-of-life expert:"

If there isn't a natural explanation and there doesn't seem to be the potential of finding one, then I believe it's appropriate to look at a supernatural explanation...I think that's the most reasonable inference based on the evidence [page 42].

No. And no. And NO.

I need to be very clear about this, because this is extremely important. Ultimately this is the primary failure of this entire book, the foundational misunderstanding upon which Strobel is building his eponymous Case:

Science.

Does not.

Work that way.

To make it easily apparent why I have such a strong objection to Bradley's statement, let me rephrase, to have it say explicitly what Bradley is only implying:

If something happens that we don't understand, and its reasons for happening are not immediately apparent, we should feel free to make up any explanation that suits us.

As I said, this is the fundamental departure point, the fundamental mistake Strobel and others like him make. If they don't know the answer, they make it up. Or, conversely, they decide on the answer they will choose to accept before even bothering to look at the evidence.

Any scientist will tell you that many things occur in the world that science can't answer. And scientists will have their hypotheses for the reasons that these events occur, based on a sort of triangulation of the observations that they've made ("because A, and B, and C, it seems to make sense that D is occurring").

Through repeated experimentation they will either verify the hypothesis -- in which case it eventually becomes the accepted explanation, and is considered something we "know" -- or falsify the hypothesis -- in which case they will begin searching for a new answer to test.

What they absolutely do not do is fabricate a "supernatural" and untestable "reason" that has no relationship to the evidence given, nor do they force themselves to adhere to a predetermined "explanation" for new information.

A true scientist is not afraid to say, "I don't know."

And saying that, as a corollary, is not the same as saying, "No one will never know."

Bradley's attitude is the opposite of scientific inquiry -- the death of scientific inquiry. Pick a scientific discovery of significance. Say electricity. Or antibiotics.

Until the 17th century, human beings had little to no awareness of the microscopic world. We didn't know about bacteria, which means that we didn't know how people got sick. Following Bradley's exact line of thought, those who came down with illnesses were thought to be either cursed by God/the gods, or possessed by evil spirits. There wasn't a natural explanation, and there wasn't the potential of finding one. So they pursued the supernatural explanation.

Except that there was a natural explanation, and eventually we found it, because despite people like Bradley, who were happy with their comforting-but-completely-unjustified "answer," some people kept looking.

Admittedly, even attempts to be "naturalistic" can and have been wrong, too. Humourism, for example, was the dominant non-supernatural theory in medicine for nearly the entirety of Western history. A theory which has now, by modern medicine, been completely discredited.

But the important component in this example is that even despite believing that they had the answer, despite having held to and operated under this theory for 18 centuries1, scientists kept looking to make sure. And when they started to make observations that humourism couldn't answer, to create alternate hypotheses that had a higher success rate of explaining and predicting related occurrences, the long-held theory of humourism was eventually discarded.

A true scientist is not afraid to say, "I was wrong."

This is the strength of the scientific method, the reason that the scientific method is the only reliable method for determining objective truth about reality. Science is not emotional, it is not entrenched, it is continuously adapting -- indeed, science is constantly evolving. Scientific discoveries in one discipline have a ripple effect across our entire understanding of our universe. If a paradigm for understanding the universe cannot accommodate objective observation, that paradigm must be discarded.

In ancient Greece Bradley would have said that obviously Zeus was the source of lightning, because there was no natural explanation for it and no potential for finding one. And at the time he would have been right that there was no natural explanation for lightning and no potential (again, at the time) for finding one. But now we know exactly the natural explanation for lightning, and while that doesn't automatically mean Zeus isn't the one making nature work that way, that's just a case of pushing the "Zeus" answer one step back. Not because there's evidence for the Zeus answer, but because its adherents can't deal with letting go.

As you all know, nobody seriously believes in Zeus anymore. And yet intelligent design is exactly the same thing. They say God did it. Once you show how nature did it, it goes back a stage to "God made nature do it like that," with no evidence to back that up, or even indicate a reason to think so. They call that lack of evidence (aka ignorance) faith, and they are inexplicably proud of it.

People like Bradley are afraid to say, "I don't know." People like Bradley are afraid to say "I was wrong." People like Bradley choose a comforting answer because it is comforting. Not because it is appropriate, correct, or even warranted. People like Bradley are not proper scientists.

Because science doesn't work that way.

As I thought might happen, this post got long enough that I think it's enough for today. But it was important. One of the reasons supernatural explanations are so compelling is that they're easy to communicate, and sometimes more intuitive than the natural explanation. One sentence of creationist claptrap takes paragraphs and paragraphs to answer in a way that is both relatively accurate and intellectually accessible to people who are not scientists.2 (Not to mention the challenge of making sure I understand it right, not being a scientist myself.)

Now I can simply point back to this post, or even use the acronym SDWTW, and you will know what I mean without having to spend paragraphs explaining myself. (Even given that, by the end of this whole endeavor, I may very well have written more words about the book than Strobel did in it.)

More CFC next week.



  1. A common defense of religion is that its longevity and tenacity somehow give it credence. How could a wrong idea survive so many centuries? My answer to that, as with humourism, is simply "because people didn't know any better."

  2. One of the creationist tactics in "debates" with qualified scientists -- and apparently one of Strobel's tactics in this book -- is to rattle off in quick succession half a dozen or more wholly-incorrect but succinctly-stated talking points. Their opponent becomes flustered by the assault, frustrated by being unable to communicate the answers clearly in the allotted time (and/or by the ridiculous nature of the claims), or winds up forgetting or not being allowed to answer one of the points, which makes it appear to the audience that s/he had no answer to give. That's why I'm taking my time going through this, I don't want to leave any stone unturned.

Friday, January 16, 2009

Last post on the Watchmen debacle

If you hadn't heard, Warner Bros. and Fox have reached a settlement over Watchmen, and so the March 9 release will not be delayed.

This means that I probably won't be posting about this project again until I'm giving my review on March 9 (midnight showing, what-what). Let's hope that after all this drama (and publicity) that the flick doesn't suck.

Thursday, January 15, 2009

DorkmanScott.com -- Your Source for Whatever This Is

I've gone a long time coasting by without the professional accoutrements -- no website, no business cards, I haven't even bothered to put my professional reel together until just recently (and still haven't posted it online).

But no more! I'm working on getting my act together, and the first step is my registration of dorkmanscott.com as my official site address.1 As of now, if you type that in, you will just be forwarded back here. But over time I hope to build it up to be more of a site with a portfolio and all that jazz.

It's worth updating your bookmarks and/or muscle memory, as I may also shift the blog from Blogspot to Wordpress at some point. Haven't had major problems with Blogspot, but Wordpress seems to be more customizable and have more reliable servers, both of which may matter if things start to pick up. If you get used to using the dorkmanscott.com portal, then the switch from one to the other will be relatively seamless for you if/when it occurs.



  1. Dorkman.com is being squatted on, MichaelScott.com is some country western singer, and Michael-Scott.com is porn (NSFW!). Straight porn, no less.

Wednesday, January 14, 2009

The Descendants -- the Story So Far

I keep saying that I want to use this blog to talk about my experiences in the serious-business Hollywood machine. But so far what few I have had, I've kept to myself -- mainly because they're deals-in-progress and there's nothing solid to announce yet. And I'm afraid I'll jinx it by announcing it ahead of time.

But there's one project that, with the beginning of a new year, I think it's safe to talk about, and especially since it's likely to be indicative of how things are going to be from here on -- The Descendants. I've made oblique references to the project, and people following my YouTube page have probably seen the trailer. But I want to talk about the story of the project thus far.

Shortly after we put out RvD2 -- nearly two years ago, gah -- a creative executive from Dark Horse Entertainment contacted me and Ryan, interested in meeting and discussing our plans for the future, including future projects we might want to do.

I put together a number of pitches and we met Chris, the exec, for dinner at a bar near Dark Horse HQ (or the HQ at the time; they've since relocated to bigger and better). I told him some of the ideas, and he listened to them patiently before making a counter-pitch. They had a project that they've been developing from an independent comic book they'd acquired. It was an action-fantasy story about a monster-killing mercenary named Charlie Stone, and Ray Park was attached to star.

Given that the fight with Darth Maul in The Phantom Menace is what inspired me (and Ryan) to pick up some sticks and start making lightsaber fights, it's not too much of a stretch to say that I have Ray Park to thank, to some degree, for RvD2 and all that came after.1 So the idea of working with him was very exciting.

(As an aside, I had actually met Ray once before, randomly, at an EZ-Lube. We both happened to be there for an oil change, and he recognized my shoes as martial arts shoes, which led to a brief conversation.2 When we met at Dark Horse to discuss the project some months later, and I mentioned we had met, he actually remembered. "Oh yeah, the shoes!")

At the time, the comic was three issues (another issue has since been published). I had read them and felt very excited and interested in the ideas, although I thought it could benefit from some expansion and development in a film. We discussed what we wanted to do with the film, and the character, and we all seemed to be on the same page with what we wanted. We wanted funny, we wanted a little overconfident, a combination of Jackie Chan and Indiana Jones.

Our first plan was to produce a short film that took place in the Descendants world, but was not necessarily part of the story canon that we were planning. Joey (Andrade, the creator) and I wrote a ten minute script for the project, but given what we wanted to do with it, it was too expensive for a spec project.

I also started to feel leery of it, though I liked the script -- since it didn't represent the overall story of the project, people who didn't like it would get the wrong impression. And people who did like it would also get the wrong impression. So it seemed lose-lose.

But as summer 2007 came up, a new opportunity arose: a company that will remain nameless3 wanted to develop Descendants as a possible web series. The decision was made to produce a 90-second teaser trailer for the project, which would first premiere at Comic-Con, and be some of the first content available on the new site.

The production of the teaser is a tale in itself. Summer 2007 I was in Florida shooting Sandrima Rising; we took a break in July for logistical reasons, which meant I couldn't prep before July. Additionally, Ray was out of town until the weekend before Comic-Con, so we literally only had two non-consecutive days to shoot (the Friday and Monday preceding Comic-Con), and four days for post, to premiere it Saturday.

I don't know how, but somehow we managed it, and the teaser is available on YouTube.

(It's worth noting that when we made the trailer, we didn't really know what this "web series" would be about, other than vague concepts we were kicking around; the RED camera also hadn't been released yet. So although it was intended as a "proof of concept," the teaser neither reflects the expected visual quality of the project, nor do any of the events in the trailer actually appear in any finished script.)

So the trailer appeared exclusively on the unnamed site for a time. But they began to dick us around regarding our continued deal with them, and it quickly became apparent that they didn't have an actual plan to produce an ongoing Descendants series; they just wanted the traffic from the trailer. We pulled the teaser from their site and started thinking about other directions for the project.

We batted around web series, mini-series, TV pilot, and ultimately we decided that we needed a script, no matter what we did. Since we didn't know what form it would take, we decided to write it as a feature film. Joey and I started working on a treatment for the project. It took multiple drafts, but we finally got a go-ahead on the script.

And then the WGA strike hit.

Now, I'm not WGA, and Dark Horse is apparently not a WGA signatory company. But I still didn't want to risk my future ability to join the union by writing during the strike. So it was agreed that I would not be able to turn in any work that had been done until after the strike ended. And that took several months, as you may or may not recall.

In February 2008, the strike ended and I was able to finish off what I had done -- more or less. The writing of the script had opened up holes that I hadn't noticed at the treatment level, and the first draft was kind of a mess. I actually told Dark Horse that I didn't want to show them this first draft, preferring instead to repair the damage first.

This was kind of unprofessional, and if I'd been hired by a big studio I would have had to turn in that draft and would have been promptly fired, and probably blacklisted. Fortunately, the relationship with Dark Horse is more relaxed (and less official), and they understood that this was my first time writing-to-order, so Chris was willing to wait for draft two.

Like the treatment, it took several drafts and rounds of notes to get Descendants to a place where we were all happy with it. There was a lot to juggle with the adaptation, in attempting to stay true to Joey's original concept, while expanding it beyond the page and the first three issues, giving it a more cinematically-satisfying structure, and also giving us somewhere to go from there.

During this time, Dark Horse had signed a "first look" deal with Universal Studios. For those who haven't heard the term before, this means Universal has dibs on anything and everything Dark Horse develops. Before Dark Horse can take a project anywhere else, they have to take it to Universal. If Universal passes (Hollywood-ese for "no, go away"), then we can take it anywhere else.

The script gets done and Dark Horse takes it to Universal; specifically, "Uni Digital," their new media department. We are assured that UniDigi plans to read it right away -- the Senior VP is going to take the script home with him and read it overnight, which we are told he never does with other projects.

This is Hollywood-ese for blowing smoke up your ass. When you start working in Hollywood, you'll start to get this a lot. People will tell you how excited they are, how they will make your project/script their first priority, how they are taking it home this weekend, this VERY NIGHT, so that they can be sure to read it immediately and be ready to move.

Translation: don't expect to hear back for several weeks. And at that point they'll apologize, because they still won't have read it, and they'll sing you the same song then, too.

I've had the good fortune to have been involved with Dark Horse, who is a legitimate company and in Uni's good graces. Can you imagine how slow the response would be if it was just me and a script? No matter how much "heat" the script had, it'd be months I'm sure.

It probably sounds like I'm bitter about this. I'm really not. I've read a lot of books on the industry that talked about exactly this, so I haven't been taken by surprise. It's annoying, and makes me impatient, but that's how it works.

Anyway, they eventually passed.

So we've been taking it around to other places, and we've found a place that is interested in the project, based apparently only on a verbal pitch of the concept and the attachments (me to direct, Ray to star). It's essentially a foreign presale deal -- they give us the money to make the movie in exchange for the right to distribute the film overseas.

The catch: they're willing to give us $4 million. The script, according to an experienced line producer Dark Horse brought in, is a $40 million project. I feel confident that we can make a film look like much more than it actually costs. I think we could make a $4 million movie that looks like a $10 million movie. But we can't make $4M look like $40M. There are limits -- as the line producer said, "You can get five pounds into a two pound bag, but you can't get twenty pounds in."

So we were faced with a choice that had to be made:

We could attempt to make our $40 million script for a tenth of the appropriate budget. Doing so, we felt, would hurt the project and everyone involved. There was no way we could do justice to the script, or the concept, by making a film that was too ambitious for its own good. So we decided not to go this route.

Another option was to see if we could find another taker for the script. But when the script is attached to a first-time director and a lead actor who, while a great guy with geek cred, is no Tom Cruise in terms of getting butts in seats. All things considered, we figured we weren't likely to get more than $4 million anywhere we took the script.

So that left us with the third option: write yet another script, of a smaller scale, to make for $4 million. At first we thought it might be a smaller version of the existing script, but the concerns of doing it justice, as mentioned above, made us decide to develop and write a brand new script. Above and beyond any of the events in the story, what has always stood out has been Charlie Stone. His voice has been loudest and clearest and given the project its vitality. So we determined that if we wrote another story, as long as Charlie was in the center of it, we would be okay.

While I can't go into either storyline, the new script basically functions as a lead-in to the too-expensive script -- in other words, the IMDB trivia will say that the "sequel" was written before the "original." Some adjustments will have to be made if Descendants is successful and we get the opportunity to do what is now likely to be Descendants 2 -- people who in the current script are meeting for the first time will have met in the previous movie, stuff like that. I think it benefits the story in the long run, as the new script in part expands upon what was originally a side-story. Now it gets its own film, making the eventual "sequel" less crowded story-wise.

So that's where Descendants has been over the last two years, and that's where it is now. We're working on ironing out the new treatment and I'll be writing the new script, with the goal of finishing by Valentine's Day, and hopefully we'll have made a deal by my 26th birthday (end of March) for my first feature film.

And then, the real fun will start.



  1. Grudgingly, I suppose by extension that means I have George Lucas to thank for Phantom Menace. Perhaps a distasteful admission, but an undeniable one.

  2. When he introduced himself saying "I'm Ray," I barely restrained myself from responding "I know."

  3. They still exist, but I'm not interested in giving them any traffic by naming them.