close menu

Alex Garland Discusses EX MACHINA’s Philosophy, Scientific Advancement, and More!

Good science fiction doesn’t have to mean only giant explosions and laser guns and things. Sure, there’s a place for that kind of spectacle, but a lot of the best sci-fi actually has to do with (*gasp*) concepts, ideas, and humanity at large. Weird, I know. One writer who’s always dealt with that, and mixed in the fantastic visuals when necessary, is Alex Garland, the writer of such films as 28 Days Later…, Sunshine, Never Let Me Go, and Dredd. His first foray into direction is Ex Machina, which he also wrote, a film in which a billionaire scientist named Nathan (Oscar Isaac) creates a highly advanced Artificial Intelligence named Ava (Alicia Vikander) and tasks a programmer named Caleb (Domhnall Gleeson) with determining whether or not she can pass as human, even if she looks highly synthetic.

I spoke to Garland about the concepts and implications depicted in the film, as well as Isaac’s character’s complicated morality. Check out the entire conversation below, and be sure to also read our interview with Geoff Barrow and Ben Salisbury, who scored the film.

Nerdist: The movie hinges on whether Ava can or cannot pass the Turing Test. When you were writing it, how did you balance that? Because we have to be unsure whether she can or can’t pass it right away.

Alex Garland: In the back of my mind I guess I was thinking about films like Sleuth. I think it was originally a stage play, but I encountered it as a film first. Where you have shifting allegiances as a viewer and it’s hard to settle too hard on what a particular character is saying or meaning in any particular moment. I tried to keep that shifting eyes, closed structure alive in it.

There’s a lot of misdirection and I guess a lot of reliance, a sort of feeling of certainty, that the people watching the film are likely to have seen in other films involving humanoid robots and will have certain expectations that come from that. Possibly using those expectations a bit, I guess.

N: Did you run into the problem of her being “too human” too soon? Since we can see the face of the actress, it could be easy for her just to seem human. Were you running into that at all?

AG: No, I don’t think so. If anything the problem that we had, which slightly surprised me, was that there were some people who couldn’t get past the fact that she was a machine. When she first appears, she’s very clearly a machine, and in terms of not just the way she looks but the way she sounds. The movement, sounds, and then the pulses and things that come from her. There just were a group of viewers who were always closed off from them as a toaster or a microwave oven or something.

That kind of surprised me, because I felt she was in many respects so sentient and human-like. So, weirdly, I wouldn’t have anticipated that, it was almost a reverse.

ExMachina1

N: Do you think the design of her–because you obviously did that and that’s a point in the movie where Nathan says, “I made her look like this so that we can see if she passes the test beyond her looking robotic.” Does that disprove the theory that an AI can overcome the physical, that a grey box could in fact just pass as human?

AG: A grey box could pass as a human provided you couldn’t see the grey box. If it was on the other end of a phone line or, if it was hidden on the other side of a door, according to the classic terms actually that you test them, then sure.

Part of the reason Ava looks the way she looks is not to do with, some of the reason is weird, it’s not even to do with anything Nathan’s doing or planning; it’s like something to do with the film I was working on previously where there was a lead character in the film Dredd who constantly has his face hidden, or half his face hidden, by a helmet, a kind of mask he wears, and on this film I was thinking as I started writing it, I really want the actor to have all of his/her face used, so written into the script, her face was indistinguishable from the face of a human. It was the rest of her body that portrayed the machine aspects, I guess.

The true answer is it’s nothing to do with the internal logic of the story, some of it. It’s more just to do with reacting against the thing you just worked on.

N: A big part of this, from Nathan’s perspective, is that designing her to be female with functional, or at least semi-functional, sex organs is because he says it’s fun, but that definitely has an effect on Caleb, and do you think it would have an affect the same way on anybody?

AG: But what he’s doing, when Nathan does that, is he’s winding Caleb up. There’s a kind of game going on in the narrative, which is how much of what Nathan is doing is him letting the mask slip on what he’s actually like. How much of it is a construct for the purposes of the psychological thing he’s doing to this man that he’s brought into the house. He’s presenting himself in a very deliberate way as a threatening, intellectually bullying, misogynistic, predatory and constantly and implicitly violent person. Like he could be physically dangerous as well as intellectually dangerous and so when he’s doing that, part of what’s he’s doing, is he’s actually winding Caleb up.

One minute he’s talking about that sexual aspect of Ava, the next thing he’s saying something that makes it sound as if he’s being racist. He’s kind of pushing buttons in Caleb and then flattening him. It’s a set up and a knock down. It’s more to do with that, in that particular thing.

N: The other part when he is readily accepting of being called a god, or like a god, is that another part of him winding Caleb up?

AG: Again, he’s winding him up. He does this thing of saying that Caleb is very quotable, when Caleb is actually repeating other people’s quotes and lines. Then, misquoting what Caleb said to him. To big himself up and essentially portraying himself as a bit of a megalomaniac, comparing himself to God, basically.

But that is part of the wind up and then two thirds of the way through the film it happens again and Caleb wearily says, “No, you’re misquoting me again.” Or “I’m quoting someone else again.” Nathan goes, “Yeah, yeah, yeah, come on, I know, it’s Oppenheimer” and so it’s like in intention that that scene is supposed to be saying, “Look, come on, I knew what I was doing.” That’s part of the wind up. I’m basically agreeing with you.

ExMachina2

N: How close do you think we actually are to AI being indistinguishable from humanity?

AG: I think we’re not very close.

N: Okay [he says disappointedly].

AG: I think we’re pretty long way away from it. I know that there are people out there like Kurtzweil who’ve quantified it and said within twenty years we’ll be in position X or whatever; that may or may not be true. As far as I can tell it’s impossible to know and there are certainly some pretty fundamental breakthroughs that need to happen before we’re anywhere close to that kind of stuff.

That said, if somebody said, “Make a bet, do you think it’s going to happen? Do you think we’re going to have self aware sentient machines?” then I would bet that we would. I think yes, that will happen, but it doesn’t feel imminent.

N: Do you think it’ll happen within our lifetimes?

AG: It’s impossible to say. I think that the way I feel about it, which may or may not be true is, and now just like a cure for cancer, that there’s a lot of people working on it, there’s breakthroughs that happen, but then some of the breakthroughs, you move forward, but the goalposts shifts at the same time and the goal recedes because the complexity of the problem has just been revealed as being even harder than was previously conceived.

It’s slippery. It’s something that’s hard to put a time frame on. But by the way, what do I know? I’m like some bloke who just tried to make a movie about AI. I don’t have any special wisdom or knowledge about it.

N: The topic of AI is on a lot of people’s minds, and has been in a lot of films and television shows even recently; what fascinated you about it and why do you think so many people are fascinated all at once?

AG: What interested me is that if you make a film about strong AIs you also making a film about us, and human consciousness  and the problems of one are related to the problems of another. Just in sci-fi terms that’s a really great premise, that’s classic sci-fi in a way.

I don’t really think that’s why there’s a lot of AI narratives around at the moment. My gut feeling about that, and it really is just a hunch, again, what do I know, but I think it’s got less to do with AI and I think it’s got more to do with a unconscious or semi-conscious fear of big tech companies and computers and the sense that we don’t really know how our laptops work but our laptops seem to understand something about how we work, and that makes people feel uneasy.

Ex Machina is in select cinemas now and will open wider on April 24th. It’s really worth a look.

The Collection: Magic, Trek, Minecraft Westeros, and Other Stuff

The Collection: Magic, Trek, Minecraft Westeros, and Other Stuff

article
What it Felt Like to be THE LAST MAN ON THE MOON

What it Felt Like to be THE LAST MAN ON THE MOON

article
These GAME OF THRONES Photos Spell Doom for the Starks

These GAME OF THRONES Photos Spell Doom for the Starks

article