16 дек. 2014 г.

Bella

Elementary 3×4

& Holmes: Part of our arrangement, you will recall, is that you assist with quotidian matters when I’m busy.

& Holmes: Bunkum.

& Holmes: And how, pray tell...

& Holmes: Something called a «Turing Test» is routinely issued in AI labs. An examiner sits in front a computer, poses a series of questions until he or she can determine whether the answers have come from a human or from a computer. Until very recently, no machine has ever passed.
    Kitty: And now one has?
    Holmes: Partially. A program named «Eugene Goostman» was designed to mimic the responses of a 13-year-old boy from Eastern Europe. It fooled some of its examiners. Now, in the case of Bella, we know there’s no human in the equation, so if I can trap it into giving responses that couldn’t possibly have come from a human... I’ll have won.
    Kitty: So, in other words, you don’t really know what you’re going to do.

& Bella: I don’t understand the question. Could you tell me more?
    Holmes: Yes, so you keep saying, ad nauseum.

& Holmes: Why don’t you just admit that all you’re doing is following your own programming?
    Bella: I don’t understand the question. Could you tell me more?

& Holmes: Love... Surely it’s a human construct, a hedge against the terror of mortality. I believe that. But that doesn’t account for times I’ve felt it itself. With my mother. Irene. Even, after a fashion, with Watson. It vexes. Love is either a human construct or it’s a real thing, right?.. I know, you need more information.

& Holmes: Watson, this is Mason. Like many of his generation, he’s named after a profession his parents would never deign to practice: Hunter, Tanner, Cooper, Mason, so forth.
    Mason: Yeah, like «Sherlock» is such a great name.

& Holmes: His comments were quite salient. He is not an unintelligent man.


& Mason: Everybody knows that one day intelligent machines are going to evolve to hate us. It’s a «button-box» thing.
    Kitty: What’s the «button-box» thing?
    Mason: It’s a scenario somebody blue-skyed at an AI conference. Um, imagine there’s a computer that’s been designed with a big red button on its side. The computer’s been programmed to help solve problems, and every time it does a good job, its reward is that someone presses its button. We’ve programmed it to want that. You follow?
    Kitty: Sure.
    Mason: Right, so at first, the machine solves problems as fast as we can feed them to it. But over time, it starts to wonder if solving problems is really the most efficient way of getting its button pressed. Wouldn’t it be better just to have someone standing there pressing its button all the time? Wouldn’t it be even better to build another machine that could press its button faster than any human possibly could?
    Kitty: It’s just a computer, it can’t ask for that.
    Mason: Well, sure it can. If it can think, and it can connect itself to a network, well, theoretically, it could command over anything else that’s hooked onto the same network. And once it starts thinking about all the things that might be a threat to the button— number one on that list, us— it’s not hard to imagine it getting rid of the threat. I mean, we could be gone, all of us, just like that.
    Watson: That escalated quickly.
    Mason: Well, it would escalate quickly. I mean, they’re computers. They can’t be reasoned with. They don’t feel pity, or remorse, or fear. And they absolutely would not stop, not ever, until we’re dead.
    Watson: That was from The Terminator. He’s quoting The Terminator now.
    Mason: So? That was a very prescient movie in a lot of ways.
    Holmes: Mason, unless the machines rise tonight, we have a murder to solve.

& Andrew: Funny how life works out, huh? Holmes puts me on an e-mail chain, and 36 hours later I’m... Got a ticket to Copenhagen.

& Watson: Why are you listening to this? Is this even music?
    Holmes: Death metal.

& Watson: How could anyone work with this on?
    Holmes: Rather like rubbing a belt sander over one’s brain. There are moments when that’s necessary. Who wrote this one?
    Kitty: «Goatwhore.»

& Holmes: All right, so Michael Webb is either an imbecile, or he’s a genius at impersonating imbeciles.

& Holmes: Not as barmy as it sounded.

& Holmes: ......And so, no, I’ve got no desire to banish the man to Scandi-bloody-navia...
    Watson: ...Okay, I believe you. Kind of feel like hugging you right now.
    Holmes: Yet, as my friend, you know that would be a rash decision.

& Holmes: That’s a problem. But not an insurmountable one.

--
On the IMDb


Комментариев нет:

Отправить комментарий