Need dialogue input

Started by
10 comments, last by Beige 20 years, 2 months ago
Wasn''t Collosus that B-Movie with the robot that controls the earth in a weird sort of benevolent way?

Anyways, what beige might be missing out on is the Chess Mentality of the machine. Granted, the TAI is designed to make the best circumstances for the player at the start, and as the game progresses, it starts recursively analysing and taking to extreme behavior because theres a better solution down the road then on the other more friendly paths, but the end goal is the thing thats missing. Why exactly is the robot choosing the better solution three layers down rather then the better one at the current decision? Was the machine designed with by a Utilitarian programmer?
william bubel
Advertisement
Colossus was also a good example of the "be careful what you wish for" lesson when dealing with powerful creations. They built Colossus and gave it directives that would point it toward the goal that they were aiming at: Peace and safety and stability. Colossus had a set of parameters that would direct it toward that goal.

The beauty (and horror) of the situation is that people didn''t just want what they thought they wanted, they wanted those things qualified with a bunch of things that they had and took for granted, like freedom and survival. Colossus had no parameters for preserving these things, and so it sacrificed them to attain its primary objectives.

That''s the dynamic that makes man-built gods so terrifying: We don''t understand "perfect" well enough to build something that will achieve it for us. Colossus was a shortcut on the road to happiness, and was built before anyone could see where that was, so Forbin had to estimate it, and he got it wrong. Not so wrong that a person going there couldn''t have been convinced of his mistake, but wrong enough to doom the human race.

This topic is closed to new replies.

Advertisement