5af2e873e138b021e790a01ba8231ea8.ppt
- Количество слайдов: 84
Artificial Intelligence and Computing Science October 19, 2012
AI in the Seventies and Eighties A belated realization: in most situations, intelligent behaviour requires knowledge
How many things do you know? ``I can’t think of a hundred things that I know a thousand things about’’ (Marvin Minsky, `Semantic Information Processing’, 1968)
AI in the Seventies Winograd, SHRDLU Expert Systems Minsky, `Frames’ Rumelhart and Mc. Clellan, `Parallel Distributed Programming’ (return of the Perceptron)
SHRDLU’s World
Sample Dialogue Winograd: What’s on the green block? Shrdlu: A pyramid Winograd: Put the red block on the green block Shrdlu: OK Winograd: Why did you move the purple pyramid? Shrdlu: To clear off the green block Winograd: Why did you put the red block on the green block? Shrdlu: You told me to
Winograd: Pick up a big red block Shrdlu: OK Winograd: Grasp the pyramid Shrdlu: I don’t understand which pyramid you mean Winograd: Find a block which is taller than the one you are holding and put it into the box Shrdlu: By `it’, I assume you mean the block which is taller than the one I am holding. OK.
Winograd: What does the box contain? Shrdlu: The blue pyramid and the blue block Winograd: What is the pyramid supported by? Shrdlu: The box Winograd: Can the table pick up blocks? Shrdlu: No
Things SHRDLU doesn’t know: `red’ and `green’ are colours SHRDLU’s world is flat A table has legs but no arms SHRDLU is a computer program
Expert Systems (rule-based) Knowledge can be represented by a number of `if…then’ rules plus an inference engine. E. g, ``IF temperature is high AND rash is present, THEN patient has measles. ’’
We can extract the rules from human experts via interviews. This process is known as `knowledge engineering’:
`If an animal has fur, it is a mammal’ `If an animal has feathers, it is a bird’ `If an animal is a bird, it can fly’ `If an animal has scales, it is a fish’ `If an animal is a fish, it can swim’ `If an animal lays eggs and has fur, it is a duck-billed platypus’ This gives us a set of rules that an inference engine (or `expert system shell’) can reason about. Two popular modes of reasoning are forward chaining and backward chaining:
`If an animal has fur, it is a mammal’ `If an animal has feathers, it is a bird’ `If an animal is a bird, it can fly’ `If an animal is a bird, it lays eggs’ `If an animal has scales, it is a fish’ `If an animal is a fish, it can swim’ `If an animal lays eggs and has fur, it is a duck-billed platypus’ Forward chaining: Given a new fact (`Tweety has feathers’), search for all matching conditionals, draw all possible conclusions, and add them to the knowledge base: : - Tweety is a bird : - Tweety can fly : - Tweety lays eggs Potential problem: we run into the combinatorial explosion again
Backward chaining: Given a query (`Does Tweety lay eggs? ’), search for all matching consequents and see if the database satisfies the conditionals: `If an animal has fur, it is a mammal’ `If an animal has feathers, it is a bird’ `If an animal is a bird, it can fly’ `If an animal is a bird, it lays eggs’ `If an animal has scales, it is a fish’ `If an animal is a fish, it can swim’ `If an animal lays eggs and has fur, it is a duck-billed platypus’ `Tweety has feathers’
Backward chaining: `Does Tweety lay eggs? ’ `If an animal has fur, it is a mammal’ `If an animal has feathers, it is a bird’ `If an animal is a bird, it can fly’ `If an animal is a bird, it lays eggs’ `If an animal has scales, it is a fish’ `If an animal is a fish, it can swim’ `If an animal lays eggs and has fur, it is a duck-billed platypus’ `Tweety has feathers’
Backward chaining: `Does Tweety lay eggs? ’ `Is Tweety a bird? ’ `If an animal has fur, it is a mammal’ `If an animal has feathers, it is a bird’ `If an animal is a bird, it can fly’ `If an animal is a bird, it lays eggs’ `If an animal has scales, it is a fish’ `If an animal is a fish, it can swim’ `If an animal lays eggs and has fur, it is a duck-billed platypus’ `Tweety has feathers’
Backward chaining: `Does Tweety lay eggs? ’ `Is Tweety a bird? ’ Does Tweety have feathers? ’ `If an animal has fur, it is a mammal’ `If an animal has feathers, it is a bird’ `If an animal is a bird, it can fly’ `If an animal is a bird, it lays eggs’ `If an animal has scales, it is a fish’ `If an animal is a fish, it can swim’ `If an animal lays eggs and has fur, it is a duck-billed platypus’ `Tweety has feathers’
Backward chaining: Conclusion: Yes, Tweety does lay eggs This method is used by Prolog, for example
`If an animal has feathers, it is a bird’ `If an animal is a bird, it can fly’ `If an animal is a bird, it lays eggs’ Potential problem: A lot of rules have exceptions.
Frames (Marvin Minsky, 1974)
A frame allows us to fill in default knowledge about a situation from a partial description. For example, ``Sam was hungry. He went into a Mcdonalds and ordered a hamburger. Later he went to a movie. ’’ Did Sam eat the hamburger?
So we can economically represent knowledge by defining properties at the most general level, then letting specific cases inherit those properties… Event Transaction Buying something Buying a hamburger
Return of the perceptron (now called a `neural net’) Changes since 1969: Hidden layers Non-linear activation function Back-propagation allows learning
Rumelhart and Mc. Clelland `Parallel Distributed Processing’ Use neural nets to represent knowledge by the strengths of associations between different concepts, rather than as lists of facts, yielding programs that can learn from example.
Conventional Computer Memory Register One 0110 Register Two 11100110 Register Three 00101101 . .
AI: 1979 -2000 Douglas Lenat, `CYC’, Douglas Hofstadter, `Fluid Analogies’ Brian Hayes, `Naïve Physics’
CYC’s data are written in Cyc. L, which is a descendant of Frege’s predicate calculus (via Lisp). For example, (#$isa #$Barack. Obama #$United. States. President) or (#$genls #$Mammal #$Animal)
The same language gives rules for deducing new knowledge: (#$implies (#$and (#$isa ? OBJ ? SUBSET) (#$genls ? SUBSET ? SUPERSET)) (#$isa ? OBJ ? SUPERSET))
What CYCcorp says CYC knows about `intangible things’. Intangible Things are things that are not physical -- are not made of, or encoded in, matter. These include events, like going to work, eating dinner, or shopping online. They also include ideas, like those expressed in a book or on a website. Not the physical books themselves, but the ideas expressed in those books. It is useful for a software application to know that something is intangible, so that it can avoid commonsense errors; like, for example, asking a user the color of next Tuesday's meeting.
Questions CYC couldn’t answer in 1994 What colour is the sky? What shape is the Earth? If it’s 20 km from Vancouver to Victoria, and 20 km from Victoria to Sydney, can Sydney be 400 km from Vancouver? How old are you? (Prof. Vaughan Pratt)
Hofstadter: Fluid Analogies Human beings can understand similes, such as ``Mr Pickwick is like Christmas’’
Example: Who is the Michelle Obama of Canada?
Michaelle Jean, Governor-General
Head of government Spouse
Head of State Spouse
One of Hofstadter’s approaches to solving these problems is `Copycat’, a collection of independent competing agents. If efg becomes efw, what does ghi become? If aabc becomes aabd, what does ijkk become?
Inside Copycat: ij(ll) ij(kk) ijkk (ijk)l (ijk)k aabd: jjkk aabc: ijkk aabd: hjkk
If efg becomes efw, what does ghi become? COPYCAT suggests whi and ghw If aabc becomes aabd, what does ijkk become? COPYCAT suggests ijll and ijkl and jjkk and hjkk
Hofstadter: ``What happens in the first 500 milliseconds? ”
Find the O
XXXXXXXXXXX XXXXOXX XXXXXX
Find the X
XXXXXXXXXXX XXXXXX
Find the O
XXOXXOX XXXXXOXXOXX XXXOXXOXOXX XXOXXOXXOXX OXXXXXOXXOX
What eye sees What I see
The Cutaneous Rabbit
Naive model of perception: World Vision Awareness
Better model of perception: World Vision and Knowledge Awareness
We recognise all these as instances of the letter `A’. No computer can do this.
Hofstadter’s program `Letter Spirit’ attempts to design a font.
Naïve Physics
Hayes, ‘Naïve Physics Manifesto’, 1978 “About an order of magnitude more work than any previous AI project …’’ Hayes, `Second Naïve Physics Manifesto’, 1985 “About two or three orders of magnitude more work than any previous AI project…”
One sub-project of naïve physics: Write down what an intelligent 10 -year-old knows about fluids
Part of this is knowing how we talk about fluids: For example: Suppose Lake Chad dries up in the dry season and comes back in the wet season. Is it the same lake when it comes back?
Suppose I buy a cup of coffee, drink it, then get a free refill. Is it the same cup of coffee after the refill?
2011: IBM’s Watson Wins Jeopardy
Inside Watson: 4 Terabytes disk storage: 200 million pages (including all of Wikipedia) 16 Terabytes of RAM 90 3. 5 -GHz eight-core processors
One of the components of Watson is a Google-like search algorithm. For example, a typical Jeopardy question in the category `American Presidents’ might be ``The father of his country, he didn’t really chop down a cherry tree’’ Try typing `father country cherry tree’ into Google The first hit is `George Washington – Wikipedia’ But Watson also needs to know how confident it should be in its answers
Conspicuous Failures, Invisible Successes In 2012, we have nothing remotely comparable to 2001’s HAL. On the other hand, some complex tasks, such as attaching a printer to a computer, have become trivially easy
A different approach: robot intelligence Grey Walter’s machina speculatrix, 1948
BEAM robotics, Queen Ant, a light-seeking hexapod, 2009
AI Now: Robot Intelligence Rodney Brooks, `Cambrian Intelligence’ -Complex behaviour can arise when a simple system interacts with a complex world. -Intelligent behaviour does not require a symbolic representation of the world.
SPIRIT: Two years on Mars and still going.
Brook’s approach invites us to reconsider our definition of intelligence: …is it the quality that distinguishes Albert from Homer?
…or the quality that distinguishes Albert and Homer from a rock?
`Chess is the touchstone of intellect’ -- Goethe …but perhaps we are most impressed by just those of our mental processes that move slowly enough for us to notice them…
Strong AI: ``We can build a machine that will have a mind. ’’ Weak AI: ``We can build a machine that acts like it has a mind. ’’
Strong AI (restatement): ``We can build a machine that, solely by virtue of its manipulation of formal symbols, will have a mind. ’’
Hans Moravec: ``We will have humanlike competence in a $1, 000 machine in about forty years. ’’ ---- 1998
Hubert Dreyfus: ``No computer can ever pass the Turing Test, or do any of the following things [long list, including `play master-level Chess’]. ’’ 1965; Mac. Hack beat Dreyfus in 1967
…and if a program did pass the Turing test, what then?
John Searle: ``Even if a computer did pass the Turing test, it would not be intelligent, as we can see from the Chinese Room argument’’
John Bird: ``A computer is a deterministic system, and hence can have neither free-will, responsibility or intelligence -- whether it passes the Turing test or not. ’’
``This is an AND gate: A A C B B C 0 0 1 1 0 1 0 0 0 1 Given A and B, does the computer have any choice about the value of C?
… but a computer is just a collection of AND gates and similar components. If none of these components can make a free choice, the computer cannot make a free choice. ’’
The Brazen Head
5af2e873e138b021e790a01ba8231ea8.ppt