# Final guide

## Table of Contents

You can bring a notecard (5in x 7in max) with notes or whatnot.

## Prolog

Given,

member(X, [X|_]).
member(X, [_|Tail]) :- member(X, Tail).

foobar(X, List) :- member(Y, List), X \= Y.


What are the values of each of these queries? (Write "true" or "false", or give one value of the variable X.)

member(5, [1, 2, 3]).
member(X, [1, 2, 3]).
foobar(1, [1, 2, 3]).
foobar(1, [1, 1, 1]).


Given,

family(10392,
person(tom, fox, born(7, may, 1960), works(cnn, 152000)),
person(ann, fox, born(19, april, 1961), works(nyu, 65000)),
% here are the children...
[person(pat, fox, born(5, october, 1983), unemployed),
person(jim, fox, born(1, june, 1986), unemployed),
person(amy, fox, born(17, december, 1990), unemployed)]).

exists(Person) :- family(_, Person, _, _).
exists(Person) :- family(_, _, Person, _).
exists(Person) :- family(_, _, _, Children), member(Person, Children).

foo(x, y).
foo(a, y).
foo(b, c).
bar(x, w).
bar(x, x).
bar(c, a).


What are the values of each of these queries (write "true" or "false", or give one value for each of the variables)? Note, if the result is a list, order matters.

family(_, person(_, _, born(_, _, Year), _), _, _), Year > 1960.

% don't forget to give one value for each variable:
% FirstName, LastName, and X
exists(person(FirstName, LastName, _, X)), X \= unemployed.

% just give the value of Stuff
findall(Z, (foo(Z, y), bar(X, Z)), Stuff).


Do the following unify, and if so, what are values for the variables (one set of values for each question), if any variables are involved, that make the unification work?

• foo and foo
• foo(X) and X
• foo(foo(foo)) and foo(X)
• foo(X, foo) and foo(foo, X)
• foo(X, foo) and foo(Y)
• foo(foo(X, Y)) and foo(Z)

Suppose we have the following knowledge base:

f(a).
f(b).

g(a, a).
g(b, c).

h(b).
h(c).

k(X, Y) :- f(X), g(X, Y), h(Y).


Draw the resolution tree for the first successful proof of k(X, c).

## Planning

### General Problem Solver

Modify the following "program" so that GPS can find the solution.

problem = {
"start": ["door locked"],
"finish": ["door open"],
"ops": [
{
"action": "open door",
"preconds": ["door unlocked", "door closed"],
"add": ["door open"],
"delete": ["door closed"]
},
{
"action": "unlock door",
"preconds": ["door closed"],
"add": [],
"delete": ["door locked"]
}
]
}


### PDDL

Given the following domain and problem files, write a successul plan.

;; bogus-domain.pddl

(define (domain bogus)
(:requirements :strips)
(:predicates (baz ?x ?y)
(quux ?x))
(:action do-abc
:parameters (?a ?b ?c)
:precondition (baz ?a ?a)
:effect (and (baz ?a ?b) (quux ?c) (not (baz ?b ?b))))
(:action do-xyz
:parameters (?a ?b)
:precondition (and (quux ?b) (baz ?a ?b))
:effect (and (baz ?b ?b) (not (quux ?a)))))

;; bogus-prob1.pddl

(define (problem bogus-prob1)
(:domain bogus)
(:objects r s t)
(:init (baz r s) (quux s))
(:goal (and (quux t) (baz s r))))


## Multi-agent systems

What are two principles for designing agent-based simulations?

## Learning

Describe the difference between "supervised" and "unsupervised" learning.

### k-means clustering

k-means clustering is an unsupervised or supervised learning strategy?

What does the choice of $$k$$ represent?

How does the choice of initial clusters affect the outcome?

Note: also be able to perform k-means clustering on some data as in Homework 8.

### Classification evaluation

Define true positive (TP). Define false positive (FP). Define false negative (FN). Define precision (in terms of TP and/or FP and/or FN). Define recall (in terms of TP and/or FP and/or FN). Define F-score (in terms of precision and/or recall).

Suppose we make our classification engine more cautious; that is, it is less likely overall to predict any category. Does precision go up or down or remain unchanged? Does recall go up or down or remain unchanged?

What are the precision and recall for the following scenario:

• The true categories for some data points are:
• {noise, noise, signal, noise, signal}
• The predicted categories for the data are (same ordering of the data points):
• {noise, signal, signal, signal, noise}

Consider "signal" to be a "positive" claim and "noise" to be a "negative" claim.

What does "10-fold cross validation" mean?

### k-nearest neighbor

What does k-nearest neighbor allow us to do with a new, unknown data point?

k-nearest neighbor is an unsupervised or supervised learning strategy?

What does the choice of $$k$$ represent?

What problem may a very small value of $$k$$ cause?

What problem may a very large value of $$k$$ cause?

Is there one value for $$k$$ that works best for nearly all data sets? If so, what is it?

Give one benefit of k-nearest neighbor learning.

Give one drawback of k-nearest neighbor learning.

Note: also be able to perform k-nearest neighbor classification on some data as in Homework 8.

## Philosophy

### The "Chinese room" argument

What is the essential goal of "strong AI?"

What is the most critical assumption in the Chinese room argument?

If you believe the Chinese room argument, can you also (reasonably) believe that passing the Turing test gives proof that a machine possesses a mind (i.e., can be said to truly understand things)?

## Extra credit

What is bigger than ant?

Intro to AI material by Joshua Eckroth is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License. Source code for this website available at GitHub.