Anticipation (artificial intelligence)


In artificial intelligence, anticipation occurs when an agent makes decisions based on its explicit beliefs about the future. More broadly, "anticipation" can also refer to the ability to act in appropriate ways that take future events into account, without necessarily explicitly possessing a model of the future events.

In AI

An agent employing anticipation would try to predict the future state of the environment and make use of the predictions in the decision making. For example,
If the sky is cloudy and the air pressure is low,
it will probably rain soon
so take the umbrella with you.
Otherwise
leave the umbrella home.
These rules explicitly take into account possible future events.
In 1985, Robert Rosen defined an anticipatory system as follows:
To some extent, Rosen's definition of anticipation applies to any system incorporating machine learning. At issue is how much of a system's behaviour should or indeed can be determined by reasoning over dedicated representations, how much by on-line planning, and how much must be provided by the system's designers.

In animals

Humans can make decisions based on explicit beliefs about the future. More broadly, animals can act in appropriate ways that take future events into account, although they may not necessarily have an explicit cognitive model of the future; evolution may have shaped simpler systemic features that result in adaptive anticipatory behavior in a narrow domain. For example, hibernation is anticipatory behavior, but does not appear to be driven by a cognitive model of the future.