Model of computation

From Wiki @ Karl Jones dot com
Jump to: navigation, search

In computability theory and computational complexity theory, a model of computation is the definition of the set of allowable operations used in computation and their respective costs.

Description

It is used for measuring the complexity of an algorithm in execution time and or memory space.

By assuming a certain model of computation, it is possible to analyze the computational resources required or to discuss the limitations of algorithms or computers.

See also

External links