Model of computation

From Wiki @ Karl Jones dot com
Revision as of 05:05, 21 August 2015 by Karl Jones (Talk | contribs) (First)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

In computability theory and computational complexity theory, a model of computation is the definition of the set of allowable operations used in computation and their respective costs.

It is used for measuring the complexity of an algorithm in execution time and or memory space: by assuming a certain model of computation, it is possible to analyze the computational resources required or to discuss the limitations of algorithms or computers.

See also

External links