Difference between revisions of "Model of computation"

From Wiki @ Karl Jones dot com
Jump to: navigation, search
(First)
(No difference)

Revision as of 04:05, 21 August 2015

In computability theory and computational complexity theory, a model of computation is the definition of the set of allowable operations used in computation and their respective costs.

It is used for measuring the complexity of an algorithm in execution time and or memory space: by assuming a certain model of computation, it is possible to analyze the computational resources required or to discuss the limitations of algorithms or computers.

See also

External links