Model of computation
From Wiki @ Karl Jones dot com
Revision as of 06:04, 3 September 2015 by Karl Jones (Talk | contribs)
In computability theory and computational complexity theory, a model of computation is the definition of the set of allowable operations used in computation and their respective costs.
Description
It is used for measuring the complexity of an algorithm in execution time and or memory space: by assuming a certain model of computation, it is possible to analyze the computational resources required or to discuss the limitations of algorithms or computers.
See also
- Algorithm
- Computability theory
- Computation
- Computational complexity theory
- Computer science
- Theoretical computer science
- Turing machine
External links
- Model of computation @ Wikipedia