Model of computation
From Wiki @ Karl Jones dot com
In computability theory and computational complexity theory, a model of computation is the definition of the set of allowable operations used in computation and their respective costs.
Description
It is used for measuring the complexity of an algorithm in execution time and or memory space.
By assuming a certain model of computation, it is possible to analyze the computational resources required or to discuss the limitations of algorithms or computers.
See also
- Algorithm
- Computability theory
- Computation
- Computational complexity theory
- Computer science
- Theoretical computer science
- Turing machine
External links
- Model of computation @ Wikipedia