Difference between revisions of "Lexical analysis"
From Wiki @ Karl Jones dot com
Karl Jones (Talk | contribs) (→See also) |
Karl Jones (Talk | contribs) (→External links) |
||
Line 19: | Line 19: | ||
* [https://en.wikipedia.org/wiki/Lexical_analysis Lexical analysis] @ Wikipedia | * [https://en.wikipedia.org/wiki/Lexical_analysis Lexical analysis] @ Wikipedia | ||
+ | |||
+ | [[Category:Computer programming]] | ||
+ | [[Category:Computer science]] | ||
+ | [[Category:Language]] |
Latest revision as of 04:28, 22 April 2016
In computer science, lexical analysis is the process of converting a sequence of characters (such as a computer program or web page) into a sequence of tokens (strings with an identified "meaning").
Description
A program that performs lexical analysis may be called a lexer, tokenizer, or scanner (though "scanner" is also used to refer to the first stage of a lexer).
Such a lexer is generally combined with a parser, which together analyze the syntax of programming languages, web pages, and so forth.
See also
- Computer program
- Computer programming
- Computer science
- Compiler
- Programming language
- Syntax (programming languages)
External links
- Lexical analysis @ Wikipedia