Lexical analysis
From Wiki @ Karl Jones dot com
Revision as of 18:05, 26 August 2015 by Karl Jones (Talk | contribs)
In computer science, lexical analysis is the process of converting a sequence of characters (such as a computer program or web page) into a sequence of tokens (strings with an identified "meaning").
A program that performs lexical analysis may be called a lexer, tokenizer, or scanner (though "scanner" is also used to refer to the first stage of a lexer).
Such a lexer is generally combined with a parser, which together analyze the syntax of programming languages, web pages, and so forth.
See also
External links
- Lexical analysis @ Wikipedia