Difference between revisions of "Lexical analysis"

From Wiki @ Karl Jones dot com
Jump to: navigation, search
(See also)
Line 13: Line 13:
 
* [[Computer science]]
 
* [[Computer science]]
 
* [[Compiler]]
 
* [[Compiler]]
 +
* [[Programming language]]
 
* [[Syntax (programming languages)]]
 
* [[Syntax (programming languages)]]
  

Revision as of 10:29, 2 September 2015

In computer science, lexical analysis is the process of converting a sequence of characters (such as a computer program or web page) into a sequence of tokens (strings with an identified "meaning").

Description

A program that performs lexical analysis may be called a lexer, tokenizer, or scanner (though "scanner" is also used to refer to the first stage of a lexer).

Such a lexer is generally combined with a parser, which together analyze the syntax of programming languages, web pages, and so forth.

See also

External links