Difficulty: Easy
Correct Answer: recognition of basic elements and creation of uniform symbols
Explanation:
Introduction / Context:
Compilers transform high-level source code into executable form through a pipeline of phases. The first substantive phase after preprocessing is lexical analysis (scanning), which converts raw character streams into tokens for the parser.
Given Data / Assumptions:
Concept / Approach:
Lexical analysis groups characters into meaningful units: identifiers, keywords, literals, operators, delimiters. It may also remove whitespace/comments and normalize representations (e.g., number formats). The output is a token stream, often with symbol table entries for identifiers and literals, which feeds the parser for grammar-based analysis.
Step-by-Step Solution:
Verification / Alternative check:
Standard compiler design references separate scanner and parser phases exactly along these lines.
Why Other Options Are Wrong:
Common Pitfalls:
Final Answer:
recognition of basic elements and creation of uniform symbols.
Discussion & Comments