Difficulty: Easy
Correct Answer: All of the above
Explanation:
Introduction / Context:
In compiler design, the lexical analysis (scanner) phase is the front-door of the compilation pipeline. It transforms raw source code characters into a structured stream of tokens and prepares auxiliary tables that later phases use. This question tests recognition of the standard responsibilities of lexical analysis.
Given Data / Assumptions:
Concept / Approach:
Lexical analysis groups characters into lexemes and maps them to tokens (keywords, identifiers, literals, operators). Alongside tokenization, it typically records encountered identifiers and constants in tables, assigns token codes, and produces a uniform symbol representation consumed by later stages.
Step-by-Step Solution:
Recognize lexemes and emit tokens for the parser.Record identifiers (names) and literals (constants) into dedicated tables with attributes.Produce a uniform symbol table or token stream with references into those tables to standardize later processing.Therefore, all listed tasks (tokenization, tables, uniform symbols) belong to lexical analysis.
Verification / Alternative check:
Most textbooks separate scanning (lexical) from parsing (syntax). Anything involving grammar productions is parser work; anything involving raw character grouping is scanner work.
Why Other Options Are Wrong:
(e) Target code generation is a back-end activity, far beyond lexical analysis.
Common Pitfalls:
Confusing parsing with scanning, or assuming code generation happens during tokenization. Also, some projects split table-building across phases; nevertheless, establishing identifiers/literals at scan time is standard.
Final Answer:
All of the above.
Discussion & Comments