Curioustab
Aptitude
General Knowledge
Verbal Reasoning
Computer Science
Interview
Take Free Test
Aptitude
General Knowledge
Verbal Reasoning
Computer Science
Interview
Take Free Test
Data Warehousing Questions
In data warehousing design, what does the term “snowflake schema” refer to, and which type of tables are being normalized in that schema?
In dimensional modeling, how are fact tables typically structured with respect to normalization in a data warehouse?
In information systems, which description best matches an operational system (OLTP)—that is, the system used to run the business day-to-day?
In an active (near real-time) data warehouse architecture, which components are typically present?
Which of the following is a core goal of data mining in analytics projects?
Within ETL, what does the “extract” process typically capture from operational systems?
Which statement best describes a data warehouse according to classic definitions (subject-oriented, integrated, time-variant, non-volatile)?
Within ETL, which description correctly reflects a form of data transformation commonly performed before loading to the warehouse?
In a generic two-level data warehouse architecture, which component is typically included alongside the core warehouse?
In the context of data quality for warehousing, what best describes “data scrubbing” (also called data cleansing)?
In data warehousing design, what kind of relationship exists between a dimension table and the central fact table in a classic Star Schema (considering how keys link rows for analysis)?
In data warehousing terminology, what is meant by transient data (as opposed to periodic, history-preserving data)?
In Enterprise Data Warehousing, what best defines reconciled data (the enterprise, current, integrated layer used to feed downstream analytics)?
In ETL for data warehousing, what does the term “load and index” typically mean in the context of preparing warehouse tables for analytics?
In data transformation during ETL, what can a “multifield transformation” accomplish when mapping source fields to target fields?
Data warehousing and ETL quality: Evaluate the statement below and choose the most accurate option.\n“During Extract–Transform–Load (ETL), the role of the process is to identify erroneous data and to fix them.”\nAssume a modern DW/BI pipeline with data quality rules, profiling, and stewardship.
Data marts and workload focus: Judge the statement.\n“A data mart is designed to optimize performance for well-defined and predictable uses.”
Operational data characteristics: Decide whether the following is accurate.\n“Data in operational systems are typically fragmented and inconsistent.”
Schema suitability: Evaluate the claim.\n“Star schema is suited to online transaction processing (OLTP) and therefore is generally used in operational systems, operational data stores (ODS), or an enterprise data warehouse (EDW).”
Prerequisites for successful data warehousing: Consider this statement.\n“Implementing a formal Total Quality Management (TQM) program is required for data warehousing success.”
1
2