Most asked ETL mcq questions and answers for University Exam

ETL mcq questions and answers for online exams

etl mcq, etl mcq questions and answers, etl multiple choice questions with answers, etl multiple choice questions, extract transform load interview questions, extract transform load mcq, etl interview questions, etl mcq with answers,
ETL mcq questions and answers

These are the 25+ most important ETL i.e. Extract, transform, load mcq questions and answers for online exams of various universities like sppu, mu, anna university and a lot other. Below ETL multiple choice questions are also taken from the interviews of some of the MNC’s like tcs, infosys, Capgemini, accenture, etc.

So studying these mcq questions on ETL will not only help you to crack your university exam but also help you to get job in Business intelligence (BI) domain.

1. Which of these statements related ‘non-volatile’ feature of a data warehouse (DWH) is TRUE?
In a DWH, data is only written, never read
In a DWH, data is only read, not written
In a DWH, all CRUD operations are done; but data marts are only read
All types of OLTP operations are done in a DWH

Advertisement

In a DWH, data is only read, not written

2. Which of these steps is executed at the end of every stage of ETL – extract, clean, conform?
Logging the activity to a flat file
Displaying the data to the user
Staging the data to the database
Sending a message about the tasks

Advertisement

Staging the data to the database

3. ETL execution or operation approach falls into which of these two major categories:
Planning & Execution
Implementation & Testing
Scheduling & Support
Maintenance & Support

Advertisement

Scheduling & Support

4. Which is that criterion of the ETL process that ensures the extracted data is trustworthy at any granular level?
Maintainability
Reliability
Availability
Manageability

Advertisement

Reliability

5. How to ensure the data warehouse (DWH) that is initially loaded, will continue to get recent day-to-day transactions data regularly?
DWH admin has to manually extract it from the sources and keep loading them.
A complex software interface between DWH & operational system has to be built for this.
Appropriate ETL jobs have to run regularly to keep refreshing DWH data.
Not possible. DWH always works only with initially loaded data

Advertisement

Appropriate ETL jobs have to run regularly to keep refreshing DWH data.

6. One of the tasks involved during the load process is:
to build the logical data map
to filter the source data
to understand the structure of source data
aggregating the data

Advertisement

aggregating the data

7. Arrange these ETL steps in a correct order:
i. Load data to data warehouse
ii. Clean and conform the data
iii. Build logical data map
iv. Extract the source data
(iii), (iv), (ii), (i)
(ii), (iv), (iii), (i)
(iii), (i), (iv), (i)
(iii), (iv), (i), (ii)

Advertisement

(iii), (iv), (ii), (i)

8. While loading the data to the data warehouse, if the data is already present in it,
it’s simply overwritten
always keep its history
need to decide to whether overwrite it or keep its history
it’s deleted and overwritten

Advertisement

need to decide to whether overwrite it or keep its history

9. “Filtering out the data items” from the source data is done at the ______ stage of ETL process.
extract
transform
load
query

Advertisement

transform

10. Differences between data warehouse (DWH) and data mart (DM) are (Choose the correct ones):
i. DWH holds disparate data, DM holds specific dept. data
ii. Data held by DWH is specific, held by DM is detailed
iii. DWH is easy to maintain, DM is difficult to maintain
Only (i) & (ii) are correct
Only (i) & (iii) are correct
Only (i) is correct
All are correct

Advertisement

Only (i) is correct

11. One of the ways the data warehouse can be used in a bank is:
For opening & closing loan accounts
For doing daily transactions
To provide internet banking facility to its customers
None of these

None of these

12. In the statement, “During a book exhibition, a popular book
store sold 1000 books of worth Rs. 1,23,456.00 in a day”. Here:
Rs. is a measure, book is a dimension, 1000 & 1,23,456.00 are the variables.
Book is a measure, 1000 & 1,23,456.00 are a dimension, Rs is fact
Book is variable, Rs is a dimension, 1000 & 1,23,456.00 are a measure
In fact, all these are facts

Advertisement

Rs. is a measure, book is a dimension, 1000 & 1,23,456.00 are the variables.

13. In a data warehouse (DWH) architecture, the role of data mart is
Just to partition the DWH
To host the data from the original sources
To transform the data back into their respective sources
To contain subset of DWH specific to a particular dept.

To contain subset of DWH specific to a particular dept.

14. One of the real needs for creating a data warehouse is that though there is data everywhere, ___________
They are stored in RDBMS tables
There are many database users
Data comes in different versions and flavours
It has to be processed before it can be used for decision making.

Advertisement

It has to be processed before it can be used for decision making.

15. End user applications of the DWH can access following database areas:
(i) DWH database
(ii) ETL staging database
(iii) Source system databases
Only (i)
Only (i) & (ii)
Only (ii) & (iii)
All of these

Only (i)

1. One of the requirements while designing an ETL system is how quickly source data can be delivered to end users. This is referred as:
Data speed
Data lineage
Data latency
Data availability

Data latency

2. Which one of these important activities, if done by the ETL team, is like accomplishing their main mission?
to handover loaded dimension and fact tables
to connect to disparate source systems
to extract data from various source systems
to standardize the extracted and cleaned data

Advertisement

to handover loaded dimension and fact tables

3. Which of these statements is true w.r.t. dimension table?
It can use primacy key of any of the source tables as its primary key to ensure uniqueness
It uses a composite primary keys from multiple source tables as its primary key
It uses a surrogate key as its primary key
It uses a composite primary key formed from its own attributes

It can use primacy key of any of the source tables as its primary key

4. In dimensional modeling, _____ tables contain the measurements and the ______ tables provide the context for those measurements.
dimension, RDBMS
aggregate, staging
fact, dimension
dimension, sub-dimension

fact, dimension

5. While filling the fact tables, the first steps is:
to fill it with legitimate foreign keys from the dimension tables
to fill it with appropriate foreign keys from other fact tables
to identify its primary key
to build an index for it

to fill it with legitimate foreign keys from the dimension tables

6. Almost all of the attributes of dimension tables are in ______ format.
numeric
textual
date
fixed-length string

Advertisement

textual

7. Extracting the data from the source systems, is involved in _______ step of ETL process.
extract
transform
load
planning

extract

ETL interview questions

8. Data profiling activity that is done during initial ETL requirements stage, gets a thorough understanding of:
the amount of data present in the source system
content, structure and quality of the data
ownership information of the data
None of these

Advertisement

content, structure and quality of the data

9. ETL architecture needs to look the front room and back room of the database. Which of these ‘rooms’ are accessible by end user applications?
front room
back room
Any of these
Neither of these.

front room

10. Which of these cannot be a dimension table in a typical insurance business scenario?
Policy
Agent
Effective date
Policy amount

Advertisement

Policy amount

11. Which of these columns need not be present in a Logical Data Map?
Table name
Slowly changing dimension
Business term
Table type

Business term

12. How do you plan to extract the data from an ERP system?
Just like extracting data from any other source system.
Consult ERP expert and then plan
Use any export utility
Ignore them.

Consult ERP expert and then plan

13. The ‘weight’ measure of ‘product’ dimension of a retail industry, has different units when brought from various source systems like gms, ounce, kg, etc. In the ‘conforming’ stage, it’s better be converted to:
Kg
Ounce
Grams
Keep them as it is.

Grams

14. While loading dimension tables, there can be more than one surrogate key for the same natural key, say Customer ID Which example justifies it?
There can be more than one customer
One person can be customers for different dept.s or units
Many numeric attributes like total sale amt, total qty for different time periods will become measurements in fact tables
There should be only one surrogate key to one natural key.

Advertisement

Many numeric attributes like total sale amt, total qty for different time periods will become measurements in fact tables

15. The primary purpose of conforming the data during the ETL process is,
to ensure that the data is clean
to confirm that data from all relevant sources are extracted
to have standardized dimension and fact tables
to standardize them for merging the fact and dimension tables

to have standardized dimension and fact tables

16. Similar data (say, sales figures) extracted from various source systems needs to have a common unit of
measurement. This is can be ensured using _______ process.
translating
transferring
confirming
conforming

conforming

Extract, Transform, Load mcq

17. Organization can decide to go for hand-coding of the ETL processes, if:
there are many end users
there is need for simpler and faster development
there is availability of in-house programmers
there is a need to automatically generate ETL metadata

18. The data extracted from different source systems cannot be directly loaded into the data warehouse (DWH). How is it managed then?
By copying them into a staging area first
By copying them into some flat files in the respective source systems
By copying them into flat files in the file system of the DWH server
Data from various source systems are loaded directly into the DWH one-by-one

Advertisement

By copying them into a staging area first

19. One of the intentions of ‘cleaning’ the source data is,
to reduce the column widths of source data
to reduce the no. of rows loaded into the data warehouse
to ensure consistency and data quality needs
so that it’s easy to copy the data into the data warehouse

to ensure consistency and data quality needs

20. Which of these is a non-relational data source?
i. Oracle database table
ii. ‘Customer.XLS’ file
iii. ims_manifest.xml
Only (ii)
Only (iii)
Only (i) & (ii)
Only (ii) & (iii)

Only (ii) & (iii)

21. The strength of an XML file, one of the ETL data structures, is
It is a flat file
It is an internet based file
It is a Windows based file
It is a universal data exchange language

It is a universal data exchange language

22. Arrange these steps that help implement ETL system, in a proper order:
i. Walk-thru data warehouse data model
ii. Validate calculations & formulas
iii. Analyze source data
iv. Build logical data map
(iv), (iii), (ii), (i)
(i), (iii), (ii), (iv)
(ii), (iii), (iv), (i)
(ii), (iii), (i), (iv)

Advertisement

(iv), (iii), (ii), (i)

etl mcq, etl mcq questions and answers, etl multiple choice questions with answers, etl multiple choice questions, extract transform load interview questions, extract transform load mcq, etl interview questions, etl mcq with answers,

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top