Data Mining and Data Warehousing MCQ


Hello friends in this post we are going to discuss about Data Mining and Data Warehousing Multiple choice questions | Data Mining and Data Warehousing Multiple choice questions | Data Mining and Data Warehousing Objective type questions | Data Mining and Data Warehousing PDF

Q1. What does Apriori algorithm do?

a. It mines all frequent patterns through pruning rules with lesser support

b. It mines all frequent patterns through pruning rules with higher support

c. Both a and b

d. None of the above

Ans: A

Q2. What does FP growth algorithm do?

a. It mines all frequent patterns through pruning rules with lesser support

b. It mines all frequent patterns through pruning rules with higher support

c. It mines all frequent patterns by constructing a FP tree

d. All of the above

Ans: C

Q3. What techniques can be used to improve the efficiency of apriori algorithm?

a. Hash-based techniques

b. Transaction Reduction

c. Partitioning

d. All of the above

Ans: D

Q4. What do you mean by support(A)?

a. Total number of transactions containing A

b. Total Number of transactions not containing A

c. Number of transactions containing A / Total number of transactions

d. Number of transactions not containing A / Total number of transactions

Ans: C

Q5. How do you calculate Confidence(A -> B)?

a. Support(A B) / Support (A)

b. Support(A B) / Support (B)

c. Support(A B) / Support (A)

d. Support(A B) / Support (B)

Ans: A

Q6. Which of the following is direct application of frequent itemset mining?

a. Social Network Analysis

b. Market Basket Analysis

c. Outlier Detection

d. Intrusion Detection

Ans: B

Q7. What is not true about FP growth algorithms?

a. It mines frequent itemsets without candidate generation.

b. There are chances that FP trees may not fit in the memory

c. FP trees are very expensive to build

d. It expands the original database to build FP trees

Ans: D

Q8. When do you consider an association rule interesting?

a. If it only satisfies min_support

b. If it only satisfies min_confidence

c. If it satisfies both min_support and min_confidence

d. There are other measures to check so

Ans: C

Q9. What is the difference between absolute and relative support?

a. Absolute – Minimum support count threshold and Relative – Minimum support threshold

b. Absolute – Minimum support threshold and Relative – Minimum support count threshold

c. Both mean same

Ans: A

Q10. What is the relation between candidate and frequent itemsets?

a. A candidate itemset is always a frequent itemset

b. A frequent itemset must be a candidate itemset

c. No relation between the two

d. Both are same

Ans: B

Q11. Which technique finds the frequent itemsets in just two database scans?

a. Partitioning

b. Sampling

c. Hashing

d. Dynamic itemset counting

Ans: A

Q12. Which of the following is true?

a. Both apriori and FP-Growth uses horizontal data format

b. Both apriori and FP-Growth uses vertical data format

c. Apriori uses horizontal and FP-Growth uses vertical data format

d. Apriori uses vertical and FP-Growth uses horizontal data format

Ans: A

Q13. What is the principle on which Apriori algorithm work?

a. If a rule is infrequent, its specialized rules are also infrequent

b. If a rule is infrequent, its generalized rules are also infrequent

c. Both a and b

d. None of the above

Ans: A

Q14.Which of these is not a frequent pattern mining algorithm?

a. Apriori

b. FP growth

c. Decision trees

d. Eclat

Ans: C

Q15.Which algorithm requires fewer scans of data?

a. Apriori

b. FP growth

c. Both a and b


d. None of the above

Ans: B

Q16. What are Max_confidence, Cosine similarity, All_confidence?

a. Frequent pattern mining algorithms

b. Measures to improve efficiency of apriori

c. Pattern evaluation measure

d. None of the above

Ans: C

Q17.What are closed itemsets?

a. An itemset for which at least one proper super-itemset has same support

b. An itemsetwhose no proper super-itemset has same support

c. An itemset for which at least super-itemset has same confidence

d. An itemsetwhose no proper super-itemset has same confidence

Ans: B

Q18.What are closed frequent itemsets?

a. A closed itemset

b. A frequent itemset

c. An itemset which is both closed and frequent

d. None of the above

Ans: C

Q19.What are maximal frequent itemsets?

a. A frequent itemset whose no super-itemset is frequent

b. A frequent itemset whose super-itemset is also frequent

c. A non-frequent itemset whose super-itemset is frequent

d. None of the above

Ans: A

Q20.Why is correlation analysis important?

a. To make apriori memory efficient

b. To weed out uninteresting frequent itemsets

c. To find large number of interesting itemsets

d. To restrict the number of database iterations

Ans: B

Q21.What will happen if support is reduced?

a. Number of frequent itemsets remains same

b. Some itemsets will add to the current set of frequent itemsets

c. Some itemsets will become infrequent while others will become frequent

d. Can not say

Ans: B

Q22.Find all strong association rules given the support is 0.6 and confidence is 0.8

a. → I5, → → I2

b. → I5, →

c. Null rule set

d. Cannot be determined

Ans: B

Q23.What is the effect of reducing min confidence criteria on the same?

a. Number of association rules remains same

b. Some association rules will add to the current set of association rules

c. Some association rules will become invalid while others might become a rule.

d. Can not say

Ans: B

Q24.Can FP growth algorithm be used if FP tree cannot be fit in memory?

a. Yes

b. No

c. Both a and b

d. None of the above

Ans: B

Q25.What is association rule mining?

a. Same as frequent itemset mining

b. Finding of strong association rules using frequent itemsets

c. Using association to analyse correlation rules

d. None of the above

Ans: B

Q26.What is frequent pattern growth?

a. Same as frequent itemset mining

b. Use of hashing to make discovery of frequent itemsets more efficient

c. Mining of frequent itemsets without candidate generation

d. None of the above

Ans: C

Q27.When is sub-itemset pruning done?

a. A frequent itemset ‘P’ is a proper subset of another frequent itemset ‘Q’

b. Support (P) = Support(Q)

c. When both a and b is true

d. When a is true and b is not

Ans: C

Q28.Which of the following is not null invariant measure(that does not considers null transactions)?

a. all_confidence

b. max_confidence

c. cosine measure

d. lift

Ans: D

Q29. The apriori algorithm works in a ..and ..fashion?

a. top-down and depth-first

b. top-down and breath-first

c. bottom-up and depth-first

d. bottom-up and breath-first

Ans: D


Leave a Reply

Your email address will not be published. Required fields are marked *