Hadoop interview questions part 1

Hadoop interview questions part 1

Take as many assements as you can to improve your validate your skill rating

Total Questions: 20

1. User applications can instruct the namenode to cache the files by

Correct Answer is : adding cache directive to cache pool

2. HDFS can be accessed over HTTP using

Correct Answer is : webhdfs URI scheme

3. Which of the below property gets configured on core-site.xml ?

Correct Answer is : Directory names to store hdfs files.

4. Which of the below apache system deals with ingesting streaming data to hadoop

Correct Answer is : Flume

5. When a file in HDFS is deleted by a user

Correct Answer is : It goes to trash if configured.

6. The Amazon ____________ is a Web-based service that allows business subscribers to run application programs in the Amazon.com computing environment.

Correct Answer is : None of the mentioned

7. Point out the correct statement :

Correct Answer is : All of the mentioned

8. Amazon ___________ is a Web service that provides real-time monitoring to Amazon’s EC2 customers.

Correct Answer is : CloudWatch

9. Amazon ___________ provides developers the tools to build failure resilient applications and isolate themselves from common failure scenarios.

Correct Answer is : EC2

10. Point out the wrong statement :

Correct Answer is : None of the mentioned

11. Amazon EC2 provides virtual computing environments, known as :

Correct Answer is : instances

12. Amazon ___________ is well suited to transfer bulk amount of data.

Correct Answer is : EC3

13. The EC2 can serve as a practically unlimited set of ___________ machines.

Correct Answer is : virtual

14. EC2 capacity can be increased or decreased in real time from as few as one to more than ___________ virtual machines simultaneously.

Correct Answer is : 1000

15. AMI is uploaded to the Amazon _______ and registered with Amazon EC2, creating a so-called AMI identifier (AMI ID).

Correct Answer is : S2

16. Amazon EMR also allows you to run multiple versions concurrently, allowing you to control your ___________ version upgrade.

Correct Answer is : Hive

17. Point out the correct statement :

Correct Answer is : Amazon Elastic MapReduce (Amazon EMR) provides support for Apache Hive.

18. The Amazon EMR default input format for Hive is :.

Correct Answer is : org.apache.hadoop.hive.ql.io.CombineHiveInputFormat

19. Hadoop clusters running on Amazon EMR use ______ instances as virtual Linux servers for the master and slave nodes

Correct Answer is : EC2

20. Point out the wrong statement :

Correct Answer is : None of the mentioned

Similar Interview Questions

    Search for latest jobs

    Icon
    Icon