JOBSEEKERS
Login
Sign Up
Jobseeker
Employer
Staffing Firm
Direct Client
Hadoop interview questions part 44
Hadoop interview questions part 44
Back
Take as many assements as you can to improve your validate your skill rating
Total Questions: 20
1. Which command is used to disable all the tables matching the given regex ?
A. remove all
B. drop all
C. disable_all
D. all of the mentioned
Show Correct Answer
Correct Answer is :
disable_all
2. __________ command disables drops and recreates a table.
A. drop
B. truncate
C. delete
D. none of the mentioned
Show Correct Answer
Correct Answer is :
truncate
3. Correct and valid syntax for count command is :
A. count ‘
’
B. count ‘
’
C. count ‘
’
D. None of the mentioned
Show Correct Answer
Correct Answer is :
count ‘
’
4. For running hadoop service daemons in Hadoop in secure mode, ___________ principals are required.
A. SSL
B. Kerberos
C. SSH
D. None of the mentioned
Show Correct Answer
Correct Answer is :
Kerberos
5. Point out the correct statement :
A. Hadoop does have the definition of group by itself
B. MapReduce JobHistory server run as same user such as mapred
C. SSO environment is managed using Kerberos with LDAP for Hadoop in secure mode
D. None of the mentioned
Show Correct Answer
Correct Answer is :
SSO environment is managed using Kerberos with LDAP for Hadoop in secure mode
6. The simplest way to do authentication is using _________ command of Kerberos.
A. auth
B. kinit
C. authorize
D. all of the mentioned
Show Correct Answer
Correct Answer is :
kinit
7. Data transfer between Web-console and clients are protected by using :
A. SSL
B. Kerberos
C. SSH
D. None of the mentioned
Show Correct Answer
Correct Answer is :
SSL
8. Point out the wrong statement :
A. Data transfer protocol of DataNode does not use the RPC framework of Hadoop
B. Apache Oozie which access the services of Hadoop on behalf of end users need to be able to impersonate end users
C. DataNode must authenticate itself by using privileged ports which are specified by dfs.datanode.address and dfs.datanode.http.address
D. None of the mentioned
Show Correct Answer
Correct Answer is :
None of the mentioned
9. In order to turn on RPC authentication in hadoop, set the value of hadoop.security.authentication property to :
A. zero
B. kerberos
C. FALSE
D. none of the mentioned
Show Correct Answer
Correct Answer is :
kerberos
10. The __________ provides a proxy between the web applications exported by an application and an end user.
A. ProxyServer
B. WebAppProxy
C. WebProxy
D. None of the mentioned
Show Correct Answer
Correct Answer is :
WebAppProxy
11. ___________ used by YARN framework which define how any container launched and controlled.
A. Container
B. ContainerExecutor
C. Executor
D. All of the mentioned
Show Correct Answer
Correct Answer is :
ContainerExecutor
12. The ____________ requires that paths including and leading up to the directories specified in yarn.nodemanager.local-dirs
A. TaskController
B. LinuxTaskController
C. LinuxController
D. None of the mentioned
Show Correct Answer
Correct Answer is :
LinuxTaskController
13. The configuration file must be owned by the user running :
A. DataManager
B. NodeManager
C. ValidationManager
D. None of the mentioned
Show Correct Answer
Correct Answer is :
NodeManager
14. Apache _______ is a serialization framework that produces data in a compact binary format.
A. Oozie
B. Impala
C. kafka
D. Avro
Show Correct Answer
Correct Answer is :
Avro
15. Point out the correct statement :
A. Apache Avro is a framework that allows you to serialize data in a format that has a schema built in
B. The serialized data is in a compact binary format that doesn’t require proxy objects or code generation
C. Including schemas with the Avro messages allows any application to deserialize the data
D. All of the mentioned
Show Correct Answer
Correct Answer is :
All of the mentioned
16. Avro schemas describe the format of the message and are defined using :
A. JSON
B. XML
C. JS
D. All of the mentioned
Show Correct Answer
Correct Answer is :
JSON
17. The ____________ is an iterator which reads through the file and returns objects using the next() method.
A. DatReader
B. DatumReader
C. DatumRead
D. None of the mentioned
Show Correct Answer
Correct Answer is :
DatumReader
18. Point out the wrong statement :
A. Java code is used to deserialize the contents of the file into objects
B. Avro allows you to use complex data structures within Hadoop MapReduce jobs
C. The m2e plugin automatically downloads the newly added JAR files and their dependencies
D. None of the mentioned
Show Correct Answer
Correct Answer is :
None of the mentioned
19. The ____________ class extends and implements several Hadoop-supplied interfaces.
A. AvroReducer
B. Mapper
C. AvroMapper
D. None of the mentioned
Show Correct Answer
Correct Answer is :
AvroMapper
20. ____________ class accepts the values that the ModelCountMapper object has collected.
A. AvroReducer
B. Mapper
C. AvroMapper
D. None of the mentioned
Show Correct Answer
Correct Answer is :
AvroReducer
Similar Interview Questions
Search for latest jobs
Find Jobs