13 Explain the functionalities and S management. What is drf policy important fe
ID: 665138 • Letter: 1
Question
13 Explain the functionalities and S management. What is drf policy important features of Fair Scheduler. Esplain as queue s+2 14 What a client does once it figures out a data block is corrupt? What a NameNode does 2+2 when it gets report that a block replica is corrupt? 15 What is meant by splittable property of data compression scheme? Is that relevant for any unix file system? Give an example each of a codec which is splittable and non splittable 1+1+2 16 What is RPC mechanism? Explain four important desirable properties that RPC mechanism should have. How that is different from storage mechanism? i + 4 + 17. What are Serialization & Deserialization? Explain by means of a diagram, how Serialization & Deserialization are required in MapReduce job processing.2+3Explanation / Answer
This question has lot of subparts. Please post 1 more question separately.
17.
Serilaization is the process of converting structured objects into a byte stream. It is done basically for two purposes one, for transmission over a network(interprocess communication) and for writing to persisitent storage. In Hadoop the interprocess communication between nodes in the system is done by using remote procedure calls i.e. RPCs. The RPC rotocol uses serialization to make the message into a binary stream to be sent to the remote node,which receives and deserializes the binary stream into the original message.
RPC serialization format is expected to be:
It should be noted that the data format for persistent storage purposes would have different requirements from serilaization framework in addition to four expected properties of an RPC's serialization format mentioned above.
Hadoop uses its own serialization format,Writables. Writable is compact and fast, but not extensible or interoperable.
16.
Hadoop has its own RPC mechanism that dates back to when Hadoop was a part of Nutch. It's used throughout Hadoop as the mechanism by which daemons talk to each other. For example, a DataNode communicates with the NameNode using the RPC interface DatanodeProtocol.
Protocols are defined using Java interfaces whose arguments and return types are primitives, Strings, Writables, or arrays. These types can all be serialized using Hadoop's specialized serialization format, based on Writable. Combined with the magic of Java dynamic proxies, we get a simple RPC mechanism which for the caller appears to be a Java interface.
Related Questions
drjack9650@gmail.com
Navigate
Integrity-first tutoring: explanations and feedback only — we do not complete graded work. Learn more.