These are not 4 or5 choices. The answer may be all of the above.
Securing a Hadoop cluster starts with identifying what type (PII, security sensitive, webblogs or low value-highvolume data) will be stored in a Hadoop cluster.
Then consider how users will access the data? Through a middleware application (perhaps BI) or directly. What are the controls placed in the middleware. Are these controls sufficient? If these controls aren’t sufficient, or the consequences of a breach are high, enable Kerberos. Then consider putting a firewall around your Hadoop cluster. Often an end-user does not have a line of sight to an enterprise DB and a Hadoop cluster may need to be secured similarily.
Then if you have applications accessing Hadoop services over REST, Apache Knox is your friend. Put it between the application and the Hadoop cluster.
Now there are authorization controls at various layers in Hadoop. From ACL in MR Qs to HDFS permission. More access controls improvements are coming.
Consider enabling wire encryption to protect data as it moves in Hadoop. Consider using custom and other solutions for encrypting Data at rest (as it sits in HDFS). In new couple of release there will be Hadoop native solutions of data at rest encryption.
You may also want to review Hadoop Security labs (http://hortonworks.com/labs/security/)- it has a good overview of Hadoop security, where it is heading and links to a growing list of how-tos.