Get fresh updates from Hortonworks by email

Once a month, receive latest insights, trends, analytics information and knowledge of Big Data.


Sign up for the Developers Newsletter

Once a month, receive latest insights, trends, analytics information and knowledge of Big Data.


Get Started


Ready to Get Started?

Download sandbox

How can we help you?

* I understand I can unsubscribe at any time. I also acknowledge the additional information found in Hortonworks Privacy Policy.
closeClose button
Hortonworks Partner


Community ISV/IHV PartnerGold Reseller Partner
3Soft is a Community ISV/IHV Partner / Gold Reseller Partner in the Partnerworks Program.

3Soft offers comprehensive services in the field of Big Data that are customized for individual clients and their industries. We have a professional team of specialists who create architecture and implement the Hadoop platform. We take responsibility either for software layer or cluster infrastructure.


  1. Design and analysis of data models
  • carrying out workshops with client’s representatives regarding data usage
  • determining ways of data collection from the Hadoop platform for the purposes of analytical systems
  • establishing requirements for NoSQL databases
  • defining data structures on HDFS
  • specifying ways of file data encryption on HDFS and determining optimal file sizes given the amount of data in the cluster
  • defining security and availability requirements of the cluster


  1. Providing the cluster with data from domain systems and data warehouses
  • determining ways of data collection from domain systems
  • designing and implementing interfaces to communicate with external systems
  • migrating archived data


  1. Implementation and testing of distributed MapReduce algorithms
  • implementation and verification in a test environment, including unit testing
  • measuring the performance of MapReduce algorithms


  1. Administering the Hadoop platform within the cluster in terms of:
  • roles (NameNode, DataNode, RegionServer, etc..)
  • privileges (access to data on HDFS, integration with AD, etc..)
  • data processing (workflow)


  1. Monitoring and maintenance of cluster infrastructure in the area of:
  • hardware (processors, memory, hard drives, network)
  • operating system (OS G compatibility, patching, maintaining consistency versions, etc..)
  • Hadoop platform components (HDFS, MapReduce, Oozie, Hue, HBase, Flume, etc..)
  • custom applications running on the Hadoop platform