HDP on Linux – Installation Forum

Adding additional disk space to existing data nodes

  • #25759
    P K
    Participant

    I have created a new file system and trying to add it to existing HDFS by updating hdfs-site.xml – dfs-data-dir property on all data nodes. After restarting the HDFS and other components I don’t see size of HDFS increased. Also I try to perform same task using Ambari but the the option to update additional filesystem is greyed out. any ideas how it can be done ?

to create new topics or reply. | New User Registration

  • Author
    Replies
  • #25785
    Jeff Sposetti
    Moderator

    Hi,

    Which version of Ambari are you running? On your ambari server…

    yum info ambari-server

    – Jeff

    #25827
    P K
    Participant

    Hi Jeff, please see below.
    Thanks

    yum info ambari-server
    Loaded plugins: product-id, rhnplugin, security, subscription-manager
    This system is not registered to Red Hat Subscription Management. You can use subscription-manager to register.
    This system is receiving updates from RHN Classic or RHN Satellite.
    Installed Packages
    Name : ambari-server
    Arch : noarch
    Version : 1.2.2.5
    Release : 1
    Size : 42 M
    Repo : installed
    From repo : Updates-ambari-1.x
    Summary : Ambari Server
    License : 2012, Apache Software Foundation
    Description : Maven Recipe: RPM Package.

    Available Packages
    Name : ambari-server
    Arch : noarch
    Version : 1.2.3.6
    Release : 1
    Size : 32 M
    Repo : Updates-ambari-1.x
    Summary : Ambari Server
    License : 2012, Apache Software Foundation
    Description : Maven Recipe: RPM Package.

    #25828
    Chris Miller
    Member

    Same issue here. When modifying the hdfs-site.xml configs on the data nodes directly they get overwritten by what is in Ambari-Server. We’re using the Ambari Server that comes with HDP 1.2.2.

    #25932
    Seth Lyubich
    Moderator

    Hi all,

    Ability to change configuration via Ambari is in Ambari 1.2.3. You can try to upgrade and try again.

    Hope this helps.

    Thanks,
    Seth

    #25959
    Jeff Sposetti
    Moderator

    Links to the latest HDP version that includes Ambari 1.2.3 can be found here:

    http://docs.hortonworks.com/

    There is a chapter on Upgrading Ambari in the “Automated Install (Ambari)” doc.

    #26059
    Chris Miller
    Member

    Is there a way to update this parameter via Ambari’s REST API?

    e.g.

    {
    “Clusters”: {
    “desired_config”: {
    “type”: “hdfs-site”,
    “tag”: “version1″,
    “properties”: {
    “dfs_data_dir” : “/data”,
    }
    }
    }

    However, I get the following error. Perhaps this method on the Ambari API was not implemented?

    Status Code: 500 Unable to parse json: org.codehaus.jackson.JsonParseException: Unexpected character (‘}’ (code 125)): was expecting double-quote to start field name at [Source: java.io.StringReader@44ffe3b2; line: 8, column: 8]

    #26060
    Chris Miller
    Member

    FYI – I have Ambari 1.2.2.5 installed.

    #26062
    Larry Liu
    Moderator

    Hi, Chris, P K,

    We have newer version of Ambari which would allow you to make changes to hdfs-site.xml from ambari UI. Please go to following page to upgrade:

    http://hortonworks.com/download/

    Thanks
    Larry

    #26086
    Chris Miller
    Member

    I know you have a newer version of Ambari out that we could upgrade to.

    Question is whether the Management API calls should be working in 1.2.2.5? When were they introduced?

    Chris

    #26091
    Yi Zhang
    Moderator

    Hi Chris,

    The latest as of this moment is 1.2.3.6, which allows more rooms for changes in configs through UI. Ambari api supports host level and cluster overrides. You basically create a newer version of the config.

    That being said, is the new filesystem a different filesystem format than the existing ones? What kind?

    Thanks,
    Yi

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.