Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
48 views2 pages

Hadoop Configure 3.3.6 Configuration

The document outlines the steps to set up a Hadoop environment, including user creation, SSH key generation, and configuration of various Hadoop XML files. It specifies the necessary environment variables and directories for Hadoop installation and configuration. Finally, it includes commands to format the namenode and start the Hadoop services.

Uploaded by

Mr.Sakthivel csg
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as TXT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
48 views2 pages

Hadoop Configure 3.3.6 Configuration

The document outlines the steps to set up a Hadoop environment, including user creation, SSH key generation, and configuration of various Hadoop XML files. It specifies the necessary environment variables and directories for Hadoop installation and configuration. Finally, it includes commands to format the namenode and start the Hadoop services.

Uploaded by

Mr.Sakthivel csg
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as TXT, PDF, TXT or read online on Scribd
You are on page 1/ 2

sudo adduser sce254001

sudo usermod -aG sudo sce254001

ssh-keygen -t rsa -P ""

cat /home/sce25/.ssh/id_rsa.pub >> /home/sce2355/.ssh/authorized_keys

tar -xvzf had00p-3-3.6.tar.gz

1. hadoop-env.sh
..................
export JAVA_HOME=/usr/lib/jvm/java-11-openjdk-amd64

2. core-site.xml
.............
<configuration>
<property>
<name>fs.defaultFS</name>
<value>hdfs://localhost:9000</value>
</property>
</configuration>

3. hdfs-xite.xml
....................
*** mkdir /home/student/batchno/hadoop_store
*** mkdir /home/student/batchno/hadoop_store/hdfs
*** mkdir /home/student/batchno/hadoop_store/hdfs/namenode
*** mkdir /home/student/batchno/hadoop_store/hdfs/datanode

<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
<property>
<name>dfs.namenode.name.dir</name>
<value>file:/home/test/Downloads/hadoop-3.3.6/hadoop_store/hdfs/namenode</
value>
</property>
<property>
<name>dfs.datanode.data.dir</name>
<value>file:/home/test/Downloads/hadoop-3.3.6/hadoop_store/hdfs/datanode</
value>
</property>
</configuration>

4. yarn site.xmxl
................

<configuration>
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>
</configuration>

5. mapred-site.xml
...............
<configuration>
<property>
<name>yarn.app.mapreduce.am.env</name>
<value>HADOOP_MAPRED_HOME=/home/test/Desktop/hadoop-3.3.6/bin/hadoop</value>
</property>
<property>
<name>mapreduce.map.env</name>
<value>HADOOP_MAPRED_HOME=/home/test/Desktop/hadoop-3.3.6/bin/hadoop</value>
</property>
<property>
<name>mapreduce.reduce.env</name>
<value>HADOOP_MAPRED_HOME=/home/test/Desktop/hadoop-3.3.6/bin/hadoop</value>
</property>
</configuration>

sudo gedit ~/.bashrc

Profile
...............
export JAVA_HOME=/usr/lib/jvm/java-11-openjdk-amd64
export HADOOP_HOME=/home/test/Downloads/hadoop-3.3.6
export HADOOP_INSTALL=$HADOOP_HOME
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export HADOOP_YARN_HOME=$HADOOP_HOME
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib/native"

source ~/.bashrc

Close existing terminal.. Open New terminal


hdfs namenode -format

Go to SBIN directory - cd /hadoop-3.3.6/sbin>./start-all.sh

You might also like