1. Introduction
  2. Prerequistes
  3. Set Environment Variables
  4. Setup SSH daemon
  5. Download hadoop and place it in the home directory.
  6. Unpack hadoop
  7. Configure Hadoop
  8. Format the namenode
  9. Setup hadoop plugin
  10. Start the cluster
  11. Setup hadoop location
  12. Upload data
  13. Create and run a test project.
Bookmark and Share

Setup Hadoop Location in Eclipse

Next step is to configure Hadoop location in the Eclipse environment.

  1. Launch the Eclipse environment.
  2. Open Map/Reduce perspective by clicking on the open perspective icon (), select "Other" from the menu, and then select "Map/Reduce" from the list of perspectives.

  3. After switching to the Map/Reduce perspective, select the Map/Reduce Locations tab located at the bottom of the Eclipse environment. Then right click on the blank space in that tab and select "New Hadoop location...." from the context menu. You should see a dialog box similar to the one shown below.

    Setting up new Map/Reduce location

  4. Fill in the following items, as shown on the figure above.
    • Location Name -- localhost
    • Map/Reduce Master
      • Host -- localhost
      • Port -- 9101
    • DFS Master
      • Check "Use M/R Master Host"
      • Port -- 9100
    • User name -- User

    Then press the Finish button.

  5. After closing the Hadoop location settings dialog you should see a new location in the "Map/Reduce Locations" tab.
  6. In the Project Explorer tab on the left hand side of the Eclipse window, find the DFS Locations item. Open it using the "+" icon on its left. Inside, you should see the localhost location reference with the blue elephant icon. Keep opening the items below it until you see something like the image below.

  7. Browsing HDFS location

You can now move on to the next step.

 

Continue

Bookmark and Share