Logo video2dn
  • Сохранить видео с ютуба
  • Категории
    • Музыка
    • Кино и Анимация
    • Автомобили
    • Животные
    • Спорт
    • Путешествия
    • Игры
    • Люди и Блоги
    • Юмор
    • Развлечения
    • Новости и Политика
    • Howto и Стиль
    • Diy своими руками
    • Образование
    • Наука и Технологии
    • Некоммерческие Организации
  • О сайте

Скачать или смотреть Hadoop Basic Commands for beginners

  • how to hadoop
  • 2019-02-11
  • 315
Hadoop Basic Commands for beginners
hadoop commandshadoop basicshadoophadoop basics for beginnershadoop basic conceptsbig data basicsbig data basic tutorialhadoop fs commands with exampleshadoop copy from local to hdfshadoop copy file from local to hdfshadoop copy from hdfs to hdfshadoop put from local to hdfshadoop commands with exampleshadoop commands tutorial for beginnershadoop linux commandshadoop linux install tutorialhadoop linux install
  • ok logo

Скачать Hadoop Basic Commands for beginners бесплатно в качестве 4к (2к / 1080p)

У нас вы можете скачать бесплатно Hadoop Basic Commands for beginners или посмотреть видео с ютуба в максимальном доступном качестве.

Для скачивания выберите вариант из формы ниже:

  • Информация по загрузке:

Cкачать музыку Hadoop Basic Commands for beginners бесплатно в формате MP3:

Если иконки загрузки не отобразились, ПОЖАЛУЙСТА, НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если у вас возникли трудности с загрузкой, пожалуйста, свяжитесь с нами по контактам, указанным в нижней части страницы.
Спасибо за использование сервиса video2dn.com

Описание к видео Hadoop Basic Commands for beginners

This tutorial is for practicing hadoop basic commands. Below mentioned are some commands with instructions :

Open a terminal window to the current working directory.
/home/hadoop
1. Print the Hadoop version
hadoop version
2. List the contents of the root directory in HDFS
hadoop fs -ls /
3. Report the amount of space used and available on currently mounted file system
hadoop fs -df hdfs:/
4. Count the number of directories, files and bytes under the paths that match the specified file pattern
hadoop fs -count hdfs:/
5. Run a DFS file system checking utility
hadoop fsck – /
6. Run a cluster balancing utility
hadoop balancer
7. Create a new directory
hadoop fs -mkdir /home/hadoop/work/sample

8. Add a sample text file from the local directory
hadoop fs -put /home/hadoop/work /sample/sample.txt /home/hadoop/work/sample
9. List the contents of this new directory in HDFS.
hadoop fs -ls /home/hadoop/work/sample
10. Add the entire local directory called “tchadoop2” to the /home/hadoop/work/.
hadoop fs -put /home/hadoop/work/sample/tchadoop2 /home/hadoop/work/sample
11. Since /home/hadoop/work is your home directory in HDFS, any command that does not have an absolute path is interpreted as relative to that directory. The next command will therefore list your home directory, and should show the items you’ve just added there.
hadoop fs -ls
12. See how much space this directory occupies in HDFS.
hadoop fs -du -s -h /home/hadoop/work /pragim
13. Delete a file ‘customers’ from the “retail” directory.
hadoop fs -rm /home/hadoop/work /pragim/class2
14. Ensure this file is no longer in HDFS.
hadoop fs -ls /home/hadoop/work /pragim/class2
15. Delete all files from the “pragim” directory using a wildcard.
hadoop fs -/home/hadoop/work /pragim /*
16. To empty the trash
hadoop fs -expunge
17. Finally, remove the entire pragim directory and all of its contents in HDFS.
hadoop fs -rm -r /home/hadoop/work /pragim
18. List the hadoop directory again
hadoop fs -ls hadoop
19. Add the purchases.txt file from the local directory named /home/hadoop/work /sample/purchases.txt /” to the hadoop directory you created in HDFS
hadoop fs - copyFromLocal /home/hadoop/work /sample/purchases.txt /home/hadoop/work/sample
20. To view the contents of your text file purchases.txt which is present in your hadoop directory?
hadoop fs -cat /home/hadoop/work/sample /purchases.txt
21. Add the purchases.txt file from “/home/hadoop/work /sample/purchases.txt” directory which is present in HDFS directory to the directory “/home/hadoop/work/sample/purchase1.txt” which is present in your local directory
hadoop fs - copyToLocal /home/hadoop/work /sample/purchases.txt /home/hadoop/work/sample/purchase1.txt

22. cp is used to copy files between directories present in HDFS
hadoop fs -cp /home/hadoop/work /sample/*.txt /home/hadoop/work/sample/cp
23. ‘-get’ command can be used alternaively to ‘-copyToLocal’ command
hadoop fs -get /home/hadoop/work/sample/purchase1.txt /home/hadoop/work /sample/purchases3.txt

24. Display last kilobyte of the file “purchases.txt” to stdout.
hadoop fs -tail /home/hadoop/work/sample/purchase1.txt
25. Default file permissions are 666 in HDFS
Use ‘-chmod’ command to change permissions of a file
hadoop fs -ls /home/hadoop/work/sample/purchase1.txt
sudo -u hdfs hadoop fs -chmod 600 /home/hadoop/work/sample/purchase1.txt

26. Move a directory from one location to other
hadoop fs -mv hadoop apache_hadoop
27. Default replication factor to a file is 3.
Use ‘-setrep’ command to change replication factor of a file
hadoop fs -setrep -w 2 apache_hadoop/sample.txt
28. Copy a directory from one node in the cluster to another
Use ‘-distcp’ command to copy,
-overwrite option to overwrite in an existing files
-update command to synchronize both directories
hadoop fs -distcp hdfs://namenodeA/apache_hadoop hdfs://namenodeB/hadoop
29. Command to make the name node leave safe mode
hadoop fs -expunge
sudo -u hdfs hdfs dfsadmin -safemode leave
30. List all the hadoop file system shell commands
hadoop fs
31. Last but not least, always ask for help!
hadoop fs -help

Комментарии

Информация по комментариям в разработке

Похожие видео

  • О нас
  • Контакты
  • Отказ от ответственности - Disclaimer
  • Условия использования сайта - TOS
  • Политика конфиденциальности

video2dn Copyright © 2023 - 2025

Контакты для правообладателей [email protected]