课程表

Apache Pig 基础

Apache Pig 诊断运算符

Apache Pig 分组和连接

Apache Pig 合并和拆分

Apache Pig 过滤

Apache Pig 排序

Pig Latin 内置函数

Apache Pig 其他执行模式

工具箱
速查手册

Diagnostic运算符

当前位置:免费教程 » 大数据/云 » Apache Pig

Load 语句会简单地将数据加载到Apache Pig中的指定关系中。要验证Load语句的执行,必须使用Diagnostic运算符Pig Latin提供四种不同类型的诊断运算符:

  • Dump运算符
  • Describe运算符
  • Explanation运算符
  • Illustration运算符

在本章中,我们将讨论Pig Latin的Dump运算符。

Dump运算符

Dump 运算符用于运行Pig Latin语句,并在屏幕上显示结果,它通常用于调试目的。

语法

下面给出了 Dump 运算符的语法。

  1. grunt> Dump Relation_Name

假设在HDFS中有一个包含以下内容的文件 student_data.txt

  1. 001,Rajiv,Reddy,9848022337,Hyderabad
  2. 002,siddarth,Battacharya,9848022338,Kolkata
  3. 003,Rajesh,Khanna,9848022339,Delhi
  4. 004,Preethi,Agarwal,9848022330,Pune
  5. 005,Trupthi,Mohanthy,9848022336,Bhuwaneshwar
  6. 006,Archana,Mishra,9848022335,Chennai.

我们使用LOAD运算符将它读入关系 student ,如下所示。

  1. grunt> student = LOAD 'hdfs://localhost:9000/pig_data/student_data.txt'
  2. USING PigStorage(',')
  3. as ( id:int, firstname:chararray, lastname:chararray, phone:chararray,
  4. city:chararray );

现在,使用Dump运算符打印关系的内容,如下所示。

  1. grunt> Dump student

一旦执行上述 Pig Latin 语句,将启动一个MapReduce作业以从HDFS读取数据,将产生以下输出。

  1. 2015-10-01 15:05:27,642 [main]
  2. INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher -
  3. 100% complete
  4. 2015-10-01 15:05:27,652 [main]
  5. INFO org.apache.pig.tools.pigstats.mapreduce.SimplePigStats - Script Statistics:
  6. HadoopVersion PigVersion UserId StartedAt FinishedAt Features
  7. 2.6.0 0.15.0 Hadoop 2015-10-01 15:03:11 2015-10-01 05:27 UNKNOWN
  8. Success!
  9. Job Stats (time in seconds):
  10. JobId job_14459_0004
  11. Maps 1
  12. Reduces 0
  13. MaxMapTime n/a
  14. MinMapTime n/a
  15. AvgMapTime n/a
  16. MedianMapTime n/a
  17. MaxReduceTime 0
  18. MinReduceTime 0
  19. AvgReduceTime 0
  20. MedianReducetime 0
  21. Alias student
  22. Feature MAP_ONLY
  23. Outputs hdfs://localhost:9000/tmp/temp580182027/tmp757878456,
  24.  
  25. Input(s): Successfully read 0 records from: "hdfs://localhost:9000/pig_data/
  26. student_data.txt"
  27. Output(s): Successfully stored 0 records in: "hdfs://localhost:9000/tmp/temp580182027/
  28. tmp757878456"
  29.  
  30. Counters: Total records written : 0 Total bytes written : 0 Spillable Memory Manager
  31. spill count : 0Total bags proactively spilled: 0 Total records proactively spilled: 0
  32.  
  33. Job DAG: job_1443519499159_0004
  34. 2015-10-01 15:06:28,403 [main]
  35. INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLau ncher - Success!
  36. 2015-10-01 15:06:28,441 [main] INFO org.apache.pig.data.SchemaTupleBackend -
  37. Key [pig.schematuple] was not set... will not generate code.
  38. 2015-10-01 15:06:28,485 [main]
  39. INFO org.apache.hadoop.mapreduce.lib.input.FileInputFormat - Total input paths
  40. to process : 1
  41. 2015-10-01 15:06:28,485 [main]
  42. INFO org.apache.pig.backend.hadoop.executionengine.util.MapRedUtil - Total input paths
  43. to process : 1
  44.  
  45. (1,Rajiv,Reddy,9848022337,Hyderabad)
  46. (2,siddarth,Battacharya,9848022338,Kolkata)
  47. (3,Rajesh,Khanna,9848022339,Delhi)
  48. (4,Preethi,Agarwal,9848022330,Pune)
  49. (5,Trupthi,Mohanthy,9848022336,Bhuwaneshwar)
  50. (6,Archana,Mishra,9848022335,Chennai)
转载本站内容时,请务必注明来自W3xue,违者必究。
 友情链接:直通硅谷  点职佳  北美留学生论坛

本站QQ群:前端 618073944 | Java 606181507 | Python 626812652 | C/C++ 612253063 | 微信 634508462 | 苹果 692586424 | C#/.net 182808419 | PHP 305140648 | 运维 608723728

W3xue 的所有内容仅供测试,对任何法律问题及风险不承担任何责任。通过使用本站内容随之而来的风险与本站无关。
关于我们  |  意见建议  |  捐助我们  |  报错有奖  |  广告合作、友情链接(目前9元/月)请联系QQ:27243702 沸活量
皖ICP备17017327号-2 皖公网安备34020702000426号