问题描述
代码为Configurationconf=newConfiguration();Jobjob=newJob(conf,"wordcount");conf.set("fs.defaultFS","hdfs://192.168.117.128:8020/");conf.set("hadoop.job.ugi","root");conf.set("mapred.job.tracker","192.168.117.128:8021");PathinputPath=newPath("/input/FB_Bank_Comments_abi_required_original.txt");PathoutputPath=newPath("/output/wordcount"+System.currentTimeMillis());job.setJobName("wordcount_analysis");job.setJarByClass(Map.class);job.setOutputKeyClass(Text.class);job.setOutputValueClass(IntWritable.class);job.setMapperClass(Map.class);job.setReducerClass(Reduce.class);job.setInputFormatClass(TextInputFormat.class);job.setOutputFormatClass(TextOutputFormat.class);FileInputFormat.addInputPath(job,inputPath);FileOutputFormat.setOutputPath(job,outputPath);job.waitForCompletion(true);老是有这个错误,把程序达成jar包,在Linuxmaster节点上直接运行是没有错的Exceptioninthread"main"java.lang.NullPointerExceptionatjava.lang.ProcessBuilder.start(UnknownSource)atorg.apache.hadoop.util.Shell.runCommand(Shell.java:404)atorg.apache.hadoop.util.Shell.run(Shell.java:379)atorg.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:589)atorg.apache.hadoop.util.Shell.execCommand(Shell.java:678)atorg.apache.hadoop.util.Shell.execCommand(Shell.java:661)atorg.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:639)atorg.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:435)atorg.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:277)atorg.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:125)atorg.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:344)atorg.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)atorg.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)atjava.security.AccessController.doPrivileged(NativeMethod)atjavax.security.auth.Subject.doAs(UnknownSource)atorg.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)atorg.apache.hadoop.mapreduce.Job.submit(Job.java:1265)atorg.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1286)atorg.myorg.WordCount.main(WordCount.java:69)[/code]
解决方案
解决方案二:
解决了么?..我也是这个问题,求lz告诉解决办法
解决方案三:
同样这个问题没解决,是不是hadoop2.2插件不兼容呀?
解决方案四:
我起初是从网上下的hadoop-eclipse插件,出这个问题,后来用ant自己编译了一个插件,还是这个问题......但是插件我是在用eclipsesdkforlinux-4.3.1编译的,win下的eclipse是juno版本,是eclipse版本的问题么?..我按资料一步一步下来还是这个问题..头疼解决ing.....╮(╯▽╰)╭
解决方案五:
你好,我也是这个问题,请问你是怎么解决的啊?!!
解决方案六:
换了linux的机子,在用eclipse后就没有这个问题了
解决方案七:
用linux机器没问题,就是换成windows就不行,也同样出现这个问题
解决方案八:
MacOSX10.9.3eclipseVersion:KeplerServiceRelease2Buildid:20140224-0627下相同问题但是很奇怪的是居然跑成功了一次!就一次!还不知道怎么跑出来的……
解决方案九:
引用7楼huoqi12的回复:
MacOSX10.9.3eclipseVersion:KeplerServiceRelease2Buildid:20140224-0627下相同问题但是很奇怪的是居然跑成功了一次!就一次!还不知道怎么跑出来的……
我的以解决,虽然都是NullPointerException错误,但是我的错在org.apache.hadoop.mapreduce.lib.input.FileInputFormat,回查原来输入只有路径,忘了把文件名也写上了……