问题描述
使用nutch1.8+hadoop2.2.1运行爬虫任务时候出现的如下问题:15/01/1222:35:23ERRORcrawl.Injector:Injector:java.lang.IllegalArgumentException:WrongFS:hdfs://192.168.137.131:9000/user/haduser/crawld/crawldb/1501539946,expected:file:///atorg.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:642)atorg.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:69)atorg.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:516)atorg.apache.hadoop.fs.FileSystem.isDirectory(FileSystem.java:1410)atorg.apache.hadoop.fs.ChecksumFileSystem.rename(ChecksumFileSystem.java:496)atorg.apache.nutch.crawl.CrawlDb.install(CrawlDb.java:159)atorg.apache.nutch.crawl.Injector.inject(Injector.java:295)atorg.apache.nutch.crawl.Injector.run(Injector.java:316)atorg.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)atorg.apache.nutch.crawl.Injector.main(Injector.java:306)atsun.reflect.NativeMethodAccessorImpl.invoke0(NativeMethod)atsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)atsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)atjava.lang.reflect.Method.invoke(Method.java:606)atorg.apache.hadoop.util.RunJar.main(RunJar.java:212)有哪位大神知道解决办法么?
解决方案
解决方案二:
Hadoop的配置文件是否拷贝到Nutch的conf目录下,并编译打包?
解决方案三:
引用1楼wulinshishen的回复:
Hadoop的配置文件是否拷贝到Nutch的conf目录下,并编译打包?
试过了啊还是一样的错误,蛋疼