【问题标题】:Why does Crawler4j non blocking method is not waiting for links in queue?为什么 Crawler4j 非阻塞方法不等待队列中的链接?
【发布时间】:2016-02-03 18:58:05
【问题描述】:

鉴于这个简单的代码:

CrawlConfig config = new CrawlConfig();
config.setMaxDepthOfCrawling(1);
config.setPolitenessDelay(1000);
config.setResumableCrawling(false);
config.setIncludeBinaryContentInCrawling(false);
config.setCrawlStorageFolder(Config.get(Config.CRAWLER_SHARED_DIR) + "test/");
config.setShutdownOnEmptyQueue(false);
PageFetcher pageFetcher = new PageFetcher(config);
RobotstxtConfig robotstxtConfig = new RobotstxtConfig();
robotstxtConfig.setEnabled(false);
RobotstxtServer robotstxtServer = new RobotstxtServer(robotstxtConfig, pageFetcher);
CrawlController controller = new CrawlController(config, pageFetcher, robotstxtServer);
controller.addSeed("http://localhost/test");

controller.startNonBlocking(WebCrawler.class, 1);


long counter = 1;
while(Thread.currentThread().isAlive()) {
    System.out.println(config.toString());
    for (int i = 0; i < 4; i++) {
        System.out.println("Adding link");
        controller.addSeed("http://localhost/test" + ++counter + "/");
    }

    try {
        TimeUnit.SECONDS.sleep(5);
    } catch (InterruptedException e) {
        e.printStackTrace();
    }
}

程序的输出是:

18:48:02.411 [main] INFO  - Obtained 6791 TLD from packaged file tld-names.txt
18:48:02.441 [main] INFO  - Deleted contents of: /home/scraper/test/frontier ( as you have configured resumable crawling to false )
18:48:02.636 [main] INFO  - Crawler 1 started
18:48:02.636 [Crawler 1] INFO  - Crawler Crawler 1 started!
Adding link
Adding link
Adding link
Adding link
18:48:02.685 [Crawler 1] WARN  - Skipping URL: http://localhost/test, StatusCode: 404, text/html; charset=iso-8859-1, Not Found
18:48:03.642 [Crawler 1] WARN  - Skipping URL: http://localhost/test2/, StatusCode: 404, text/html; charset=iso-8859-1, Not Found
18:48:04.642 [Crawler 1] WARN  - Skipping URL: http://localhost/test3/, StatusCode: 404, text/html; charset=iso-8859-1, Not Found
18:48:05.643 [Crawler 1] WARN  - Skipping URL: http://localhost/test4/, StatusCode: 404, text/html; charset=iso-8859-1, Not Found
18:48:06.642 [Crawler 1] WARN  - Skipping URL: http://localhost/test5/, StatusCode: 404, text/html; charset=iso-8859-1, Not Found
Adding link
Adding link
Adding link
Adding link
Adding link
Adding link
Adding link
Adding link

为什么 crawler4j 不访问 test6、test7 及以上?

如您所见,它们之前的所有 4 个链接均已正确添加和访问。

当我将“http://localhost/”设置为seedUrl(在启动爬虫之前)时,它最多处理13个链接,然后出现上述问题。

我想要获得的是一种情况,当我可以添加 url 以从其他线程访问正在运行的爬虫时(在运行时)。

@编辑: 我根据@Seth 的建议查看了线程转储,但我不知道为什么它不起作用。

"Thread-1" #25 prio=5 os_prio=0 tid=0x00007ff32854b800 nid=0x56e3 waiting on condition [0x00007ff2de403000]
   java.lang.Thread.State: TIMED_WAITING (sleeping)
    at java.lang.Thread.sleep(Native Method)
    at edu.uci.ics.crawler4j.crawler.CrawlController.sleep(CrawlController.java:367)
    at edu.uci.ics.crawler4j.crawler.CrawlController$1.run(CrawlController.java:243)
    - locked <0x00000005959baff8> (a java.lang.Object)
    at java.lang.Thread.run(Thread.java:745)

   Locked ownable synchronizers:
    - None

"Crawler 1" #24 prio=5 os_prio=0 tid=0x00007ff328544000 nid=0x56e2 in Object.wait() [0x00007ff2de504000]
   java.lang.Thread.State: WAITING (on object monitor)
    at java.lang.Object.wait(Native Method)
    - waiting on <0x0000000596afdd28> (a java.lang.Object)
    at java.lang.Object.wait(Object.java:502)
    at edu.uci.ics.crawler4j.frontier.Frontier.getNextURLs(Frontier.java:151)
    - locked <0x0000000596afdd28> (a java.lang.Object)
    at edu.uci.ics.crawler4j.crawler.WebCrawler.run(WebCrawler.java:259)
    at java.lang.Thread.run(Thread.java:745)

   Locked ownable synchronizers:
    - None

【问题讨论】:

  • 为什么要使用限制为 4 的 for 循环?
  • @Seth 我想模拟添加 4 个链接以从远程源抓取。这个数字在这里并不重要。
  • 我注意到,我目前正在尝试了解您的爬虫,这就是我问的原因。
  • @Seth 我想要达到的是一个爬虫,在单独的线程中运行,它只爬取其他线程在运行时添加的站点。如果未使用种子初始化,爬虫需要等待种子。
  • 啊!好的,我明白了。我可以问你为什么这样做吗?您可以只收集链接,等到收集到 10 个,然后使用这些链接调用爬网方法并再次等待。还是一项特定的任务?

标签: java web-scraping web-crawler crawler4j


【解决方案1】:

所以我找到了问题所在。问题和this pull request一样

【讨论】:

    猜你喜欢
    • 2019-10-16
    • 1970-01-01
    • 2014-10-16
    • 1970-01-01
    • 2012-06-13
    • 2012-10-02
    • 1970-01-01
    • 2013-06-04
    • 1970-01-01
    相关资源
    最近更新 更多