【问题标题】:ElasticSearch Scroll API not going past 10000 limitElasticSearch Scroll API 未超过 10000 限制
【发布时间】:2021-01-28 11:53:05
【问题描述】:

我正在使用 Scroll API 从我们的 Elastic Search 中获取超过 10,000 个文档,但是,每当我尝试查询超过 10k 的代码时,我都会收到以下错误:

Elasticsearch exception [type=search_phase_execution_exception, reason=all shards failed]

这是我的代码:

        try {
        // 1. Build Search Request
        final Scroll scroll = new Scroll(TimeValue.timeValueMinutes(1L));
        SearchRequest searchRequest = new SearchRequest(eventId);
        SearchSourceBuilder searchSourceBuilder = new SearchSourceBuilder();

        searchSourceBuilder.query(queryBuilder);
        searchSourceBuilder.size(limit);

        searchSourceBuilder.profile(true); // used to profile the execution of queries and aggregations for a specific search

        searchSourceBuilder.timeout(new TimeValue(60, TimeUnit.SECONDS)); // optional parameter that controls how long the search is allowed to take

        if(CollectionUtils.isNotEmpty(sortBy)){
            for (int i = 0; i < sortBy.size(); i++) {
                String sortByField = sortBy.get(i);
                String orderByField = orderBy.get(i < orderBy.size() ? i : orderBy.size() - 1);
                SortOrder sortOrder = (orderByField != null && orderByField.trim().equalsIgnoreCase("asc")) ? SortOrder.ASC : SortOrder.DESC;
                if(keywordFields.contains(sortByField)) {
                    sortByField = sortByField + ".keyword";
                } else if(rawFields.contains(sortByField)) {
                    sortByField = sortByField + ".raw";
                }
                searchSourceBuilder.sort(new FieldSortBuilder(sortByField).order(sortOrder));
            }
        }
        searchSourceBuilder.sort(new FieldSortBuilder("_id").order(SortOrder.ASC));

        if (includes != null) {
            String[] excludes = {""};
            searchSourceBuilder.fetchSource(includes, excludes);
        }

        if (CollectionUtils.isNotEmpty(aggregations)) {
            aggregations.forEach(searchSourceBuilder::aggregation);
        }

        searchRequest.scroll(scroll);
        searchRequest.source(searchSourceBuilder);

        SearchResponse resp = null;
        try {
            resp = client.search(searchRequest, RequestOptions.DEFAULT);
            String scrollId = resp.getScrollId();
            SearchHit[] searchHits = resp.getHits().getHits();

            // Pagination - will continue to call ES until there are no more pages
            while(searchHits != null && searchHits.length > 0){
                SearchScrollRequest scrollRequest = new SearchScrollRequest(scrollId);
                scrollRequest.scroll(scroll);
                resp = client.scroll(scrollRequest, RequestOptions.DEFAULT);
                scrollId = resp.getScrollId();
                searchHits = resp.getHits().getHits();
            }

            // Clear scroll request to release the search context
            ClearScrollRequest clearScrollRequest = new ClearScrollRequest();
            clearScrollRequest.addScrollId(scrollId);
            client.clearScroll(clearScrollRequest, RequestOptions.DEFAULT);

        } catch (Exception e) {
            String msg = "Could not get search result. Exception=" + ExceptionUtilsEx.getExceptionInformation(e);
            
            throw new Exception(msg);
        

我正在通过此链接实施解决方案:https://www.elastic.co/guide/en/elasticsearch/client/java-rest/current/java-rest-high-search-scroll.html

谁能告诉我我做错了什么以及我需要做什么才能使用滚动 api 超过 10,000?

【问题讨论】:

  • 你能显示你得到的完整错误吗?
  • 这是错误:Elasticsearch exception [type=search_phase_execution_exception, reason=all shards failed]
  • 这不是完整的错误。你能检查一下ES服务器日志吗?
  • 这在本地主机上运行。这是我从断点得到的例外。它给出了“id {id} 没有搜索上下文”的原因
  • 迭代持续时间是否超过 60 秒?

标签: elasticsearch


【解决方案1】:

如果您的迭代时间超过 5 分钟,那么您需要调整滚动时间。更改此行以确保滚动上下文不会在 1 分钟后消失

final Scroll scroll = new Scroll(TimeValue.timeValueMinutes(10L));

然后删除这个:

searchSourceBuilder.timeout(new TimeValue(60, TimeUnit.SECONDS)); // optional parameter that controls how long the search is allowed to take

【讨论】:

  • 对不起,我还没试过。这项探索已被搁置,以支持更紧急的错误。
  • 但是,您是否大致了解什么会导致滚动 api 抛出异常并且不检索超过 10k 的文档?
  • 正如我所说,通常会抛出异常no search context for id {id},因为迭代的持续时间比您配置的滚动时间长。如果您将滚动时间增加到足以让每次迭代发生的持续时间,那么您将不再有此异常。
猜你喜欢
  • 1970-01-01
  • 2019-09-26
  • 1970-01-01
  • 1970-01-01
  • 1970-01-01
  • 2018-09-23
  • 1970-01-01
  • 1970-01-01
  • 1970-01-01
相关资源
最近更新 更多