【问题标题】:elasticsearch high cpu usageelasticsearch 高cpu使用率
【发布时间】:2014-08-24 00:49:18
【问题描述】:

我有 5 个集群节点,每个节点都有 1 个副本。 总文档大小为 216 M 和 853,000 个文档。 我正在遭受非常高的 CPU 使用率。约 60%~80% 每小时和每天凌晨 05:00 ~ 09:00 左右 仅此服务器上有弹性搜索

我认为es进程有问题。 但是在 cpu 高峰时间有一些服务器请求。 甚至没有 cron 工作。

每个小时和每天清晨大约上午 05:00 ~ 上午 09:00 不知道此时elasticsearch是怎么回事!! 有人帮帮我,告诉我那里发生了什么。请..

$ ./elasticsearch -v 
Version: 1.1.1, Build: f1585f0/2014-04-16T14:27:12Z, JVM: 1.7.0_55 

$ java -version 
java version "1.7.0_55" 
Java(TM) SE Runtime Environment (build 1.7.0_55-b13) 
Java HotSpot(TM) 64-Bit Server VM (build 24.55-b03, mixed mode) 

我在 elasticsearch 上安装了插件: HQ,bigdesk,head,kopf,sense

es log at cpu peak time:

[2014-07-03 08:01:00,045][DEBUG][action.search.type       ] [node1] [search][4], node[GJjzCrLvQQ-ZRRoqL13MrQ], [P], s[STARTED]: Failed to execute [org.elasticsearch.action.search.SearchRequest@451f9e7c] lastShard [true] 
org.elasticsearch.common.util.concurrent.EsRejectedExecutionException: rejected execution (queue capacity 300) on org.elasticsearch.action.search.type.TransportSearchTypeAction$BaseAsyncAction$4@68ab486b 
    at org.elasticsearch.common.util.concurrent.EsAbortPolicy.rejectedExecution(EsAbortPolicy.java:62) 
    at java.util.concurrent.ThreadPoolExecutor.reject(Unknown Source) 
    at java.util.concurrent.ThreadPoolExecutor.execute(Unknown Source) 
    at org.elasticsearch.action.search.type.TransportSearchTypeAction$BaseAsyncAction.onFirstPhaseResult(TransportSearchTypeAction.java:293) 
    at org.elasticsearch.action.search.type.TransportSearchTypeAction$BaseAsyncAction.onFirstPhaseResult(TransportSearchTypeAction.java:300) 
    at org.elasticsearch.action.search.type.TransportSearchTypeAction$BaseAsyncAction.start(TransportSearchTypeAction.java:190) 
    at org.elasticsearch.action.search.type.TransportSearchQueryThenFetchAction.doExecute(TransportSearchQueryThenFetchAction.java:59) 
    at org.elasticsearch.action.search.type.TransportSearchQueryThenFetchAction.doExecute(TransportSearchQueryThenFetchAction.java:49) 
    at org.elasticsearch.action.support.TransportAction.execute(TransportAction.java:63) 
    at org.elasticsearch.action.search.TransportSearchAction.doExecute(TransportSearchAction.java:108) 
    at org.elasticsearch.action.search.TransportSearchAction.doExecute(TransportSearchAction.java:43) 
    at org.elasticsearch.action.support.TransportAction.execute(TransportAction.java:63) 
    at org.elasticsearch.client.node.NodeClient.execute(NodeClient.java:92) 
    at org.elasticsearch.client.support.AbstractClient.search(AbstractClient.java:212) 
    at org.elasticsearch.rest.action.search.RestSearchAction.handleRequest(RestSearchAction.java:98) 
    at org.elasticsearch.rest.RestController.executeHandler(RestController.java:159) 
    at org.elasticsearch.rest.RestController.dispatchRequest(RestController.java:142) 
    at org.elasticsearch.http.HttpServer.internalDispatchRequest(HttpServer.java:121) 
    at org.elasticsearch.http.HttpServer$Dispatcher.dispatchRequest(HttpServer.java:83) 
    at org.elasticsearch.http.netty.NettyHttpServerTransport.dispatchRequest(NettyHttpServerTransport.java:291) 
    at org.elasticsearch.http.netty.HttpRequestHandler.messageReceived(HttpRequestHandler.java:43) 
    at org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) 
    at org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) 
    at org.elasticsearch.common.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) 
    at org.elasticsearch.common.netty.handler.codec.http.HttpChunkAggregator.messageReceived(HttpChunkAggregator.java:145) 
    at org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) 
    at org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) 
    at org.elasticsearch.common.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) 
    at org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:296) 
    at org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:459) 
    at org.elasticsearch.common.netty.handler.codec.replay.ReplayingDecoder.callDecode(ReplayingDecoder.java:536) 
    at org.elasticsearch.common.netty.handler.codec.replay.ReplayingDecoder.messageReceived(ReplayingDecoder.java:435) 
    at org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) 
    at org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) 
    at org.elasticsearch.common.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) 
    at org.elasticsearch.common.netty.OpenChannelsHandler.handleUpstream(OpenChannelsHandler.java:74) 
    at org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) 
    at org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559) 
    at org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:268) 
    at org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:255) 
    at org.elasticsearch.common.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88) 
    at org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108) 
    at org.elasticsearch.common.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:318) 
    at org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89) 
    at org.elasticsearch.common.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178) 
    at org.elasticsearch.common.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) 
    at org.elasticsearch.common.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) 
    at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) 
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) 
    at java.lang.Thread.run(Unknown Source)

【问题讨论】:

    标签: elasticsearch cpu-usage


    【解决方案1】:

    当这种情况发生时,您是否 100% 确定只有少数请求正在进行?

    日志表明有太多查询正在运行,以至于它拒绝新的查询,我希望 bigdesk 会显示大量查询。

    一定有某种批处理/自动化流程在您的系统中充斥着查询。去过那里几次。

    您应该检查索引慢日志并可能调整时间,以便您注销大多数查询(在短时间内)。有关详细信息,请参见此处: http://www.elasticsearch.org/guide/en/elasticsearch/reference/current/index-modules-slowlog.html

    【讨论】:

      猜你喜欢
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      • 2021-12-09
      • 2014-07-16
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      相关资源
      最近更新 更多