【问题标题】:Logstash Index error : [logstash-*] IndexNotFoundException[no such index]Logstash 索引错误:[logstash-*] IndexNotFoundException[没有这样的索引]
【发布时间】:2016-04-03 12:03:22
【问题描述】:

我是 ELK 的新手。 我在用 : - 弹性搜索-2.1.0 -logstash-2.1.1 - kibana-4.3.0-windows 我尝试将 ELK 配置为监视我的应用程序日志,并且遵循了不同的教程和不同的 logstash 配置,但是当我打开 kibana 时出现此错误,并将请求发送到 elasticsearch。 :

[logstash-*] IndexNotFoundException[no such index]

这是我的 logstash 配置:

input {
   file {
       path => "/var/logs/*.log"
       type => "syslog"
        }
      } 
filter {
       grok {match => [ "message", "%{COMBINEDAPACHELOG}" ] }
} 
output {
      elasticsearch { hosts => localhost }
      stdout { codec => rubydebug }
}

我尝试删除所有文件夹并重新安装并按照本教程一步一步进行操作: https://www.elastic.co/guide/en/logstash/current/advanced-pipeline.html

但是我没有收到任何类型的索引,我又得到了从kibanaelasticsearchindex Error

有什么帮助吗?

问候 调试日志 : `

C:\Users\xxx\Desktop\LOGS\logstash-2.1.1\bin>logstash -f first-pipeline.conf --debug
io/console not supported; tty will not be manipulated
←[36mReading config file {:config_file=>"C:/Users/xxx/Desktop/LOGS/logstash-2.1.1/bin/first-pipeline.conf", :level=>:debug, :file=>"/Users/xxx/Desktop/LOGS/logstash-2.1.1/vendor/bundle/jruby
/1.9/gems/logstash-core-2.1.1-java/lib/logstash/agent.rb", :line=>"325", :method=>"local_config"}←[0m
←[36mCompiled pipeline code:
        @inputs = []
        @filters = []
        @outputs = []
        @periodic_flushers = []
        @shutdown_flushers = []
        @input_file_1 = plugin("input", "file", LogStash::Util.hash_merge_many({ "path" => ("/var/logs/logstash-tutorial-dataset") }, { "start_position" => ("beginning") }))
        @inputs << @input_file_1
        @filter_grok_2 = plugin("filter", "grok", LogStash::Util.hash_merge_many({ "match" => {("message") => ("%{COMBINEDAPACHELOG}")} }))
        @filters << @filter_grok_2
            @filter_grok_2_flush = lambda do |options, &block|
              @logger.debug? && @logger.debug("Flushing", :plugin => @filter_grok_2)
              events = @filter_grok_2.flush(options)
              return if events.nil? || events.empty?
              @logger.debug? && @logger.debug("Flushing", :plugin => @filter_grok_2, :events => events)
                          events = @filter_geoip_3.multi_filter(events)
              events.each{|e| block.call(e)}
            end
            if @filter_grok_2.respond_to?(:flush)
              @periodic_flushers << @filter_grok_2_flush if @filter_grok_2.periodic_flush
              @shutdown_flushers << @filter_grok_2_flush
            end
          @filter_geoip_3 = plugin("filter", "geoip", LogStash::Util.hash_merge_many({ "source" => ("clientip") }))
          @filters << @filter_geoip_3
            @filter_geoip_3_flush = lambda do |options, &block|
              @logger.debug? && @logger.debug("Flushing", :plugin => @filter_geoip_3)
              events = @filter_geoip_3.flush(options)
              return if events.nil? || events.empty?
              @logger.debug? && @logger.debug("Flushing", :plugin => @filter_geoip_3, :events => events)
              events.each{|e| block.call(e)}
            end
            if @filter_geoip_3.respond_to?(:flush)
              @periodic_flushers << @filter_geoip_3_flush if @filter_geoip_3.periodic_flush
              @shutdown_flushers << @filter_geoip_3_flush
            end
          @output_elasticsearch_4 = plugin("output", "elasticsearch", LogStash::Util.hash_merge_many({ "hosts" => [("localhost")] }))
          @outputs << @output_elasticsearch_4
  def filter_func(event)
    events = [event]
    @logger.debug? && @logger.debug("filter received", :event => event.to_hash)
              events = @filter_grok_2.multi_filter(events)
              events = @filter_geoip_3.multi_filter(events)
    events
  end
  def output_func(event)
    @logger.debug? && @logger.debug("output received", :event => event.to_hash)
    @output_elasticsearch_4.handle(event)
  end {:level=>:debug, :file=>"/Users/xxx/Desktop/LOGS/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/logstash-core-2.1.1-java/lib/logstash/pipeline.rb", :line=>"38", :method=>"initialize"}←[0m
←[36mPlugin not defined in namespace, checking for plugin file {:type=>"input", :name=>"file", :path=>"logstash/inputs/file", :level=>:debug, :file=>"/Users/xxx/Desktop/LOGS/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/logstash-core-2.1.1-java/lib/logstash/plugin.rb", :line=>"76", :method=>"lookup"}←[0m
[...]
Logstash startup completed
←[32mFlushing buffer at interval {:instance=>"#<LogStash::Outputs::ElasticSearch::Buffer:0x75375e77@stopping=#<Concurrent::AtomicBoolean:0x61b12c0>, @last_flush=2015-12-29 15:45:27 +0000, @flush_thread=#<Thread:0x7008acbf run>, @max_size=500, @operations_lock=#<Java::JavaUtilConcurrentLocks::ReentrantLock:0x4985690f>, @submit_proc=#<Proc:0x3c9b0727@C:/Users/xxx/Desktop/LOGS/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.2.0-java/lib/logstash/outputs/elasticsearch/common.rb:55>, @flush_interval=1, @logger=#<Cabin::Channel:0x65f2b086 @subscriber_lock=#<Mutex:0x202361b4>, @data={}, @metrics=#<Cabin::Metrics:0x72e380e7 @channel=#<Cabin::Channel:0x65f2b086 ...>, @metrics={}, @metrics_lock=#<Mutex:0x3623f89e>>, @subscribers={12592=>#<Cabin::Outputs::IO:0x316290ee @lock=#<Mutex:0x3e191296>, @io=#<IO:fd 1>>}, @level=:debug>, @buffer=[], @operations_mutex=#<Mutex:0x601355b3>>", :interval=>1, :level=>:info, :file=>"/Users/xxx/Desktop/LOGS/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.2.0-java/lib/logstash/outputs/elasticsear
ch/buffer.rb", :line=>"90", :method=>"interval_flush"}←[0m
←[36m_globbed_files: /var/logs/logstash-tutorial-dataset: glob is: ["/var/logs/logstash-tutorial-dataset"] {:level=>:debug, :file=>"/Users/xxx/Desktop/LOGS/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/filewatch-0.6.7/lib/filewatch/watch.rb", :line=>"190", :method=>"_globbed_files"}←[0m`

elasticsearch.log:

[2015-12-29 15:15:01,702][WARN ][bootstrap                ] unable to install syscall filter: syscall filtering not supported for OS: 'Windows 8.1'
[2015-12-29 15:15:01,879][INFO ][node                     ] [Blue Marvel] version[2.1.1], pid[10152], build[40e2c53/2015-12-15T13:05:55Z]
[2015-12-29 15:15:01,880][INFO ][node                     ] [Blue Marvel] initializing ...
[2015-12-29 15:15:01,923][INFO ][plugins                  ] [Blue Marvel] loaded [], sites []
[2015-12-29 15:15:01,941][INFO ][env                      ] [Blue Marvel] using [1] data paths, mounts [[OS (C:)]], net usable_space [242.8gb], net total_space [458.4gb], spins? [unknown], types [NTFS]
[2015-12-29 15:15:03,135][INFO ][node                     ] [Blue Marvel] initialized
[2015-12-29 15:15:03,135][INFO ][node                     ] [Blue Marvel] starting ...
[2015-12-29 15:15:03,249][INFO ][transport                ] [Blue Marvel] publish_address {127.0.0.1:9300}, bound_addresses {127.0.0.1:9300}, {[::1]:9300}
[2015-12-29 15:15:03,255][INFO ][discovery                ] [Blue Marvel] elasticsearch/3DpYKTroSke4ruP21QefmA
[2015-12-29 15:15:07,287][INFO ][cluster.service          ] [Blue Marvel] new_master {Blue Marvel}{3DpYKTroSke4ruP21QefmA}{127.0.0.1}{127.0.0.1:9300}, reason: zen-disco-join(elected_as_master, [0] joins received)
[2015-12-29 15:15:07,377][INFO ][http                     ] [Blue Marvel] publish_address {127.0.0.1:9200}, bound_addresses {127.0.0.1:9200}, {[::1]:9200}
[2015-12-29 15:15:07,382][INFO ][node                     ] [Blue Marvel] started
[2015-12-29 15:15:07,399][INFO ][gateway                  ] [Blue Marvel] recovered [1] indices into cluster_state
[2015-12-29 16:33:00,715][INFO ][rest.suppressed          ] /logstash-$DATE/_search Params: {index=logstash-$DATE, q=response=200}
[logstash-$DATE] IndexNotFoundException[no such index]
    at org.elasticsearch.cluster.metadata.IndexNameExpressionResolver$WildcardExpressionResolver.resolve(IndexNameExpressionResolver.java:566)

【问题讨论】:

  • 您可以尝试使用--debug 命令行开关运行logstash 并使用您得到的输出更新您的问题吗?
  • 我试过了,但 logstash 正在记录大量日志。我应该展示什么?
  • 就是这个想法 ;-) 只是想看看什么在哪里流动,或者错误是否在雷达下飞行。
  • 我可以将它保存到文件中吗?
  • 您不需要存储全部内容,只需在 logstash 启动并开始使用您的日志时。而已。大约 200 行,您可以毫无问题地将它们粘贴到您的问题中。

标签: elasticsearch logstash kibana


【解决方案1】:

根据我的观察,您似乎没有在 logstash 输出配置文件中提供端口号。一般来说,elasticsearch 使用的端口是 9200 (默认)(按照大多数教程的说明)。尝试将 logstash 配置 - 输出部分更改为以下内容,并让我知道它是否有效:

output {
      elasticsearch { hosts => ["localhost:9200"] }
      stdout { codec => rubydebug }
}

【讨论】:

  • 我在输出中添加了您的示例,但仍然无法正常工作。而这里http://localhost:9200/_cat/indices我只有这个:yellow open .kibana 1 1 1 0 3.1kb 3.1kb
  • 我在 Ubuntu 中尝试了相同的步骤,它可以立即运行。感谢您的宝贵时间
  • 勘误码:我在 Ubuntu 中尝试了相同的步骤,并且可以正常工作。比我删除弹性搜索中的索引:curl -XDELETE http://localhost:9200/logstash-2015.12.30/ 并尝试使用不同的配置文件重新创建它,logstash 没有将新索引发送到弹性搜索。有人知道为什么吗?
【解决方案2】:

我修复了添加此问题的问题: input { file { path => "/path/to/logstash-tutorial.log" start_position => beginning sincedb_path =&gt; "/dev/null" } }

现在 logstash 正在将索引发送到 elasticsearch

【讨论】:

    【解决方案3】:

    此问题将通过以下 logstash 配置文件更改来解决。

     input {
        file {
            path => "/path/to/logfile.log"
            start_position => beginning 
        }
    }
    
    filter {
    
    }
    
    
    output {
          elasticsearch { 
            hosts => ["localhost:9200"] 
            index => "logstash-%{+YYYY.MM.dd}"    
                        }
          stdout { codec => rubydebug }
    }
    

    【讨论】:

      猜你喜欢
      • 1970-01-01
      • 1970-01-01
      • 2017-06-03
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      • 2015-11-15
      相关资源
      最近更新 更多