logstash-input-jdbc 无法同步
时间: 2019-04-19来源:开源中国
前景提要
HDC调试需求开发(15万预算),能者速来!>>> [root@local14 conf.d]# cat /etc/logstash/conf.d/logstash-mysql.conf # Sample Logstash configuration for creating a simple # Beats -> Logstash -> Elasticsearch pipeline. input { stdin { } jdbc { jdbc_connection_string => "jdbc:mysql://192.168.56.101:3306/test" jdbc_user => "root" jdbc_password => "" jdbc_driver_library => "/root/mysql-connector-java-5.1.46.jar" jdbc_driver_class => "com.mysql.jdbc.Driver" jdbc_paging_enabled => "true" jdbc_page_size => "1000" statement => "SELECT * from user test_id > :last_sql_value" schedule => "* * * * *" record_last_run => true use_column_value => true lowercase_column_names =>true tracking_column => "test_id" tracking_column_type => "numeric" type => "jdbc" } } filter { json { source => "message" remove_field => ["message"] } } output { elasticsearch { hosts => ["http://localhost:9200"] index => "user" document_id => "%{test_id}" #user => "elastic" #password => "changeme" } stdout { codec => json_lines } }
and logstash的log [2019-04-19T16:38:32,945][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.0.0"} [2019-04-19T16:38:50,660][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}} [2019-04-19T16:38:51,320][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"} [2019-04-19T16:38:51,474][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>7} [2019-04-19T16:38:51,507][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7} [2019-04-19T16:38:51,609][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200"]} [2019-04-19T16:38:51,637][INFO ][logstash.outputs.elasticsearch] Using default mapping template [2019-04-19T16:38:51,691][INFO ][logstash.javapipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>125, :thread=>"#<Thread:0x687ff20a run>"} [2019-04-19T16:38:51,948][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1, "index.lifecycle.name"=>"logstash-policy", "index.lifecycle.rollover_alias"=>"logstash"}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}} [2019-04-19T16:38:53,041][INFO ][logstash.javapipeline ] Pipeline started {"pipeline.id"=>"main"} [2019-04-19T16:38:53,510][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]} [2019-04-19T16:38:55,212][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
但是一直没有读到mysql的数据。在线等,很着急!!!

科技资讯:

科技学院:

科技百科:

科技书籍:

网站大全:

软件大全:

热门排行