These examples demonstrate how to use the Vert.x Micrometer Metrics module.
There are examples for Prometheus, InfluxDB and JMX backends.
Each of them deploys 3 verticles:
SimpleWebServer
creates an HTTP server - visit http://localhost:8080/ to trigger requests.
EventbusProducer
sends messages over the event bus.
EventbusConsumer
receives event bus messages.
Each simulates a random processing time.
For Prometheus, the verticle WebServerForBoundPrometheus
is an alternative of SimpleWebserver
that binds the prometheus /metrics
endpoint to an existing HTTP server.
First, build the fat jar:
mvn clean package
Then run any of these Main classes:
Check here for Prometheus getting started guide.
You need to configure the Prometheus server to scrape localhost:8081
.
- job_name: 'vertx-8081'
static_configs:
- targets: ['localhost:8081']
Tip
|
To run a pre-configured Prometheus server on your machine with Docker, go to this example directory and run: docker run --network host \
-v ${PWD}/prometheus:/etc/prometheus \
-it prom/prometheus
|
To start the example from the command line:
java -cp target/micrometer-metrics-examples-4.1.2-fat.jar io.vertx.example.micrometer.prometheus.Main
By default, histogram sampling is disabled.
You can enable it manually in the Micrometer registry, as shown in commented code in class
io.vertx.example.micrometer.prometheus.Main
:
registry.config().meterFilter(
new MeterFilter() {
@Override
public DistributionStatisticConfig configure(Meter.Id id, DistributionStatisticConfig config) {
return DistributionStatisticConfig.builder()
.percentilesHistogram(true)
.build()
.merge(config);
}
});
See also Micrometer documentation.
java -cp target/micrometer-metrics-examples-4.1.2-fat.jar io.vertx.example.micrometer.prometheus.MainWithBoundPrometheus
You need to configure the Prometheus server to scrape localhost:8080
.
- job_name: 'vertx-8080'
static_configs:
- targets: ['localhost:8080']
This sample application expects an InfluxDB server running on localhost, port 8086, without authentication. For quick setup, you can run it with this docker command:
docker run -p 8086:8086 influxdb
Start the application:
java -cp target/micrometer-metrics-examples-4.1.2-fat.jar io.vertx.example.micrometer.influxdb.Main
You can trigger some workload to see the impact on HTTP server metrics:
while true
do curl http://localhost:8080/
sleep .8
done
Metrics can be observed from Grafana.
Tip
|
To run a Grafana server on your machine with Docker, go to this example directory and run: docker run --network host \
-it grafana/grafana
|
These dashboards track some HTTP server and event bus metrics:
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。