OpenTelemetry Bot d680729c09 [chore] Prepare release 0.90.0 (#29543) 1 年之前
..
internal bf6edaf6ac [bugfix][apachesparkreceiver] Correct inaccurate metric units (#29104) 1 年之前
testdata bf6edaf6ac [bugfix][apachesparkreceiver] Correct inaccurate metric units (#29104) 1 年之前
Makefile 51d105651a [receiver/apachespark] Outline apache spark receiver component (#21971) 1 年之前
README.md 562dc48d10 Link component issue badges to the respective issue page (#24642) 1 年之前
client.go 061e2f915a [chore] unexport URL constant used in tests (#24847) 1 年之前
client_test.go 061e2f915a [chore] unexport URL constant used in tests (#24847) 1 年之前
config.go 68ed7c5c7b [enhancement][receiver/apachespark] Complete Spark receiver (#22163) 1 年之前
config_test.go 158ee0e040 [chore] Update tests to include scraper settings (#24653) 1 年之前
doc.go c237b5c691 [chore][receiver/apachespark] use generated status header (#22821) 1 年之前
documentation.md bf6edaf6ac [bugfix][apachesparkreceiver] Correct inaccurate metric units (#29104) 1 年之前
factory.go 68ed7c5c7b [enhancement][receiver/apachespark] Complete Spark receiver (#22163) 1 年之前
factory_test.go 4a0ea8e961 [chore] Improving life cycle tests reporting (#23145) 1 年之前
go.mod d680729c09 [chore] Prepare release 0.90.0 (#29543) 1 年之前
go.sum 40b485f08a Update core for v0.90.0 release (#29539) 1 年之前
integration_test.go a7045cbee9 [chore] Fix spark integration test by pinning version (#23675) 1 年之前
metadata.yaml bf6edaf6ac [bugfix][apachesparkreceiver] Correct inaccurate metric units (#29104) 1 年之前
scraper.go 29ea5a964d [cmd/mdatagen] Avoid reusing the same ResourceBuilder instance (#24762) 1 年之前
scraper_test.go cd92432fd2 [pkg/golden] internal/coreinternal/golden -> pkg/golden (#28636) 1 年之前

README.md

Apache Spark Receiver

Status
Stability development: metrics
Distributions contrib
Issues Open issues Closed issues
Code Owners @djaglowski, @Caleb-Hurshman, @mrsillydog

This receiver fetches metrics for an Apache Spark cluster through the Apache Spark REST API - specifically, the /metrics/json, /api/v1/applications/[app-id]/stages, /api/v1/applications/[app-id]/executors, and /api/v1/applications/[app-id]/jobs endpoints.

Purpose

The purpose of this component is to allow monitoring of Apache Spark clusters and the applications running on them through the collection of performance metrics like memory utilization, CPU utilization, shuffle operations, garbage collection time, I/O operations, and more.

Prerequisites

This receiver supports Apache Spark versions:

  • 3.3.2+

Configuration

These configuration options are for connecting to an Apache Spark application.

The following settings are optional:

  • collection_interval: (default = 60s): This receiver collects metrics on an interval. This value must be a string readable by Golang's time.ParseDuration. Valid time units are ns, us (or µs), ms, s, m, h.
  • initial_delay (default = 1s): defines how long this receiver waits before starting.
  • endpoint: (default = http://localhost:4040): Apache Spark endpoint to connect to in the form of [http][://]{host}[:{port}]
  • application_names: An array of Spark application names for which metrics should be collected. If no application names are specified, metrics will be collected for all Spark applications running on the cluster at the specified endpoint.

Example Configuration

receivers:
  apachespark:
    collection_interval: 60s
    endpoint: http://localhost:4040
    application_names:
    - PythonStatusAPIDemo
    - PythonLR

The full list of settings exposed for this receiver are documented here with detailed sample configurations here.

Metrics

Details about the metrics produced by this receiver can be found in metadata.yaml