Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Missing node_zfs_zpool_state metrics #2442

Closed
onedr0p opened this issue Aug 5, 2022 · 1 comment
Closed

Missing node_zfs_zpool_state metrics #2442

onedr0p opened this issue Aug 5, 2022 · 1 comment

Comments

@onedr0p
Copy link

onedr0p commented Aug 5, 2022

Host operating system: output of uname -a

Linux expanse 5.15.0-39-generic #42-Ubuntu SMP Thu Jun 9 23:42:32 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux

node_exporter version: output of node_exporter --version

v1.3.1

Are you running node_exporter in Docker?

---
version: "3.8"

services:
  node-exporter:
    image: quay.io/prometheus/node-exporter:v1.3.1
    container_name: node-exporter
    restart: unless-stopped
    command:
      - --path.procfs=/host/proc
      - --path.rootfs=/rootfs
      - --path.sysfs=/host/sys
      - --collector.filesystem.mount-points-exclude=^/(sys|proc|dev|host|etc)($$|/)
    volumes:
      - /proc:/host/proc:ro
      - /sys:/host/sys:ro
      - /:/rootfs:ro,rslave
    ports:
      - 9100:9100

What did you do that produced an error?

What did you expect to see?

I expected to see node_zfs_zpool_state metrics

What did you see instead?

Logs show zfs is enabled:

❯ docker logs ebef38fa83bf -f
ts=2022-08-05T16:29:01.797Z caller=node_exporter.go:182 level=info msg="Starting node_exporter" version="(version=1.3.1, branch=HEAD, revision=a2321e7b940ddcff26873612bccdf7cd4c42b6b6)"
ts=2022-08-05T16:29:01.797Z caller=node_exporter.go:183 level=info msg="Build context" build_context="(go=go1.17.3, user=root@243aafa5525c, date=20211205-11:09:49)"
ts=2022-08-05T16:29:01.798Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(sys|proc|dev|host|etc)($|/)
ts=2022-08-05T16:29:01.798Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:108 level=info msg="Enabled collectors"
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=arp
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=bcache
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=bonding
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=btrfs
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=conntrack
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=cpu
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=cpufreq
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=diskstats
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=dmi
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=edac
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=entropy
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=fibrechannel
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=filefd
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=filesystem
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=hwmon
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=infiniband
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=ipvs
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=loadavg
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=mdadm
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=meminfo
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=netclass
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=netdev
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=netstat
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=nfs
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=nfsd
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=nvme
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=os
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=powersupplyclass
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=pressure
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=rapl
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=schedstat
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=sockstat
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=softnet
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=stat
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=tapestats
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=textfile
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=thermal_zone
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=time
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=timex
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=udp_queues
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=uname
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=vmstat
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=xfs
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:115 level=info collector=zfs
ts=2022-08-05T16:29:01.798Z caller=node_exporter.go:199 level=info msg="Listening on" address=:9100
ts=2022-08-05T16:29:01.798Z caller=tls_config.go:195 level=info msg="TLS is disabled." http2=false

No node_zfs_zpool_state metrics returned:

curl -sL 127.0.0.1:9100/metrics | grep zpool

(nothing found)

Other zfs metrics are returned:

curl -sL 127.0.0.1:9100/metrics | grep zfs_ | head -n 10

# HELP node_zfs_abd_linear_cnt kstat.zfs.misc.abdstats.linear_cnt
# TYPE node_zfs_abd_linear_cnt untyped
node_zfs_abd_linear_cnt 300
# HELP node_zfs_abd_linear_data_size kstat.zfs.misc.abdstats.linear_data_size
# TYPE node_zfs_abd_linear_data_size untyped
node_zfs_abd_linear_data_size 172544
# HELP node_zfs_abd_scatter_chunk_waste kstat.zfs.misc.abdstats.scatter_chunk_waste
# TYPE node_zfs_abd_scatter_chunk_waste untyped
node_zfs_abd_scatter_chunk_waste 216064
# HELP node_zfs_abd_scatter_cnt kstat.zfs.misc.abdstats.scatter_cnt

state appears to be where node_exporter is looking on the host

cat /proc/spl/kstat/zfs/tycho/state

ONLINE

and it also appears in the docker container:

cat /host/proc/spl/kstat/zfs/tycho/state

ONLINE

zfs version:

zfs --version

zfs-2.1.4-0ubuntu0.1
zfs-kmod-2.1.2-1ubuntu3
@discordianfish
Copy link
Member

Dup of #2068

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants