Compare commits

..

7 commits

Author SHA1 Message Date
mrkmntal
590d6d87c7 FF13 gaming session 11/10/2025
All checks were successful
Generate Power Graph / plot (push) Successful in 1m59s
2025-11-10 20:26:51 -05:00
mrkmntal
df03b1b3a6 New test data for new runner
All checks were successful
Generate Power Graph / plot (push) Successful in 3m46s
2025-11-09 14:38:40 -05:00
mrkmntal
7205a6772b Updated workflow for new repo level runner 2025-11-09 14:37:49 -05:00
mrkmntal
e608111e7c 11/9 2025, mid-day gaming session
All checks were successful
Generate Power Graph / plot (push) Successful in 2m15s
2025-11-09 14:19:39 -05:00
mrkmntal
e37d4a8e4d Folding and Uma session 11/8
All checks were successful
Generate Power Graph / plot (push) Successful in 2m4s
2025-11-08 12:43:23 -05:00
mrkmntal
6a048792eb Hour power update
All checks were successful
Generate Power Graph / plot (push) Successful in 2m12s
2025-11-07 13:09:17 -05:00
mrkmntal
ca2dac8c78 Start of dev branch 2025-11-07 12:10:31 -05:00
5 changed files with 3618 additions and 509 deletions

View file

@ -2,12 +2,12 @@ name: Generate Power Graph
on: on:
push: push:
branches: [ master ] branches: [ master, dev ]
paths: [ power_log.csv ] paths: [ power_log.csv ]
jobs: jobs:
plot: plot:
runs-on: [self-hosted] runs-on: [docker]
env: env:
BRANCH: ${{ github.ref_name }} BRANCH: ${{ github.ref_name }}
COMMIT: ${{ github.sha }} COMMIT: ${{ github.sha }}
@ -36,4 +36,3 @@ jobs:
with: with:
name: power-graph name: power-graph
path: power_graph.png path: power_graph.png

View file

@ -47,7 +47,7 @@ A Forgejo workflow (`.forgejo/workflows/power-graph.yml`) automates the chart ge
What it does: What it does:
1. Runs on a self-hosted runner for separation of processing (You will need to set one up if using forgejo). 1. Runs on a self-hosted runner because it needs local hwmon logs and Python packages.
2. Manually clones the repo with a read token stored in `secrets.PAT_READ_REPO`. 2. Manually clones the repo with a read token stored in `secrets.PAT_READ_REPO`.
3. Installs Python (via `apt`), prepares a virtualenv, and pulls `pandas` + `matplotlib`. 3. Installs Python (via `apt`), prepares a virtualenv, and pulls `pandas` + `matplotlib`.
4. Executes `python plot_power.py`, producing `power_graph.png`. 4. Executes `python plot_power.py`, producing `power_graph.png`.

View file

@ -7,7 +7,7 @@ APU_HWMON="/sys/class/hwmon/hwmon5" # APU sensor
GPU_HWMON="/sys/class/drm/card0/device/hwmon/hwmon4" # dGPU sensor GPU_HWMON="/sys/class/drm/card0/device/hwmon/hwmon4" # dGPU sensor
INTERVAL=1 # seconds between refreshes INTERVAL=1 # seconds between refreshes
LOGFILE="power_log.csv" # CSV output file LOGFILE="power_log.csv" # CSV output file
MAX_LINES=500 # limit to last 500 lines RETENTION_SECONDS=3600 # keep roughly 1 hour of samples
# --- Helpers --- # --- Helpers ---
read_watts() { read_watts() {
@ -57,12 +57,16 @@ while true; do
# --- Append to CSV --- # --- Append to CSV ---
echo "$TIME,$APU_PWR,$GPU_PWR,$TOTAL,$APU_TEMP,$GPU_TEMP" >> "$LOGFILE" echo "$TIME,$APU_PWR,$GPU_PWR,$TOTAL,$APU_TEMP,$GPU_TEMP" >> "$LOGFILE"
# --- Keep only last $MAX_LINES lines --- # --- Keep roughly 1 hour of samples (plus header) ---
MAX_SAMPLES=$(( (RETENTION_SECONDS + INTERVAL - 1) / INTERVAL ))
MAX_LINES_WITH_HEADER=$(( MAX_SAMPLES + 1 ))
LINES=$(wc -l < "$LOGFILE") LINES=$(wc -l < "$LOGFILE")
if (( LINES > MAX_LINES )); then if (( LINES > MAX_LINES_WITH_HEADER )); then
tail -n $MAX_LINES "$LOGFILE" > "${LOGFILE}.tmp" && mv "${LOGFILE}.tmp" "$LOGFILE" {
head -n 1 "$LOGFILE"
tail -n "$MAX_SAMPLES" "$LOGFILE"
} > "${LOGFILE}.tmp" && mv "${LOGFILE}.tmp" "$LOGFILE"
fi fi
sleep $INTERVAL sleep $INTERVAL
done done

View file

@ -11,6 +11,7 @@ import matplotlib.pyplot as plt
CSV_PATH = "power_log.csv" CSV_PATH = "power_log.csv"
OUTPUT_FILE = "power_graph.png" OUTPUT_FILE = "power_graph.png"
RETENTION_SECONDS = 3600
print(f"📊 Reading {CSV_PATH}...") print(f"📊 Reading {CSV_PATH}...")
@ -23,6 +24,11 @@ df = pd.read_csv(
# --- Convert timestamps --- # --- Convert timestamps ---
df["timestamp"] = pd.to_datetime(df["timestamp"], errors="coerce") df["timestamp"] = pd.to_datetime(df["timestamp"], errors="coerce")
latest_ts = df["timestamp"].dropna().max()
if pd.notna(latest_ts):
cutoff = latest_ts - pd.Timedelta(seconds=RETENTION_SECONDS)
df = df[df["timestamp"] >= cutoff]
df["time_fmt"] = df["timestamp"].dt.strftime("%Y-%m-%d %H:%M:%S") df["time_fmt"] = df["timestamp"].dt.strftime("%Y-%m-%d %H:%M:%S")
# --- Downsample if too many entries --- # --- Downsample if too many entries ---
@ -61,4 +67,3 @@ plt.grid(axis="y", alpha=0.3)
plt.tight_layout() plt.tight_layout()
plt.savefig(OUTPUT_FILE, dpi=150) plt.savefig(OUTPUT_FILE, dpi=150)
print(f"✅ Saved graph: {OUTPUT_FILE}") print(f"✅ Saved graph: {OUTPUT_FILE}")

File diff suppressed because it is too large Load diff