Compare commits
7 commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
590d6d87c7 | ||
|
|
df03b1b3a6 | ||
|
|
7205a6772b | ||
|
|
e608111e7c | ||
|
|
e37d4a8e4d | ||
|
|
6a048792eb | ||
|
|
ca2dac8c78 |
5 changed files with 3618 additions and 509 deletions
|
|
@ -2,12 +2,12 @@ name: Generate Power Graph
|
|||
|
||||
on:
|
||||
push:
|
||||
branches: [ master ]
|
||||
branches: [ master, dev ]
|
||||
paths: [ power_log.csv ]
|
||||
|
||||
jobs:
|
||||
plot:
|
||||
runs-on: [self-hosted]
|
||||
runs-on: [docker]
|
||||
env:
|
||||
BRANCH: ${{ github.ref_name }}
|
||||
COMMIT: ${{ github.sha }}
|
||||
|
|
@ -36,4 +36,3 @@ jobs:
|
|||
with:
|
||||
name: power-graph
|
||||
path: power_graph.png
|
||||
|
||||
|
|
|
|||
|
|
@ -47,7 +47,7 @@ A Forgejo workflow (`.forgejo/workflows/power-graph.yml`) automates the chart ge
|
|||
|
||||
What it does:
|
||||
|
||||
1. Runs on a self-hosted runner for separation of processing (You will need to set one up if using forgejo).
|
||||
1. Runs on a self-hosted runner because it needs local hwmon logs and Python packages.
|
||||
2. Manually clones the repo with a read token stored in `secrets.PAT_READ_REPO`.
|
||||
3. Installs Python (via `apt`), prepares a virtualenv, and pulls `pandas` + `matplotlib`.
|
||||
4. Executes `python plot_power.py`, producing `power_graph.png`.
|
||||
|
|
|
|||
|
|
@ -7,7 +7,7 @@ APU_HWMON="/sys/class/hwmon/hwmon5" # APU sensor
|
|||
GPU_HWMON="/sys/class/drm/card0/device/hwmon/hwmon4" # dGPU sensor
|
||||
INTERVAL=1 # seconds between refreshes
|
||||
LOGFILE="power_log.csv" # CSV output file
|
||||
MAX_LINES=500 # limit to last 500 lines
|
||||
RETENTION_SECONDS=3600 # keep roughly 1 hour of samples
|
||||
|
||||
# --- Helpers ---
|
||||
read_watts() {
|
||||
|
|
@ -57,12 +57,16 @@ while true; do
|
|||
# --- Append to CSV ---
|
||||
echo "$TIME,$APU_PWR,$GPU_PWR,$TOTAL,$APU_TEMP,$GPU_TEMP" >> "$LOGFILE"
|
||||
|
||||
# --- Keep only last $MAX_LINES lines ---
|
||||
# --- Keep roughly 1 hour of samples (plus header) ---
|
||||
MAX_SAMPLES=$(( (RETENTION_SECONDS + INTERVAL - 1) / INTERVAL ))
|
||||
MAX_LINES_WITH_HEADER=$(( MAX_SAMPLES + 1 ))
|
||||
LINES=$(wc -l < "$LOGFILE")
|
||||
if (( LINES > MAX_LINES )); then
|
||||
tail -n $MAX_LINES "$LOGFILE" > "${LOGFILE}.tmp" && mv "${LOGFILE}.tmp" "$LOGFILE"
|
||||
if (( LINES > MAX_LINES_WITH_HEADER )); then
|
||||
{
|
||||
head -n 1 "$LOGFILE"
|
||||
tail -n "$MAX_SAMPLES" "$LOGFILE"
|
||||
} > "${LOGFILE}.tmp" && mv "${LOGFILE}.tmp" "$LOGFILE"
|
||||
fi
|
||||
|
||||
sleep $INTERVAL
|
||||
done
|
||||
|
||||
|
|
|
|||
|
|
@ -11,6 +11,7 @@ import matplotlib.pyplot as plt
|
|||
|
||||
CSV_PATH = "power_log.csv"
|
||||
OUTPUT_FILE = "power_graph.png"
|
||||
RETENTION_SECONDS = 3600
|
||||
|
||||
print(f"📊 Reading {CSV_PATH}...")
|
||||
|
||||
|
|
@ -23,6 +24,11 @@ df = pd.read_csv(
|
|||
|
||||
# --- Convert timestamps ---
|
||||
df["timestamp"] = pd.to_datetime(df["timestamp"], errors="coerce")
|
||||
latest_ts = df["timestamp"].dropna().max()
|
||||
if pd.notna(latest_ts):
|
||||
cutoff = latest_ts - pd.Timedelta(seconds=RETENTION_SECONDS)
|
||||
df = df[df["timestamp"] >= cutoff]
|
||||
|
||||
df["time_fmt"] = df["timestamp"].dt.strftime("%Y-%m-%d %H:%M:%S")
|
||||
|
||||
# --- Downsample if too many entries ---
|
||||
|
|
@ -61,4 +67,3 @@ plt.grid(axis="y", alpha=0.3)
|
|||
plt.tight_layout()
|
||||
plt.savefig(OUTPUT_FILE, dpi=150)
|
||||
print(f"✅ Saved graph: {OUTPUT_FILE}")
|
||||
|
||||
|
|
|
|||
4099
power_log.csv
4099
power_log.csv
File diff suppressed because it is too large
Load diff
Loading…
Add table
Add a link
Reference in a new issue