๐ซ The Pain Point
Server crashed. The log file server.log is 500MB. Notepad freezes when trying to open it.
You need to find what happened at โ10:00 AMโ.
๐ Agentic Solution
Stream Reader: Reads file line-by-line, never loading the whole thing into RAM. Fast and lightweight.
Key Features:
- Filter: Show only lines containing โERRORโ or โCRITICALโ.
- Time Window: Extract logs between 09:00 and 10:00.
โ๏ธ Phase 1: Commander (Quick Fix)
For Sysadmins in a hurry.
Prompt:
โRead
server.log. Extract all lines containing the word โERRORโ. Save these lines toerrors.txt. Handle the file line-by-line to avoid memory issues.โ
Result: A small, readable error log.
๐๏ธ Phase 2: Architect (Permanent Tool)
For DevOps/QA.
Engineering Prompt:
**Role:** Python Backend Developer
**Task:** Create a "Large Log Filter Tool".
**Requirements:**
1. **GUI:**
* Select Log File.
* Input Keywords (comma separated, e.g., "ERROR, FAIL").
* "Extract" button.
2. **Logic:**
* Use `open(file, 'r')` iterator to read line-by-line (RAM Safe).
* Check if keyword in line.
* Write matching lines to output file.
* Show "Lines Scanned" counter.
3. **Deliverables:** `log_filter.py`, `run.bat` (Windows), `run.sh` (Mac).
๐ง Prompt Decoding
- Memory Safe: When handling Big Data (Text files > 100MB),
read()crashes computers. The โline-by-lineโ instruction is the key engineering constraint here.
๐ ๏ธ Instructions
- Copy Prompt -> Paste -> Run.
- Select Log -> Keyword โERRORโ -> Scan.