End-to-end reference for the automated daily pipeline and on-demand deep research: scheduler triggers, multi-engine CLI execution (Claude, Codex, Gemini, Copilot), multi-agent research, Notion publishing, Teams and Slack delivery, custom topic briefs, and maintenance workflows.
From scheduler trigger to Notion and optional multi-channel webhook delivery (Teams + Slack).
flowchart LR
subgraph Schedulers
A1["macOS launchd"]
A2["Windows Task Scheduler"]
A3["Manual / make run"]
end
A1 --> B["briefing.sh / .ps1"]
A2 --> B
A3 --> B
B --> C["prompt.md"]
C --> D["AI Engine (Claude/Codex/Gemini/Copilot)"]
D --> E["WebSearch (9 topics)"]
D --> F["Notion MCP"]
F --> G["Notion Page"]
D --> H["card.json"]
B --> I{"Teams webhook?"}
I -->|Yes| J["notify-teams"]
J --> K["Teams Channel"]
B --> L{"Slack webhook?"}
L -->|Yes| M["notify-slack"]
M --> N["Slack Channel"]
flowchart TD
subgraph Triggers
A1["macOS launchd"]
A2["Windows Task Scheduler"]
A3["Manual make run"]
end
A1 --> B1["briefing.sh"]
A2 --> B2["briefing.ps1"]
A3 --> B1
A3 --> B2
B1 --> C["prompt.md"]
B2 --> C
C --> D["AI Engine"]
D --> E["WebSearch"]
D --> F["Notion MCP"]
F --> G["Notion briefing page"]
D --> CS["logs/covered-stories.txt"]
D --> H["logs/YYYY-MM-DD.log"]
D --> I["logs/YYYY-MM-DD-card.json"]
B1 --> JT{"AI_BRIEFING_TEAMS_WEBHOOK set?"}
B2 --> JT
JT -->|"No"| KT["Skip Teams"]
JT -->|"Yes"| LT["notify-teams script"]
LT --> MT{"card JSON exists and is valid?"}
MT -->|"Yes"| NT["POST Adaptive Card to Teams webhooks"]
MT -->|"No"| OT["Teams notify failed"]
B1 --> JS{"AI_BRIEFING_SLACK_WEBHOOK set?"}
B2 --> JS
JS -->|"No"| KS["Skip Slack"]
JS -->|"Yes"| LS["notify-slack script"]
LS --> MS{"card JSON exists and conversion succeeds?"}
MS -->|"Yes"| NS["Convert to Block Kit and POST to Slack webhooks"]
MS -->|"No"| OS["Slack notify failed"]
sequenceDiagram
participant S as Scheduler/Manual
participant E as Entry Script
participant C as AI Engine
participant W as WebSearch
participant N as Notion MCP
participant T as notify-teams
participant TW as Teams Webhook(s)
participant SL as notify-slack
participant SW as Slack Webhook(s)
S->>E: Start briefing script
E->>E: Setup date, logs, env
E->>E: Refresh webhook env (Windows)
E->>C: invoke selected engine (fallback chain)
C->>C: Step 0a read logs/covered-stories.txt
C->>N: Step 0b fetch most recent briefing
N-->>C: prior coverage
loop 9 topic areas
C->>W: search recent news
W-->>C: results
end
C->>N: Step 3 create today's page
N-->>C: page URL
C->>C: Step 5 append to covered-stories.txt
C-->>E: exit 0 + output
E->>E: append logs/YYYY-MM-DD.log
opt Teams webhook configured
E->>T: call notify-teams --all
T->>T: validate card.json
T->>TW: POST payload
TW-->>T: 2xx
end
opt Slack webhook configured
E->>SL: call notify-slack --all
SL->>SL: convert card via teams-to-slack.py
SL->>SW: POST Block Kit payload
SW-->>SL: 2xx
end
What each stage expects, produces, and where failures usually happen.
| Stage | Input | Output | Primary File/Tool |
|---|---|---|---|
| Trigger | 08:00 schedule or manual command | Script execution starts | com.ainews.briefing.plist, install-task.ps1, Makefile |
| Bootstrap | Date + environment | Prompt text + runtime context | briefing.sh, briefing.ps1 |
| Dedup | logs/covered-stories.txt + latest Notion page | Exclusion list for this run | Step 0a (file) + Step 0b (Notion MCP) |
| AI Run | prompt.md | Notion briefing + card.json + covered-stories update | AI engine (Claude/Codex/Gemini/Copilot), WebSearch, Notion MCP |
| Teams Notify | logs/YYYY-MM-DD-card.json + AI_BRIEFING_TEAMS_WEBHOOK | Adaptive Card post(s) | notify-teams.sh, notify-teams.ps1 |
| Slack Notify | logs/YYYY-MM-DD-card.json + AI_BRIEFING_SLACK_WEBHOOK | Block Kit post(s) | notify-slack.sh, notify-slack.ps1, teams-to-slack.py |
| Multi-URL Routing | Semicolon-separated webhook URLs | First URL by default, all URLs with --all/-All | Teams + Slack notify scripts |
| Cleanup | Historic log files | Old *.log removed | Entry script cleanup step |
Resolved: `prompt.md` Step 4 generates `logs/YYYY-MM-DD-card.json` directly. Teams notifier posts this file as-is, and Slack notifier converts the same file to Block Kit before POST.
flowchart TD
A[Entry script exits success] --> B{card.json exists?}
B -->|No| B1[Notify paths fail fast]
B -->|Yes| T{Teams webhook env set?}
B -->|Yes| S{Slack webhook env set?}
T -->|No| T0[Skip Teams]
T -->|Yes| T1[notify-teams]
T1 --> T2{JSON valid + URL list valid?}
T2 -->|No| T3[Teams notify failed]
T2 -->|Yes| T4[POST to first URL or all URLs]
S -->|No| S0[Skip Slack]
S -->|Yes| S1[notify-slack]
S1 --> S2{teams-to-slack conversion OK?}
S2 -->|No| S3[Slack notify failed]
S2 -->|Yes| S4[POST Block Kit to first URL or all URLs]
The briefing targets short-horizon AI news (past 24 hours) across the following domains.
Releases, features, Anthropic updates
Model changes, API updates, product launches
Cursor, Windsurf, Copilot, JetBrains, Xcode AI
LangChain, CrewAI, AutoGen, MCP and agents
Major lab announcements, benchmarks, launches
Llama, Mistral, DeepSeek, Hugging Face
Rounds, acquisitions, startup launches
Government policy and AI safety regulation
Vercel, Next.js, React Native, TypeScript tooling
Pick a topic, choose your destinations, and get a comprehensive news-focused research briefing with linked citations and dates.
flowchart LR
A["Topic Input
(CLI flags or REPL)"] --> B["custom-brief.sh / .ps1"]
B --> C["prompt-custom-brief.md"]
C --> D["AI Engine"]
subgraph "Phase 1: Parallel Discovery"
D --> A1["Agent 1: Breaking News"]
D --> A2["Agent 2: Technical Analysis"]
D --> A3["Agent 3: Industry Impact"]
D --> A4["Agent 4: Trends"]
D --> A5["Agent 5: Policy"]
D --> A6["Agent 6+: Topic-specific"]
end
A1 --> DD["Phase 2: Deep Dive"]
A2 --> DD
A3 --> DD
A4 --> DD
A5 --> DD
A6 --> DD
DD --> S["Phase 3: Synthesis"]
S --> OUT["Terminal Output"]
S -->|"--notion"| N["Notion Page"]
S -->|"--teams/--slack"| CARD["Card JSON"]
CARD --> T["Teams"]
CARD --> SL["Slack"]
flowchart TD
subgraph "Entry Points"
SH["custom-brief.sh"]
PS["custom-brief.ps1"]
SK["/custom-brief skill"]
end
SH --> PT["prompt-custom-brief.md"]
PS --> PT
SK --> CC["AI Engine"]
PT --> CC
subgraph "Phase 1: Parallel Discovery"
CC --> A1["Agent 1: Breaking News"]
CC --> A2["Agent 2: Technical Analysis"]
CC --> A3["Agent 3: Industry Impact"]
CC --> A4["Agent 4: Trend Trajectory"]
CC --> A5["Agent 5: Policy and Ethics"]
end
subgraph "Phase 2-3: Synthesis"
A1 --> DD["Deep Dive Follow-ups"]
A2 --> DD
A3 --> DD
A4 --> DD
A5 --> DD
DD --> SYNTH["Thematic Synthesis"]
end
subgraph "Output"
SYNTH --> STDOUT["Terminal"]
SYNTH -->|"--notion"| NOTION["Notion Page"]
SYNTH -->|"--teams/--slack"| CARD["Card JSON"]
CARD --> NT["notify-teams"]
CARD --> NS["notify-slack"]
end
# Full research with all destinations ./custom-brief.sh --topic "AI in healthcare" --notion --teams --slack # Terminal + Notion only ./custom-brief.sh -t "quantum computing" -n # Interactive mode (prompts for topic and destinations) ./custom-brief.sh # PowerShell .\custom-brief.ps1 -Topic "AI regulation EU" -Notion -Teams # Make target make custom-brief T="open source LLMs" NOTION=1 TEAMS=1
| Aspect | Daily Briefing | Custom Brief |
|---|---|---|
| Trigger | Scheduled (8:00 AM daily) | On-demand |
| Scope | 9 fixed AI topics | Any user-defined topic |
| Depth | Broad scan (search per topic) | Deep (5 parallel agents + follow-ups) |
| Deduplication | Yes (covered-stories.txt) | No (standalone) |
| Notion title | YYYY-MM-DD - AI Daily Briefing | YYYY-MM-DD - Custom Brief: [Topic] |
| CLI output | Logged to file only | Printed to terminal + logged |
| Card filename | logs/YYYY-MM-DD-card.json | logs/custom-TIMESTAMP-card.json |
Daily operation commands for both standard runs and historical date backfills.
# Clone + install scheduler git clone https://github.com/hoangsonww/AI-News-Briefing cd AI-News-Briefing make install # Run now make run # Backfill a specific date make run D=2026-03-15 # Watch logs make tail
# macOS install (launchd) chmod +x briefing.sh cp com.ainews.briefing.plist ~/Library/LaunchAgents/ launchctl load ~/Library/LaunchAgents/com.ainews.briefing.plist # Windows install (Task Scheduler) .\install-task.ps1 # Windows custom schedule example .\install-task.ps1 -Hour 7 -Minute 30
| Action | macOS/Linux | Windows |
|---|---|---|
| Run now | bash briefing.sh | .\briefing.ps1 |
| Backfill date | bash briefing.sh 2026-03-15 | .\briefing.ps1 -BriefingDate 2026-03-15 |
| Trigger scheduler | launchctl kickstart "gui/$(id -u)/com.ainews.briefing" | schtasks /run /tn AiNewsBriefing |
| Tail today log | tail -f logs/$(date +%Y-%m-%d).log | Get-Content .\logs\$(Get-Date -Format yyyy-MM-dd).log -Wait |
| Scheduler status | launchctl list | grep ainews | schtasks /query /tn AiNewsBriefing |
Single interface for execution, logs, scheduler management, and validation.
| Command | Category | Description |
|---|---|---|
make run | Execution | Run briefing in foreground |
make run-bg | Execution | Run briefing in background |
make run D=YYYY-MM-DD | Execution | Backfill for a target date |
make run-scheduled | Execution | Trigger via scheduler service |
make custom-brief T="topic" | Execution | Deep-research a specific topic |
make custom-brief T="..." NOTION=1 TEAMS=1 | Execution | Research + publish to Notion/Teams |
make custom-brief-bg T="..." | Execution | Custom brief in background |
make tail | Logs | Tail today log |
make logs | Logs | List all logs |
make log-date D=YYYY-MM-DD | Logs | Print date-specific log |
make clean-logs | Logs | Delete logs older than 30 days |
make install | Scheduler | Install scheduler for current platform |
make uninstall | Scheduler | Remove scheduler |
make status | Scheduler | Show scheduler status |
make check | Validate | Verify Claude binary path |
make validate | Validate | Validate key project files and prompt steps |
make info | Info | Show platform/model/topic summary |
Script pairs exist for both `.sh` and `.ps1` variants.
| Script | Purpose | Example |
|---|---|---|
health-check | Verify install, prompt structure, scheduler, CLI | bash scripts/health-check.sh |
dry-run | Run pipeline without writing to Notion | bash scripts/dry-run.sh --model haiku --budget 1.00 |
test-notion | Test Notion MCP connectivity | bash scripts/test-notion.sh |
log-summary | Summarize recent run outcomes | bash scripts/log-summary.sh 14 |
log-search | Search logs with optional context | bash scripts/log-search.sh --search "Anthropic" --context 2 |
cost-report | Estimate spend by period | bash scripts/cost-report.sh --month 2026-03 |
export-logs | Archive logs to tar.gz/zip | bash scripts/export-logs.sh --from 2026-03-01 --to 2026-03-09 |
backup-prompt | Version and restore prompt.md | bash scripts/backup-prompt.sh --list |
topic-edit | Add/remove/list topics in prompt.md | bash scripts/topic-edit.sh --list |
update-schedule | Adjust scheduler time | bash scripts/update-schedule.sh --hour 7 --minute 30 |
notify | Native OS status notification | bash scripts/notify.sh |
notify-teams | Validate and POST Adaptive Card JSON to Teams webhook(s) | bash scripts/notify-teams.sh --all --card-file logs/2026-03-24-card.json |
notify-slack | Convert Teams card JSON to Block Kit and POST to Slack webhook(s) | bash scripts/notify-slack.sh --all --card-file logs/2026-03-24-card.json |
uninstall | Remove scheduler and optional artifacts | bash scripts/uninstall.sh --all |
Both channels use the same generated logs/YYYY-MM-DD-card.json, with Slack converting it to Block Kit first.
flowchart LR
A["Claude writes logs/YYYY-MM-DD-card.json"] --> B{"Channel notifier"}
B --> C["notify-teams.sh / notify-teams.ps1"]
B --> D["notify-slack.sh / notify-slack.ps1"]
C --> E["POST Adaptive Card JSON to Teams webhooks"]
D --> F["teams-to-slack.py conversion"]
F --> G["POST Block Kit JSON to Slack webhooks"]
# Configure multiple webhook URLs (semicolon-separated) export AI_BRIEFING_TEAMS_WEBHOOK="https://teams-one;https://teams-two" export AI_BRIEFING_SLACK_WEBHOOK="https://hooks.slack.com/services/T.../B.../one;https://hooks.slack.com/services/T.../B.../two" # Default: first URL only bash scripts/notify-teams.sh bash scripts/notify-slack.sh # Fan out to all configured URLs bash scripts/notify-teams.sh --all bash scripts/notify-slack.sh --all # Explicit card file for replay/testing bash scripts/notify-teams.sh --all --card-file logs/2026-03-24-card.json bash scripts/notify-slack.sh --all --card-file logs/2026-03-24-card.json
# Persist multiple webhook URLs (semicolon-separated) [Environment]::SetEnvironmentVariable("AI_BRIEFING_TEAMS_WEBHOOK", "https://teams-one;https://teams-two", "User") [Environment]::SetEnvironmentVariable("AI_BRIEFING_SLACK_WEBHOOK", "https://hooks.slack.com/services/T.../B.../one;https://hooks.slack.com/services/T.../B.../two", "User") # Default: first URL only .\scripts\notify-teams.ps1 .\scripts\notify-slack.ps1 # Fan out to all configured URLs .\scripts\notify-teams.ps1 -All .\scripts\notify-slack.ps1 -All # Explicit card file for replay/testing .\scripts\notify-teams.ps1 -All -CardFile ".\logs\2026-03-24-card.json" .\scripts\notify-slack.ps1 -All -CardFile ".\logs\2026-03-24-card.json"
Legacy note: `scripts/build-teams-card.py` remains for historical parser flow, but current runtime uses direct card generation in `prompt.md` Step 4. Slack reuses that same card through `scripts/teams-to-slack.py`.
Verify syntax, structure, arg handling, template substitution, card JSON, notification error paths, and cross-platform portability. No external services called.
flowchart TD
subgraph "Bash (macOS / Linux / Git Bash)"
R["tests/run-all.sh"] --> T1["test-custom-brief.sh\n37 tests"]
R --> T2["test-daily-brief.sh\n56 tests"]
R --> T3["test-notifications.sh\n37 tests"]
R --> T4["test-portability.sh\n26 tests"]
end
subgraph "PowerShell (Windows)"
PS["tests/test-all.ps1\n91 tests"]
end
T1 --> X["Args, templates,\nprompts, skills"]
T2 --> Y["Steps, topics,\nchangelogs, scripts"]
T3 --> Z["Card JSON, converter,\nerror handling"]
T4 --> W["Bash 3.2, awk, date,\nANSI safety"]
PS --> X
PS --> Y
PS --> Z
# All bash tests (macOS / Linux / Git Bash) bash tests/run-all.sh # Individual suites bash tests/test-custom-brief.sh bash tests/test-daily-brief.sh bash tests/test-notifications.sh bash tests/test-portability.sh # PowerShell (Windows) powershell -ExecutionPolicy Bypass -File tests\test-all.ps1
| Suite | Tests | Coverage |
|---|---|---|
test-custom-brief.sh | 37 | Args, template substitution, prompt structure, skill |
test-daily-brief.sh | 56 | Prompt steps, 9 topics, 8 changelog URLs, entry scripts, dedup |
test-notifications.sh | 37 | Card JSON validity, Adaptive Card structure, converter, errors |
test-portability.sh | 26 | Bash 3.2 compat, awk, date, -f not -x, ANSI safety |
test-all.ps1 | 91 | PowerShell syntax, all prompts, cards, converter, docs |
Baseline estimates with Opus model, 9 topics, no hard budget cap.
Primary docs to keep this wiki and runtime behavior aligned.
User-facing setup, topic scope, scheduler installation, Makefile commands, and Teams + Slack delivery overview.
Deep architecture walkthrough with component-level design decisions and operational constraints.
Detailed end-to-end execution, sequence, failure paths, artifact contracts, and alignment options.
Teams webhook setup, multi-webhook configuration, Adaptive Card behavior, and troubleshooting delivery issues.
Converter that transforms Teams Adaptive Card JSON into Slack Block Kit payloads for Slack webhook delivery.
Agent instructions for search scope, formatting requirements, and Notion write operation.
Custom topic deep research: CLI usage, agent architecture, output formats, and comparison with daily briefing.
Full setup guide: AI CLI engines, Notion MCP, webhook configuration, scheduler install, and verification.
Slack webhook setup, Block Kit conversion, multi-webhook configuration, and troubleshooting.
Test suite documentation: architecture, per-suite coverage tables, run commands, design principles, how to add tests.
Log tailing and management: live tail, read, search, summarize, export, and cleanup for daily and custom briefs.
Cross-platform operator interface for run, schedule, custom-brief, log, validation, and status workflows.