Last refreshed: April 15, 2026

AI News Briefing
Comprehensive Project Wiki

End-to-end reference for the automated daily pipeline and on-demand deep research: scheduler triggers, multi-engine CLI execution (Claude, Codex, Gemini, Copilot), multi-agent research, Notion publishing, Obsidian vault integration with graph visualization, Teams and Slack delivery, custom topic briefs, and maintenance workflows.

View Obsidian Integration Open README
Claude Code Codex Gemini GitHub Copilot Notion MCP Obsidian Bash PowerShell Python Microsoft Teams Slack macOS launchd Task Scheduler Claude Code Codex Gemini GitHub Copilot Notion MCP Obsidian Bash PowerShell Python Microsoft Teams Slack macOS launchd Task Scheduler

Current system topology

From scheduler trigger to Notion, Obsidian vault, and optional multi-channel webhook delivery (Teams + Slack).

The system operates across five primary layers: a platform-native scheduler (launchd or Task Scheduler), a scripted entry point, an AI CLI engine registry (Claude, Codex, Gemini, or Copilot), the AI execution layer, and the output destinations. This modular design ensures that the core prompting and research logic remains decoupled from the specific OS or AI model being used.

flowchart LR
    subgraph Schedulers
        A1["macOS launchd"]
        A2["Windows Task Scheduler"]
        A3["Manual / make run"]
    end
    A1 --> B["briefing.sh / .ps1"]
    A2 --> B
    A3 --> B
    B --> C["prompt.md"]
    C --> D["AI Engine (Claude/Codex/Gemini/Copilot)"]
    D --> E["WebSearch (9 topics)"]
    D --> F["Notion MCP"]
    F --> G["Notion Page"]
    D --> H["card.json"]
    D --> OBS["obsidian.md"]
    B --> I{"Teams webhook?"}
    I -->|Yes| J["notify-teams"]
    J --> K["Teams Channel"]
    B --> L{"Slack webhook?"}
    L -->|Yes| M["notify-slack"]
    M --> N["Slack Channel"]
    B --> OC{"Obsidian vault?"}
    OC -->|Yes| OP["publish-obsidian"]
    OP --> OV["Obsidian Vault +\nGraph View"]
    

Runtime Topology

flowchart TD
    subgraph Triggers
        A1["macOS launchd"]
        A2["Windows Task Scheduler"]
        A3["Manual make run"]
    end

    A1 --> B1["briefing.sh"]
    A2 --> B2["briefing.ps1"]
    A3 --> B1
    A3 --> B2

    B1 --> C["prompt.md"]
    B2 --> C

    C --> D["AI Engine"]
    D --> E["WebSearch"]
    D --> F["Notion MCP"]
    F --> G["Notion briefing page"]

    D --> CS["logs/covered-stories.txt"]
    D --> H["logs/YYYY-MM-DD.log"]
    D --> I["logs/YYYY-MM-DD-card.json"]
    D --> OI["logs/YYYY-MM-DD-obsidian.md"]

    B1 --> JT{"AI_BRIEFING_TEAMS_WEBHOOK set?"}
    B2 --> JT
    JT -->|"No"| KT["Skip Teams"]
    JT -->|"Yes"| LT["notify-teams script"]
    LT --> MT{"card JSON exists and is valid?"}
    MT -->|"Yes"| NT["POST Adaptive Card to Teams webhooks"]
    MT -->|"No"| OT["Teams notify failed"]

    B1 --> JS{"AI_BRIEFING_SLACK_WEBHOOK set?"}
    B2 --> JS
    JS -->|"No"| KS["Skip Slack"]
    JS -->|"Yes"| LS["notify-slack script"]
    LS --> MS{"card JSON exists and conversion succeeds?"}
    MS -->|"Yes"| NS["Convert to Block Kit and POST to Slack webhooks"]
    MS -->|"No"| OS["Slack notify failed"]

    B1 --> JO{"AI_BRIEFING_OBSIDIAN_VAULT set?"}
    B2 --> JO
    JO -->|"No"| KO["Skip Obsidian"]
    JO -->|"Yes"| LO["publish-obsidian script"]
    LO --> MO{"obsidian.md exists and vault writable?"}
    MO -->|"Yes"| NO["Copy to vault + create topic stubs"]
    MO -->|"No"| OO["Obsidian publish skipped"]
      

Successful Execution Sequence

sequenceDiagram
    participant S as Scheduler/Manual
    participant E as Entry Script
    participant C as AI Engine
    participant W as WebSearch
    participant N as Notion MCP
    participant T as notify-teams
    participant TW as Teams Webhook(s)
    participant SL as notify-slack
    participant SW as Slack Webhook(s)
    participant OB as publish-obsidian
    participant OV as Obsidian Vault

    S->>E: Start briefing script
    E->>E: Setup date, logs, env
    E->>E: Refresh webhook env (Windows)
    E->>C: invoke selected engine (AI_BRIEFING_CLI or daily fallback)

    C->>C: Step 0a read logs/covered-stories.txt
    C->>N: Step 0b fetch most recent briefing
    N-->>C: prior coverage

    loop 9 topic areas
      C->>W: search recent news
      W-->>C: results
    end

    C->>N: Step 3 create today's page
    N-->>C: page URL

    C->>C: Step 5 append to covered-stories.txt

    C-->>E: exit 0 + output
    E->>E: append logs/YYYY-MM-DD.log

    opt Teams webhook configured
      E->>T: call notify-teams --all
      T->>T: validate card.json
      T->>TW: POST payload
      TW-->>T: 2xx
    end

    opt Slack webhook configured
      E->>SL: call notify-slack --all
      SL->>SL: convert card via teams-to-slack.py
      SL->>SW: POST Block Kit payload
      SW-->>SL: 2xx
    end

    opt Obsidian vault configured
      E->>OB: call publish-obsidian
      OB->>OB: extract [[wikilinks]]
      OB->>OV: copy briefing + create topic stubs
      OV-->>OB: success
    end
      

Execution guarantees and gaps

What each stage expects, produces, and where failures usually happen.

Our end-to-end flow is completely headless and automated. From the moment the scheduler fires, the entry scripts handle environment sanitization, log rotation, and secure API execution. Daily briefing supports fallback (`claude → codex → gemini → copilot`) when no explicit engine is set. Custom brief supports the same engines, but uses explicit selection (`--cli` / `-Cli`), then `AI_BRIEFING_CLI`, then defaults to `claude` (no fallback chain).

Stage Input Output Primary File/Tool
Trigger08:00 schedule or manual commandScript execution startscom.ainews.briefing.plist, install-task.ps1, Makefile
BootstrapDate + environmentPrompt text + runtime contextbriefing.sh, briefing.ps1
Deduplogs/covered-stories.txt + latest Notion pageExclusion list for this runStep 0a (file) + Step 0b (Notion MCP)
AI Runprompt.mdNotion briefing + card.json + obsidian.md + covered-stories updateAI engine (Claude/Codex/Gemini/Copilot), WebSearch, Notion MCP
Teams Notifylogs/YYYY-MM-DD-card.json + AI_BRIEFING_TEAMS_WEBHOOKAdaptive Card post(s)notify-teams.sh, notify-teams.ps1
Slack Notifylogs/YYYY-MM-DD-card.json + AI_BRIEFING_SLACK_WEBHOOKBlock Kit post(s)notify-slack.sh, notify-slack.ps1, teams-to-slack.py
Obsidian Publishlogs/YYYY-MM-DD-obsidian.md + AI_BRIEFING_OBSIDIAN_VAULTVault briefing page + topic stub pagespublish-obsidian.sh, publish-obsidian.ps1
Multi-URL RoutingSemicolon-separated webhook URLsFirst URL by default, all URLs with --all/-AllTeams + Slack notify scripts
CleanupHistoric log filesOld *.log removedEntry script cleanup step

Prompt/Runtime Aligned

Resolved: `prompt.md` Step 4 generates `logs/YYYY-MM-DD-card.json` directly. Step 5 generates `logs/YYYY-MM-DD-obsidian.md` with [[wikilinks]]. Teams notifier posts the card as-is, Slack notifier converts it to Block Kit, and Obsidian publisher copies the markdown to the vault with topic stub pages.

flowchart TD
    A[Entry script exits success] --> B{card.json exists?}
    B -->|No| B1[Notify paths fail fast]
    B -->|Yes| T{Teams webhook env set?}
    B -->|Yes| S{Slack webhook env set?}

    T -->|No| T0[Skip Teams]
    T -->|Yes| T1[notify-teams]
    T1 --> T2{JSON valid + URL list valid?}
    T2 -->|No| T3[Teams notify failed]
    T2 -->|Yes| T4[POST to first URL or all URLs]

    S -->|No| S0[Skip Slack]
    S -->|Yes| S1[notify-slack]
    S1 --> S2{teams-to-slack conversion OK?}
    S2 -->|No| S3[Slack notify failed]
    S2 -->|Yes| S4[POST Block Kit to first URL or all URLs]

    A --> OB{obsidian.md exists?}
    OB -->|No| OB0[Skip Obsidian]
    OB -->|Yes| OC{Vault env set + writable?}
    OC -->|No| OC0[Obsidian publish skipped]
    OC -->|Yes| OC1[publish-obsidian]
    OC1 --> OC2[Copy to AI-News-Briefings/]
    OC1 --> OC3["Extract [[wikilinks]]"]
    OC3 --> OC4[Create Topics/ stub pages]
    

9 Core Topic Areas Searched Daily

Our automated briefing engine targets ultra short-horizon AI news (strictly the past 24 hours). By segmenting the web search across these 9 highly specific domains, we bypass algorithmic noise and surface the exact technical, financial, and strategic updates you actually care about.

1

Anthropic & Claude Ecosystem

Real-time monitoring of the Anthropic ecosystem. This covers unannounced model deprecations, API rate limit changes, new system prompt capabilities, Claude Code CLI updates, and official blog announcements regarding safety benchmarks or enterprise tier features.

2

OpenAI, Codex & ChatGPT

Comprehensive tracking of OpenAI's rapidly shifting product lines. We monitor silent API parameter updates, GPT model rollouts, OpenAI Codex integration news, ChatGPT Plus feature drops, and strategic partnership announcements (e.g., Apple, Microsoft).

3

AI Coding IDEs & Workspaces

The developer tooling war is moving fast. We track paradigm shifts in AI-native editors like Cursor and Windsurf, alongside major extension updates for GitHub Copilot, JetBrains AI Assistant, and the integration of local SLMs directly into Xcode and VS Code.

4

Agentic AI & Orchestration

Beyond single-turn chat, we track the execution layer. This includes the latest advancements in LangChain, CrewAI, and Microsoft AutoGen, alongside new integrations for the Model Context Protocol (MCP), browser-automation agents, and multi-agent consensus frameworks.

5

Industry Titans & Hardware

Macro-level shifts from the tech giants. We monitor Google DeepMind, Meta, and xAI for major lab announcements, while simultaneously tracking NVIDIA, AMD, and Groq for semiconductor news, cluster deployments, and inference hardware breakthroughs.

6

Open Source & Local Models

The bleeding edge of the open-weights community. We track Hugging Face leaderboards, new quantization methods (GGUF, AWQ), and massive drops like Llama 3/4 iterations, Mistral architectural shifts, and DeepSeek's latest parameter-efficient milestones.

7

Startups, VC & M&A

Following the money in the AI boom. We scrape news of unannounced Seed and Series A funding rounds for stealth startups, major acqui-hires (talent poaching) between leading AI labs, and massive compute-infrastructure capital expenditures.

8

Policy, Safety & Regulation

The legal landscape defining AI's future. We track sweeping government policy changes (EU AI Act, US Executive Orders), copyright infringement lawsuits from major publishers against AI labs, and developments in cryptographic provenance and watermarking.

9

Frontend Dev Tools & Frameworks

How AI is fundamentally altering web development. We track Vercel's v0, Next.js AI integrations, React Native copilot generation, and the overall shift toward AI-generated UIs, dynamic rendering, and automated TypeScript type-safety tooling.

Deep research on any topic, on demand

Pick a topic, choose your destinations, and get a comprehensive news-focused research briefing with linked citations and dates.

flowchart LR
    A["Topic Input
(CLI flags or REPL)"] --> B["custom-brief.sh / .ps1"] B --> C["prompt-custom-brief.md"] C --> D["AI Engine"] subgraph "Phase 1: Parallel Discovery" D --> A1["Agent 1: Breaking News"] D --> A2["Agent 2: Technical Analysis"] D --> A3["Agent 3: Industry Impact"] D --> A4["Agent 4: Trends"] D --> A5["Agent 5: Policy"] D --> A6["Agent 6+: Topic-specific"] end A1 --> DD["Phase 2: Deep Dive"] A2 --> DD A3 --> DD A4 --> DD A5 --> DD A6 --> DD DD --> S["Phase 3: Synthesis"] S --> OUT["Terminal Output"] S -->|"--notion"| N["Notion Page"] S -->|"--teams/--slack"| CARD["Card JSON"] S -->|"--obsidian"| OBS["Obsidian Vault +\nGraph View"] CARD --> T["Teams"] CARD --> SL["Slack"]

Research Agent Architecture

flowchart TD
    subgraph "Entry Points"
        SH["custom-brief.sh"]
        PS["custom-brief.ps1"]
        SK["/custom-brief skill"]
    end

    SH --> PT["prompt-custom-brief.md"]
    PS --> PT
    SK --> CC["AI Engine"]
    PT --> CC

    subgraph "Phase 1: Parallel Discovery"
        CC --> A1["Agent 1: Breaking News"]
        CC --> A2["Agent 2: Technical Analysis"]
        CC --> A3["Agent 3: Industry Impact"]
        CC --> A4["Agent 4: Trend Trajectory"]
        CC --> A5["Agent 5: Policy and Ethics"]
    end

    subgraph "Phase 2-3: Synthesis"
        A1 --> DD["Deep Dive Follow-ups"]
        A2 --> DD
        A3 --> DD
        A4 --> DD
        A5 --> DD
        DD --> SYNTH["Thematic Synthesis"]
    end

    subgraph "Output"
        SYNTH --> STDOUT["Terminal"]
        SYNTH -->|"--notion"| NOTION["Notion Page"]
        SYNTH -->|"--teams/--slack"| CARD["Card JSON"]
        SYNTH -->|"--obsidian"| OBSMD["Obsidian Markdown"]
        CARD --> NT["notify-teams"]
        CARD --> NS["notify-slack"]
        OBSMD --> PO["publish-obsidian"]
        PO --> OV["Vault +\nTopic Stubs"]
    end
      
custom brief usage
# Full research with all destinations
./custom-brief.sh --topic "AI in healthcare" --notion --teams --slack --obsidian

# Terminal + Notion only
./custom-brief.sh -t "quantum computing" -n

# Terminal + Obsidian vault (graph-enabled)
./custom-brief.sh -t "AI coding assistants" --obsidian

# Interactive mode (prompts for topic and destinations)
./custom-brief.sh

# Pick a specific engine
./custom-brief.sh --cli codex -t "AI safety" -n

# PowerShell
.\custom-brief.ps1 -Topic "AI regulation EU" -Notion -Teams -Obsidian

# Make target
make custom-brief T="open source LLMs" NOTION=1 TEAMS=1 OBSIDIAN=1
Aspect Daily Briefing Custom Brief
TriggerScheduled (8:00 AM daily)On-demand
Scope9 fixed AI topicsAny user-defined topic
DepthBroad scan (search per topic)Deep (5 parallel agents + follow-ups)
DeduplicationYes (covered-stories.txt)No (standalone)
Notion titleYYYY-MM-DD - AI Daily BriefingYYYY-MM-DD - Custom Brief: [Topic]
Engine selectionAI_BRIEFING_CLI or fallback chain--cli/-CliAI_BRIEFING_CLIclaude
Obsidian vault fileAI-News-Briefings/YYYY-MM-DD.mdAI-News-Briefings/YYYY-MM-DD - [Topic].md
CLI outputLogged to file onlyPrinted to terminal + logged
Card filenamelogs/YYYY-MM-DD-card.jsonlogs/custom-TIMESTAMP-card.json

Install, run, backfill, inspect

Daily operation commands for both standard runs and historical date backfills.

Operations are standardized around a cross-platform Makefile, giving you a unified surface to install schedulers, trigger runs, inspect logs, and validate the environment. All outputs are captured in date-stamped logs with an automatic 30-day rotation policy to keep your disk clean.

quick start (make)
# Clone + install scheduler
git clone https://github.com/hoangsonww/AI-News-Briefing
cd AI-News-Briefing
make install

# Run now
make run

# Backfill a specific date
make run D=2026-03-15

# Watch logs
make tail
platform-native scheduling
# macOS install (launchd)
chmod +x briefing.sh
cp com.ainews.briefing.plist ~/Library/LaunchAgents/
launchctl load ~/Library/LaunchAgents/com.ainews.briefing.plist

# Windows install (Task Scheduler)
.\install-task.ps1

# Windows custom schedule example
.\install-task.ps1 -Hour 7 -Minute 30
Action macOS/Linux Windows
Run nowbash briefing.sh.\briefing.ps1
Backfill datebash briefing.sh 2026-03-15.\briefing.ps1 -BriefingDate 2026-03-15
Trigger schedulerlaunchctl kickstart "gui/$(id -u)/com.ainews.briefing"schtasks /run /tn AiNewsBriefing
Tail today logtail -f logs/$(date +%Y-%m-%d).logGet-Content .\logs\$(Get-Date -Format yyyy-MM-dd).log -Wait
Scheduler statuslaunchctl list | grep ainewsschtasks /query /tn AiNewsBriefing

Research Ops Plugins

A unified suite of 10 intelligence agents built natively for Claude Code, OpenAI Codex, and Gemini CLI.

We provide generic, highly-specialized research tools capable of scanning GitHub trends, parsing SEC filings, summarizing ArXiv papers, tracking crypto tokenomics, and performing deep social intelligence scraping across Reddit and X.

graph TD
    subgraph "CLI Environments"
        CC[Claude Code]
        CX[OpenAI Codex]
        GM[Gemini CLI]
    end

    subgraph "Marketplace Catalogs"
        CMP[.claude-plugin/marketplace.json]
        CXMP[.agents/plugins/marketplace.json]
    end

    subgraph "Plugin Types (10 Available)"
        News[News & Media: ai-news, last30days, podcast-summarizer]
        Dev[Tech & Dev: trend-spotter, paper-reader, repo-auditor]
        Biz[Business: earnings, competitor-intel, startup-scout, crypto]
    end

    CC -->|/plugin install| CMP
    CX -->|Local UI| CXMP
    GM -->|extensions link| News
    GM -->|extensions link| Dev
    GM -->|extensions link| Biz

    CMP --> News
    CMP --> Dev
    CMP --> Biz

    CXMP --> News
    CXMP --> Dev
    CXMP --> Biz
    
📰

ai-news-briefing

Automates daily AI news research, Notion publishing, and Teams/Slack webhooks. Deep multi-angle research.

🔥

last30days

AI agent search engine scored by upvotes and real money. Searches Reddit, X, HN, YouTube, and Polymarket.

📈

trend-spotter

Analyzes GitHub trending repos, package downloads, and tech Twitter to identify emerging developer tools.

💰

earnings-analyzer

Fetches SEC filings and earnings call transcripts to synthesize an objective business briefing.

📚

paper-reader

Connects to ArXiv and Semantic Scholar to explain complex methodology in plain English (ELI5).

⚔️

competitor-intel

Finds top competitors, compares feature releases, pricing changes, and customer sentiment on Reddit/G2.

🛡️

repo-auditor

Analyzes GitHub repos for security vulnerabilities, stale dependencies, code quality, and bus factor.

🎙️

podcast-summarizer

Extracts transcripts from YouTube/podcasts and synthesizes actionable takeaways in a scannable format.

🚀

startup-scout

Tracks Y Combinator, Product Hunt, and VC funding rounds to identify promising early-stage startups.

🪙

crypto-tracker

Analyzes tokenomics, whitepapers, on-chain activity, and crypto Twitter sentiment for Web3 projects.

Cross-platform command surface

Single interface for execution, logs, scheduler management, and validation.

Command Category Description
make runExecutionRun briefing in foreground
make run-bgExecutionRun briefing in background
make run D=YYYY-MM-DDExecutionBackfill for a target date
make run-scheduledExecutionTrigger via scheduler service
make custom-brief T="topic"ExecutionDeep-research a specific topic
make custom-brief T="..." NOTION=1 TEAMS=1ExecutionResearch + publish to Notion/Teams
make custom-brief T="..." OBSIDIAN=1ExecutionResearch + publish to Obsidian vault
make custom-brief-bg T="..."ExecutionCustom brief in background
make tailLogsTail today log
make logsLogsList all logs
make log-date D=YYYY-MM-DDLogsPrint date-specific log
make clean-logsLogsDelete logs older than 30 days
make installSchedulerInstall scheduler for current platform
make uninstallSchedulerRemove scheduler
make statusSchedulerShow scheduler status
make checkValidateVerify Claude binary path
make validateValidateValidate key project files and prompt steps
make infoInfoShow platform/model/topic summary

Utility Script Catalog

Script pairs exist for both `.sh` and `.ps1` variants.

Script Purpose Example
health-checkVerify install, prompt structure, scheduler, CLIbash scripts/health-check.sh
dry-runRun pipeline without writing to Notionbash scripts/dry-run.sh --model haiku --budget 1.00
test-notionTest Notion MCP connectivitybash scripts/test-notion.sh
log-summarySummarize recent run outcomesbash scripts/log-summary.sh 14
log-searchSearch logs with optional contextbash scripts/log-search.sh --search "Anthropic" --context 2
cost-reportEstimate spend by periodbash scripts/cost-report.sh --month 2026-03
export-logsArchive logs to tar.gz/zipbash scripts/export-logs.sh --from 2026-03-01 --to 2026-03-09
backup-promptVersion and restore prompt.mdbash scripts/backup-prompt.sh --list
topic-editAdd/remove/list topics in prompt.mdbash scripts/topic-edit.sh --list
update-scheduleAdjust scheduler timebash scripts/update-schedule.sh --hour 7 --minute 30
notifyNative OS status notificationbash scripts/notify.sh
notify-teamsValidate and POST Adaptive Card JSON to Teams webhook(s)bash scripts/notify-teams.sh --all --card-file logs/2026-03-24-card.json
notify-slackConvert Teams card JSON to Block Kit and POST to Slack webhook(s)bash scripts/notify-slack.sh --all --card-file logs/2026-03-24-card.json
publish-obsidianCopy Obsidian markdown to vault and create [[wikilink]] topic stubsbash scripts/publish-obsidian.sh --file logs/2026-04-11-obsidian.md
test-obsidianTest Obsidian vault connectivity, permissions, and .obsidian configbash scripts/test-obsidian.sh
uninstallRemove scheduler and optional artifactsbash scripts/uninstall.sh --all

Teams, Slack, and Obsidian publishing paths

Teams and Slack reuse the same logs/YYYY-MM-DD-card.json. Obsidian uses the dedicated logs/YYYY-MM-DD-obsidian.md with [[wikilinks]] for graph visualization.

The notification pipeline employs a highly scalable fan-out architecture. A single JSON Adaptive Card is generated during the AI run, which is then dynamically converted into platform-specific formats (like Slack Block Kit) and dispatched across multiple webhook URLs seamlessly without needing any intermediate service.

flowchart LR
    A["Claude writes logs/YYYY-MM-DD-card.json"] --> B{"Channel notifier"}
    B --> C["notify-teams.sh / notify-teams.ps1"]
    B --> D["notify-slack.sh / notify-slack.ps1"]
    C --> E["POST Adaptive Card JSON to Teams webhooks"]
    D --> F["teams-to-slack.py conversion"]
    F --> G["POST Block Kit JSON to Slack webhooks"]

    H["Claude writes logs/YYYY-MM-DD-obsidian.md"] --> I["publish-obsidian.sh / .ps1"]
    I --> J["Copy to vault/AI-News-Briefings/"]
    I --> K["Extract [[wikilinks]]"]
    K --> L["Create Topics/ stub pages"]
    L --> M["Obsidian Graph View\nvisualizes topic connections"]
    
configure + test delivery (macOS/Linux)
# Configure multiple webhook URLs (semicolon-separated)
export AI_BRIEFING_TEAMS_WEBHOOK="https://teams-one;https://teams-two"
export AI_BRIEFING_SLACK_WEBHOOK="https://hooks.slack.com/services/T.../B.../one;https://hooks.slack.com/services/T.../B.../two"

# Default: first URL only
bash scripts/notify-teams.sh
bash scripts/notify-slack.sh

# Fan out to all configured URLs
bash scripts/notify-teams.sh --all
bash scripts/notify-slack.sh --all

# Explicit card file for replay/testing
bash scripts/notify-teams.sh --all --card-file logs/2026-03-24-card.json
bash scripts/notify-slack.sh --all --card-file logs/2026-03-24-card.json
configure + test delivery (Windows PowerShell)
# Persist multiple webhook URLs (semicolon-separated)
[Environment]::SetEnvironmentVariable("AI_BRIEFING_TEAMS_WEBHOOK", "https://teams-one;https://teams-two", "User")
[Environment]::SetEnvironmentVariable("AI_BRIEFING_SLACK_WEBHOOK", "https://hooks.slack.com/services/T.../B.../one;https://hooks.slack.com/services/T.../B.../two", "User")

# Default: first URL only
.\scripts\notify-teams.ps1
.\scripts\notify-slack.ps1

# Fan out to all configured URLs
.\scripts\notify-teams.ps1 -All
.\scripts\notify-slack.ps1 -All

# Explicit card file for replay/testing
.\scripts\notify-teams.ps1 -All -CardFile ".\logs\2026-03-24-card.json"
.\scripts\notify-slack.ps1 -All -CardFile ".\logs\2026-03-24-card.json"
configure + test Obsidian vault (macOS/Linux)
# Set vault path (absolute path to your Obsidian vault root)
export AI_BRIEFING_OBSIDIAN_VAULT="/Users/you/Documents/ObsidianVault"

# Test vault connectivity
bash scripts/test-obsidian.sh

# Manual publish (auto-runs after daily briefing when vault is configured)
bash scripts/publish-obsidian.sh --file logs/2026-04-11-obsidian.md
configure + test Obsidian vault (Windows PowerShell)
# Persist vault path
[Environment]::SetEnvironmentVariable("AI_BRIEFING_OBSIDIAN_VAULT", "C:\Users\you\Documents\ObsidianVault", "User")

# Test vault connectivity
.\scripts\test-obsidian.ps1

# Manual publish
.\scripts\publish-obsidian.ps1 -File ".\logs\2026-04-11-obsidian.md"

Legacy note: `scripts/build-teams-card.py` remains for historical parser flow, but current runtime uses direct card generation in `prompt.md` Step 4. Step 5 generates Obsidian-formatted markdown. Slack reuses the card through `scripts/teams-to-slack.py`, and Obsidian publishing writes directly to the local vault.

Local-first vault publishing with graph visualization

Briefings are written as markdown files with [[wikilinks]] and YAML frontmatter. Obsidian's graph view automatically maps topic connections across all briefings.

flowchart TD
    subgraph "Claude CLI Output"
        A["logs/YYYY-MM-DD-obsidian.md"]
    end

    subgraph "publish-obsidian.sh"
        B["Copy briefing to vault"]
        C["Extract all [[wikilinks]]"]
        D{"Topic stub exists?"}
        D -->|No| E["Create Topics/Name.md\nwith YAML frontmatter"]
        D -->|Yes| F["Skip (idempotent)"]
    end

    subgraph "Obsidian Vault"
        G["AI-News-Briefings/\n2026-04-11.md"]
        H["Topics/Claude Code.md"]
        I["Topics/OpenAI.md"]
        J["Topics/AI Coding IDEs.md"]
        K["Topics/..."]
    end

    A --> B --> G
    A --> C --> D
    G -.->|"[[Claude Code]]"| H
    G -.->|"[[OpenAI]]"| I
    G -.->|"[[AI Coding IDEs]]"| J
    H -.->|"graph edges"| I
    I -.->|"graph edges"| J
    

Vault Directory Structure

your-obsidian-vault/
  AI-News-Briefings/ # briefing pages (one per run)
    2026-04-11.md # daily briefing with [[wikilinks]]
    2026-04-11 - AI Coding Assistants.md # custom brief
    2026-04-10.md
  Topics/ # graph hub nodes (auto-created)
    Claude Code.md # type: topic, links back to briefings
    OpenAI.md
    AI Coding IDEs.md
    Anthropic.md
  .obsidian/ # Obsidian app config (auto-created by Obsidian)

Graph View Concept

graph LR
    B1["2026-04-11\nDaily Briefing"] --> CC["Claude Code"]
    B1 --> OA["OpenAI"]
    B1 --> ACI["AI Coding IDEs"]
    B1 --> AG["Agentic AI"]
    B2["2026-04-10\nDaily Briefing"] --> CC
    B2 --> OA
    B2 --> MS["Mistral"]
    B2 --> HF["Hugging Face"]
    B3["2026-04-11\nCustom: AI Regulation"] --> POL["AI Policy"]
    B3 --> OA
    B3 --> EU["EU AI Act"]
    CC --> ANT["Anthropic"]
    ACI --> CUR["Cursor"]
    ACI --> WS["Windsurf"]
    style CC fill:#7c3aed,stroke:#a78bfa,color:#fff
    style OA fill:#7c3aed,stroke:#a78bfa,color:#fff
    style ACI fill:#7c3aed,stroke:#a78bfa,color:#fff
    style AG fill:#7c3aed,stroke:#a78bfa,color:#fff
    style MS fill:#7c3aed,stroke:#a78bfa,color:#fff
    style HF fill:#7c3aed,stroke:#a78bfa,color:#fff
    style POL fill:#7c3aed,stroke:#a78bfa,color:#fff
    style EU fill:#7c3aed,stroke:#a78bfa,color:#fff
    style ANT fill:#7c3aed,stroke:#a78bfa,color:#fff
    style CUR fill:#7c3aed,stroke:#a78bfa,color:#fff
    style WS fill:#7c3aed,stroke:#a78bfa,color:#fff
    style B1 fill:#1e3a5f,stroke:#3b82f6,color:#fff
    style B2 fill:#1e3a5f,stroke:#3b82f6,color:#fff
    style B3 fill:#1e3a5f,stroke:#3b82f6,color:#fff
    
Feature Implementation
WikilinksSection headings use ## [[Topic Name]], inline mentions wrapped in [[brackets]]
YAML FrontmatterBriefings: type: briefing, date, topics list. Stubs: type: topic, created
Topic Hub PagesAuto-created in Topics/ on first reference; idempotent on subsequent runs
Graph EdgesEach [[wikilink]] creates a bidirectional edge in Obsidian's graph view
Cross-Briefing Links"Related Topics" header links all topics at top of each briefing for maximum connectivity
TagsYAML tags array enables tag-based filtering in Obsidian search
Env VariableAI_BRIEFING_OBSIDIAN_VAULT — absolute path to vault root directory
Local-FirstNo API, no cloud sync — files written directly to disk. Obsidian Sync is optional.

How the graph grows: Each daily run adds one briefing page linking to ~9 topic nodes. Each custom brief adds another page with its own topic nodes. Over time, Obsidian's graph view reveals which topics cluster together, which are trending, and how AI news themes evolve. Topic hub pages serve as the gravitational centers of the graph.

201 non-blocking tests

Verify syntax, structure, arg handling, template substitution, card JSON, notification error paths, Obsidian vault publishing, and cross-platform portability. No external services called.

Reliability is paramount. We maintain a comprehensive suite of 201 non-blocking tests across Bash and PowerShell. These tests validate syntax, argument handling, prompt templates, and notification schemas without requiring active API connections, ensuring the pipeline remains robust across all platforms.

flowchart TD
    subgraph "Bash (macOS / Linux / Git Bash)"
        R["tests/run-all.sh"] --> T1["test-custom-brief.sh\n48 tests"]
        R --> T2["test-daily-brief.sh\n80 tests"]
        R --> T3["test-notifications.sh\n17 tests"]
        R --> T4["test-portability.sh\n26 tests"]
        R --> T5["test-obsidian.sh\n30 tests"]
    end
    subgraph "PowerShell (Windows)"
        PS["tests/test-all.ps1\n91 tests"]
    end
    T1 --> X["Args, templates,\nprompts, skills, Obsidian flag"]
    T2 --> Y["Steps, topics,\nchangelogs, scripts, Obsidian"]
    T3 --> Z["Card JSON, converter,\nerror handling"]
    T4 --> W["Bash 3.2, awk, date,\nANSI safety"]
    T5 --> V["Vault structure, topic stubs,\nfrontmatter, idempotency"]
    PS --> X
    PS --> Y
    PS --> Z
    
run tests
# All bash tests (macOS / Linux / Git Bash)
bash tests/run-all.sh

# Individual suites
bash tests/test-custom-brief.sh
bash tests/test-daily-brief.sh
bash tests/test-notifications.sh
bash tests/test-portability.sh
bash tests/test-obsidian.sh

# PowerShell (Windows)
powershell -ExecutionPolicy Bypass -File tests\test-all.ps1
Suite Tests Coverage
test-custom-brief.sh48Args, template substitution, prompt structure, skill, Obsidian flag
test-daily-brief.sh80Prompt steps, 9 topics, 8 changelog URLs, entry scripts, dedup, Obsidian integration
test-notifications.sh17Card JSON validity, Adaptive Card structure, converter, errors
test-portability.sh26Bash 3.2 compat, awk, date, -f not -x, ANSI safety
test-obsidian.sh30Vault simulation, publish flow, topic stubs, frontmatter, idempotency
test-all.ps191PowerShell syntax, all prompts, cards, converter, docs
Full test documentation: TESTS.md

Default operating envelope

Baseline estimates with Opus model, 9 topics, no hard budget cap.

Per Run
~$3-5
Opus model, uncapped
Monthly (Daily)
~$90-150
Typical: varies by news volume
Hard Cap
None
No --max-budget-usd set
Runtime
4-8 min
Search, dedup, compile, publish

Source docs and references

Primary docs to keep this wiki and runtime behavior aligned.

README.md

User-facing setup, topic scope, scheduler installation, Makefile commands, Teams, Slack, and Obsidian delivery overview.

ARCHITECTURE.md

Deep architecture walkthrough with component-level design decisions and operational constraints.

E2E_FLOW.md

Detailed end-to-end execution, sequence, failure paths, artifact contracts, and alignment options.

NOTIFY_TEAMS.md

Teams webhook setup, multi-webhook configuration, Adaptive Card behavior, and troubleshooting delivery issues.

scripts/teams-to-slack.py

Converter that transforms Teams Adaptive Card JSON into Slack Block Kit payloads for Slack webhook delivery.

prompt.md

Agent instructions for search scope, formatting requirements, Notion write, and Obsidian markdown with [[wikilinks]].

CUSTOM_BRIEF.md

Custom topic deep research: CLI usage, agent architecture, output formats, Obsidian graph support, and comparison with daily briefing.

SETUP.md

Full setup guide: AI CLI engines, Notion MCP, Obsidian vault, webhook configuration, scheduler install, and verification.

NOTIFY_SLACK.md

Slack webhook setup, Block Kit conversion, multi-webhook configuration, and troubleshooting.

TESTS.md

Test suite documentation: architecture, per-suite coverage tables (including Obsidian suite), run commands, design principles.

LOGS.md

Log tailing and management: live tail, read, search, summarize, export, and cleanup for daily and custom briefs.

Makefile

Cross-platform operator interface for run, schedule, custom-brief, log, validation, and status workflows.

ai-news-briefing/
  briefing.sh / briefing.ps1 # daily briefing entry points
  prompt.md # daily briefing agent prompt
  custom-brief.sh / custom-brief.ps1 # custom topic deep research CLI
  prompt-custom-brief.md # multi-agent research prompt template
  commands/ # Claude Code interactive skills
    ai-news-briefing.md # daily briefing skill
    custom-brief.md # custom topic skill
  scripts/ # diagnostics, maintenance, notify paths
    notify-teams.sh / .ps1 # Teams webhook notifier
    notify-slack.sh / .ps1 # Slack webhook notifier
    publish-obsidian.sh / .ps1 # Obsidian vault publisher
    test-obsidian.sh / .ps1 # Obsidian vault connectivity test
    teams-to-slack.py # Adaptive Card to Block Kit converter
  tests/ # 201 non-blocking tests (bash + PowerShell)
  logs/ # runtime output (gitignored)
    covered-stories.txt # dedup headline tracker
  README.md # setup + usage
  ARCHITECTURE.md # deep architecture
  CUSTOM_BRIEF.md # custom brief documentation
  SETUP.md # full setup guide
  E2E_FLOW.md # execution contracts
  NOTIFY_TEAMS.md / NOTIFY_SLACK.md # integration guides
  index.html # this wiki page