Back to Projects

Building an AI-Powered Weekly News Digest

Automating content creation with n8n and Claude—from raw news feeds to publication-ready blog posts, committed directly to GitHub

n8nClaude Sonnet 4GitHub API

The Challenge

Maintaining both a professional tech blog and a personal lifestyle blog meant spending 3–4 hours every week manually researching news, filtering articles, and writing digest posts. The repetitive nature of this task was clear, but the quality bar remained high—readers expected well-researched, insightful content that went beyond simple link aggregation.

The question became: Could I automate 90% of this workflow while maintaining the quality and voice my audience expected?

The Solution: An Intelligent Content Pipeline

I built a fully automated weekly news agent using n8n for workflow automation and Claude AI for content generation. The system transforms raw news data into publication-ready blog posts and commits them directly to my GitHub repository—all without human intervention.

Pipeline Architecture

Weekly TriggerNews GatheringAI FilteringContent GenerationGit Deployment

The workflow handles two distinct content streams: a professional blog covering AI/LLMs, .NET/Blazor, and TypeScript/React for software engineers, and a personal blog covering trail running, cycling, climbing, and fitness for adventure enthusiasts.

Tech Stack

Workflow Automation

n8n (self-hosted)

AI Model

Claude Sonnet 4 (Anthropic API)

News Source

NewsAPI

Version Control

GitHub API

Intelligent Topic Configuration

Rather than hardcoding search queries, I created a flexible configuration system that defines topics with precision. Each topic includes semantic keywords optimized for NewsAPI search, category mapping for proper blog taxonomy, and tone and audience specifications for personalized content generation.

{ name: 'AI & LLMs', keywords: 'artificial intelligence product launch OR LLM release OR Claude AI OR GPT-4', categoryValue: 'tech-review' }

The system currently tracks 8 different topics across both blogs, with each topic generating targeted search queries.

Multi-Stage News Gathering

Stage 1: Fetch from NewsAPI

  • Queries last 7 days of articles
  • Filters by language (English), relevancy, and source quality
  • Retrieves top 15 articles per topic
  • Handles authentication via HTTP Query Auth

Stage 2: Article Validation

  • Filters out articles missing title, description, or URL
  • Extracts meaningful content snippets (500 chars)
  • Preserves source attribution and publication dates
  • Formats for AI consumption

AI-Powered Quality Filtering

This is where the magic happens. Raw news feeds are noisy—filled with duplicates, clickbait, and off-topic matches. I use Claude AI with a sophisticated filtering prompt that prioritizes original reporting, excludes promotional content, favors recent articles, and deduplicates coverage of the same story.

Claude returns structured JSON:

{ "has_relevant_news": true, "selected_articles": [...], "summary": "A cohesive narrative explaining what happened and why it matters" }

Key Innovation: The AI doesn't just filter—it synthesizes a narrative thread that becomes the foundation for the final blog post.

Context-Aware Content Generation

After grouping articles by category, the workflow constructs two different AI personas to match each blog's voice and audience.

Professional Voice

  • Senior tech journalist writing for software engineers
  • Analytical, technical, strategic implications
  • 800+ words with 4–6 substantive sections

Personal Voice

  • Outdoor enthusiast sharing discoveries with friends
  • First-person, conversational, enthusiastic
  • 600+ words with natural flow

Each prompt includes formatted article summaries grouped by topic, target audience specifications, valid category taxonomy, and required markdown structure with frontmatter.

Automated Git Deployment

The final stage integrates directly with GitHub. The workflow fetches the current TypeScript blog data file, parses and inserts the new post object at the beginning of the array, commits with dynamic metadata and timestamps, and preserves TypeScript structure and formatting throughout.

Generated output format:

{ id: "ai-llm-developments-2026-02-07", title: "...", excerpt: "...", content: `...`, publishedAt: "2026-02-07", readTime: 6, category: "tech-review", tags: ["AI", "LLM"], sources: [...] }

Key Learnings

Prompt Engineering is Critical

The quality of generated content directly correlates with prompt specificity. Including exact output format requirements, category taxonomy validation, audience personas, and quality criteria resulted in 3× better first-draft quality.

Multi-Stage AI Works Better Than Single-Shot

Splitting the workflow into Filter → Synthesize → Generate stages produced significantly better results than asking Claude to do everything at once.

Structured Data Output is Essential

Enforcing JSON responses with strict schemas eliminated parsing errors and enabled reliable downstream processing.

Voice Preservation Requires Context

Providing tone specifications, audience descriptions, and example structures helped Claude maintain consistent voice across generated posts.