United States
Web (n8n Automation)
AI Content Generation & Publishing
n8n + GPT-4o-mini + LinkedIn API
Live & Active
Overview
Automated Open Source Content Discovery, AI Writing & LinkedIn Publishing
CueBytes needed a hands-off system to consistently publish high-quality LinkedIn content about trending open source projects. The goal: build thought leadership in the developer community without spending hours every day researching projects, writing posts, and manually publishing.
We built a 4-phase automated pipeline running on n8n that discovers trending GitHub repositories daily, generates polished LinkedIn posts using GPT-4o, and publishes them on a set schedule — all without human intervention.
The pipeline has been running in production 24/7, publishing one curated open source project to LinkedIn every day at 8 AM UTC.
Tech Stack
The Problem
What Manual Content Publishing Looked Like
Manually browsing GitHub Trending every day to find interesting repositories
Reading through READMEs and documentation to understand each project
Writing a LinkedIn post from scratch with the right tone and structure
Logging into LinkedIn, formatting the post, and publishing at the optimal time
No tracking of what's been posted before - risk of duplicate content
No way to maintain a consistent posting cadence during busy weeks
Architecture
4 Interconnected n8n Workflows
Each phase runs on its own schedule with staggered timing. One project flows through the entire pipeline per day — from discovery to published LinkedIn post.
Scheduling Strategy
| Time (UTC) | Workflow | Action |
|---|---|---|
| 2:00 AM | Phase 1 | Scrape GitHub, insert new projects |
| 6:00 AM | Phase 2 | Generate LinkedIn content for 1 pending project |
| 8:00 AM | Phase 3 | Publish 1 queued post to LinkedIn |
GitHub Trending Scraper
Trigger: Daily at 2:00 AM UTC
- GitHub API search for repos with 100+ stars, sorted by popularity
- Extract project name, URL, stars, forks, language, and description
- Filter by minimum stars and recency (updated within 365 days)
- Deduplicate against existing projects in PostgreSQL
- Fetch full README content via GitHub API for AI context
- Insert new projects with status=pending
AI Content Generation
Trigger: Daily at 6:00 AM UTC
- Select highest-starred pending project from PostgreSQL
- Decode base64 README and build structured prompt
- GPT-4o-mini generates LinkedIn post with developer-audience tone
- Content escaped for SQL safety and stored on project record
- Status transitions from pending to queued
LinkedIn Publishing
Trigger: Daily at 8:00 AM UTC
- Select oldest queued project (FIFO order)
- Publish via n8n LinkedIn node with OAuth2 credential
- Extract LinkedIn post URN from API response
- Update project status to published with post ID
- Log to posting_history table for audit trail
Error Handler
Trigger: On any workflow failure
- Parse error details: workflow name, node, message, stack trace
- Classify severity: critical, high, medium, low
- Log to workflow_errors or error_logs table
- Send formatted Discord webhook alert
- Mark failed projects with status=error to prevent retries
What We Built
8 Core Capabilities
GitHub Trending Discovery
Automated GitHub API search for repositories with 100+ stars, sorted by popularity. Fetches full README content for deep AI context. Runs nightly at 2 AM UTC.
AI Content Generation
GPT-4o-mini generates polished LinkedIn posts from project data and README content. Carefully tuned system prompt ensures developer-audience tone and SQL-safe formatting.
LinkedIn Auto-Publishing
Native n8n LinkedIn node with OAuth2 credential publishes one curated post daily at 8 AM UTC. Post URN tracked for analytics.
Full Status Lifecycle
Every project flows through pending, queued, published, or error states. PostgreSQL tracks the entire journey with timestamps at each transition.
Dual-Layer Error Handling
Workflow-level and project-level error handlers with severity classification. Critical failures trigger @everyone Discord alerts. Bad projects are isolated so the pipeline continues.
Complete Audit Trail
Posting history table records every published post with platform, post ID, content, and success status. Full traceability from discovery to publication.
Staggered Schedule Design
Three phases run at 2 AM, 6 AM, and 8 AM with 2-4 hour gaps. Ensures each step completes before the next begins. One project per day for quality control.
Competitor Content Scraping
Companion workflows scrape RSS feeds from developer content sites. Human-in-the-loop approval gate ensures quality before publishing competitor-sourced ideas.
Data Flow
Project Status Lifecycle
projects
posting_history
workflow_errors
Outcome
What's Running in Production
Fully automated daily LinkedIn publishing - zero manual intervention
GitHub trending discovery running every night at 2 AM
AI-generated posts tailored for developer audience
One high-quality post published daily at 8 AM UTC
Complete status tracking: pending, queued, published, error
Full posting history audit trail with LinkedIn post URNs
Dual-layer error handling with Discord alerting
Companion competitor scraping with human-approval gate
Need an automated content pipeline for your brand?