Skip to main content
info@drupalodyssey.com
Thursday, April 2, 2026
Contact

Main navigation

  • Home
  • Services
  • Case Studies
  • Blog
  • Resources
  • About
Search
Development

Automating Social Media Posts from Drupal with n8n, Claude, and Postiz

April 02, 2026

You just hit "Publish" on a blog post you spent three hours writing. Now comes the other hour: crafting a LinkedIn post that sounds professional, a Bluesky post that fits in 300 characters, a Facebook post that won't get buried by the algorithm, and a Mastodon post that doesn't make you sound like a brand robot. Oh, and each one needs a unique tracking URL so you can actually measure what's working.

This is the tax nobody budgets for. And most automation tools "solve" it by blasting the same generic blurb everywhere — which is worse than not posting at all.

I built a pipeline that eliminates this entirely. Drupal publishes a blog post, and within seconds, platform-specific social posts — each with the right tone, character limits, and tracking URLs — are scheduled across four networks. No manual steps. No copy-paste. And here's the part I'm most proud of: Drupal drives the prompt engineering, not just the content. Adding a new social platform requires zero workflow changes.

Let me walk you through the build.

Bottom Line for Stakeholders

Concern Answer
What does it do? Automatically creates and schedules platform-optimized social posts when a blog entry publishes.
Which platforms? Facebook, LinkedIn, Mastodon, Bluesky — expandable without developer intervention.
Time saved per post? 30–60 minutes of manual crafting and scheduling.
Cost? Self-hosted n8n + Anthropic API usage (~$0.01–0.03 per post) + Postiz (open-source).
Risk? Posts are scheduled 30 minutes after publish — enough time to catch issues before social traffic arrives.

 

The Architecture at 10,000 Feet

The pipeline is a 13-node n8n workflow with two parallel branches that merge before a single API call schedules everything:

**Drupal** → ECA presave webhook → n8n → two parallel branches:
- Branch A (Content): Fetch blog data → extract HTML → convert to Markdown → get Postiz integrations → build dynamic prompt → call Claude → parse response
- Branch B (Image): Upload image to Postiz via URL 

Both branches produce exactly one item. They merge (Combine → Merge By Position), a Code node assembles the final payload, and a single POST /api/public/v1/posts call schedules posts across all platforms simultaneously.

The trigger is straightforward. The ECA module detects a presave event on Blog nodes and fires a webhook:

POST https://n8n.yourdomain.com/webhook/blog-social-post?nid=239

You could swap ECA for the Rules module or a custom `hook_entity_presave()` implementation. The webhook just needs the node ID.

Drupal as the Prompt Engine

This is where the architecture gets interesting. Most people think of Drupal's role in this kind of pipeline as "the thing that has the content." But in this build, Drupal is also the thing that tells the AI how to write for each platform.

The custom JSON endpoint returns a structure like this:

{
  "nid": 239,
  "title": "From Publish to Posted",
  "body": "<article>...</article>",
  "links": {
    "facebook": {
      "url": "https://example.com/blog/post?utm_source=facebook",
      "ai_attribute": "Max 63k;Helpful, visual & trusted."
    },
    "bluesky": {
      "url": "https://example.com/blog/post?utm_source=bluesky",
      "ai_attribute": "Max 300;Witty, human & niche."
    },
    "linkedin": {
      "url": "https://example.com/blog/post?utm_source=linkedin",
      "ai_attribute": "Max 3k;Professional, authoritative & insightful."
    },
    "youtube": {
      "url": "https://example.com/blog/post?utm_source=youtube"
    }
  },
  "image_url": "https://example.com/files/styles/postiz_image/public/hero.png",
  "publish_date": "2025-01-15T10:00:00Z",
  "canonical_url": "https://example.com/blog/post"
}

Notice YouTube has no ai_attribute. That's intentional — it's present for other uses but the workflow ignores it because it won't match any active Postiz integration. That's the whole extensibility model. Want to add Threads next month? Create a Postiz connection, add a threads key with its ai_attribute to the Drupal endpoint, and the workflow picks it up automatically. Zero n8n changes.

The endpoint is protected by a custom X-AI-Access-Token header validated in the route controller. Not OAuth-level security, but sufficient for a server-to-server call where both ends are under your control.

The Dynamic Prompt Builder and Claude

Branch A's real workhorse is the Code node that builds Claude's prompt dynamically. It pulls the list of active Postiz integrations (fetched via GET /api/public/v1/integrations), then cross-references them against the links object from Drupal:

const links = blogData.links;
const postizPlatforms = integrations.map(i => i.type);


let constraintBlock = '';
let outputFormat = '';


for (const [platform, data] of Object.entries(links)) {
  // Skip platforms without AI attributes or without active Postiz connections
  if (!data.ai_attribute || !postizPlatforms.includes(platform)) continue;
  
  constraintBlock += `\n${platform}: ${data.ai_attribute} (include link: ${data.url})`;
  outputFormat += `\n  "${platform}": "Your post text here"`;
}

The prompt_limits variable enforces straight ASCII quotes in the JSON output and mandates that character limits account for the tracking URL length. That second part matters — if Bluesky's limit is 300 characters, Claude needs to know that 60 of those are already consumed by the URL.

The assembled prompt, along with the Markdown-converted blog body, gets sent to the Anthropic API:

  • Model: claude-sonnet-4-6
  • Max tokens: 4096
  • Header: anthropic-version: 2023-06-01
  • Message: Single user message with prompt + full blog content as Markdown

The parse node extracts Claude's response using a regex:

const match = rawText.match(/```json\n([\s\S]*?)\n```/);
const parsed = match ? JSON.parse(match[1]) : JSON.parse(rawText);

The fallback to JSON.parse(rawText) handles cases where Claude returns clean JSON without a code fence. Defensive parsing — because you will get both formats.

Here's a critical detail that cost me time: This parse node is also where I append publish_date and image_url from the $('Get blog data').first().json reference. Why here and not downstream? Because after the Merge node, you cannot reference nodes that exist in only one branch. The Merge doesn't carry ancestry from both paths — it combines outputs. If you need data from a branch-specific node after the merge, inject it before the merge.

The Gotchas That'll Cost You Hours

Three traps. All underdocumented. All discovered the hard way.

1. Postiz Upload Strips Filenames

The POST /api/public/v1/upload-from-url endpoint accepts { "url": "..." } and returns an id and path. Sounds clean. Except Postiz strips the source filename and stores files as extensionless hashes — something like .../cfd6106b6c4c3132291fe2c6333e779c. The name field in the response? Empty.

Using that returned path in the create-post endpoint produces a "successful" post with a broken image. The fix:

// Use Postiz's id, but override path with the original Drupal URL
const imagePayload = {
 id: uploadResponse.id,
 path: sourceUrl.split('?')[0]  // Strip query params from Drupal image style URL
};

Both id and path are required. Either field alone returns a HTTP 400 error.

2. Drupal's WebP Effect Creates Double Extensions

I had an image style called postiz_image with a resize effect and a WebP conversion effect. Drupal's WebP conversion appends .webp to the original extension: hero.png becomes hero.png.webp. Postiz validates file extensions from the URL path and rejects double extensions outright.

The fix is surgical: remove the WebP conversion from the postiz_image style. A plain resize/crop outputting .png or .jpg at 1200×630 works perfectly.

3. n8n Node Names Break Silently

Every $('Node Name') reference in n8n Code nodes is a string match against the node's display name. Duplicate a node? Import a workflow? n8n may append a number or subtly rename it. Your references break silently — no error, just undefined data propagating through the pipeline.

After any workflow import or node duplication, verify every $('...') reference. Ask me how I know.

Scheduling, Shortlinks, and the 30-Minute Offset

The final Code node assembles the Postiz payload. The scheduling timestamp adds a 30-minute offset to publish_date:

const scheduleDate = new Date(
 new Date(publishDate).getTime() + 30 * 60 * 1000
).toISOString();

Why 30 minutes? This is arbitrary honestly, but it helps with CDN and cache warming. When social traffic hits your site, you want those pages already cached and indexed — not triggering a cold Drupal bootstrap under load.

The shortLink flag is set to false. Claude already embedded the tracking URLs in each post's content. If Postiz shortens them again, you'd get a shortened-shortened URL and your UTM parameters would be gone.

The posts array contains one entry per platform, mapped via a platformMap object:

const platformMap = {
 facebook: { __type: 'facebook' },
 linkedin: { __type: 'linkedin' },
 mastodon: { __type: 'mastodon' },
 bluesky:  { __type: 'bluesky' }
};

Each entry in the array:

{
 integration: { id: integrationId },
 value: [{ content: claudeOutput[platform], image: imagePayload }],
 settings: platformMap[platform]
}

One POST /api/public/v1/posts with type: 'schedule' fires it all off. Four platforms, four tailored posts, one API call.

The whole pipeline — from Drupal presave to scheduled social posts — runs in under 50 seconds. The part that took weeks was the architectural decision to make Drupal the prompt engine rather than hardcoding platform rules in n8n. That choice is what makes this system composable instead of brittle.

Is this overkill for a blog? Probably. Is it the kind of repeatable integration pattern — Drupal as content origin, n8n as orchestrator, external APIs as consumers — that scales to a 50-site enterprise? Absolutely.

Have you built something similar, or found a better way to handle the Postiz upload bug? Open an issue or share it with the community — the Odyssey is better when we travel together.

Don’t let manual workflows become a liability.

Start Planning Now

Drupal 11 and n8n will simplify your architectural life. Let’s talk about how to automate your custom code.

Author

Ron Ferguson

 

Next Blog

0 Comments

Login or Register to post comments.

Ad - Header (728*90 AD)

Ad - Sidebar (300 x 600 AD)

Ad - Sidebar (300 x 250 AD)

Newsletter

Subscribe my Newsletter for new blog and tips.

Menu

  • Home
  • Services
  • Case Studies
  • Blog
  • Resources
  • About

Legal

  • Privacy Policy
  • Terms & Conditions
  • Disclaimer
  • Cookies

I specialize in custom development, performance tuning, and reliable maintenance, delivering clean code and strategic solutions that scale with your business. Ready to discuss your project?

E: info@drupalodyssey.com
Fort Worth, TX

© 2026 All Rights Reserved.

Proud supporter of active military, veterans and first responders.