Skip to main content

Integration

This guide covers how to integrate Tech Story Teller with your Rust repositories and customize output destinations.

GitHub Integration

Webhook Setup

  1. In your GitHub repository, go to Settings → Webhooks
  2. Add a new webhook with:
    • Payload URL: Your n8n webhook endpoint
    • Content type: application/json
    • Events: Select “Push” or specific events

GitHub Action Trigger

Alternatively, trigger from GitHub Actions:
name: Analyze Rust Files
on:
  push:
    paths:
      - '**.rs'

jobs:
  analyze:
    runs-on: ubuntu-latest
    steps:
      - name: Trigger Tech Story Teller
        run: |
          curl -X POST ${{ secrets.N8N_WEBHOOK_URL }} \
            -H "Content-Type: application/json" \
            -d '{"files": "${{ steps.changed-files.outputs.all_changed_files }}"}'

Output Destinations

Slack

Configure the Slack webhook URL in Workflow D:
{
  "url": "https://hooks.slack.com/services/YOUR_WEBHOOK",
  "body": {
    "text": "Rust Morning Report:\n\n{{$json[\"report\"]}}"
  }
}

Notion

Replace the Slack output with a Notion integration:
  1. Create a Notion integration at developers.notion.com
  2. Share a database with the integration
  3. Use the Notion node in n8n to create pages

GitHub PR Comments

Generate PR comments instead of Slack messages:
- name: Comment on PR
  uses: actions/github-script@v6
  with:
    script: |
      github.rest.issues.createComment({
        owner: context.repo.owner,
        repo: context.repo.repo,
        issue_number: context.issue.number,
        body: '${{ steps.report.outputs.content }}'
      })

Model Configuration

Supported Models

  • OpenAI: gpt-4o, gpt-4o-mini
  • Anthropic: Claude (via API)
  • AWS Bedrock: Claude, Titan
  • Local Models: Via OpenAI-compatible API

Switching Models

Update the model configuration in each workflow’s LLM node:
{
  "model": "gpt-4o",
  "temperature": 0.3
}

Customization

File Filters

Modify Workflow A to filter specific files:
// Only process src/ files
return items.filter(item => 
  item.json.file.startsWith('src/')
);

Report Templates

Customize the final report structure in Workflow D’s LLM prompt to match your team’s documentation standards.