Developer Docs

Launch live integrations first. Go deeper only when you need to.

Start with one-click setup across Google Ads, Meta Ads, TikTok Ads, LinkedIn Ads, Microsoft Ads, Snapchat Ads, and Pinterest Ads, plus Shopify, Reddit Ads, BigQuery, Google Sheets, n8n, Slack, and generic webhooks. Then use advanced custom setup, SDKs, and infrastructure guides when your team wants tighter execution control.

7
live ad platforms

Released paid social and search connectors teams can launch now.

7
routing + destination tools

Shopify, Reddit Ads, BigQuery, Google Sheets, n8n, Slack, and webhooks.

2
setup paths

Managed one-click setup or advanced custom setup in your own stack.

Integration logo
Integration logo
Integration logo
Integration logo
Integration logo
ShopifyReddit AdsBigQueryGoogle Sheetsn8nSlackWebhooks
One-click setup

Best when your team wants the fastest launch in the TrendsAGI UI with managed connections and preview-first workflows.

Advanced custom setup

Best when engineering wants provider writes, worker execution, and automation logic to stay inside your own infrastructure.

View live integration docs

Start Here

Welcome to TrendsAGI docs. Start with the setup path that matches your team: one-click integrations for the fastest launch, or advanced custom setup when you want execution to stay inside your own infrastructure.

Pick The Rollout Path That Fits

Use one-click setup when you want the fastest path in the UI. Use advanced custom setupwhen you want provider writes and worker execution to stay in your own infrastructure.

1. API Authentication

Your API key identifies your workspace and manages its rate limits.

  1. Navigate to your Profile Page.
  2. Select the "API Keys" tab.
  3. Generate a key for your integration or automation workflow.
Security Best Practice

Treat this key like an application secret. Store it in environment variables (for example,TRENDSAGI_API_KEY), not in source code.

2. Initialize the Client

The Python client is the fastest way to inspect live signals, prototype routing, or test an integration flow.

pip install trendsagi

3. Pull Your First Live Signals

Start with one category and verify the payload shape first. Once the signal looks right, wire it into your workflow.

python: agent_init.py
from trendsagi import TrendsAGIClient, APIError
import os

# Initialize the signal layer
client = TrendsAGIClient(api_key=os.getenv("TRENDSAGI_API_KEY"))

try:
    # Check for high-velocity signals in the Technology category
    signals = client.get_trends(limit=5, category="Technology", sort_by="velocity")
    
    print(f"Signal Layer Initialized. Detected {len(signals.trends)} active signals.")
    for signal in signals.trends:
        print(f" - Signal Detected: {signal.name} (Velocity: {signal.volume})")
        
except APIError as e:
    print(f"Signal Fetch Failed ({e.status_code}): {e.error_detail}")

Ads Activation

Ad Platform Logo
Ad Platform Logo
Ad Platform Logo
Ad Platform Logo

One-click ad integrations for paid social and search execution. Connect Google Ads, Meta Ads, TikTok Ads, LinkedIn Ads, Microsoft Ads, Snapchat Ads, and Pinterest Ads in one click, then optionally move execution into your own infrastructure. TrendsAGI turns live demand shifts into audience and keyword refreshes with preview/apply guardrails.

Use Case 1: Audience And Keyword Refreshes

Your workflow can refresh campaign settings using live themes and trend categories instead of static targeting.

  1. Discover: Select a category or trend from the Ads mapping flow.
  2. Analyze: TrendsAGI resolves the latest positive trend and extracts audience keywords.
  3. Execute: Preview or apply guardrailed targeting updates to mapped campaign entities.
Two Rollout Paths

One-click setup: Managed account connection and token handling in TrendsAGI.

Advanced custom setup: Keep provider credentials and execution workers fully in your own infrastructure.

Use Case 2: Context-Aware Copy Support

Prevent generic ad copy. Inject "Sentiment Summary" and "Key Themes" into your prompt so the output aligns with the current public mood.

Python: Context Injection For Ad Workflow
# Example: preparing prompt context for an ad copy workflow
trend_id = 123 
insights = client.get_ai_insights(trend_id=trend_id)

# Constructing the System Prompt Context
# Note: These are retrieved from the cache. New insights must be triggered via the dashboard.
system_context = {
    "topic": insights.trend_name,
    "current_public_mood": insights.sentiment_summary,
    "marketing_hook": insights.suggested_marketing_angle,
    "taboo_subjects": insights.negative_themes # Guardrails
}

prompt = f"""
You are an expert copywriter. 
Topic: {system_context['topic']}
Mood: {system_context['current_public_mood']}
Hook: {system_context['marketing_hook']}

Task: Write 3 Facebook Ad headlines that align with this mood.
"""

# llm.predict(prompt)

Destinations + Routing

Social or Search Platform Logo
Social or Search Platform Logo
Social or Search Platform Logo

Deploy destination and routing workflows for the released connector surface. The integration tab now centers BigQuery, Google Sheets, n8n, Slack, and webhook delivery patterns instead of generic channel recipes.

Use Case 1: BigQuery + Google Sheets destination sync

  1. Authorize Google once for BigQuery or Google Sheets.
  2. Discover projects, datasets, tables, spreadsheets, and tabs from the connector routes.
  3. Save the destination selection that should receive released delivery rows.
Destination Connector Order
1) GET /api/integrations/bigquery/status
2) GET /api/integrations/bigquery/projects
3) GET /api/integrations/bigquery/datasets
4) GET /api/integrations/bigquery/tables
5) PUT /api/integrations/bigquery/selection

or

1) GET /api/integrations/google-sheets/status
2) GET /api/integrations/google-sheets/spreadsheets
3) GET /api/integrations/google-sheets/sheets
4) PUT /api/integrations/google-sheets/selection

Use Case 2: Slack, n8n, and webhook routing bridge

Connect the managed destination when it exists, then fall back to webhooks only when you need a custom POST target.

Routing Connector Order
1) GET /api/integrations/slack/auth
2) POST /api/integrations/slack/save-channel
3) POST /api/integrations/n8n/connect
4) PUT /api/integrations/n8n/selection
5) POST /api/integrations/n8n/push
6) POST /api/integrations/webhooks
7) POST /api/integrations/webhooks/{id}/test

Content Workflows

Run content workflows that respond to rising topics and search demand without relying on stale research.

Use Case 1: Newsjacking Workflow

Create a workflow that monitors your niche for breakout velocity. When a topic crosses a specific threshold, it can draft a blog post outline automatically.

  1. Poll get_trends sorted by growth.
  2. If growth > 200%, trigger the content workflow.
  3. Use ai_insights.suggested_content_angle as the H1 title.
  4. Use ai_insights.key_themes to generate H2 headers.

Use Case 2: Grounded Content Generation

LLMs often make up facts when writing about new topics. Use our API to inject a fact sheet into the context window so the workflow has grounded truth to work from.

Python: Grounding The Workflow
# Fetch grounded, cached context before generation
trends_response = client.get_trends(search="sustainable fashion", limit=1)
trend = trends_response.trends[0]
insights = client.get_ai_insights(trend_id=trend.id)

print(f"--- Fact Sheet For Workflow ---")
print(f"Context: {insights.trend_name}")
print(f"Verified Themes: {', '.join(insights.key_themes)}")
print(f"Audience Sentiment: {insights.sentiment_category}")

# Use 'Verified Themes' as a checklist for content sections.

Commerce Signals

Power Shopify storefront workflows that push released trend context into product metafields so your storefront or internal automation can react without manual copy-and-paste.

Use Case: Shopify Trend Context Sync

Map a live trend category to one Shopify product and let the connector enrich that product with structured trend context when the rule matches.

Implementation Logic

The live connector path is: Trend signalselection ruleShopify metafield destination.

Shopify Connector Order
1) GET /api/integrations/shopify/auth?shop=example-store.myshopify.com
2) Complete authorization in browser
3) GET /api/integrations/shopify/products
4) PUT /api/integrations/shopify/selection

Low-Code Orchestration

N8N Logo

Orchestrate low-code delivery workflows with the dedicated n8n connector. Store one workspace, choose a workflow, and push released TrendsAGI payloads without starting from generic webhooks.

Workflow: Dedicated n8n Connector Handoff

  1. Connect: Store the n8n API key and base URL with POST /api/integrations/n8n/connect.
  2. Discover: Fetch workflows from GET /api/integrations/n8n/workflows.
  3. Select: Save the workflow or webhook destination with PUT /api/integrations/n8n/selection.
  4. Push: Deliver released payloads with POST /api/integrations/n8n/push.
No Code Required

This keeps n8n as the orchestration layer while TrendsAGI manages connector state, selection defaults, and delivery metadata.

Advanced Custom Setup (Vertex)

Vertex AI Logo

Build advanced custom tools on Google Cloud Vertex AI. Create custom tools for LangChain or AutoGPT workflows so they can query live trends instead of guessing.

Use Case: Market Analyst Tool

Define a custom tool function. When a user asks what is happening in tech right now, the workflow can call this tool instead of hallucinating an answer.

Python: LangChain Tool Definition
from langchain.agents import tool
from trendsagi import TrendsAGIClient

# Initialize the Context Layer
trends_client = TrendsAGIClient(api_key="YOUR_KEY")

@tool
def check_market_signals(sector: str) -> str:
    """
    Useful for when you need to know what is currently trending or 
    popular in a specific market sector. Returns real-time data.
    """
    try:
        response = trends_client.get_trends(category=sector, limit=5)
        return str([t.name for t in response.trends])
    except Exception:
        return "Data unavailable."

# The LLM now has a "sense" of the market.
# agent_chain = initialize_agent(tools=[check_market_signals], ...)

Knowledge + Context

Your workflows need grounded context. Instead of stuffing every prompt with generic text, use the Context Intelligence Suite to create structured knowledge bases. Upload product reference guides, PDF specs, or style guides, and let the API handle retrieval.

Use Case: Support Knowledge Base

Create a dedicated knowledge base for customer support. Upload your latest policy documents so responses stay aligned with the most up-to-date information.

Python: Managing Workflow Knowledge
# 1. Create a dedicated knowledge base
project = client.create_context_project(name="Support Bot Knowledge")

# 2. Upload the official policy document (PDF/Text/Image)
client.upload_context_file(
    project_id=project.id,
    file_path="./refund_policy_2025.pdf",
    item_type="reference_doc"
)

# 3. Runtime: retrieve context relevant to user query
user_query = "Can I get a refund after 30 days?"
relevant_items = client.query_context(
    project_id=project.id,
    search=user_query
)

# 4. Inject into your LLM or support workflow
context_block = "
".join([item.content for item in relevant_items])
# response = llm.predict(f"Context: {context_block}
Question: {user_query}")

Webhook Actions

Any Platform
Cloud Functions
Webhook Routers

The Webhooks Reactor is a universal integration layer. If a platform can receive a POST request, it can be integrated with TrendsAGI. This allows you to trigger workflows, serverless functions, or chat notifications instantly when a trend is detected.

Universal Compatibility

We send a standardized JSON payload to your configured URL. This makes our service compatible with virtually any modern stack or SaaS tool.

Supported Platforms (Extensive List)

Since we use standard Webhooks, you can integrate with:

  • Serverless: AWS Lambda, Google Cloud Functions, Azure Functions, Vercel, Netlify, Cloudflare Workers
  • Automation: n8n, IFTTT, Tray.io, and custom webhook orchestrators
  • Communication: Slack, generic webhooks, Microsoft Teams, Telegram, WhatsApp Business API, Twilio, SendGrid
  • DevOps: GitHub, GitLab, Bitbucket, PagerDuty, Datadog, New Relic, Splunk, Jira, Linear
  • E-commerce: Shopify, WooCommerce, Magento, BigCommerce, Stripe, PayPal, Square
  • CRM/Sales: Salesforce, HubSpot, Zoho, Pipedrive, Zendesk, Intercom
  • Data/Productivity: Google Sheets, Airtable, Notion, Trello, Asana, Monday.com, ClickUp
  • CMS: WordPress, Webflow, Contentful, Strapi, Ghost, Drupal
  • Infrastructure: Heroku, DigitalOcean, Render, Railway, Fly.io, Supabase, Firebase

Example: Cloud Function Reactor

Trigger a Google Cloud Function to run a complex sentiment analysis job whenever a new trend hits a velocity threshold.

Node.js: Cloud Function Receiver
exports.trendsReactor = async (req, res) => {
    // 1. Verify Signature (Security)
    const signature = req.headers['x-webhook-signature'];
    if (!verifySignature(req.body, signature)) {
        return res.status(401).send('Unauthorized');
    }

    // 2. Parse Payload
    const { trend_name, velocity, sentiment } = req.body;

    // 3. Execute Business Logic
    console.log(`Reacting to trend: ${trend_name} (Velocity: ${velocity})`);
    await triggerMarketingCampaign(trend_name);

    res.status(200).send('Reactor Executed');
};

Real-Time Event Triggers

Real-time push triggers. For high-frequency workflows, polling REST APIs is too slow. Our WebSocket API pushes data the moment it is detected, allowing for sub-second reaction times.

Plan Limits

Real-time event streams are available on all plans, with throughput and concurrent stream limits scaling by tier (Scale includes priority throughput).

Stream 1: Industry Events (/ws/industry-live)

Your workflow receives a push payload immediately when a regulatory update is released or an economic indicator is published.

Python: Async Event Listener
import asyncio
import websockets
import json

URI = f"wss://api.trendsagi.com/ws/industry-live?token={API_KEY}"

async def run_listener():
    async with websockets.connect(URI) as websocket:
        print("--> Listener connected")
        while True:
            message = await websocket.recv()
            data = json.loads(message)
            
            # Immediate Reaction
            if data['type'] == 'trend_velocity_alert' and data['payload']['velocity_bucket'] == 'high':
                await trigger_response_protocol(data['payload'])

# asyncio.run(run_listener())

Stream 2: Trend Velocity (/ws/trends-live)

Wake up creative workflows the moment a topic goes viral. Filter by specific keywords to create specialized monitoring streams.

wss://api.trendsagi.com/ws/trends-live?token=KEY&trends=AI,Technology

Exports + Model Inputs

Cloud Platform Logo
Cloud Platform Logo
Cloud Platform Logo
Cloud Platform Logo

Long-Term Memory & Model Training. Offload our daily data dumps into your data lake to build historical datasets. This data is crucial for fine-tuning your own models (LoRA/QLoRA) to understand market dynamics specific to your industry.

Data Pipeline Integration

  1. Configure your S3/GCS bucket in the Export Dashboard.
  2. We push .parquet or .csv files daily.
  3. Ingest these files into your vector database (for example, Pinecone or Milvus) to expand your workflow's long-term context.

Full API Reference

This guide focused on launch patterns and workflow design. For raw technical specifications, endpoint schemas, and parameter definitions, consult the API Reference.

Technical Specs

The API Reference contains the Swagger/OpenAPI specifications needed for generating client libraries or integrating with strict-schema tools.