N8N Integration
Step-by-step guide to using the Tuberalytics API with N8N's HTTP Request node. Works with both N8N Cloud and self-hosted instances.
Prerequisites
- A Tuberalytics account with an API key (generate one at API Keys)
- An N8N instance (cloud or self-hosted)
Creating Reusable Credentials
Set up Header Auth credentials so you don't have to configure authentication on every node.
- Go to Settings > Credentials in N8N
- Click Add Credential and search for Header Auth
- Configure:
- Name:
Tuberalytics API - Header Name:
Authorization - Header Value:
Bearer sk_live_your_api_key
- Name:
- Click Save
Now select this credential in any HTTP Request node's Authentication dropdown.
Example 1: Search for Channels
Search for YouTube channels by keyword and process the results.
Step 1: Add an HTTP Request Node
- Add an HTTP Request node to your workflow
- Configure it:
- Method:
GET - URL:
https://tuberalytics.com/api/v1/channels/search - Authentication: Header Auth >
Tuberalytics API - Query Parameters:
q=ai automationper_page=25
- Method:
- Click Test step to execute
Step 2: Access the Response
N8N automatically parses the JSON response. Access fields using expressions:
{{ $json.data }}— array of channel objects{{ $json.data[0].title }}— first channel's name{{ $json.meta.total_count }}— total matching channels
Example Response
{
"data": [
{
"id": 170,
"title": "Nick Saraev",
"youtube_id": "UCxxxx",
"subscriber_count": 250000,
"video_count": 180
}
],
"meta": {
"current_page": 1,
"per_page": 25,
"total_pages": 3,
"total_count": 62
}
}
Example 2: Trigger Analysis and Wait for Completion
Run AI analysis on a channel. Since analysis is async, you need to wait for the result.
The channel ID is passed in the URL path (e.g.,
/channels/170/analyze). Use theidfield from a previous search result or fromPOST /api/v1/me/channelsresponse. No request body is needed.
Option A: Fire and Forget with Webhook
HTTP Request —
POST https://tuberalytics.com/api/v1/channels/{{ $json.data[0].id }}/analyze- Authentication: Header Auth >
Tuberalytics API - No body required
- Replace
{{ $json.data[0].id }}with an expression referencing the channel ID from a previous node
- Authentication: Header Auth >
Create a separate workflow triggered by a Webhook node:
- Set up a Webhook endpoint in Tuberalytics pointing to your N8N webhook URL
- Subscribe to
channel.analysis.completedevents - Process the analysis result when the webhook fires
Option B: Polling with Wait Node
- HTTP Request —
POST https://tuberalytics.com/api/v1/channels/{{ $json.data[0].id }}/analyze - Wait — Pause for 60 seconds
- HTTP Request —
GET https://tuberalytics.com/api/v1/channels/{{ $json.data[0].id }}/profile - IF — Check if
{{ $json.data }}exists. If not, loop back to the Wait node.
POST Response (HTTP 202)
{
"data": {
"status": "pending",
"analysis_id": 456,
"message": "Channel analysis queued."
}
}
Recommended: Use the webhook approach. It's more reliable and doesn't waste workflow executions on polling.
Example 3: Niche Research to Slack Summary
Build a workflow that researches a niche and sends a summary to Slack.
Workflow Nodes
- Schedule Trigger — Run every Monday at 9am
- HTTP Request — Search niches:
GET /api/v1/niches/search?q=ai+automation - HTTP Request — Get niche channels:
GET /api/v1/niches/{{ $json.data[0].id }}/channels?per_page=10 - HTTP Request — Get leaderboard:
GET /api/v1/leaderboards/videos?niche_id={{ $json.data[0].id }}&period=week - Code — Format the data into a Slack message
- Slack — Send message to a channel
Code Node (Format Message)
const channels = $input.all()[0].json.data;
const videos = $input.all()[1].json.data;
let message = "*Weekly Niche Report: AI Automation*\n\n";
message += "*Top Channels:*\n";
channels.slice(0, 5).forEach((ch, i) => {
message += `${i + 1}. ${ch.title} — ${ch.subscriber_count.toLocaleString()} subs\n`;
});
message += "\n*Top Videos This Week:*\n";
videos.slice(0, 5).forEach((v, i) => {
message += `${i + 1}. ${v.title} — ${v.view_count.toLocaleString()} views\n`;
});
return [{ json: { message } }];
Working with Pagination
Process all pages of results using N8N's Loop Over Items node.
Pattern: Pagination Loop
- Set — Initialize
page=1andhasMore=true - Loop Over Items — Loop while
hasMoreistrue - HTTP Request —
GET /api/v1/channels/search?q=ai+automation&page={{ $json.page }}&per_page=100 - Your processing nodes — Process
data[]items - IF — Check
{{ $json.meta.current_page < $json.meta.total_pages }}- True: Set
page={{ $json.page + 1 }}, continue loop - False: Set
hasMore=false, exit loop
- True: Set
Tip: Use
per_page=100(the maximum) to reduce API calls. Check Rate Limits for your tier's limits.
Error Handling
Retry on Failure
Configure automatic retries on each HTTP Request node:
- Open the node's Settings tab
- Enable Retry On Fail
- Set:
- Max Retries:
3 - Wait Between Retries:
60000ms (60 seconds — allows rate limits to reset)
- Max Retries:
Error Workflow
Create a dedicated error workflow for alerting:
- Go to Settings > Error Workflow in your workflow settings
- Select or create an error-handling workflow
- In the error workflow, add a Slack or Email node to notify you of failures
Common Error Responses
| Status | Meaning | Action |
|---|---|---|
| 401 | Invalid API key | Verify your Header Auth credential |
| 404 | Resource not found | Check that the channel/niche ID exists |
| 422 | Validation error | Read the error field for details |
| 429 | Rate limit exceeded | Retry after 60 seconds (auto-retry handles this) |
| 500 | Server error | Retry after a short delay |
Tips and Best Practices
- Use the Schedule Trigger node for recurring research. Weekly or daily runs are ideal for niche monitoring without burning through API quota.
- Use expressions for dynamic URLs. Reference data from previous nodes:
https://tuberalytics.com/api/v1/channels/{{ $json.data[0].id }}/videos - Split large batches. When processing many channels, use the Split In Batches node to process 10 at a time. This prevents rate limit errors and keeps workflows manageable.
- Cache with a database. Connect a Postgres, MySQL, or Google Sheets node to store results. Only fetch data that's changed since your last run by comparing timestamps.
- Pin test data. Use N8N's pin data feature to save example API responses while building your workflow. This avoids making real API calls during development.
- Use sub-workflows for reusable API patterns. Create a sub-workflow for "get channel with profile" that you can call from any parent workflow.