Back to Integrations
integrationHTTP Request node
integrationNotion node

HTTP Request and Notion integration

Save yourself the work of writing custom integrations for HTTP Request and Notion and use n8n instead. Build adaptable and scalable Development, Core Nodes, and Productivity workflows that work with your technology stack. All within a building experience you will love.

How to connect HTTP Request and Notion

  • Step 1: Create a new workflow
  • Step 2: Add and configure nodes
  • Step 3: Connect
  • Step 4: Customize and extend your integration
  • Step 5: Test and activate your workflow

Step 1: Create a new workflow and add the first step

In n8n, click the "Add workflow" button in the Workflows tab to create a new workflow. Add the starting point – a trigger on when your workflow should run: an app event, a schedule, a webhook call, another workflow, an AI chat, or a manual trigger. Sometimes, the HTTP Request node might already serve as your starting point.

HTTP Request and Notion integration: Create a new workflow and add the first step

Step 2: Add and configure HTTP Request and Notion nodes

You can find HTTP Request and Notion in the nodes panel. Drag them onto your workflow canvas, selecting their actions. Click each node, choose a credential, and authenticate to grant n8n access. Configure HTTP Request and Notion nodes one by one: input data on the left, parameters in the middle, and output data on the right.

HTTP Request and Notion integration: Add and configure HTTP Request and Notion nodes

Step 3: Connect HTTP Request and Notion

A connection establishes a link between HTTP Request and Notion (or vice versa) to route data through the workflow. Data flows from the output of one node to the input of another. You can have single or multiple connections for each node.

HTTP Request and Notion integration: Connect HTTP Request and Notion

Step 4: Customize and extend your HTTP Request and Notion integration

Use n8n's core nodes such as If, Split Out, Merge, and others to transform and manipulate data. Write custom JavaScript or Python in the Code node and run it as a step in your workflow. Connect HTTP Request and Notion with any of n8n’s 1000+ integrations, and incorporate advanced AI logic into your workflows.

HTTP Request and Notion integration: Customize and extend your HTTP Request and Notion integration

Step 5: Test and activate your HTTP Request and Notion workflow

Save and run the workflow to see if everything works as expected. Based on your configuration, data should flow from HTTP Request to Notion or vice versa. Easily debug your workflow: you can check past executions to isolate and fix the mistake. Once you've tested everything, make sure to save your workflow and activate it.

HTTP Request and Notion integration: Test and activate your HTTP Request and Notion workflow

Automate Competitor Research with Exa.ai, Notion and AI Agents

This n8n workflow demonstrates a simple multi-agent setup to perform the task of competitor research. It showcases how using the HTTP request tool could reduce the number of nodes needed to achieve a workflow like this.

How it works
For this template, a source company is defined by the user which is sent to Exa.ai to find competitors.
Each competitor is then funnelled through 3 AI agents that will go out onto the internet and retrieve specific datapoints about the competitor; company overview, product offering and customer reviews.
Once the agents are finished, the results are compiled into a report which is then inserted in a notion database.

Check out an example output here: https://jimleuk.notion.site/2d1c3c726e8e42f3aecec6338fd24333?v=de020fa196f34cdeb676daaeae44e110&pvs=4

Requirements

An OpenAI account for the LLM.
Exa.ai account for access to their AI search engine.
SerpAPI account for Google search.
Firecrawl.dev account for webscraping.
Notion.com account for database to save final reports.

Customising the workflow

Add additional agents to gather more datapoints such as SEO keywords and metrics.

Not using notion? Feel free to swap this out for your own database.

Nodes used in this workflow

Popular HTTP Request and Notion workflows

+2

Host Your Own AI Deep Research Agent with n8n, Apify and OpenAI o3

This template attempts to replicate OpenAI's DeepResearch feature which, at time of writing, is only available to their pro subscribers. > An agent that uses reasoning to synthesize large amount of online information and complete multi-step research tasks for you. Source Though the inner workings of DeepResearch have not been made public, it is presumed the feature relies on the ability to deep search the web, scrape web content and invoking reasoning models to generate reports. All of which n8n is really good at! Using this workflow, n8n users can enjoy a variation of the Deep Research experience for themselves and their teams at a fraction of the cost. Better yet, learn and customise this Deep Research template for their businesses and/or organisations. Check out the generated reports here: https://jimleuk.notion.site/19486dd60c0c80da9cb7eb1468ea9afd?v=19486dd60c0c805c8e0c000ce8c87acf How it works A form is used to first capture the user's research query and how deep they'd like the researcher to go. Once submitted, a blank Notion page is created which will later hold the final report and the researcher gets to work. The user's query goes through a recursive series of web serches and web scraping to collect data on the research topic to generate partial learnings. Once complete, all learnings are combined and given to a reasoning LLM to generate the final report. The report is then written to the placeholder Notion page created earlier. How to use Duplicate this Notion database template and make sure all Notion related nodes point to it. Sign-up for APIFY.com API Key for web search and scraping services. Ensure you have access to OpenAI's o3-mini model. Alternatively, switch this out for o1 series. You must publish this workflow and ensure the form url is publically accessible. On depth & breadth configuration For more detailed reports, increase depth and breadth but be warned the workflow will take exponentially longer and cost more to complete. The recommended defaults are usually good enough. Depth=1 & Breadth=2 - will take about 5 - 10mins. Depth=1 & Breadth=3 - will take about 15 - 20mins. Dpeth=3 & Breadth=5 - will take about 2+ hours! Customising this workflow I deliberately chose not to use AI-powered scrapers like Firecrawl as I felt these were quite costly and quotas would be quickly exhausted. However, feel free to switch web search and scraping services which suit your environment. Maybe you don't decide to source the web and instead, data collection comes from internal documents instead. This template gives you freedom to change this. Experiment with different Reasoning/Thinking models such as Deepseek and Google's Gemini 2.0. Finally, the LLM prompts could definitely be improved. Refine them to fit your use-case. Credits This template is largely based off the work by David Zhang (dzhng) and his open source implementation of Deep Research: https://github.com/dzhng/deep-research

Sync Notion to Clockify including Clients Projects and Tasks

Purpose This workflow synchronizes three entities from Notion to Clockify, allowing tracked time to be linked to client-related projects or tasks. Demo & Explanation How it works On every run active Clients, Projects and Tasks are retrieved from both Notion and Clockify before being compared by the Clockify ID, which is again stored in Notion for reference Potential differences are then applied to Clockify If an item has been archived or closed in Notion, it is also marked as archived in Clockify All entities are processed sequentially, since they are related hierarchically to each other By default this workflow runs once per day or when called via webhook (e.g. embedded into a Notion Button) Prerequisites A set of Notion databases with a specific structure is required to use this workflow You can either start with this Notion Template or adapt your system based on the requirements described in the big yellow sticky note of this workflow template Setup Clone the workflow and select the belonging credentials Follow the instructions given in the yellow sticky notes Activate the workflow Related workflows: Backup Clockify to Github based on monthly reports Prevent simultaneous workflow executions with Redis

CallForge - 06 - Automate Sales Insights with Gong.io, Notion & AI

CallForge - AI-Powered Sales Call Data Processor Automate sales call analysis and store structured insights in Notion with AI-powered intelligence. Who is This For? This workflow is ideal for: ✅ Sales teams looking to automate call insight processing. ✅ Sales operations managers managing AI-driven call analysis. ✅ Revenue teams using Gong, Fireflies.ai, Otter.ai, or similar transcription tools. It streamlines sales call intelligence, ensuring that insights such as competitor mentions, objections, and customer pain points are efficiently categorized and stored in Notion for easy access. 🔍 What Problem Does This Workflow Solve? Manually reviewing and documenting sales call takeaways is time-consuming and error-prone. With CallForge, you can: ✔ Identify competitors mentioned in sales calls. ✔ Capture objections and customer pain points for follow-up. ✔ Track sales call outcomes and categorize insights automatically. ✔ Store structured sales intelligence in Notion for future reference. ✔ Improve sales strategy with AI-driven, automated call analysis. 📌 Key Features & Workflow Steps 🎙️ AI-Powered Call Data Processing This workflow processes AI-generated sales call insights and structures them in Notion databases: Triggers automatically when AI call analysis data is received. Extracts competitor mentions from the call transcript and logs them in Notion. Identifies and categorizes sales objections for better follow-ups. Processes integration mentions, capturing tools or platforms referenced in the call. Extracts customer use cases, categorizing pain points and feature requests. Aggregates all extracted insights and updates relevant Notion databases. 📊 Notion Database Integration Competitors → Logs mentioned competitors for sales intelligence. Objections → Tracks and categorizes common objections from prospects. Integrations → Captures third-party tools & platforms discussed in calls. Use Cases → Stores customer challenges & product feature requests. 🛠 How to Set Up This Workflow Prepare Your AI Call Analysis Data Ensure AI-generated sales call data is passed into the workflow. Compatible with Gong, Fireflies.ai, Otter.ai, and other AI transcription tools. Connect Your Notion Database Set up Notion databases for: 🔹 Competitors (tracks competing products) 🔹 Objections (logs customer objections & concerns) 🔹 Integrations (captures mentioned platforms & tools) 🔹 Use Cases (categorizes customer pain points & feature requests) Configure n8n API Integrations Connect your Notion API key** in n8n under “Notion API Credentials.” Set up webhook triggers** to receive data from your AI transcription tool. Test the workflow** using a sample AI-generated call transcript. CallForge - 01 - Filter Gong Calls Synced to Salesforce by Opportunity Stage CallForge - 02 - Prep Gong Calls with Sheets & Notion for AI Summarization CallForge - 03 - Gong Transcript Processor and Salesforce Enricher CallForge - 04 - AI Workflow for Gong.io Sales Calls CallForge - 05 - Gong.io Call Analysis with Azure AI & CRM Sync CallForge - 06 - Automate Sales Insights with Gong.io, Notion & AI CallForge - 07 - AI Marketing Data Processing with Gong & Notion CallForge - 08 - AI Product Insights from Sales Calls with Notion 🔧 How to Customize This Workflow 💡 Modify Notion Data Structure – Adjust fields to match your company’s CRM setup. 💡 Enhance AI Data Processing – Align fields with different AI transcription providers. 💡 Expand with CRM Integration – Sync insights with HubSpot, Salesforce, or Pipedrive. 💡 Add Notifications – Send alerts via Slack, email, or webhook when key competitor mentions or objections are detected. ⚙️ Key Nodes Used in This Workflow 🔹 If Nodes – Checks if AI-generated data includes competitors, integrations, objections, or use cases. 🔹 Notion Nodes – Creates or updates entries in Notion databases. 🔹 Split Out & Aggregate Nodes – Processes multiple insights and consolidates AI outputs. 🔹 Wait Nodes – Ensures smooth sequencing of API calls and database updates. 🔹 HTTP Request Node – Sends AI-extracted insights to Notion for structured storage. 🚀 Why Use This Workflow? ✔ Eliminates manual data entry and speeds up sales intelligence processing. ✔ Ensures structured and categorized sales insights for decision-making. ✔ Improves team collaboration with AI-powered competitor tracking & objections logging. ✔ Seamlessly integrates with Notion to centralize and manage sales call insights. ✔ Scalable for teams using n8n Cloud or self-hosted deployments. This workflow empowers sales teams with automated AI insights, streamlining sales strategy and follow-ups with minimal effort. 🚀

Convert Notion to Markdown and Back to Notion

This workflow converts Notion pages to markdown, and then converts that markdown back to Notion blocks. It will triple the content of the last updated page it finds. This is useless by itself, but you can copy-paste from this workflow to create your own. Prerequisites A notion account with some pages or databases Setup instructions Create a notion credential and share some pages as described here: https://docs.n8n.io/integrations/builtin/credentials/notion/ How it works The HTTP Request gets notion child blocks from a page, because the default n8n block only gets plain text and no links. The first code block converts it to markdown. The second code block converts it back to Notion blocks The last HTTP block appends everything to the original Notion page, essentially duplicating it for the purpose of demoing the script. I hope in the future we get official n8n blocks that extract markdown, or use markdown to write to Notion. There is community block that also does this, but this template is easier: you can simply copy-paste the blocks from this workflow.
+3

Realtime Notion Todoist 2-way Sync with Redis

Purpose This solution enables you to manage all your Notion and Todoist tasks from different workspaces as well as your calendar events in a single place. All tasks can be managed in Todoist and additionally Fantastical can be used to manage scheduled tasks & events all together. Demo & Explanation How it works The realtime sync consists of two workflows, both triggered by a registered webhook from either Notion or Todoist To avoid overwrites by lately arriving webhook calls, every time the current task is retrieved from both sides. Redis is used to prevent from endless loops, since an update in one system triggers another webhook call again. Using the ID of the task, the trigger is being locked down for 15 seconds. Depending on the detected changes, the other side is updated accordingly. Generally Notion is treaded as the main source. Using an "Obsolete" Status, it is guaranteed, that tasks never get deleted entirely by accident. The Todoist ID is stored in the Notion task, so they stay linked together An additional full sync workflow daily fixes inconsistencies, if any of them occurred, since webhooks cannot be trusted entirely. Since Todoist requires a more complex setup, a tiny workflow helps with activating the webhook. Another tiny workflow helps generating a global config, which is used by all workflows for mapping purposes. Mapping (Notion >> Todoist) Name: Task Name Priority: Priority (1: do first, 2: urgent, 3: important, 4: unset) Due: Date Status: Section (Done: completed, Obsolete: deleted) <page_link>: Description (read-only) Todoist ID: <task_id> Current limitations Changes on the same task cannot be made simultaneously in both systems within a 15-20 second time frame Subtasks are not linked automatically to their parent yet Recurring tasks are not supported yet Tasks names do not support URL’s yet Prerequisites Notion A database must already exist (get a basic template here) with the following properties (case matters!): Text: "Name" Status: "Status", containing at least the options "Backlog", "In progress", "Done", "Obsolete" Select: "Priority", containing the options "do first", "urgent", "important" Date: "Due" Checkbox: "Focus" Text: "Todoist ID" Todoist A project must already exist with the same sections like defined as Status in Notion (except Done and Obsolete) Redis Create a Free Redis Cloud instance or self-host Setup The setup involves quite a lot of steps, yet many of them can be automated for business internal purposes. Just follow the video or do the following steps: Setup credentials for Notion (access token), Todoist (access token) and Redis - you can also create empty credentials and populate these later during further setup Clone this workflow by clicking the "Use workflow" button and then choosing your n8n instance - otherwise you need to map the credentials of many nodes. Follow the instructions described within the bundle of sticky notes on the top left of the workflow How to use You can apply changes (create, update, delete) to tasks both in Notion and Todoist which then get synced over within a couple of seconds (this is handled by the differential realtime sync) The daily running full sync, resolves possible discrepancies in Todoist and sends a summary via email, if anything needed to be updated. In case that contains an unintended change, you can jump to the Task from the email directly to fix it manually.

Analyse papers from Hugging Face with AI and store them in Notion

This workflow automates the process of retrieving Hugging Face paper summaries, analyzing them with OpenAI, and storing the results in Notion. Here’s a breakdown of how it works: ⏰ Scheduled Trigger: The flow is set to run automatically at 8 AM on weekdays. 📄 Fetching Paper Data: It fetches Hugging Face paper summaries using their API. 🔍 Data Check: Before processing, the workflow checks if the paper already exists in Notion to avoid duplicates. 🤖 Content Analysis with OpenAI: If the paper is new, it extracts the summary and uses OpenAI to analyze the content. 📥 Store Results in Notion: After analysis, the summarized data is saved in Notion for easy reference. ⚙️ Set Up Steps for Automation Follow these steps to set up this automated workflow with Hugging Face, OpenAI, and Notion integration: 🔑 Obtain API Tokens: You’ll need the Notion and OpenAI API tokens to authenticate and connect these services with n8n. 🔗 Integration in n8n: Link Hugging Face, OpenAI, and Notion by configuring the appropriate nodes in n8n. 🔧 Configure Workflow Logic: Set up a cron trigger for automatic execution at 8 AM on weekdays. Use an HTTP request node to fetch Hugging Face paper data. Add logic to check if the data already exists in Notion. Set up the OpenAI integration to analyze the paper’s summary. Store the results in Notion for easy access and reference. Result:

Build your own HTTP Request and Notion integration

Create custom HTTP Request and Notion workflows by choosing triggers and actions. Nodes come with global operations and settings, as well as app-specific parameters that can be configured. You can also use the HTTP Request node to query data from any app or service with a REST API.

Notion supported actions

Append After
Append a block
Get Child Blocks
Get many child blocks
Get
Get a database
Get Many
Get many databases
Search
Search databases using text search
Get
Get a database
Get Many
Get many databases
Create
Create a pages in a database
Get
Get a page in a database
Get Many
Get many pages in a database
Update
Update pages in a database
Create
Create a pages in a database
Get Many
Get many pages in a database
Update
Update pages in a database
Create
Create a page
Get
Get a page
Search
Text search of pages
Archive
Archive a page
Create
Create a page
Search
Text search of pages
Get
Get a user
Get Many
Get many users
Use case

Save engineering resources

Reduce time spent on customer integrations, engineer faster POCs, keep your customer-specific functionality separate from product all without having to code.

Learn more

FAQs

  • Can HTTP Request connect with Notion?

  • Can I use HTTP Request’s API with n8n?

  • Can I use Notion’s API with n8n?

  • Is n8n secure for integrating HTTP Request and Notion?

  • How to get started with HTTP Request and Notion integration in n8n.io?

Need help setting up your HTTP Request and Notion integration?

Discover our latest community's recommendations and join the discussions about HTTP Request and Notion integration.
Justin Cheu
Moiz Contractor
Cris A. Works
theo
Jon

Looking to integrate HTTP Request and Notion in your company?

Over 3000 companies switch to n8n every single week

Why use n8n to integrate HTTP Request with Notion

Build complex workflows, really fast

Build complex workflows, really fast

Handle branching, merging and iteration easily.
Pause your workflow to wait for external events.

Code when you need it, UI when you don't

Simple debugging

Your data is displayed alongside your settings, making edge cases easy to track down.

Use templates to get started fast

Use 1000+ workflow templates available from our core team and our community.

Reuse your work

Copy and paste, easily import and export workflows.

Implement complex processes faster with n8n

red iconyellow iconred iconyellow icon