Back to Integrations
integrationGoogle Sheets node
integrationHTTP Request node

Google Sheets and HTTP Request integration

Save yourself the work of writing custom integrations for Google Sheets and HTTP Request and use n8n instead. Build adaptable and scalable Data & Storage, Productivity, Development, and Core Nodes workflows that work with your technology stack. All within a building experience you will love.

How to connect Google Sheets and HTTP Request

  • Step 1: Create a new workflow
  • Step 2: Add and configure nodes
  • Step 3: Connect
  • Step 4: Customize and extend your integration
  • Step 5: Test and activate your workflow

Step 1: Create a new workflow and add the first step

In n8n, click the "Add workflow" button in the Workflows tab to create a new workflow. Add the starting point – a trigger on when your workflow should run: an app event, a schedule, a webhook call, another workflow, an AI chat, or a manual trigger. Sometimes, the HTTP Request node might already serve as your starting point.

Google Sheets and HTTP Request integration: Create a new workflow and add the first step

Step 2: Add and configure Google Sheets and HTTP Request nodes

You can find Google Sheets and HTTP Request in the nodes panel. Drag them onto your workflow canvas, selecting their actions. Click each node, choose a credential, and authenticate to grant n8n access. Configure Google Sheets and HTTP Request nodes one by one: input data on the left, parameters in the middle, and output data on the right.

Google Sheets and HTTP Request integration: Add and configure Google Sheets and HTTP Request nodes

Step 3: Connect Google Sheets and HTTP Request

A connection establishes a link between Google Sheets and HTTP Request (or vice versa) to route data through the workflow. Data flows from the output of one node to the input of another. You can have single or multiple connections for each node.

Google Sheets and HTTP Request integration: Connect Google Sheets and HTTP Request

Step 4: Customize and extend your Google Sheets and HTTP Request integration

Use n8n's core nodes such as If, Split Out, Merge, and others to transform and manipulate data. Write custom JavaScript or Python in the Code node and run it as a step in your workflow. Connect Google Sheets and HTTP Request with any of n8n’s 1000+ integrations, and incorporate advanced AI logic into your workflows.

Google Sheets and HTTP Request integration: Customize and extend your Google Sheets and HTTP Request integration

Step 5: Test and activate your Google Sheets and HTTP Request workflow

Save and run the workflow to see if everything works as expected. Based on your configuration, data should flow from Google Sheets to HTTP Request or vice versa. Easily debug your workflow: you can check past executions to isolate and fix the mistake. Once you've tested everything, make sure to save your workflow and activate it.

Google Sheets and HTTP Request integration: Test and activate your Google Sheets and HTTP Request workflow

OpenAI GPT-3: Company Enrichment from website content

Enrich your company lists with OpenAI GPT-3 ↓

You’ll get valuable information such as:

  • Market (B2B or B2C)
  • Industry
  • Target Audience
  • Value Proposition

This will help you to:

  • add more personalization to your outreach
  • make informed decisions about which accounts to target

I've made the process easy with an n8n workflow.

Here is what it does:

  • Retrieve website URLs from Google Sheets
  • Extract the content for each website
  • Analyze it with GPT-3
  • Update Google Sheets with GPT-3 data

Nodes used in this workflow

Popular Google Sheets and HTTP Request workflows

Google Sheets node
HTTP Request node
Merge node
+4

OpenAI GPT-3: Company Enrichment from website content

Enrich your company lists with OpenAI GPT-3 ↓ You’ll get valuable information such as: Market (B2B or B2C) Industry Target Audience Value Proposition This will help you to: add more personalization to your outreach make informed decisions about which accounts to target I've made the process easy with an n8n workflow. Here is what it does: Retrieve website URLs from Google Sheets Extract the content for each website Analyze it with GPT-3 Update Google Sheets with GPT-3 data
Split Out node
Convert to File node
HTML node
+5

Automated Web Scraping: email a CSV, save to Google Sheets & Microsoft Excel

How it works: The workflow starts by sending a request to a website to retrieve its HTML content. It then parses the HTML extracting the relevant information The extracted data is storted and converted into a CSV file. The CSV file is attached to an email and sent to your specified address. The data is simultaneously saved to both Google Sheets and Microsoft Excel for further analysis or use. Set-up steps: Change the website to scrape in the "Fetch website content" node Configure Microsoft Azure credentials with Microsoft Graph permissions (required for the Save to Microsoft Excel 365 node) Configure Google Cloud credentials with access to Google Drive, Google Sheets and Gmail APIs (the latter is required for the Send CSV via e-mail node).
Code node
Google Sheets node
HTTP Request node
Item Lists node
+5

Google Maps Scraper

This workflow allows to scrape Google Maps data in an efficient way using SerpAPI. You'll get all data from Gmaps at a cheaper cost than Google Maps API. Add as input, your Google Maps search URL and you'll get a list of places with many data points such as: phone number website rating reviews address And much more. Full guide to implement the workflow is here: https://lempire.notion.site/Scrape-Google-Maps-places-with-n8n-b7f1785c3d474e858b7ee61ad4c21136?pvs=4
Google Sheets node
HTTP Request node

Push JSON data into an App or to Spreadsheet file

This workflow template shows how to load JSON data into a workflow and push that data into an App or convert it into a Spreadsheet file. Specifically, this workflow shows how to make a generic API request that returns JSON. It then shows how to load that into a Google Sheets spreadsheet, or convert it to .CSV file format. However, you can use the general pattern to load data into any app or convert to any spreadsheet file format (such as .xlsx).
OpenAI Model node
HTTP Request node
Gmail Trigger node
+5

Invoice data extraction with LlamaParse and OpenAI

This n8n workflow automates the process of parsing and extracting data from PDF invoices. With this workflow, accounts and finance people can realise huge time and cost savings in their busy schedules. Read the Blog: https://blog.n8n.io/how-to-extract-data-from-pdf-to-excel-spreadsheet-advance-parsing-with-n8n-io-and-llamaparse/ How it works This workflow will watch an email inbox for incoming invoices from suppliers It will download the attached PDFs and processing them through a third party service called LlamaParse. LlamaParse is specifically designed to handle and convert complex PDF data structures such as tables to markdown. Markdown is easily to process for LLM models and so the data extraction by our AI agent is more accurate and reliable. The workflow exports the extracted data from the AI agent to Google Sheets once the job complete. Requirements The criteria of the email trigger must be configured to capture emails with attachments. The gmail label "invoice synced" must be created before using this workflow. A LlamaIndex.ai account to use the LlamaParse service. An OpenAI account to use GPT for AI work. Google Sheets to save the output of the data extraction process although this can be replaced for whatever your needs. Customizing this workflow This workflow uses Gmail and Google Sheets but these can easily be swapped out for equivalent services such as Outlook and Excel. Not using Excel? Simple redirect the output of the AI agent to your accounting software of choice.
Google Sheets node
HTTP Request node

Import JSON data into Google Sheets and CSV file

This workflow gets data from an API and exports it into Google Sheets and a CSV file.
Google Sheets node
Slack node
Gmail node
+4

Host your own Uptime Monitoring with Scheduled Triggers

This n8n workflow demonstrates how to build a simple uptime monitoring service using scheduled triggers. Useful for webmasters with a handful of sites who want a cost-effective solution without the need for all the bells and whistles. How it works Scheduled trigger reads a list of website urls in a Google Sheet every 5 minutes Each website url is checked using the HTTP node which determines if the website is either in the UP or DOWN state. An email and Slack message are sent for websites which are in the DOWN state. The Google Sheet is updated with the website's state and a log created. Logs can be used to determine total % of UP and DOWN time over a period. Requirements Google Sheet for storing websites to monitor and their states Gmail for email alerts Slack for channel alerts Customising the workflow Don't use Google Sheets? This can easily be exchanged with Excel or Airtable.
HTTP Request node
+3

Enrich Linkedin profile URLs stored in a Google Sheet

Template Information Who is this template for? This template is for users looking to retrieve email information from LinkedIn profiles and update Google Sheets with the collected data. 🎥 quick set up video How it works** The template utilizes a series of nodes to fetch email information from LinkedIn profiles. It starts with a Schedule Trigger node that sets the interval for the workflow. The Conditional Check node verifies if certain fields like Name, Gender, Job Title, Summary, and LinkedIn URL are not empty. The HTTP Request node sends a POST request to the specified URL with API key and profile information. The Data Merge node merges the data collected. The Field Editing node modifies the fields as needed. Finally, the Google Sheets Update node updates the Google Sheets with the gathered information. Set Up Instructions Make sure to have the necessary credentials and permissions for accessing LinkedIn and Google Sheets. Set up the API key required for the HTTP Request node. Configure the Google Sheets Update node with the appropriate document ID and sheet name. Check and adjust field mappings in the Field Editing node according to your needs. Run the workflow and monitor the updates in your Google Sheets document. Overview: The workflow is designed to find contact information for LinkedIn profile URLs stored in a Google Sheet. It involves various nodes for different operations such as making HTTP requests, scheduling triggers, reading from and updating Google Sheets, field editing, data merging, and conditional checks. A video demonstrating the workflow process can be accessed here. Copy this template to get started : Google Sheets Using Prospeo.io LinkedIn Email Finder API with cURL To use the API endpoint "https://api.prospeo.io/linkedin-email-finder" with cURL, follow these steps: Use the cURL command with the following parameters: curl -X POST \ -H "Content-Type: application/json" \ -H "X-KEY: your_api_key" \ -d '{ "url": "https://www.linkedin.com/in/john-doe/" }' \ "https://api.prospeo.io/linkedin-email-finder" Replace "your_api_key" with your actual API key. Update the "url" field in the JSON data with the LinkedIn profile URL for which you want to find the email address. To get access to this API and obtain your API key, you need to sign up on the Prospeo platform and subscribe to their LinkedIn email finder service. Once you have subscribed, you will receive an API key that you can use to authenticate your requests to the API endpoint. Description: Schedule Trigger:** Triggers the workflow based on a defined schedule interval, in this case, based on minutes. Schedule Trigger Node Documentation Google Sheets Read:** Reads data from a Google Sheets document and sheet based on the provided document ID and sheet name. Google Sheets Node Documentation Conditional Check:** Checks multiple conditions based on the input data and performs actions accordingly. Conditional Node Documentation HTTP Request:** Sends an HTTP POST request to a specified URL with headers and body parameters. HTTP Request Node Documentation No Operation, do nothing:** Placeholder node that does not perform any operation. Data Merge:** Merges data based on specified mode and combination settings. Merge Node Documentation Field Editing:** Edits fields by setting specific values for each field based on input data. Set Node Documentation Google Sheets Update:** Updates data in a Google Sheets document and sheet based on specified columns and values. Google Sheets Node Documentation
Google Sheets node
HTTP Request node
Merge node
+3

Read XML file and store content in Google Sheets

This workflow shows a low code approach to parsing an XML file and storing its contents in a Google Sheets spreadsheet. To run the workflow: Make sure you are running n8n 0.197 or newer Have n8n authenticated with Google Sheets How it's done: This workflow first downloads an example file using the HTTP Request node and reads this file using the XML node. It then runs the Item Lists node to split out the individual food items from the example file. It then splits up the workflow into a separate branch creating a new spreadsheet file using the Google Sheets node. To read the column names we're using the Object.keys() method inside a Set node. Once the spreadsheet is created (the workflow waits for this using the Merge node), the data is appended to the newly created sheet (again using the Google Sheets node).
HTML node
Split Out node
Default Data Loader node
+5

Customer Insights with Qdrant, Python and Information Extractor

This n8n template is one of a 3-part series exploring use-cases for clustering vector embeddings: Survey Insights Customer Insights Community Insights This template demonstrates the Customer Insights scenario where Trustpilot reviews can be quickly grouped by similarity and an AI agent can generate insights on those groupings. With this workflow, marketers can save days and even weeks of work breaking down their own or competitor reviews and identify frequently mentioned positives and negatives. Sample Output: https://docs.google.com/spreadsheets/d/e/2PACX-1vQ6ipJnXWXgr5wlUJnhioNpeYrxaIpsRYZCwN3C-fFXumkbh9TAsA_JzE0kbv7DcGAVIP7az0L46_2P/pubhtml How it works Trustpilot reviews are scraped for a particular company using the HTTP request node. Reviews are then inserted into a Qdrant collection carefully tagged with the question and Trustpilot metadata. Reviews are fetched and put through a clustering algorithm using the Python Code node. The Qdrant points are returned in clustered groups. Each group is looped to fetch the payloads of the points and feed them to the AI agent to summarise and generate insights for. The resulting insights and raw responses are then saved to the Google Spreadsheet for further analysis by the marketer. Requirements Qdrant Vectorstore for storing embeddings. OpenAI account for embeddings and LLM. Customising the Template Adjust clustering parameters which make sense for your data. Consider expanding date range of reviews for insights over common intervals: 3mth, 6mth and YTD.
HTTP Request node
Google Sheets node

Import CSV from URL to Google Sheets

This workflow automatically imports data from a CSV file located at a specific URL and then updates the Google Sheets document with the provided data. Below is a step-by-step description of what this workflow does: The workflow is started manually using the "When you click 'Execute Workflow'" node. The CSV file is then uploaded from the specified URL "https://opendata.ecdc.europa.eu/covid19/testing/csv/data.csv" using the "Upload CSV" node. The "Import CSV" node accepts the uploaded CSV file and converts it into JSON formatted data. The "Add Unique Field" node generates a unique key by combining the 'country_code' and 'year_week' fields from the JSON data, which will be further used in the Google Sheets document. The 'Keep only DACH in 2023' node filters the data to keep only records where 'country_code' is either 'DE', 'AT', or 'CH' and 'year_week' starts with '2023'. Google's API has limitations on the speed of read and write operations, so only a subset of the data is taken. The filtered data is loaded into the specified Google Sheets document via the 'Load to Spreadsheet' node. The operation is set to 'appendOrUpdate' and the document ID and sheet name are specified. Also, the previously generated 'unique_key' key is set as the key to match the columns.
OpenAI Chat Model node
Split Out node
+5

Enrich FAQ sections on your website pages at scale with AI

This n8n workflow template lets you easily generate comprehensive FAQ (Frequently Asked Questions) content for multiple services (or any items or pages you need to add the FAQs to). Simply provide the Google Sheets document containing the items to scrape, and the workflow automatically creates detailed, AI-enhanced FAQ documents. How it works The workflow reads data from a Google Sheets document containing information about different services and categories (again, in your case - whatever objects you need). For each service and category, it generates a set of standard questions and answers covering setup, permissions, integrations, use cases, and pricing benefits. An AI model (OpenAI's GPT) is used to enhance or complete some of the answers, making the content more comprehensive and natural-sounding. The workflow formats the Q&A pairs, combining AI-generated content with predefined answers where applicable. It creates a text file (JSON) for each service or category, containing the formatted Q&A pairs. The generated files are saved to specific folders in Google Drive, organized by the type of integration (native, credential-only, non-native) or category. After processing each service or category, it updates the status in the original Google Sheets document to mark it as completed. Ideal for: Marketing teams: Rapidly create comprehensive FAQ documents for multiple products or services. Customer support: Generate consistent and detailed answers for common customer queries. Product managers: Easily maintain up-to-date documentation as products evolve. Content creators: Streamline the process of creating informative content about various offerings. Accounts required Google account (for Google Sheets and Google Drive) OpenAI API account (for AI-enhanced content generation) n8n.io account (for workflow execution) Set up instructions Set up the required credentials for Google Sheets, Google Drive, and OpenAI when you first open the workflow. Prepare your Google Sheets document with the service/category information. Here's an example of Google Sheet. Fill the "Define Sheets" node with your sheets Adjust the folder IDs in the "Prepare Job" node to match your Google Drive structure. Configure the OpenAI model settings in the "OpenAI Chat Model" node if needed. Test the workflow with a small subset of data before running it on your entire dataset. Adjust the questions asked in the "Create your Q&A templates" section After testing, activate your workflow for automated FAQ generation. 🙏 Big, big kudos to Jim Le for his ideas, input and support when building this workflow. Your approach to AI workflows is always super helpful!
Google Sheets node
HTTP Request node
Twilio node
Telegram node
Telegram Trigger node

Add data from a photo to Google Sheets

Automatically adding expense receipts to Google Sheets with Telegram, Mindee API, Twilio, and n8n.
Item Lists node
HTTP Request node
Google Sheets node
Merge node
+3

Scrape Crunchbase recent funding rounds

Get recent funding rounds from Crunchbase in Google Sheets, along with 10+ data points (LinkedIn URL, monthly traffic, company size, etc.) You’ll be able to: Create a custom database Reach out to interesting leads at the right time Send custom alerts to your tools This workflow scrape recent funding rounds from Crunchbase, and add them in Google Sheets. It uses Piloterr API to get this data with ease. Full guide can be found here: https://lempire.notion.site/Get-recent-fundraising-in-Google-Sheets-dafbbda2635544b4925c4fb04abac8f5?pvs=74
Google Sheets node
HTTP Request node
+3

Zalando Price Patrol: Monitor price evolution with email notification

Monitor Zalando product pricing and get notified if a Zalando product price falls under a limit you have defined. This n8n workflow lets you follow the evolution of the price of products you select. For each product, you define a minimal price. The workflow automatically scrapes the price for you on a daily basis. If the price falls under your minimal price settings, you receive a notification. This workflow is very easy to use. From a simple form, just paste the URL of the Zalando product you want to monitor and fill in the minimal price. Features Monitor Zalando Product price: follow the price evolution of your favorite Zalando products. Email notification: set a minimal price, if the product price falls below this limit, you get notified by email. Visual price evolution: get a graphical overview of the product pricing evolutions. Automated Daily check-up: this workflow automatically checks the price of your selected Zalando products on a daily basis. Set up Copy this workflow to your n8n interface. Create a new Google Spreadsheet, copy this template Setup your workflow with your Google credential, your email, and your copy of the Spreadsheet. Activate the Workflow and start pasting Zalando product URLs. I hope you will enjoy this workflow that is probably one of the simplest ways to monitor the pricing evolution of your favorite Zalando products. Feel free to contact me should you have any questions or suggestions. Created by the n8n.inja ✨ follow on X 📺 follow on YT
Code node
HTTP Request node
HTML node
+5

Monitor G2 competitors reviews [Google Sheets, ScrapingBee, Slack]

This workflow monitor G2 reviews URLS. When a new review is published, it will: trigger a Slack notification record the review in Google Sheets To install it, you'll need: access to Slack, Google Sheets and ScrapingBee Full guide here: https://lempire.notion.site/Scrape-G2-reviews-with-n8n-3f46e280e8f24a68b3797f98d2fba433?pvs=4
HTTP Request node
Code node
Google Drive Trigger node
+3

Recognize invoices / receipts from Google Drive and put them into Google Sheets

This workflow allows you to recognize a folder with receipts or invoices (make sure your files are in .pdf, .png, or .jpg format). The workflow can be triggered via the "Test workflow" button, and it also monitors the folder for new files, automatically recognizing them. Video Demo https://youtu.be/mGPt7fqGQD8 1. n8n import glitch After import, the trigger node "When clicking 'Test workflow'" might be disconnected. You need to connect it via 2 arrows to "Google Sheets1" and "Google Drive" nodes. So, the workflow has 2 triggers - via button, and via Google Sheets "new file" event - both of these triggers should be connected to 2 nodes. Here is how it should look like: https://ocr.oakpdf.com/n8n_fix.png 2. Set up RapidAPI HTTP auth key Create new "HTTP header" n8n credential and paste your RapidAPI key from https://rapidapi.com/restyler/api/receipt-and-invoice-ocr-api into it. https://ocr.oakpdf.com/n8n_api_key.png Make sure "HTTP Request" node uses this credential. 3. Set up your Google Auth You need a Google connection to work with your Google Sheets and Google Drive accounts: https://docs.n8n.io/integrations/builtin/credentials/google/oauth-generic/#finish-your-n8n-credential 4. Set up Google Sheets Copy this Google Sheets document: https://docs.google.com/spreadsheets/d/1G0w-OMdFRrtvzOLPpfFJpsBVNqJ9cfRLMKCVWfrTQBg/edit?usp=sharing Custom document formats and advanced usage Email: [email protected] Linkedin: https://www.linkedin.com/in/anthony-sidashin/
Google Sheets node
HTTP Request node
+3

Move data between JSON and spreadsheets

This workflow illustrates how to convert data from JSON to binary format and import JSON data or files into Google Sheets or local spreadsheets.
Google Sheets node
HTTP Request node
AWS Rekognition node

Collects images from web search results and send to Google Sheets

This workflow collects images from web search results on a specific query, analyzes the image for labels, formats the text, and adds the information in Google Sheets. HTTP Request node** gets images from the web. AWS Rekognition node** analyzes the images (in this case, it detects text in an image). Set node** sets the values necessary for the data set. Function node** transforms the text detected in the image to lower case. Google Sheets node** adds the image information to a sheet that serves as data set.
Embeddings OpenAI node
Default Data Loader node
Google Sheets node
+5

Survey Insights with Qdrant, Python and Information Extractor

This n8n template is one of a 3-part series exploring use-cases for clustering vector embeddings: Survey Insights Customer Insights Community Insights This template demonstrates the Survey Insights scenario where survey participant responses can be quickly grouped by similarity and an AI agent can generate insights on those groupings. With this workflow, researchers can save days and even weeks of work breaking down cohorts of participants and identify frequently mentioned positives and negatives. Sample Output: https://docs.google.com/spreadsheets/d/e/2PACX-1vT6m8XH8JWJTUAfwojc68NAUGC7q0lO7iV738J7aO5fuVjiVzdTRRPkMmT1C4N8TwejaiT0XrmF1Q48/pubhtml# How it works All survey questions and responses are imported from a Google Sheet. Responses are then inserted into a Qdrant collection carefully tagged with the question and survey metadata. For each question, all relevant response are put through a clustering algorithm using the Python Code node. The Qdrant points are returned in clustered groups. Each group is looped to fetch the payloads of the points and feed them to the AI agent to summarise and generate insights for. The resulting insights and raw responses are then saved to the Google Spreadsheet for further analysis by the researcher. Requirements Survey data and format as shown in the attached google sheet. Qdrant Vectorstore for storing embeddings. OpenAI account for embeddings and LLM. Customising the Template Adjust clustering parameters which make sense for your data. Add more clusters for open-ended questions and less clusters when responses are multiple choice.
Google Sheets node
HTTP Request node

Automate company research using ProspectLens and Google Sheets

This n8n workflow automates the process of researching companies by gathering relevant data such as traffic volume, foundation details, funding information, founders, and more. The workflow leverages the ProspectLens API, which is particularly useful for researching companies commonly found on Crunchbase and LinkedIn. ProspectLens is an API that provides very detailed company data. All you need to do is supply the company's domain name. You can obtain your ProspectLens API key here: https://apiroad.net/marketplace/apis/prospectlens In n8n, create a new "HTTP Header" credential. Set x-apiroad-key as the "Name" and enter your APIRoad API key as the "Value". Use this credential in the HTTP Request node of the workflow.
Google Drive node
Google Gemini Chat Model node
+5

Visual Regression Testing with Apify and AI Vision Model

This n8n workflow is a proof-of-concept template exploring how we might work with multimodal LLMs and their multi-image analysis capabilities. In this demo, we compare 2 screenshots of a webpage taken at different timestamps and pass both to our multimodal LLM for a visual comparison of differences. Handling multiple binary inputs (ie. images) in an AI request is supported by n8n's basic LLM node. How it works This template is intended to run as 2 parts: first to generate the base screenshots and next to run the visual regression test which captures fresh screenshots. Starting with a list of webpages captured in a Google sheet, base screenshots are captured for each using a external web scraping service called Apify.com (I prefer Apify but feel free to use whichever web scraping service available to you) These base screenshots are uploaded to Google Drive and will be referenced later when we run our testing. Phase 2 of the workflow, we'll use a scheduled trigger to fire sometime in the future which will reuse our web scraping service to generate fresh screenshots of our desired webpages. Next, re-download our base screenshots in parallel and with both old and new captures, we'll pass these to our LLM node. In the LLM node's options, we'll define 2 "user message" inputs with the type of binary (data) for our images. Finally, we'll prompt our LLM with our testing criteria and capture the regressions detected. Note, results will vary depending on which LLM you use. A final report can be generated using the LLM's output and is uploaded to Linear. Requirements Apify.com API key for web screenshotting service Google Drive and Sheets access to store list of webpages and captures Customising this workflow Have your own preferred web screenshotting service? Feel free to swap out Apify with your service of choice. If the web screenshot is too large, it may prove difficult for the LLM to spot differences with precision. Try splitting up captures into smaller images instead.
Zoom node
HTTP Request node
Gmail node
Google Sheets node
+5

Streamline Your Zoom Meetings with Secure, Automated Stripe Payments

Unlock streamlined Zoom Meeting organization and exclusive access management with this n8n workflow. Designed for educators, event organizers, and businesses, this tool automates your event logistics, so you can focus on delivering valuable content. Features Zoom Meetings Creation:** Instantly generate new Zoom meetings with the n8n built-in form. Collect Payments Using Stripe:** Effortlessly monetize your events with secure, automatically created Stripe payment pages for each meeting. Exclusive Gated Access:** Ensure your content remains exclusive by sending Zoom meeting passwords only to verified subscribers who have completed their payment through Stripe. Participants Email Notifications:** Automate the distribution of Zoom meeting details post-payment, eliminating the need for manual email management and ensuring participants are promptly informed. Instant and Easy Participants Overview:** Manage and track your event registrations with ease. All related data is stored in a Google Sheets document that you own. You're notified via email with each new subscription, simplifying participant management. Set Up Steps Connect your Zoom, Stripe, Gmail and Google Sheet credentials. Create an empty Google Sheet in your Google Drive. Fill the config node (Sheet URL, email and currency). Edit email text. This n8n workflow template is designed to minimize setup time and maximize efficiency, allowing you to focus on delivering value to your subscribers. With just a few clicks, you can automate the entire process of organizing and monetizing your Zoom meetings. Created by the n8ninja.
Google Sheets Trigger node
Remove Duplicates node
HTTP Request node
Google Sheets node

Verifying Email deliverability using google sheets and Effibotics API

This workflow helps marketers verify and update data using EffiBotics Email Verifier API. Copy and create a list with emails as on this one https://docs.google.com/spreadsheets/d/1rzuojNGTaBvaUEON6cakQRDva3ueGg5kNu9v12aaSP4/edit#gid=0 The trigger checks for any updates in the number of rows that are present in a sheet and updates the verified emails on Google sheets Once you update a new cell, the new data is read, and the email is checked for its validity before. The results are then updated in real-time on the sheet. Happy Emailing!
OpenAI Chat Model node
Google Sheets node
HTTP Request node
+5

Enrich Company Data from Google Sheet with OpenAI Agent and ScrapingBee

This workflow demonstrates how to enrich data from a list of companies in a spreadsheet. While this workflow is production-ready if all steps are followed, adding error handling would enhance its robustness. Important notes Check legal regulations**: This workflow involves scraping, so make sure to check the legal regulations around scraping in your country before getting started. Better safe than sorry! Mind those tokens**: OpenAI tokens can add up fast, so keep an eye on usage unless you want a surprising bill that could knock your socks off! 💸 Main Workflow Node 1 - Webhook This node triggers the workflow via a webhook call. You can replace it with any other trigger of your choice, such as form submission, a new row added in Google Sheets, or a manual trigger. Node 2 - Get Rows from Google Sheet This node retrieves the list of companies from your spreadsheet. The columns in this Google Sheet are: Company**: The name of the company Website**: The website URL of the company These two fields are required at this step. Business Area**: The business area deduced by OpenAI from the scraped data Offer**: The offer deduced by OpenAI from the scraped data Value Proposition**: The value proposition deduced by OpenAI from the scraped data Business Model**: The business model deduced by OpenAI from the scraped data ICP**: The Ideal Customer Profile deduced by OpenAI from the scraped data Additional Information**: Information related to the scraped data, including: Information Sufficiency: Description: Indicates if the information was sufficient to provide a full analysis. Options: "Sufficient" or "Insufficient" Insufficient Details: Description: If labeled "Insufficient," specifies what information was missing or needed to complete the analysis. Mismatched Content: Description: Indicates whether the page content aligns with that of a typical company page. Suggested Actions: Description: Provides recommendations if the page content is insufficient or mismatched, such as verifying the URL or searching for alternative sources. Node 3 - Loop Over Items This node ensures that, in subsequent steps, the website in "extra workflow input" corresponds to the row being processed. You can delete this node, but you'll need to ensure that the "query" sent to the scraping workflow corresponds to the website of the specific company being scraped (rather than just the first row). Node 4 - AI Agent This AI agent is configured with a prompt to extract data from the content it receives. The node has three sub-nodes: OpenAI Chat Model: The model used is currently gpt4-o-mini. Call n8n Workflow: This sub-node calls the workflow to use ScrapingBee and retrieves the scraped data. Structured Output Parser: This parser structures the output for clarity and ease of use, and then adds rows to the Google Sheet. Node 5 - Update Company Row in Google Sheet This node updates the specific company's row in Google Sheets with the enriched data. Scraper Agent Workflow Node 1 - Tool Called from Agent This is the trigger for when the AI Agent calls the Scraper. A query is sent with: Company name Website (the URL of the website) Node 2 - Set Company URL This node renames a field, which may seem trivial but is useful for performing transformations on data received from the AI Agent. Node 3 - ScrapingBee: Scrape Company's Website This node scrapes data from the URL provided using ScrapingBee. You can use any scraper of your choice, but ScrapingBee is recommended, as it allows you to configure scraper behavior directly. Once configured, copy the provided "curl" command and import it into n8n. Node 4 - HTML to Markdown This node converts the scraped HTML data to Markdown, which is then sent to OpenAI. The Markdown format generally uses fewer tokens than HTML. Improving the Workflow It's always a pleasure to share workflows, but creators sometimes want to keep some magic to themselves ✨. Here are some ways you can enhance this workflow: Handle potential errors Configure the scraper tool to scrape other pages on the website. Although this will cost more tokens, it can be useful (e.g., scraping "Pricing" or "About Us" pages in addition to the homepage). Instead of Google Sheets, connect directly to your CRM to enrich company data. Trigger the workflow from form submissions on your website and send the scraped data about the lead to a Slack or Teams channel.
Hacker News node
Split Out node
Qdrant Vector Store node
+5

Community Insights using Qdrant, Python and Information Extractor

This n8n template is one of a 3-part series exploring use-cases for clustering vector embeddings: Survey Insights Customer Insights Community Insights This template demonstrates the Community Insights scenario where HN commments can be quickly grouped by similarity and an AI agent can generate insights on those groupings. With this workflow, Researchers or HN users can quickly breakdown community consensus on a particular topic and identify frequently mentioned positives and negatives. Sample Output: https://docs.google.com/spreadsheets/d/e/2PACX-1vQXaQU9XxsxnUIIeqmmf1PuYRuYtwviVXTv6Mz9Vo6_a4ty-XaJHSeZsptjWXS3wGGDG8Z4u16rvE7l/pubhtml How it works HN comments are imported via the Hacknews API node. Comments are then inserted into a Qdrant collection carefully tagged with the Hackernews API metadata. Comments are then fetched and are put through a clustering algorithm using the Python Code node. The Qdrant points are returned in clustered groups. Each group is looped to fetch the payloads of the points and feed them to the AI agent to summarise and generate insights for. The resulting insights and raw responses are then saved to the Google Spreadsheet for further analysis by the researcher or the HN user. Requirements Works best with lots of comments! Qdrant Vectorstore for storing embeddings. OpenAI account for embeddings and LLM. Customising the Template Adjust clustering parameters which make sense for your data. Adjust sentimentality setting if comments are overwhelmingly negative at times.
Google Sheets node
HTTP Request node
+2

Update specific post-call information on Syncro

This workflow takes Dialpad call information after a call is disconnected and pushes it into Syncro as a ticket timer update, matching the start time and end time provided by Dialpad and a note that containing the contact or customer name and number. > This workflow is part of an MSP collection, The original can be found here: https://github.com/bionemesis/n8nsyncro
Google Sheets node
HTTP Request node
AWS Rekognition node

Collect and label images and send to Google Sheets

This workflow collects images from web search on a specific query, detects labels in them, and stores this information in a Google Sheet.
HTTP Request node
OpenAI Chat Model node
+5

🚀 Local Multi-LLM Testing & Performance Tracker

🚀 Local Multi-LLM Testing & Performance Tracker This workflow is perfect for developers, researchers, and data scientists benchmarking multiple LLMs with LM Studio. It dynamically fetches active models, tests prompts, and tracks metrics like word count, readability, and response time, logging results into Google Sheets. Easily adjust temperature 🔥 and top P 🎯 for flexible model testing. Level of Effort: 🟢 Easy – Minimal setup with customizable options. Setup Steps: Install LM Studio and configure models. Update IP to connect to LM Studio. Create a Google Sheet for result tracking. Key Outcomes: Benchmark LLM performance. Automate results in Google Sheets for easy comparison. Version 1.0
Webhook node
Google Sheets node
HTTP Request node

Add product ideas to Google Sheets via a Slack

Use Case This workflow is a slight variation of a workflow we're using at n8n. In most companies, employees have a lot of great ideas. That was the same for us at n8n. We wanted to make it as easy as possible to allow everyone to add their ideas to some formatted database - it should be somewhere where everyone is all the time and could add a new idea without much extra effort. Since we're using Slack, this seemed to be the perfect place to easily add ideas. In this example, we're adding the ideas to Google Sheets instead of Notion, like we do. What this workflow does This workflow waits for a webhook call within Slack, that gets fired when users use the /idea command on a bot that you will create as part of this template. It then checks the command, adds the idea to Google Sheets and notifies the user about the newly added idea as you can see below: Creating your Slack bot Visit https://api.slack.com/apps, click on New App and choose a name and workspace. Click on OAuth & Permissions and scroll down to Scopes -> Bot token Scopes Add the chat:write scope Head over to Slash Commands and click on Create New Command Use /idea as the command Copy the test URL from the Webhook node into Request URL Add whatever feels best to the description and usage hint Go to Install app and click install Setup Create a Google Sheets document with the columns Name and Creator Add your Google credentials Fill the Set me up node. Create your Slack app (see other sticky) Click Test workflow and use the /idea comment in Slack Activate the workflow and exchange the Request URL with the production URL from the webhook How to adjust it to your needs You can adjust the table in Google Sheets and for example, add different types of ideas or areas that they impact Rename the Slack command as it works best for you How to enhance this workflow At n8n we use this workflow in combination with some others. E.g. we have the following things on top: We additionally have a /bug Slack command that adds a new bug to Linear. Here we're using AI to classify the bugs and move it to the right team. (Bug command workflow and Ai Classifier workflow) We also added other types, like /pain to be less solution-driven To make it easier for everyone to give input, we added a Votes column that allows everyone to vote on ideas/pain points in the list We're also running a workflow once a week that highlights the most popular new ideas and the most active voters
Code node
Google Sheets node
HTTP Request node

Batch verify emails in Google Sheet with Icypeas

This template can be used to verify email addresses with Icypeas. Be sure to have an active account to use this template. How it works This workflow can be divided into four steps : The workflow initiates with a manual trigger (On clicking ‘execute’). It reads your Google Sheet file. It converts your file to an array. It connects to your Icypeas account. It performs an HTTP request to verify the emails. Set up steps You will need a formated Google sheet file with email addresses. You will need a working icypeas account to run the workflow and get your API Key, API Secret and User ID. You will need email addresses to verify them.
HTTP Request node
Clearbit node
Split Out node
+3

Enrich website visitors with Leadfeeder & Clearbit and save to Google Sheets

Use Case When trying to maximize your outreach, website visitors are often an overlooked source of qualified new leads. This workflow allows your to track and enrich new website visitors and saves them to a Google Sheet once they meet a pre-defined criteria. What this workflow does This workflow fires once a day and gets all your leads saved in Leadfeeder. It then takes the leads that meet a pre-defined engagement criteria, e.g. that they visited your site 3 times, and enriches them additionally with Clearbit. From there it filters the leads again by a criteria on the company, e.g. a minimum employee count, and saves matching leads into a Google Sheet document. Setup Add your Leedfeeder credentials. The name should be Authorization and the value Token token=yourapitoken. You can find your token via Settings -> Personal -> API-Token Add your Google Sheet credentials Save the Leedfeeder account names you want to use in the Setup node Copy the Google Sheets Template and add its URL to the Setup node How to adjust this to your needs Adjust and/or remove the engagement and company criteria Add more ways to enrich a company Potential ideas to enhance the use of this workflow Automatically reach out to users that meet the criteria / that get added to the sheet Create a workflow that finds the right employee in companies that are identified by this workflow
Code node
Google Sheets node
HTTP Request node

Batch process domains/companies in Google Sheet with Icypeas (Bulk Search)

This template can be used to verify email addresses with Icypeas. Be sure to have an active account to use this template. How it works This workflow can be divided into four steps : The workflow initiates with a manual trigger (On clicking ‘execute’). It reads your Google Sheet file. It connects to your Icypeas account. It performs an HTTP request to scan the domains/companies. Set up steps You will need a formated Google sheet file with company/domain names. You will need a working icypeas account to run the workflow and get your API Key, API Secret and User ID. You will need domain/companies names to scan them.
Code node
HTTP Request node
+3

Import multiple Manufacturers from Google Sheets to Shopware 6

Effortlessly optimize your workflow by automatically importing hundreds of manufacturers from a Google Sheet into your Shopware online store, saving countless hours of manual work. How it works Retrieve all manufactures from a Google Sheet Add each manufacture via Shopware sync API Endpoint to Shopware Upload a logo for each manufacture from a provided public URL to Shopware Set Up Steps Add your Shopware url to first node called Settings Create a Google Sheet in your Google account with the following columns (Demo Sheet) name (the name of the manufacturer which has to be unique and is required) website (url to the manufacturer website) description logo_url (public manufcaturer logo url. Have to be a png, jpg or svg file) translation_language_code_1 (optional. Language Code of your language. For example 'es-ES' for spanish. You have to make sure a language with this code exists in your Shopware shop.) translation_name_1 (optional. Manufacturer name translated to the language you defined at translation_language_code_1) translation_description_1 (optional. Manufacturer description translated to the language you defined at translation_language_code_1) translation_language_code_2 (optional. Same as translation_language_code_1 for another language) translation_name_2 (optional. Same as translation_name_1 for another language) translation_description_2 (optional. Same as translation_description_1 for another language) translation_language_code_3 (optional. Same as translation_language_code_1 for another language) translation_name_3 (optional. Same as translation_name_1 for another language) translation_description_3 (optional. Same as translation_description_1 for another language) Connect to your Google account Connect to your Shopware account Create a Shopware Integration Connect to Shopware at the nodes "Import Manufacturer" and "Upload Manufacturer Logo" using a Generic OAuth2 API Authentication with Grant Type "Client Credentials". The Access Token URL is https://your-shopware-domain.com/api/oauth/token. Run the workflow
Google Sheets node
HTTP Request node
+3

Push Dialpad call information to Syncro

This workflow takes Dialpad call information for an answered call and pushes it into Syncro as either a ticket or an update to an existing ticket. You will need to have a workflow for each technician at this time. It also saves call/ticket information to a Google Sheet to be queried by the dialpad_to_syncro_timer.json workflow. This will match to inbound and outbound calls, so if that's not desired you need to add in an IF to only proceed on either inbound or outbound calls. > This workflow is part of an MSP collection, The original can be found here: https://github.com/bionemesis/n8nsyncro
Limit node
Code node
Google Sheets node
+3

Find out which Chrome extensions are tracked by Linkedin

What this workflow does Linkedin tracks which Chrome extensions are installed in your browser. This workflow uses a huge raw JSON of chrome extension ids, extracted from Linkedin pages, and builds a pretty Google Sheet with the list of these extensions. This workflow web scrapes Google to search for chrome extension id - and extracts the first search result. Setup Clone this Google Sheet template: https://docs.google.com/spreadsheets/d/1nVtoqx-wxRl6ckP9rBHSL3xiCURZ8pbyywvEor0VwOY/edit?gid=0#gid=0 Get API key for Google SERP API access here: https://rapidapi.com/restyler/api/serp-api1 Create n8n header auth for Google SERP API Some context and discussion https://www.linkedin.com/feed/update/urn:li:activity:7245006911807393792/ Follow the author and get the final Google Sheet with 1300+ Chrome extensions: https://www.linkedin.com/in/anthony-sidashin/
Code node
Google Sheets node
HTTP Request node

Perform email searches with Icypeas (bulk)

This template can be used to verify email addresses with Icypeas. Be sure to have an active account to use this template. How it works This workflow can be divided into four steps : The workflow initiates with a manual trigger (On clicking ‘execute’). It reads your Google sheet file. It connects to your Icypeas account. It performs an HTTP request to search for the email addresses. Set up steps You will need a formated Google Sheet file with firstnames,lastnames and company/domain names. You will need a working icypeas account to run the workflow and get your API Key, API Secret and User ID. You will need a personn firstname, lastname and domain/company name to perform the search.
Google Sheets node
HTTP Request node
+3

Sync timer entries from Clockify to Syncro

This workflow will take a timer entry from Clockify and submit it to a matching ticket in Syncro. It saves the time entry ID from Clockify and the time entry ID from Syncro into a Google Sheets. Then, it will check if a match already exists from a previous update and will update the same time entry if the description or time is changed in Clockify. There is a Set node with the name and Syncro IDs of technicians. If you have multiple technicians with the same name, this won't work for you. Likewise, if the name in Clockify doesn't exactly match what you put in the Set, it won't work. You also need to setup a webhook in Clockify set to trigger on "Time entry updated (anyone)" and pointed at your workflow. Configured this way, you can start and stop time entries at will and it won't do anything until you change the description. > This workflow is part of an MSP collection, The original can be found here: https://github.com/bionemesis/n8nsyncro
HTTP Request node
+5

Telegram Payment, Invoicing and Refund Workflow for Stars

This workflow provides a complete solution for handling Telegram Stars payments, invoicing and refunds using n8n. It automates the process of sending invoices, managing pre-checkout approvals, recording transactions, and processing refunds for stars, making it ideal for businesses using Telegram Stars for digital payments. What are Telegram Stars? Learn more here. Key Features Payment Handling**: Automate invoice creation and sending via Telegram, with pre-checkout approval for smooth payment processing. Refund Management: Simplify the refund process using user IDs and payment charge IDs from successful **Telegram Stars transactions. Transaction Recording**: Record all payment details, such as user information and payment charge IDs, in Google Sheets for transparent financial tracking. Who Can Use This Workflow? Developers and Businesses: Looking to implement **Telegram Stars as a payment system within Telegram. Service Providers**: Managing subscriptions, donations, or digital sales through Telegram automation. Use Cases Subscription Sales Automation**: Use the workflow to issue invoices and automatically process payments for recurring subscriptions. Infopreneurs and Marketers: Use the workflow for delivering lead magnets, tripwires, and further automating sales via **Telegram Stars. Course Sales Automation**: Automate invoicing and refunds for educational platforms selling online courses. Developers and Businesses: Looking to implement **Telegram Stars as a payment system within Telegram. Service Providers**: Managing subscriptions, donations, or digital sales through Telegram automation. Online Educational Platforms**: Automate billing for courses and handle refunds easily. Setup Instructions Replace the Telegram API credentials with your bot API token from BotFather. Customize invoice details, including product name, description, and payment amount. Connect your Google Sheets for storing transaction logs. Configure refund steps for easy processing of star refunds when needed. Note: The setup is very simple—just follow the instructions provided on the yellow sticky notes within the workflow and insert your data. All other nodes are pre-configured and require no additional customization. The entire setup process takes just 1 minute. I provided a short Loom record with an explanation. Extensibility This workflow can be further customized to include user profile management, payment analytics, or integration with external services like CRMs or accounting tools. Additional modules can be easily connected to manage advanced features like Telegram User Registration. Available Templates Telegram Bot Starter Template: A foundational setup for creating custom Telegram bots. Telegram User Registration Workflow: Automate user registration and data collection via Telegram. Telegram Payment and Refund Workflow for Stars**: Streamline your Telegram payment processing with stars, invoices, and refunds. Support and Updates This workflow is supported and regularly updated to stay compatible with the latest Telegram features and n8n improvements. If you encounter any issues, technical support is available to ensure smooth integration and setup. Key terms: Telegram Stars payment workflow, Telegram refund automation, n8n payment template, Google Sheets transaction logging, Telegram bot for payments, automated refunds on Telegram, Telegram Stars invoice workflow. Please reach out to Victor if you need further assistance with your n8n workflows and automation!
HTTP Request node
Merge node
Code node
+5

Create LinkedIn Contributions with AI and Notify Users On Slack

This workflow automates the process of gathering LinkedIn advice articles, extracting their content, and generating unique contributions for each article using an AI model. The contributions are then posted to a Slack channel and a NocoDB database for record-keeping. The workflow is triggered weekly to ensure new articles are continuously collected and responded to. Who is this for? This workflow is designed for professionals, marketers, and content creators looking to boost their LinkedIn presence by regularly engaging with LinkedIn advice articles. It’s especially useful for those who want to be seen as a "thought leader" or "top voice" in their niche by contributing relevant and unique advice to trending topics. What problem is this workflow solving? Manually searching for relevant LinkedIn articles, reading through them, and crafting thoughtful contributions can be time-consuming. This workflow solves that by automating the process of finding new articles, extracting key content, and generating AI-powered contributions. It helps users stay consistently active on LinkedIn, contributing value to trending discussions. What this workflow does Triggers Weekly: The workflow is set to run every Monday at 8:00 AM. Search Google for LinkedIn Advice Articles: Uses a predefined Google search URL to find the latest LinkedIn advice articles based on the user's area of expertise. Extract LinkedIn Article Links: A code node extracts all LinkedIn advice article links from the search results. Retrieve Article Content: For each article link, the workflow retrieves the HTML content and extracts the article title, topics, and existing contributions. Generate AI-Powered Contributions: The workflow sends the extracted article content to an AI model, which generates unique, helpful advice for each topic within the article. Post to Slack & NocoDB: The AI-generated contributions, along with the article links, are posted to a designated Slack channel and stored in a NocoDB database for future reference. Setup Google Search URL: Update the Google search URL with the relevant LinkedIn advice query for your field (e.g., "site:linkedin.com/advice 'marketing automation'"). Slack Integration: Connect your Slack account and specify the Slack channel where you want the contributions to be posted. NocoDB Integration: Set up your NocoDB project to store the generated contributions along with the article titles and links. How to customize this workflow Change Search Terms**: Modify the Google search URL to focus on a different LinkedIn topic or expertise area. Adjust Trigger Frequency**: The workflow is set to run weekly, but you can adjust the frequency by changing the schedule trigger. Enhance Contribution Quality**: Customize the AI model's prompt to generate contributions that align with your brand voice or content strategy. Workflow Summary This workflow helps users maintain a consistent presence on LinkedIn by automating the discovery of new advice articles and generating unique contributions using AI. It is ideal for professionals who want to engage with LinkedIn content regularly without spending too much time manually searching and drafting responses.
Code node
HTTP Request node
Split Out node
Anthropic Chat Model node
+5

Monthly Spotify Track Archiving and Playlist Classification

Monthly Spotify Track Archiving and Playlist Classification This n8n workflow allows you to automatically archive your monthly Spotify liked tracks in a Google Sheet, along with playlist details and descriptions. Based on this data, Claude 3.5 is used to classify each track into multiple playlists and add them in bulk. Who is this template for? This workflow template is perfect for Spotify users who want to systematically archive their listening history and organize their tracks into custom playlists. What problem does this workflow solve? It automates the monthly process of tracking, storing, and categorizing Spotify tracks into relevant playlists, helping users maintain well-organized music collections and keep a historical record of their listening habits. Workflow Overview Trigger Options**: Can be initiated manually or on a set schedule. Spotify Playlists Retrieval**: Fetches the current playlists and filters them by owner. Track Details Collection**: Retrieves information such as track ID and popularity from the user’s library. Audio Features Fetching**: Uses Spotify's API to get audio features for each track. Data Merging**: Combines track information with their audio features. Duplicate Checking**: Filters out tracks that have already been logged in Google Sheets. Data Logging**: Archives new tracks into a Google Sheet. AI Classification**: Uses an AI model to classify tracks into suitable playlists. Playlist Updates**: Adds classified tracks to the corresponding playlists. Setup Instructions Credentials Setup: Make sure you have valid Spotify OAuth2 and Google Sheets access credentials. Trigger Configuration: Choose between manual or scheduled triggers to start the workflow. Google Sheets Preparation: Set up a Google Sheet with the necessary structure for logging track details. Spotify Playlists Setup: Have a diverse range of playlists and exhaustive description (see example) ready to accommodate different music genres and moods. Customization Options Adjust Playlist Conditions**: Modify the AI model’s classification criteria to align with your personal music preferences. Enhance Track Analysis**: Incorporate additional audio features or external data sources for more refined track categorization. Personalize Data Logging**: Customize which track attributes to log in Google Sheets based on your archival preferences. Configure Scheduling**: Set a preferred schedule for periodic track archiving, e.g., monthly or weekly. Cost Estimate For 300 tracks, the token usage amounts to approximately 60,000 tokens (58,000 for input and 2,000 for completion), costing around 20 cents with Claude 3.5 Sonnet (as of October 2024). Playlists' Description Examples | Playlist Name | Playlist Description | |-------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------| | Classique | Indulge in the timeless beauty of classical music with this refined playlist. From baroque to romantic periods, this collection showcases renowned compositions. | | Poi | Find your flow with this dynamic playlist tailored for poi, staff, and ball juggling. Featuring rhythmic tracks that complement your movements. | | Pro Sound | Boost your productivity and focus with this carefully selected mix of concentration-enhancing music. Ideal for work or study sessions. | | ChillySleep | Drift off to dreamland with this soothing playlist of sleep-inducing tracks. Gentle melodies and ambient sounds create a peaceful atmosphere for restful sleep. | | To Sing | Warm up your vocal cords and sing your heart out with karaoke-friendly tracks. Featuring popular songs, perfect for solo performances or group sing-alongs. | | 1990s | Relive the diverse musical landscape of the 90s with this eclectic mix. From grunge to pop, hip-hop to electronic, this playlist showcases defining genres. | | 1980s | Take a nostalgic trip back to the era of big hair and neon with this 80s playlist. Packed with iconic hits and forgotten gems, capturing the energy of the decade.| | Groove Up | Elevate your mood and energy with this upbeat playlist. Featuring a mix of feel-good tracks across various genres to lift your spirits and get you moving. | | Reggae & Dub | Relax and unwind with the laid-back vibes of reggae and dub. This playlist combines classic reggae tunes with deep, spacious dub tracks for a chilled-out vibe. | | Psytrance | Embark on a mind-bending journey with this collection of psychedelic trance tracks. Ideal for late-night dance sessions or intense focus. | | Cumbia | Sway to the infectious rhythms of Cumbia with this lively playlist. Blending traditional Latin American sounds with modern interpretations for a danceable mix. | | Funky Groove | Get your body moving with this collection of funk and disco tracks. Featuring irresistible basslines and catchy rhythms, perfect for dance parties. | | French Chanson | Experience the romance and charm of France with this mix of classic and modern French songs, capturing the essence of French musical culture. | | Workout Motivation | Push your limits and power through your exercise routine with this high-energy playlist. From warm-up to cool-down, these tracks will keep you motivated. | | Cinematic Instrumentals | Immerse yourself in a world of atmospheric sounds with this collection of cinematic instrumental tracks, perfect for focus, relaxation, or contemplation. |

Build your own Google Sheets and HTTP Request integration

Create custom Google Sheets and HTTP Request workflows by choosing triggers and actions. Nodes come with global operations and settings, as well as app-specific parameters that can be configured. You can also use the HTTP Request node to query data from any app or service with a REST API.

Google Sheets supported actions

Create
Create a spreadsheet
Delete
Delete a spreadsheet
Append or Update Row
Append a new row or update an existing one (upsert)
Append Row
Create a new row in a sheet
Clear
Delete all the contents or a part of a sheet
Create
Create a new sheet
Delete
Permanently delete a sheet
Delete Rows or Columns
Delete columns or rows from a sheet
Get Row(s)
Retrieve one or more rows from a sheet
Update Row
Update an existing row in a sheet
Use case

Save engineering resources

Reduce time spent on customer integrations, engineer faster POCs, keep your customer-specific functionality separate from product all without having to code.

Learn more

FAQs

  • Can Google Sheets connect with HTTP Request?

  • Can I use Google Sheets’s API with n8n?

  • Can I use HTTP Request’s API with n8n?

  • Is n8n secure for integrating Google Sheets and HTTP Request?

  • How to get started with Google Sheets and HTTP Request integration in n8n.io?

Looking to integrate Google Sheets and HTTP Request in your company?

Over 3000 companies switch to n8n every single week

Why use n8n to integrate Google Sheets with HTTP Request

Build complex workflows, really fast

Build complex workflows, really fast

Handle branching, merging and iteration easily.
Pause your workflow to wait for external events.

Code when you need it, UI when you don't

Simple debugging

Your data is displayed alongside your settings, making edge cases easy to track down.

Use templates to get started fast

Use 1000+ workflow templates available from our core team and our community.

Reuse your work

Copy and paste, easily import and export workflows.

Implement complex processes faster with n8n

red iconyellow iconred iconyellow icon