Back to Integrations
integration integration
integration Limit node

Integrate Limit with 500+ apps and services

Unlock Limit’s full potential with n8n, connecting it to similar Core Nodes apps and over 1000 other services. Create adaptable and scalable workflows between Limit and your stack. All within a building experience you will love.

Popular ways to use Limit integration

HTTP Request node
Webhook node
Respond to Webhook node
+8

Ultimate Scraper Workflow for n8n

What this template does The Ultimate Scraper for n8n uses Selenium and AI to retrieve any information displayed on a webpage. You can also use session cookies to log in to the targeted webpage for more advanced scraping needs. ⚠️ Important: This project requires specific setup instructions. Please follow the guidelines provided in the GitHub repository: n8n Ultimate Scraper Setup : https://github.com/Touxan/n8n-ultimate-scraper/tree/main. The workflow version on n8n and the GitHub project may differ; however, the most up-to-date version will always be the one available on the GitHub repository : https://github.com/Touxan/n8n-ultimate-scraper/tree/main. How to use Deploy the project with all the requirements and request your webhook. Example of request: curl -X POST http://localhost:5678/webhook-test/yourwebhookid \ -H "Content-Type: application/json" \ -d '{ "subject": "Hugging Face", "Url": "github.com", "Target data": [ { "DataName": "Followers", "description": "The number of followers of the GitHub page" }, { "DataName": "Total Stars", "description": "The total numbers of stars on the different repos" } ], "cookie": [] }' Or to just scrap a url : curl -X POST http://localhost:5678/webhook-test/67d77918-2d5b-48c1-ae73-2004b32125f0 \ -H "Content-Type: application/json" \ -d '{ "Target Url": "https://github.com", "Target data": [ { "DataName": "Followers", "description": "The number of followers of the GitHub page" }, { "DataName": "Total Stars", "description": "The total numbers of stars on the different repo" } ], "cookies": [] }' `
pablobarrier
Pablo
Google Sheets node
HTTP Request node
Merge node
+3

Find out which Chrome extensions are tracked by Linkedin

What this workflow does Linkedin tracks which Chrome extensions are installed in your browser. This workflow uses a huge raw JSON of chrome extension ids, extracted from Linkedin pages, and builds a pretty Google Sheet with the list of these extensions. This workflow web scrapes Google to search for chrome extension id - and extracts the first search result. Setup Clone this Google Sheet template: https://docs.google.com/spreadsheets/d/1nVtoqx-wxRl6ckP9rBHSL3xiCURZ8pbyywvEor0VwOY/edit?gid=0#gid=0 Get API key for Google SERP API access here: https://rapidapi.com/restyler/api/serp-api1 Create n8n header auth for Google SERP API Some context and discussion https://www.linkedin.com/feed/update/urn:li:activity:7245006911807393792/ Follow the author and get the final Google Sheet with 1300+ Chrome extensions: https://www.linkedin.com/in/anthony-sidashin/
scrapeninja
Anthony
Aggregate node
+4

Chat with OpenAI Assistant (by adding a memory)

OpenAI Assistant is a powerful tool, but at the time of writing it doesn't automatically remember past messages from a conversation. This workflow demonstrates how to get around this, by managing the chat history in n8n and passing it to the assistant when required. This makes it possible to use OpenAI Assistant for chatbot use cases. Note that to use this template, you need to be on n8n version 1.28.0 or later.
davidn8n
David Roberts
HTTP Request node
Notion node
+8

Automate Competitor Research with Exa.ai, Notion and AI Agents

This n8n workflow demonstrates a simple multi-agent setup to perform the task of competitor research. It showcases how using the HTTP request tool could reduce the number of nodes needed to achieve a workflow like this. How it works For this template, a source company is defined by the user which is sent to Exa.ai to find competitors. Each competitor is then funnelled through 3 AI agents that will go out onto the internet and retrieve specific datapoints about the competitor; company overview, product offering and customer reviews. Once the agents are finished, the results are compiled into a report which is then inserted in a notion database. Check out an example output here: https://jimleuk.notion.site/2d1c3c726e8e42f3aecec6338fd24333?v=de020fa196f34cdeb676daaeae44e110&pvs=4 Requirements An OpenAI account for the LLM. Exa.ai account for access to their AI search engine. SerpAPI account for Google search. Firecrawl.dev account for webscraping. Notion.com account for database to save final reports. Customising the workflow Add additional agents to gather more datapoints such as SEO keywords and metrics. Not using notion? Feel free to swap this out for your own database.
jimleuk
Jimleuk
HTTP Request node
Gmail node
Markdown node
+5

Compose reply draft in Gmail with OpenAI Assistant

This workflow uses OpenAI Assistant to compose draft replies for labeled email messages. It automatically connects the drafts to Gmail threads. 💡 You can add knowledge base to your OpenAI Assistant and make your reply drafts very customized (e.g. compose response with product information in response to inquiry from customer). 🎬 See this workflow in action in my YouTube video about automating Gmail. How it works? The workflow is triggered at regular intervals (default: every 1 minute – you can change this value) to check for messages with a specific label (e.g., "AI"). The content of the retrieved email message is then forwarded to the OpenAI Assistant node, and a reply draft is generated. Next, the response from the Assistant is converted to HTML, and a raw message in RFC standard is composed. 💡 You can learn more about composing drafts with the Gmail API in the official Google documentation. The raw email message (reply draft) is encoded and attached to the original thread ID. Finally, the trigger label (in this case: "AI") is removed to prevent the workflow from looping. Set up steps Set credentials for Gmail and OpenAI. Add new label in Gmail account for messages that should be handled by the workflow (e.g. name it "AI"). Select this label in the first and last Gmail nodes in workflow. Create and configure your OpenAI Assistant. Select your assistant in "OpenAI Assistant" node. Optionally: change trigger interval (by default interval is 1 minute). If you like this workflow, please subscribe to my YouTube channel and/or my newsletter.
workfloows
Oskar
HTTP Request node
+4

Convert URL HTML to Markdown Format and Get Page Links

Use Case Transform web pages into AI-friendly markdown format: You need to process webpage content for LLM analysis You want to extract both content and links from web pages You need clean, formatted text without HTML markup You want to respect API rate limits while crawling pages What this Workflow Does The workflow uses Firecrawl.dev API to process webpages: Converts HTML content to markdown format Extracts all links from each webpage Handles API rate limiting automatically Processes URLs in batches from your database Setup Create a Firecrawl.dev account and get your API key Add your Firecrawl API key to the HTTP Request node's Authorization header Connect your URL database to the input node (column name must be "Page") or edit the array in Example fields from data source Configure your preferred output database connection How to Adjust it to Your Needs Modify input source to pull URLs from different databases Adjust rate limiting parameters if needed Customize output format for your specific use case Made by Simon @ automake.io
simonscrapes
simonscrapes

Over 3000 companies switch to n8n every single week

Connect Limit with your company’s tech stack and create automation workflows

Last week I automated much of the back office work for a small design studio in less than 8hrs and I am still mind-blown about it.

n8n is a game-changer and should be known by all SMBs and even enterprise companies.

in other news I installed @n8n_io tonight and holy moly it’s good

it’s compatible with EVERYTHING

We're using the @n8n_io cloud for our internal automation tasks since the beta started. It's awesome! Also, support is super fast and always helpful. 🤗

Implement complex processes faster with n8n

red icon yellow icon red icon yellow icon