Cursor is an Integrated Development Environment (IDE) that speeds up coding processes with AI-driven code generation, intelligent auto-completion, seamless code refactoring, and natural language search across your codebase — all within a familiar VS Code-like editor.
Before moving forward, make sure you have completed the following steps:
Download Cursor from the official website, install it on your device, and sign in to your account.
Get user credentials for Oxylabs Web Scraper API or Web Unblocker.
No credit card needed
Web Scraper API – 5K results
Web Unblocker – 1 GB of traffic
Via uvx: The most stable option that enables direct API calls.
Via Smithery CLI: Uses Smithery CLI to call the MCP server hosted in Smithery.
Using a Local/Dev setup: Host the MCP server locally in your environment.
oxylabs_universal_scraper: Scrapes any public website URL you provide.
oxylabs_google_search_scraper: Scrapes Google search results by search term.
oxylabs_amazon_search_scraper: Scrapes Amazon search results by search term.
oxylabs_amazon_product_scraper: Scrapes Amazon product details by ASIN.
oxylabs_web_unblocker: Uses an AI-driven proxy solution and returns HTML.
uvx is part of the uv package. To install it, follow this official guide and pick the method that works best for you. For instance, on macOS devices, you can use Homebrew, among other methods:
brew install uv
Next, check whether uvx installed successfully:
uvx -V
In your Cursor app, head to Settings → MCP → Add new global MCP server.
This will open the mcp.json file where you can paste the following Oxylabs MCP configuration:
{
"mcpServers": {
"oxylabs_scraper_uvx": {
"command": "uvx",
"args": ["oxylabs-mcp"],
"env": {
"OXYLABS_USERNAME": "YOUR_USERNAME",
"OXYLABS_PASSWORD": "YOUR_PASSWORD"
}
}
}
}
Replace YOUR_USERNAME and YOUR_PASSWORD with the API or Web Unblocker user's credentials.
Open the chat box, select the Agent mode, and run a test prompt, for example:
Search Google for "best AI agent frameworks" and summarize results
Install Node.js version 16 or higher to ensure compatibility.
After that, verify that npx is available on your system by running a simple version check:
npx -v
Open the Cursor app’s terminal and automatically install the Oxylabs MCP server using Smithery CLI:
npx -y @smithery/cli install @oxylabs/oxylabs-mcp --client cursor
Then, enter your Web Scraper API or Web Unblocker credentials as prompted.
Go to Settings → MCP where you should see the oxylabs-mcp card added to this window. Make sure there are no errors and that you see a green dot next to the card's name.
Open the chat window, select the Agent mode, and submit your prompt, for example:
Search Google for "best AI agent frameworks" and summarize results
After successful integration, you can start using Oxylabs solutions in Cursor for advanced cases, such as quickly creating an LLM training set:
Scrape https://en.wikipedia.org/wiki/Miss_Meyers without additional parameters and create a production-quality JSON training set with detailed metadata following LLM training best practices. Don't summarize the content, use full text from the MD file.
Visit Oxylabs MCP GitHub repository to find comprehensive details about our MCP server, including setup instructions for Claude Desktop or Local/Dev configuration. Eager to expand your AI toolkit? Check out these specialized tutorials:
Our team is always ready to assist with any questions about our solutions – simply connect with us through email or live chat.
Please be aware that this is a third-party tool not owned or controlled by Oxylabs. Each third-party provider is responsible for its own software and services. Consequently, Oxylabs will have no liability or responsibility to you regarding those services. Please carefully review the third party's policies and practices and/or conduct due diligence before accessing or using third-party services.
How to Navigate AI, Legal, and Web Scraping: Asking a Professional
In this interview, we sit down with a legal professional to shed light on the ever-changing legal framework surrounding web scraping.
Acquiring High-Quality Web Data for LLM Fine-Tuning
Discover data categories, large-scale scraping strategies, and cost optimization tips for fine-tuning your AI models.
Building Web Scraping Architecture for AI Companies
The process of building a workflow that allows you to collect large-scale web data for AI training.
Get the latest news from data gathering world
Get Web Scraper API for $1.35/1K results
Proxies
Advanced proxy solutions
Data Collection
Datasets
Resources
Innovation hub