If you’re new to Make, create a free account to access 1000 free operations each month. Once logged in, select Scenarios in the left-side menu and then click Create a new scenario.
Then, choose Build from scratch to continue.
Make sure to claim a free trial for Oxylabs Web Scraper API by registering on the Oxylabs dashboard.
Once you have your account, click the plus sign to add a new module in Make, search for Oxylabs, and select the Scrape Amazon Search module.
Next, create a connection by entering your Oxylabs Web Scraper API username and password:
After saving, you’ll see a window where you can set API parameters. For this tutorial, let’s set the Query to lenovo laptop, Geo location to Seattle’s downtown ZIP code 98104, Parse to Yes, and Pages to 2:
These settings will tell the scraper to search Amazon for “lenovo laptop”, localize results for Seattle, collect data from 2 pages, and then parse everything into a structured JSON format. Save this configuration and run this single module to see how it works. The output should contain two scraped pages, so let’s move on to the next step.
Click the plus sign on the right side of the Oxylabs module to connect a new module. Search for Set variable and select it:
Set the Variable name to flattened and paste this function as the Variable value:
{{flatten(map(map(map(1.results; "content"); "results"); "organic"))}}
This will pick all the scraped organic search results from all pages and gather everything into a single array of items. Run the scenario to see the results.
While you can use the flattened results directly to save the entire array in bulk to Google Sheets, this approach will not let you map specific data points to columns. For this reason, let’s use Iterator and Array aggregator to iterate through the array and prepare results for Google Sheets.
Connect the Iterator module to the Set variable tool. Paste this {{2.flattened}} as a value or select flattened[] from the items menu:
Note: Module IDs auto-increment even after deletions. For example, if you add modules 1-3 and delete module 2, the next module will be ID 4, not 2. Always verify the correct ID when using functions.
Next, add another module and search for an Array aggregator. For now, select the Iterator [3] module as the Source Module and save the settings. We’ll come back to this module later.
Connect another module by searching for Bulk Add Rows. This one allows you to add multiple rows of data to your Google sheet at once.
Follow the steps to Create a connection for Google Sheets.
After that, manually create a spreadsheet in your Google Drive so you can use it in the scenario. Of course, you can set up the scenario to do that for you, but that’s over the scope of this guide.
Spreadsheet title: Amazon Search Results
Sheet name: Organic
12 column names: ASIN, TITLE, PRICE, PRICE STRIKETHROUGH, RATING, REVIEWS COUNT, SALES VOLUME, IS PRIME, IS AMAZONS CHOICE, BEST SELLER, SHIPPING, and URL.
Your spreadsheet should look like this:
Getting back to Make, use the ID finder to search for the spreadsheet’s title in the Google Sheets module:
Next, make sure to disable the map setting so you would be able to select the Sheet’s name (Organic) from the dropdown list. Set the Column range to A-Z and, in the Rows field, paste {{4.array}} or simply select Array[] from the item menu. Save these settings.
As a final step, modify the Array aggregator. First, in the Target structure type field, select Rows from the Google Sheets - Bulk Add Rows (Advanced) module. Then, create 12 column fields under the Rows section and map specific values from the Iterator in the same order as you have in your sheet:
You can clean up the URL value (12th column) by appending https://www.amazon.com to the start and removing everything starting from the /ref= part. Use this function:
https://www.amazon.com{{first(split(3.url; "/ref="))}}
Let’s execute the scenario to scrape 2 pages of Amazon search results and get it parsed to Google Sheets.
AI agents in Make is a robust feature for building LLM-based pipelines, allowing you to connect any application to AI. You can give pre-defined system prompts, additional context, and let AI agents use other scenarios as system tools.
In this section, you’ll see the basics of creating an AI agent that analyzes scraped Amazon data to provide a comprehensive report.
You may also connect OpenAI, Anthropic, Gemini, and other LLM modules directly in your scenarios instead of using the AI Agents feature.
At the moment, Make provides a 30-day free trial to test this feature, but it’s fully available with the Core plan. Click the AI Agents button in the left-side menu to access the page.
After successful activation, click the Create agent button. Here, you can give the agent a name, instructions, and choose the LLM provider. Let’s use Google Gemini’s free API (generate an API key here) or connect to your existing LLM provider.
You can paste this system prompt:
# Role and Objective
You are an expert e-commerce product assistant. Your goal is to analyze scraped product data to identify the best deals based on product specifics, price, availability, and overall value.
# Instructions & Rules
- Provide a comprehensive report of your main findings, highlighting the best product options.
Once saved, you can access this agent directly in the scenario by adding a module, searching for “agent”, and selecting the Make AI Agents - Run an agent module.
For demonstration purposes, let’s utilize the same Make scenario. First, change the total of pages to 10 in the Oxylabs module to gather more data. Next, add a Router between the Set variable tool and Iterator.
This will allow you to branch out the scenario rather than having everything in a single line. Then, click the router and add your agent. To send the scraped data to it, in the Messages field, select the flattened[] item or paste this: {{2.flattened}}.
Finally, connect the Google Sheets - Update a cell module to AI. Make sure to create another Sheet in the same spreadsheet and give it a name, for example, “Analysis”. Then, you can select it in the Google Sheets module, specify the cell where the AI’s response will go, and set the {{7.response}} as a value.
Once executed, the scenario will save scraped Amazon data from 10 pages, and the AI agent will summarize all the data in a new sheet:
With Make and Oxylabs, you can overcome web blocks and feed real-time web data directly to your AI agents and hundreds of other apps. Beyond Amazon scraping, you could monitor competitor prices, track real estate listings, aggregate news content, or build automated lead generation systems. Feel free to contact us via live chat or email if you have any questions about Oxylabs products.
Please be aware that this is a third-party tool not owned or controlled by Oxylabs. Each third-party provider is responsible for its own software and services. Consequently, Oxylabs will have no liability or responsibility to you regarding those services. Please carefully review the third party's policies and practices and/or conduct due diligence before accessing or using third-party services.
How to Navigate AI, Legal, and Web Scraping: Asking a Professional
In this interview, we sit down with a legal professional to shed light on the ever-changing legal framework surrounding web scraping.
Web Scraping With LangChain & Oxylabs API
Follow our quick guide on combining LangChain with Web Scraper API for hassle-free web scraping process and AI-driven analysis.
OpenAI Agents SDK Integration With Oxylabs Web Scraper API
Learn how to build AI agents that scrape and analyze web content by combining OpenAI's Agents SDK with Oxylabs Web Scraper API for cost-effective web access.
Get the latest news from data gathering world
Get Web Scraper API for $1.35/1K results
Proxies
Advanced proxy solutions
Data Collection
Datasets
Resources
Innovation hub