OpenAI Agents SDK Integration With Oxylabs Web Scraper API


Vytenis Kaubrė
Last updated on
2025-03-14
3 min read
Vytenis Kaubrė
Last updated on
2025-03-14
3 min read
OpenAI just made it easier to orchestrate AI agent workflows with the release of production-ready Agents SDK. Integrating Oxylabs Web Scraper API significantly cuts web search costs and ensures real-time access to web feed for a ceiling price of $1.35/1k results. Follow this guide to see how you can leverage both tools for building smart AI agents.
The Agents SDK is a powerful open-source developer toolkit for building goal-driven, autonomous agents powered by OpenAI’s LLMs. Replacing the experimental Swarm SDK, it offers a streamlined way to create agents that can use tools, maintain memory, and make multi-step decisions to accomplish complex tasks.
Under the hood, the SDK utilizes Responses API and Chat Completions API and features tools for custom function calls, web search, file search, and computer use, on top of other capabilities. Developers can also monitor and debug agent behavior in real-time and continuously improve performance using the dashboard's evaluation tool.
Let’s create a basic agent that scrapes any website using Web Scraper API and implements Agents SDK to answer a question using the data. You can integrate Web Scraper API in two ways:
By using the Oxylabs MCP server (recommended)
By using the @function_tool decorator with direct calls to the API
This guide will utilize the Oxylabs MCP server, which provides a standardized framework for AI tools to access fresh web data. You can learn more about Model Context Protocol (MCP) in our other post.
Claim your 1-week free trial to test Web Scraper API for your project needs.
Before moving on, make sure you have Web Scraper API credentials and OpenAI API key ready. To use the Oxylabs MCP server, install the uv package by following this official installation guide. There are plenty of ways to install it; for instance, on macOS devices, you may use Homebrew:
brew install uv
Next, check whether uv installed successfully:
uv -V
After that, create a new Python project and install Agents SDK:
pip install openai-agents
To keep it more secure, create a .env file in your project’s directory and store your Web Scraper API user credentials and OpenAI API key as follows:
OXYLABS_USERNAME=your_API_username
OXYLABS_PASSWORD=your_API_password
OPENAI_API_KEY=your-openai-key
You may also save the environment variables using a terminal or by simply specifying the authentication credentials directly in code if you prefer to do so.
Inside a new Python script file, import the necessary libraries as follows and load the .env file:
import asyncio
import os
from dotenv import load_dotenv
from agents import Agent, Runner
from agents.mcp import MCPServerStdio
load_dotenv()
Create an asynchronous function and initialize an MCP server connection using an async context manager. The timeout parameter overrides the default 5-second timeout, ensuring the agent waits an appropriate time for the server’s response.
async def main():
async with MCPServerStdio(
params={
'command': 'uvx',
'args': ['oxylabs-mcp'],
'env': {
'OXYLABS_USERNAME': os.getenv('OXYLABS_USERNAME'),
'OXYLABS_PASSWORD': os.getenv('OXYLABS_PASSWORD')
}
},
client_session_timeout_seconds=20
) as oxylabs_mcp_server:
Next, create an AI agent that utilizes the oxylabs_mcp_server connection. You may define as many specialized AI agents as your application requires to handle specific tasks, each configured with specific instructions and tools.
agent = Agent(
name='Web Search Assistant',
instructions='Use the MCP server tools to answer the question.',
mcp_servers=[oxylabs_mcp_server]
)
Finally, let’s make a very simple interactive command-line interface for communicating with the agent.
while True:
question = input('\033[1;34mQuestion ->\033[0m ')
result = await Runner.run(agent, input=question)
print(
'\033[1;33mAnswer -> ' +
f'\n\033[1;32m{result.final_output}\033[0m'
)
if __name__ == '__main__':
asyncio.run(main())
By now, you should have compiled the following script that’s ready for execution:
import asyncio
import os
from dotenv import load_dotenv
from agents import Agent, Runner
from agents.mcp import MCPServerStdio
load_dotenv()
async def main():
async with MCPServerStdio(
params={
'command': 'uvx',
'args': ['oxylabs-mcp'],
'env': {
'OXYLABS_USERNAME': os.getenv('OXYLABS_USERNAME'),
'OXYLABS_PASSWORD': os.getenv('OXYLABS_PASSWORD')
}
},
client_session_timeout_seconds=20
) as oxylabs_mcp_server:
agent = Agent(
name='Web Search Assistant',
instructions='Use the MCP server tools to answer the question.',
mcp_servers=[oxylabs_mcp_server]
)
while True:
question = input('\033[1;34mQuestion ->\033[0m ')
result = await Runner.run(agent, input=question)
print(
'\033[1;33mAnswer -> ' +
f'\n\033[1;32m{result.final_output}\033[0m'
)
if __name__ == '__main__':
asyncio.run(main())
This code lets you ask questions that the agent processes, then calls the MCP server to scrape data and deliver answers.
With Agents SDK and Web Scraper API, you've got a powerful combination that unlocks endless possibilities for AI agent development. This integration provides cost-effective web access, enabling you to build specialized agents that can search, analyze, and transform real-time web data into insights. You can try the API at no cost by activating your free trial through the dashboard.
If you have any questions, don’t hesitate to contact us via live chat or email.
About the author
Vytenis Kaubrė
Technical Copywriter
Vytenis Kaubrė is a Technical Copywriter at Oxylabs. His love for creative writing and a growing interest in technology fuels his daily work, where he crafts technical content and web scrapers with Oxylabs’ solutions. Off duty, you might catch him working on personal projects, coding with Python, or jamming on his electric guitar.
All information on Oxylabs Blog is provided on an "as is" basis and for informational purposes only. We make no representation and disclaim all liability with respect to your use of any information contained on Oxylabs Blog or any third-party websites that may be linked therein. Before engaging in scraping activities of any kind you should consult your legal advisors and carefully read the particular website's terms of service or receive a scraping license.
Roberta Aukstikalnyte
2025-01-23
Get the latest news from data gathering world
Scale up your business with Oxylabs®
Proxies
Advanced proxy solutions
Data Collection
Datasets
Resources
Innovation hub