Proxy locations

Europe

North America

South America

Asia

Africa

Oceania

See all locations

Network statusCareers

Back to blog

AI-Fueled Proxy Service Revolution: Next-Gen Residential Proxies

Jurgita Tuzikaite

Jurgita Tuzikaite

2020-08-054 min read
Share

We have officially released Next-Gen Residential Proxies, a technology that will irreversibly change the proxy service industry. To mark the occasion, we took an opportunity to interview a proxy expert behind this revolutionary web data gathering solution. Why is it so revolutionary, you ask?

We caught Aleksandras Sulzenko, a man who oversaw the development process of Next-Gen Residential Proxies, to figure it out. Six months ago Aleksandras was challenged to redefine innovation in the proxy industry. He has been leading a team of developers, data analysts, and system administrators to bring to life a never before seen solution. 

Aleksandras Sulzenko, Product Owner of Next-Gen Residential Proxies

Aleksandras, can you tell us how the idea of Next-Gen Residential Proxies came about?

When I first joined Oxylabs, three years ago, the company mainly specialized in Datacenter Proxies. This type of proxy offers significant advantages, including impressive operation speed. However, it also has its own drawbacks, namely recurring blocks. I witnessed numerous cases where our clients would struggle with data gathering as their IPs would get blocked, and business would grind to a halt.

Naturally, these situations are quite stressful, especially if web data gathering and processing is the core business of a company. We employed numerous tactics to help our clients overcome these issues. Our team did it all, from replacing the proxy sets, to providing consultations on best web data gathering practices. However, it was clear that our efforts to fix constantly recurring problems are not a sustainable solution. We needed to develop a product, which would, by design, avoid all blocks.

We equipped the product with the most progressive technologies available to make it block-free and six months later, the Next-Gen Residential Proxies were born.

Next-Gen Residential Proxies development team

What were the challenges of building something that has never been done before?

In part, the answer is already in the question. Innovating is never easy, as there are no guidelines. Many of the obstacles my team has faced had no precedent, so we needed to be creative. 

Our goal to deliver a radically different solution pushed us to re-evaluate the processes of proxy services. We had to dive deep at the heart of the problem and understand the essence behind common pitfalls. This helped us to understand what features would make Next-Gen Residential Proxies infallible.  If you wish to learn more about Next-Gen Residential Proxies, click the video bellow.

Can you explain what is so different about Next-Gen Residential Proxies?

To understand this fully, you must know the context of proxy services. HTTP(S) proxies have been in use for decades. One thing that makes innovation tricky is the fact that the proxy technology itself was not designed with web data gathering in mind. 

The web scraper bots tend to get blocked quite often if they don’t mimic the human online behavior convincingly. This means that to gather data from any given site, first, there is a need to figure out how to program the bot to imitate real human browsing patterns. The unique aspect of Next-Gen Residential Proxies is that they are already set to operate in a way that resembles a human browsing session, saving valuable time for our clients. 

More specifically, Next-Gen Residential Proxies contain a tool kit that handles the most common web scraping aspects: AI-powered dynamic fingerprinting, automatic retries, JavaScript rendering, IP management, adaptive parsing, browsing session control, and response evaluation, just to name a few. Nothing like this has ever been offered with a proxy service.

Next-Gen Residential Proxies expanding potential

How does AI-powered dynamic fingerprinting work exactly?

To put it very simply, AI dynamic fingerprinting is a feature that allows the scraper bot to appear as a real-life website visitor. This ability is essential to perform block-free, sustainable data gathering. When a bot equipped with AI dynamic fingerprinting visits a site, it is capable of collecting information undetected by providing user-related information, such as the browser type and version, among many other variables. 

Before the Next-Gen Residential Proxy introduction, the only solution was going through with tedious tasks of testing a bunch of combinations of all user-related variables and selecting the ones yielding the highest success rate. This method required some serious data analysis chops and a well-prepared infrastructure. 

The good news is that now this lengthy process can be skipped. Next-Gen Residential Proxies automizes the testing and selection phase as AI-powered dynamic fingerprinting adopts the most appropriate fingerprint for any given site instantly. This ability maximizes the success rate and ensures that the high-quality data is collected timelessly and effortlessly. There is no other proxy available on the market, which could perform anything like this.

What about the Auto-Retry system?

There is a saying – the ones who refuse to give up eventually are destined to succeed. We adapted the same philosophy to Next-Gen Residential Proxies. 

When it comes to web scraping, there is a need to understand if the data collection was successful or not. “Successful” means that the extracted data is of high quality and can be used to fulfill the business needs. To figure it out, data collectors used to evaluate every session separately and, in case of failed attempts, retry the whole process. I hope it is clear for everyone how long this evaluation process can take. 

That is why we integrated the Auto-Retry system, which is smart enough to recognize poor quality data and restart the process automatically as many times as needed to receive a satisfactory result. In other words, Next-Gen Residential Proxies don’t know how to fail.

What part does JavaScript rendering play in all of this?

We also included the optional JavaScript rendering feature to make life easier for our clients.

At this day and age, most websites operate on JavaScript code to load content in the browser. This means that a data gathering process can return completely useless information if JavaScript is not executed. Professionals in the field are aware of this, and they usually upkeep and maintain an expensive browser infrastructure just for this task. 

Next-Gen Residential Proxies provide an option to execute JavaScript code before receiving the results, saving valuable time and our clients’ resources. 

Is there any room for improvement left?

If you deal with information technologies, there is no limit for improvement. The same goes for Next-Gen Residential Proxies. In fact, we are at the last stages of integrating one more feature, which employs machine-learning adaptive parsing to deliver structured data from a vast amount of e-commerce sites. I can give another secret away: we are not stopping here.

Would you like to start using Oxylabs’ Next-Gen Residential Proxies? Book a call with our sales team, or email us at hello@oxylabs.io.

About the author

Jurgita Tuzikaite

Jurgita Tuzikaite

Former Communications Specialist

Jurgita Tuzikaite is a former Communications Specialist at Oxylabs. Her inspiration for original ideas comes from observing nature and exploring unknown paths, which often lead to unexpected adventures. Jurgita’s background in humanitarian work has formed her work ethic and moral compass, which resulted in placing positive intention behind everything she does. She values wisdom and places importance on bringing value to other people through knowledge, creativity, and compassion.

All information on Oxylabs Blog is provided on an "as is" basis and for informational purposes only. We make no representation and disclaim all liability with respect to your use of any information contained on Oxylabs Blog or any third-party websites that may be linked therein. Before engaging in scraping activities of any kind you should consult your legal advisors and carefully read the particular website's terms of service or receive a scraping license.

Related articles

Get the latest news from data gathering world

I’m interested

IN THIS ARTICLE:


  • Aleksandras, can you tell us how the idea of Next-Gen Residential Proxies came about?


  • What were the challenges of building something that has never been done before?


  • Can you explain what is so different about Next-Gen Residential Proxies?


  • How does AI-powered dynamic fingerprinting work exactly?


  • What about the Auto-Retry system?


  • What part does JavaScript rendering play in all of this?


  • Is there any room for improvement left?

Forget about complex web scraping processes

Choose Oxylabs' advanced web intelligence collection solutions to gather real-time public data hassle-free.

Scale up your business with Oxylabs®