Recently I was given a bunch of sites to scrape, most of them being simple e-commerce. I decided to try Zyte AI powered spiders. To utilize it, I had to apply for a Zyte API subscription and access to Scrapy Cloud. Zyte AI proved to be a good choice for the fast data extraction & delivery thru spiders that are Scrapy Spiders. Below you can see my experience and the results.
I have done another “experience” post on the Zyte platform usage.
Tag: service
I’ve described my initial experience with Zyte AI spiders leveraging Zype API and Scrapy Cloud Units. You might find it here. Now I’d share more sobering report of what happened with the data aggregator scrape.
Agree, it’s hard to overestimate the importance of information – “Master of information, master of situation”. Nowadays, we have everything to become a “master of situation”. We have all needed tools like spiders and parsers that could scrape various data from websites. Today we will consider scraping the Amazon with a web spider equipped with proxy services.
From eCommerce and market research to competitive analysis and more, web scraping has become an integral part of data collection. And for some, it’s the secret sauce for success.
But with great scraping power comes great responsibility.
Web scraping can result in IP bans and other harsh restrictions. To avoid these issues, many turn to proxies, which act as intermediaries between your requests and the target website. In this article, we’ll explore the top 3 proxy types for web scraping and focus on the key benefits of each proxy. Let’s go!
Bright Data’s Business Capabilities
Bright Data offers its customers a full suite of real-time data collection tools that help them gain and maintain a competitive market edge. BrightData prides itself on its ethical and 100% legally compliant approach.
When traversing the digital landscape for high-quality proxy services, the name NetNut often emerges as a frontrunner. This review aims to delve into the facets of NetNut’s offerings, emphasizing the distinctive aspects that set it apart in an intensely competitive environment.
An Introduction to NetNut
In the world of proxies, NetNut.io has built a reputation for speed, reliability, and robustness. Leveraging its partnership with a major US data carrier, NetNut offers a residential proxy network that is both unique and high-performing. The company’s network is expansive, with a coverage that spans millions of IPs globally, presenting an impressive degree of scalability.
Octoparse | Dexi.io | Mozenda | Sequentum SaaS | Import.io | |
---|---|---|---|---|---|
Able to set up robot/agent | 3 min | 3 failures in a row | "For some insight, we are working with customers in managed service engagements for large scale, mission critical web integration requirements - so we no longer have a SaaS tool offering. We have a heavy focus in digital commerce and work with customers on use cases in ecomm/retail, travel/hospitality, and tickets/events." - customer service | ||
Support response | 12 hours. It does excellent job. | 12 hours | 12 hours | 12 hours | |
Base64 encoding | no | Using a JavaScript step; btoa() is a function that takes a string and encodes it to Base64. | yes, one can encode the given value in the Transformation Script of any command | ||
Robot/agent development assistance | yes |
Working with a Backconnect proxy service (Oxylab.io) we spent a long time looking for a way to authorize it. Originally we used JSoup to get the web pages’ content. The proxy() method can be used there when setting up the connection, yet it only accepts the host and port, no authentication is possible. One of the options that we found, was the following:
DataFlowKit review
Recently we encountered a new service that helps users to scrape the modern web 2.0. It’s a simple, comfortable, easy to learn service – https://dataflowkit.com
Let’s first highlight some of its outstanding features:
- Visual online scraper tool: point, click and extract.
- Javascript rendering; any interactive site scrape by headless Chrome run in the cloud
- Open-source back-end
- Scrape a website behind a login form
- Web page interactions: Input, Click, Wait, Scroll, etc.
- Proxy support, incl. Geo-target proxying
- Scraper API
- Follow the direction of robots.txt
- Export results to Google drive, DropBox, MS OneDrive.
Oxylabs.io at a glance
Oxylabs.io is an experienced player in the proxy market. In the past few years, they have significantly expanded their proxy pool.
Right now they have a residential proxy pool with over 60M IPs and over 2M datacenter proxies. Their residential proxies cover every country in the world (!) and offer city-level targeting. Oxylabs datacenter proxies come from 82 locations and feature 7850 subnets.
Oxylabs is mainly focused on businesses and it is reflected in their product subscription plans. But recently they have introduced a Fast-Checkout feature, where customers can purchase residential proxies in a few clicks. Together with a recently added smaller plan ($300/month for 20GB of traffic) Oxylabs becomes much more attractive for smaller customers as well.