Categories
Development Web Scraping Software

My experience of choosing web scraping platform for company critical data feed

Recently we engaged with the online e-commerce startup for the need of gov. tenders/RFP scraping. Since the project size is immense , we have to switch from the hand made scripting extractors to a enterprise grade scraping platform. Below I share my experience of the scraping platforms as a feature table.

   Service   ResidentialCost/monthTraffic/month$ per GBRotatingIP whitelistingPerformance and moreNotes
MarsProxies
N/AN/A3.5yesyes 500K+ IPs, 190+ locations
Test results
SOCKS5 supported
Proxy grey zone restrictions
Oxylabs.io
N/A25 GB9 - 12

"pay-as-you-go" - 15
yesyes100M+ IPs, 192 countries

- 30K requests
- 1.3 GB of data
- 5K pages crawled
Not allowing to scrape some of grey zone targets, incl. Linkedin.
Smartproxy
Link to the price page

N/A5.2 - 7

"pay-as-you-go" - 8.5
yesyes65M+ IPs, 195+ countriesFree Trial
Not allowing to scrape some of grey zone targets, incl. Linkedin.
Infatica.io
N/AN/A3 - 6.5

"pay-as-you-go" - 8
yesyesOver 95% success
*Bans from Cloudflare are also few, less than 5%.
Black list of sites —> proxies do not work with those.
  • 1000 ports for one Proxy List
  • Up to 20 Proxy Lists at a time
  • Using via API Tool
  • ISP-level targeting
  • Rotation time selection
Mango Proxy
N/A1-50 GB3-8"pay-as-you-go" - 8yesyes90M+ IPs, 240+ countries
IPRoyal
N/AN/A$4.55yesyes32M+ IPs, 195 countriesNot allowing to scrape some of grey zone targets, incl. Facebook. List of bloked sites.
Rainproxy.io yes$ 4from 1 GB4yes
BrightDatayes15
ScrapeOps Proxy AggregatoryesAPI Credits per monthN/A N/AyesAllows multithreading, the service provides browsers at its servers. It allows to run N [cloud] browsers from a local machine.
The number of threads depends on the subscription: min 5 threads.
The All-In-One Proxy API that allows to use over 20+ proxy providers from a single API
Lunaproxy.comyesfrom $15x Gb per 90 days0.85 - 5 Each plan allows certain traffic amount for 90 days limit.
LiveProxies.ioyesfrom $454-50 GB5 - 12yesyesEg. 200 IPs with 4 GB for $70.00, for 30 days limit.
Charity Engine -docsyes-- starting from 3.6
Additionally:
CPU computing
- from $0.01 per avg CPU core-hour
- from $0.10 per GPU-hour - source.
failed to connect so far
proxy-sale.comyesfrom $17N/A
3 - 6

"pay-as-you-go" - 7
yesyes10M+ IPs, 210+ countries30 days limit for a single proxy batch
Tabproxy.comyesfrom $15N/A
0.8 - 3
(lowest price is for a chunk of 1000 GB)
yesyes 200M+ IPs, 195 countries,30-180 days limit for a single proxy batch (eg. 5 GB)
proxy-seller.comyesN/A
N/A
4.5 - 6

"pay-as-you-go" - 7
yesyes15M+ IPs, 220 countries- Generation up to 1000 proxy ports in each proxy list
- HTTP / Socks5 support
- One will be able to generate an infinite number of proxies by assigning unique parameters to each list
Categories
Development

Simple Apify Puppeteer crawler

The Apify crawler is to gather names, addresses, emails of the web urls.

const Apify = require('apify');
var total_data=[];
const regex_name = /[A-Z][a-z]+\s[A-Z][a-z]+(?=\.|,|\s|\!|\?)/gm
const regex_address = /stand:(<\/strong>)?\s+(\w+\s+\w+),?\s+(\w+\s+\w+)?/gm;
const regex_email = /(([^<>()\[\]\\.,;:\s@"]+(\.[^<>()\[\]\\.,;:\s@"]+)*)|(".+"))@((\[[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}\])|(([a-zA-Z\-0-9]+\.)+[a-zA-Z]{2,}))/i;
Apify.main(async () => {
    const requestQueue = await Apify.openRequestQueue('123');
    await requestQueue.addRequest(new Apify.Request({ url: 'https://www.freeletics.com/de/pages/imprint/' }));
    await requestQueue.addRequest(new Apify.Request({ url: 'https://di1ara.com/pages/impressum' }));
	console.log('\nStart PuppeteerCrawler\n');
    const crawler = new Apify.PuppeteerCrawler({
        requestQueue,
        handlePageFunction: async ({ request, page }) => {
            const title = await page.title();
            console.log(`Title of ${request.url}: ${title}`);
			const page_content = await page.content();
            console.log(`Page content size:`, page_content.length);
            let obj = { 'url' : request.url }; 
	 
			console.log('Names:');
			while ((m = regex_name.exec(page_content)) !== null) {
				// This is necessary to avoid infinite loops with zero-width matches
				if (m.index === regex_name.lastIndex) {
					regex_name.lastIndex++;
				}
				
				// The result can be accessed through the `m`-variable.
				m.forEach((match, groupIndex) => {
					console.log(`Found match, group ${groupIndex}: ${match}`);
					if (match !='undefined' ) { 
						obj['names'] +=  match + ', ';
					}
				}); 
				
				
			}	
			console.log('\nAddress:');
			while ((m = regex_address.exec(page_content)) !== null) {
				// This is necessary to avoid infinite loops with zero-width matches
				if (m.index === regex_address.lastIndex) {
					regex_address.lastIndex++;
				}
				
				// The result can be accessed through the `m`-variable.
				m.forEach((match, groupIndex) => {
					console.log(`Found match, group ${groupIndex}: ${match}`);
				});
				m[0] = m[0].includes('</strong>') ? m[0].split('</strong>')[1] : m[0];
				m[0] = m[0].replace('<', '');
				obj['address']= m[0] ?? '';
			}
			console.log('\Email:');
			while ((m = regex_email.exec(page_content)) !== null) {
				// This is necessary to avoid infinite loops with zero-width matches
				if (m.index === regex_email.lastIndex) {
					regex_email.lastIndex++;
				}
				
				// The result can be accessed through the `m`-variable.
				m.forEach((match, groupIndex) => {
					console.log(`Found match, group ${groupIndex}: ${match}`);
				}); 
				if (m[0]) 
				{
					obj['email'] = m[0];
					break;
				}
			}
			total_data.push(obj);
			console.log(obj);
        },
        maxRequestsPerCrawl: 2000000,
        maxConcurrency: 20,
    });
    await crawler.run();
	console.log('Total data:');
	console.log(total_data);
});
Categories
Development

Hoppscotch – API ecosystem

Hoppscotch is the open-source API development ecosystem.

Categories
Development

Add a composite constraint and view it, MySQL/MariaDB

Add

Adding constraint NetAppToken composing of 3 columns: network, application, token.
Note: you supposed to have chosen a right db:
use <database_name>;

ALTER TABLE crypto   
    ADD CONSTRAINT NetAppToken UNIQUE (network, application, token);

View

SELECT 
   table_schema,  
   table_name,    
   constraint_name
FROM information_schema.table_constraints
WHERE table_name = 'crypto';

Result

+--------------+------------+-----------------+
| table_schema | table_name | constraint_name |
+--------------+------------+-----------------+
| admin_crypto | crypto     | PRIMARY         |
| admin_crypto | crypto     | NetAppToken     |
+--------------+------------+-----------------+
2 rows in set (0.06 sec)
Categories
Development

CURL request into Curl PHP code

Recently I needed to transform the CURL request into the PHP Curl code, binary data and compressed option having been involved. See the query itself:

curl 'https://terraswap-graph.terra.dev/graphql' 
-H 'Accept-Encoding: gzip, deflate, br' 
-H 'Content-Type: application/json' 
-H 'Accept: application/json' 
-H 'Connection: keep-alive' 
-H 'DNT: 1' 
-H 'Origin: https://terraswap-graph.terra.dev' 
--data-binary '{"query":"{\n  pairs {\n    pairAddress\n    latestLiquidityUST\n    token0 {\n      tokenAddress\n      symbol\n    }\n    token1 {\n      tokenAddress\n      symbol\n    }\n    commissionAPR\n    volume24h {\n      volumeUST\n    }\n  }\n}\n"}' 
--compressed
Categories
Development

Auth proxy with JAVA

In the post we’ll show how to leverage auth ptoxy (with login/pass) for JAVA application.

Categories
Development

Vesta CP install SSL certificate for a subdomain

In this post I’ll share how I’ve added a LetsEncrypt SSL certificate to a subdomain at VPS with Centos 7 using Vesta CP.

Categories
Development

Subdomain at Centos 7 with Laravel project

This post is devoted to the steps of how to create subdomain (Centos 7 and Vesta CP) and map a [Laravel] project folder to it.

Categories
Challenge SaaS

Web Scraper IDE to scrape tough websites

Recently we encountered a new powerful scraping service called Web Scraper IDE [of Bright Data]. The life-test and thorough drill-in are coming soon. Yet now we want to highlight its main features that has badly (in positive sense, strongly) impressed us.

Categories
Development

How to add Git Personal Access Token (PAT) into git console

  1. Remove previous git origin
git remote remove origin
  1. Add new origin with PAT (<Token>) :
git remote add origin https://<TOKEN>@github.com/<USERNAME>/<REPO>.git
  1. Push once with –set-upstream
git push --set-upstream origin main

Now you might commit changes to the remote repo without adding PAT into a push command every time.

If you need to create PAT, use the following tut.