In this post I want to share on how one may scrape business directory data, real estate using Scrapy framework.
Categories
PHP Curl POSTing JSON example
<?php $base_url = "https://openapi.octoparse.com"; $token_url = $base_url . '/token'; $post =[ 'username' => 'igorsavinkin', 'password' => '<xxxxxx>', 'grant_type' => 'password' ]; $payload = json_encode($post); $headers = [ 'Content-Type: application/json' , 'Content-Length: ' . strlen($payload) ]; $timeout = 30; $ch_upload = curl_init(); curl_setopt($ch_upload, CURLOPT_URL, $token_url); if ($headers) { curl_setopt($ch_upload, CURLOPT_HTTPHEADER, $headers); } curl_setopt($ch_upload, CURLOPT_POST, true); curl_setopt($ch_upload, CURLOPT_POSTFIELDS, $payload /*http_build_query($post)*/ ); curl_setopt($ch_upload, CURLOPT_RETURNTRANSFER, true); curl_setopt($ch_upload, CURLOPT_CONNECTTIMEOUT, $timeout); $response = curl_exec($ch_upload); if (curl_errno($ch_upload)) { echo 'Curl Error: ' . curl_error($ch); } curl_close($ch_upload); //echo 'Response length: ', strlen($response); echo $response ; $fp = fopen('octoparse-api-token.json', 'w') ; fwrite($fp, $response ); fclose($fp);
Octoparse | Dexi.io | Mozenda | Sequentum SaaS | Import.io | |
---|---|---|---|---|---|
Able to set up robot/agent | 3 min | 3 failures in a row | "For some insight, we are working with customers in managed service engagements for large scale, mission critical web integration requirements - so we no longer have a SaaS tool offering. We have a heavy focus in digital commerce and work with customers on use cases in ecomm/retail, travel/hospitality, and tickets/events." - customer service | ||
Support response | 12 hours. It does excellent job. | 12 hours | 12 hours | 12 hours | |
Base64 encoding | no | Using a JavaScript step; btoa() is a function that takes a string and encodes it to Base64. | yes, one can encode the given value in the Transformation Script of any command | ||
Robot/agent development assistance | yes |
Categories
Simple Apify Puppeteer crawler
const Apify = require('apify'); var total_data=[]; const regex_name = /[A-Z][a-z]+\s[A-Z][a-z]+(?=\.|,|\s|\!|\?)/gm const regex_address = /stand:(<\/strong>)?\s+(\w+\s+\w+),?\s+(\w+\s+\w+)?/gm; const regex_email = /(([^<>()\[\]\\.,;:\s@"]+(\.[^<>()\[\]\\.,;:\s@"]+)*)|(".+"))@((\[[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}\])|(([a-zA-Z\-0-9]+\.)+[a-zA-Z]{2,}))/i; Apify.main(async () => { const requestQueue = await Apify.openRequestQueue('123'); await requestQueue.addRequest(new Apify.Request({ url: 'https://www.freeletics.com/de/pages/imprint/' })); await requestQueue.addRequest(new Apify.Request({ url: 'https://di1ara.com/pages/impressum' })); console.log('\nStart PuppeteerCrawler\n'); const crawler = new Apify.PuppeteerCrawler({ requestQueue, handlePageFunction: async ({ request, page }) => { const title = await page.title(); console.log(`Title of ${request.url}: ${title}`); const page_content = await page.content(); console.log(`Page content size:`, page_content.length); let obj = { 'url' : request.url }; console.log('Names:'); while ((m = regex_name.exec(page_content)) !== null) { // This is necessary to avoid infinite loops with zero-width matches if (m.index === regex_name.lastIndex) { regex_name.lastIndex++; } // The result can be accessed through the `m`-variable. m.forEach((match, groupIndex) => { console.log(`Found match, group ${groupIndex}: ${match}`); if (match !='undefined' ) { obj['names'] += match + ', '; } }); } console.log('\nAddress:'); while ((m = regex_address.exec(page_content)) !== null) { // This is necessary to avoid infinite loops with zero-width matches if (m.index === regex_address.lastIndex) { regex_address.lastIndex++; } // The result can be accessed through the `m`-variable. m.forEach((match, groupIndex) => { console.log(`Found match, group ${groupIndex}: ${match}`); }); m[0] = m[0].includes('</strong>') ? m[0].split('</strong>')[1] : m[0]; m[0] = m[0].replace('<', ''); obj['address']= m[0] ?? ''; } console.log('\Email:'); while ((m = regex_email.exec(page_content)) !== null) { // This is necessary to avoid infinite loops with zero-width matches if (m.index === regex_email.lastIndex) { regex_email.lastIndex++; } // The result can be accessed through the `m`-variable. m.forEach((match, groupIndex) => { console.log(`Found match, group ${groupIndex}: ${match}`); }); if (m[0]) { obj['email'] = m[0]; break; } } total_data.push(obj); console.log(obj); }, maxRequestsPerCrawl: 2000000, maxConcurrency: 20, }); await crawler.run(); console.log('Total data:'); console.log(total_data); });
Categories
Hoppscotch – API ecosystem
Add
Adding constraint NetAppToken composing of 3 columns: network, application, token.
Note: you supposed to have chosen a right db: use <database_name>;
ALTER TABLE crypto ADD CONSTRAINT NetAppToken UNIQUE (network, application, token);
View
SELECT table_schema, table_name, constraint_name FROM information_schema.table_constraints WHERE table_name = 'crypto';
Result
+--------------+------------+-----------------+
| table_schema | table_name | constraint_name |
+--------------+------------+-----------------+
| admin_crypto | crypto | PRIMARY |
| admin_crypto | crypto | NetAppToken |
+--------------+------------+-----------------+
2 rows in set (0.06 sec)
Categories
CURL request into Curl PHP code
Recently I needed to transform the CURL request into the PHP Curl code, binary data and compressed option having been involved. See the query itself:
curl 'https://terraswap-graph.terra.dev/graphql' -H 'Accept-Encoding: gzip, deflate, br' -H 'Content-Type: application/json' -H 'Accept: application/json' -H 'Connection: keep-alive' -H 'DNT: 1' -H 'Origin: https://terraswap-graph.terra.dev' --data-binary '{"query":"{\n pairs {\n pairAddress\n latestLiquidityUST\n token0 {\n tokenAddress\n symbol\n }\n token1 {\n tokenAddress\n symbol\n }\n commissionAPR\n volume24h {\n volumeUST\n }\n }\n}\n"}' --compressed
Categories
Auth proxy with JAVA
In the post we’ll show how to leverage auth ptoxy (with login/pass) for JAVA application.
In this post I’ll share how I’ve added a LetsEncrypt SSL certificate to a subdomain at VPS with Centos 7 using Vesta CP.
This post is devoted to the steps of how to create subdomain (Centos 7 and Vesta CP) and map a [Laravel] project folder to it.