Skip to content

Commit

Permalink
chore: le readme
Browse files Browse the repository at this point in the history
  • Loading branch information
vladfrangu committed Nov 7, 2023
1 parent 6744c98 commit 1fecbc4
Showing 1 changed file with 16 additions and 5 deletions.
21 changes: 16 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ $ npm install got-scraping
```

**Note:**
> - Node.js >=15.10.0 is required due to instability of HTTP/2 support in lower versions.
> - Node.js >=16 is required due to instability of HTTP/2 support in lower versions.
## API

Expand All @@ -18,11 +18,22 @@ Got scraping package is built using the [`got.extend(...)`](https://github.com/s
Interested what's [under the hood](#under-the-hood)?

```javascript
const { gotScraping } = require('got-scraping');
import { gotScraping } from 'got-scraping';

gotScraping
.get('https://apify.com')
.then( ({ body }) => console.log(body))
.then( ({ body }) => console.log(body));
```

```javascript
// If you're still using CJS and cannot use the import syntax
let gotScraping;

async function fetchWithGotScraping(url) {
gotScraping ??= (await import('got-scraping')).gotScraping;

return gotScraping.get(url);
}
```
### options
Expand All @@ -34,14 +45,14 @@ Type: **`string`**
URL of the HTTP or HTTPS based proxy. HTTP/2 proxies are supported as well.
```javascript
const { gotScraping } = require('got-scraping');
import { gotScraping } from 'got-scraping';

gotScraping
.get({
url: 'https://apify.com',
proxyUrl: 'http://usernamed:[email protected]:1234',
})
.then(({ body }) => console.log(body))
.then(({ body }) => console.log(body));
```
#### `useHeaderGenerator`
Expand Down

0 comments on commit 1fecbc4

Please sign in to comment.