Web scraping using Fetch
Web scraping or web harvesting is method we’re using to get some data from other websites. I found out that Fetch method is much more easier to use than other methods to get data, like axios etc. However, Fetch has two main differences from jQuery.ajax() method:
- The Promise returned from fetch() won’t reject on HTTP error status even if the response is an HTTP 404 or 500.
- By default, Fetch won’t send or receive any cookies from the server
Fetch in React (example)
Firstly, we need to have state where we can store data . We will create empty state called data. In function
getDataFromApi we will fetch data from simple json provided by facebook – https://facebook.github.io/react-native/movies.json <- this one. Also, we should be sure that our response will be json and for that purpose we'll call
response => response.json(). After that in response we have data from that JSON file and we will send that data to our previous defined state called with same name. At the end, we just need to map that data, to make list of items, and use their properties, like title.
Get HTML instead of JSON
As you can see, here, we’re using two npm packages – cheerio and react-render-html to display HTML we’ve got. Cheerio is package which allows us to use almost standard jQuery selectors to get right elements from HTML. After we do that, we pass those elements to new state and render it in render function by using renderHTML from the other package called react-render-html.
To Fetch data in React, we can simply use fetch to get all data we want. However, to do something more with that data, we can use ES6 map to show list from JSON, or cheerio to display HTML codes at our website.