-1

I am trying to download products of a URL and add it to my site,I mean inserting the data in database and add pictures to my images folder , I searched a lot but could not find an example , please say me what to do and which way is better

2 Answers 2

1

I haven't checked if Laravel has specific funcions or libraries for that but with plain PHP you can do this:

  1. Get the contents of the URL with file_get_contents('http://foo.example') or with cURL (for the first check this https://www.php.net/manual/es/filesystem.configuration.php#ini.allow-url-fopen).
  2. Then use the PHP's DOM Extension (http://php.net/manual/en/book.dom.php) to parse the document and extract the info you need. Getting texts may be quite straight forward but for images you'll need to call again file_get_contents('https://foo.example/image.jpg').
  3. Once you have the texts and images you need you can use Laravel to store that information the way you want.

For the two first points you will find lots of help.

0

You can use Web Scraper for that you can use Chrome extension

https://chrome.google.com/webstore/detail/web-scraper-free-web-scra/jnhgnonknehpejjnehehllkliplmbmhn?hl=en

There are only a couple of steps you will need to learn in order to master web scraping:

  1. Install the extension and open the Web Scraper tab in developer tools (which has to be placed at the bottom of the screen);
  2. Create a new sitemap;
  3. Add data extraction selectors to the sitemap;
  4. Lastly, launch the scraper and export scraped data. It’s as easy as that!

Not the answer you're looking for? Browse other questions tagged or ask your own question.