Ok, so correct me if i'm wrong. here is what you try to achieve:
1. scrap several e-commerce website every X minutes / hours
2. store the data you scrapped in a database.
3. update an excel file with the new data you stored in step 2.
is that correct ?
if so i also want to know what systeme you're using for development, is it Unix based ?
also what are your knowledge about programming ? what language do you know ?
i could help but i need few details to make it work.
From what i know i can only give you few advices like
1. first of all think about the structure of your database, what tables you gonna create ? what field ?
this will be according the data you're scrapping, e.g if you're scrapping products you might want to make a table called "product" with field like "name","price","picture_url", ...
2. use Linux Cron feature to periodically run your server side script that will do the scrapping(can be php / ruby / javascript)
3. use a Scrapping library, that will save you lots of time.
type this in google "FriendsOfPHP/Goutte"
4. use an ORM to insert your scraped data in DB with more ease. Copy pasted this usage example from Doctrine2 documentation:
Code
<?php
// create_product.php <name>
require_once "bootstrap.php";
$newProductName = $argv[1];
$product = new Product();
$product->setName($newProductName);
$entityManager->persist($product);
$entityManager->flush();
echo "Created Product with ID " . $product->getId() . "\n";
5. same goes for filling the Excel sheet, use a PHP library (check PHPExcel on github)
if you feel like you don't wanna deal with it i can write it all for ya, feel free to pm me.