For our webapp (a car marketplace) we scrap data from foreign websites with a software, customized and written in Python (Tool). All these collected scrapped data are well prepared at least in its database, a sqlite databse.
We immigrate the data (with a php script) to our website database (Mysql).
The immigration process is slow and we have loading issues:
The Tool would have about 2 million records. Website database should handle 2 million records. Database have about 20 tables now. Table Nr would increase later to 40. Website database would be updated in average every 5 min with 15000 records. Records are car data (make, model, no imgs or videos: 20 rows).
Performance has the highest priority. We need a solution for best performance.
Only database specialists plz: We already have a huge team of good developers.