How to upload db of 6M users?

dante

Member
Hi,

I have just compiled all my recent data and its around 6-7M but as we know ".csv" supports only around 1M.

So I am wondering if I can upload it as an access file??

I certainly don't wanna upload 6-7 different files and send same campaign to all, it will just increase my workload and time.

Thank you so much!
 
Hi guys,

Thanks for the suggestions. I uploaded the 6M users csv list via CLI option. Added the cron job and all etc, it was working really well for some hrs but now it just stopped importing at "672,236 / 672,262" It won't go any further.

What should I do? Where can I see what is causing this issue?
 
Hi guys,

Thanks for the suggestions. I uploaded the 6M users csv list via CLI option. Added the cron job and all etc, it was working really well for some hrs but now it just stopped importing at "672,236 / 672,262" It won't go any further.

What should I do? Where can I see what is causing this issue?
#1
Did you create a new database (e.g. in phpMyAdmin or MySQL WorkBench or at sql cli) solely for the purpose of importing the csv? That should have imported all.
#2
Did you then import that mysql database into the MailWizz database? (/customer/index.php/lists/ab1234c5de678/import#database-import-modal)
 
#1
Did you create a new database (e.g. in phpMyAdmin or MySQL WorkBench or at sql cli) solely for the purpose of importing the csv? That should have imported all.
#2
Did you then import that mysql database into the MailWizz database? (/customer/index.php/lists/ab1234c5de678/import#database-import-modal)
Damn! Didn't do any of that.

Will try now.
BTW the import resumed and now stuck at
704,152 / 704,204
 
Damn! Didn't do any of that.

Will try now.
BTW the import resumed and now stuck at
704,152 / 704,204
Just to avoid misunderstanding: the database in #1 gets the csv imported into directly (not via mwz) by the interface used (pma, workbench, cli).
 
@dante - Can you check may be it lost connectivity.
Yup thats not the issue.

Just to avoid misunderstanding: the database in #1 gets the csv imported into directly (not via mwz) by the interface used (pma, workbench, cli).

Actually it is working, the database that I uploaded had 10x dupes lol, so dupes + bad data got removed and all I left with was 700k.

I did a dup check on access and thats how I found out that there were shit ton dupes lol.

So to test this I uploaded another 1.3M csv and then bingoooo... all the data was uploaded in few hrs. Didn't even need to mess with mysql.

Only enabled CLI import option and added the cron job for that.

Thanks all, especially @frm.mwz and @Vpul Shah :)
 
#1
Did you create a new database (e.g. in phpMyAdmin or MySQL WorkBench or at sql cli) solely for the purpose of importing the csv? That should have imported all.
#2
Did you then import that mysql database into the MailWizz database? (/customer/index.php/lists/ab1234c5de678/import#database-import-modal)


Which method of importing is the fastest? direct from database (localhost) or from file? ... for #2 I am not understanding the steps.
Did you then import that mysql database into the MailWizz database? (/customer/index.php/lists/ab1234c5de678/import#database-import-modal) ... Are you saying that once we import the dbase.sql locally, we can run that command ?
 
depends on dbf structure


dbf import is live

Anyway to do the live dbf from commandline, so it runs in the background or not in browser? i thought about manually doing it with mysql shell or is there a specific console.php query that i can issue to run the dbf live import in background?
 
there isn't none atm, this is what i am trying to say in my last few messages...
Well can i mimic the sql statement mailwhizz uses to do a live transfer? I can manually connect to mysql from shell, and let the transfer run however i will disconnect from shell and it will run in the background. If so do you know the statement?
 
Back
Top