How to increase the Speed of List Import

pradeep sharma

Active Member
Hi Everyone, @twisted1919 @Rob @Salman Habib @Krasimir Nikolov
I am trying to import 2 million email data to mailwizz. but it it taking lot of time..i checked my server they are only 5-7% Utilized while import is going..i have 8 Core CPU with 16 GB ram..in both Mailwizz Application and Remote MYSQL DATA Base server (both Mailwizz application & Data base in same private network in same DC)

i have selected following setting in import setting...
Memory Limit : 5GB
File Size 200MB
Import at Once: 20000
Pause : 1 Sec
My php.ini setting are very high and even my my.conf Settings are also very high...
I am uploading a file of 100k Subscriber at a time and its taking around 30min-40min to process average..
What wrong with my Setup..??
how can i increase this upload speed.??

Regards

Pradeep
 
Hello Pradeepji,

In your PHP.INI, you need Upload FIle size=400MB, Allocate RAM 10GB, Import at once 5k.

Try with this setting and see how it works.
 
Hello,
Any news from pradeep ?
I have the same problem. I need to put 12 million@ and I haven't too much time.
 
@Leo El - upcoming version will split large list into smaller ones and process them in parallel, which will be faster.
Thank you for the reply, for now i am using console.php list-import folder , and i split the the files into abc-1.csv abc-2.csv .. etc from reading your previous posts. When i run the php command, i can see in top multiples files uploading... now if i wanted to add more files, and i rerun the import command.. Will the script go in a loop and try to reimport abc-1, abc-2 or will it detect the previous cron running and only import 1 at a time or can it detect the last record updated and resume?
 
@Leo El - the way import in command line works, it loads at most 10 files and sets a lock so that successive calls to will not process a thing to avoid processing same file multiple times. These 10 files are processed in parallel. when they are done, they are removed from the server and other 10 can be processed, and so on.
 
@Leo El - upcoming version will split large list into smaller ones and process them in parallel, which will be faster.

It split in 10k each batch.. this number is hardcoded (10.000).

=
/**
* @var int maximum number of records allowed per file.
* Above this number, files will be split into smaller files
*/
public $max_records_per_file_split = 10000;
=

Can it be configured somewhere?
 
Hello Shreem Digitals,

Check your Import settings - file size - allow like 10 MB

You can set Pause if you need.

Thanks,
VPul.
 
Back
Top