How to increase the Speed of List Import

Discussion in 'General discussions' started by pradeep sharma, Dec 1, 2015.

  1. pradeep sharma

    pradeep sharma Active Member

    Joined:
    Nov 11, 2015
    Messages:
    223
    Likes Received:
    94
    S.E:
    Expired
    L.T:
    Regular
    L.C:
    2
    Hi Everyone, @twisted1919 @Rob @Salman Habib @Krasimir Nikolov
    I am trying to import 2 million email data to mailwizz. but it it taking lot of time..i checked my server they are only 5-7% Utilized while import is going..i have 8 Core CPU with 16 GB ram..in both Mailwizz Application and Remote MYSQL DATA Base server (both Mailwizz application & Data base in same private network in same DC)

    i have selected following setting in import setting...
    Memory Limit : 5GB
    File Size 200MB
    Import at Once: 20000
    Pause : 1 Sec
    My php.ini setting are very high and even my my.conf Settings are also very high...
    I am uploading a file of 100k Subscriber at a time and its taking around 30min-40min to process average..
    What wrong with my Setup..??
    how can i increase this upload speed.??

    Regards

    Pradeep
     
  2. Salman Habib

    Salman Habib Member

    Joined:
    Mar 15, 2015
    Messages:
    134
    Likes Received:
    13
    S.E:
    Expired
    L.T:
    Regular
    L.C:
    1
    Can you please share your my.cnf settings. That could be the cause of the issue.
     
  3. Vpul Shah

    Vpul Shah Active Member Support Staff

    Joined:
    Feb 23, 2015
    Messages:
    733
    Likes Received:
    116
    S.E:
    2019-12-22 13:03:57
    L.T:
    Regular
    L.C:
    1
    Hello Pradeepji,

    In your PHP.INI, you need Upload FIle size=400MB, Allocate RAM 10GB, Import at once 5k.

    Try with this setting and see how it works.
     
  4. Teebox

    Teebox Member

    Joined:
    Sep 6, 2015
    Messages:
    66
    Likes Received:
    0
    S.E:
    Expired
    L.T:
    Regular
    L.C:
    2
    Hello,
    Any news from pradeep ?
    I have the same problem. I need to put 12 million@ and I haven't too much time.
     
  5. pradeep sharma

    pradeep sharma Active Member

    Joined:
    Nov 11, 2015
    Messages:
    223
    Likes Received:
    94
    S.E:
    Expired
    L.T:
    Regular
    L.C:
    2
    Use CLI import option..import will run in background..
    so no need to wait in frontend..
     
  6. Teebox

    Teebox Member

    Joined:
    Sep 6, 2015
    Messages:
    66
    Likes Received:
    0
    S.E:
    Expired
    L.T:
    Regular
    L.C:
    2
    So, I must choose "Queue import CSV" ?
     
  7. Kyrah414

    Kyrah414 Member

    Joined:
    Aug 22, 2016
    Messages:
    37
    Likes Received:
    5
    S.E:
    2020-05-14 23:00:00
    L.T:
    Regular
    L.C:
    2
    @pradeep sharma how much time did it take for your 100k with the CLI method?
     
  8. Leo El

    Leo El New Member

    Joined:
    Feb 24, 2019
    Messages:
    11
    Likes Received:
    1
    S.E:
    2019-08-25 16:31:06
    L.T:
    Regular
    L.C:
    1
    bump, anyone has recommendations on increasing the import speed?
     
  9. twisted1919

    twisted1919 Administrator Staff Member

    Joined:
    Dec 27, 2014
    Messages:
    10,048
    Likes Received:
    2,344
    @Leo El - upcoming version will split large list into smaller ones and process them in parallel, which will be faster.
     
  10. Leo El

    Leo El New Member

    Joined:
    Feb 24, 2019
    Messages:
    11
    Likes Received:
    1
    S.E:
    2019-08-25 16:31:06
    L.T:
    Regular
    L.C:
    1
    Thank you for the reply, for now i am using console.php list-import folder , and i split the the files into abc-1.csv abc-2.csv .. etc from reading your previous posts. When i run the php command, i can see in top multiples files uploading... now if i wanted to add more files, and i rerun the import command.. Will the script go in a loop and try to reimport abc-1, abc-2 or will it detect the previous cron running and only import 1 at a time or can it detect the last record updated and resume?
     
  11. twisted1919

    twisted1919 Administrator Staff Member

    Joined:
    Dec 27, 2014
    Messages:
    10,048
    Likes Received:
    2,344
    @Leo El - the way import in command line works, it loads at most 10 files and sets a lock so that successive calls to will not process a thing to avoid processing same file multiple times. These 10 files are processed in parallel. when they are done, they are removed from the server and other 10 can be processed, and so on.
     
  12. Rodrigo Bustos

    Rodrigo Bustos New Member

    Joined:
    Jul 27, 2016
    Messages:
    5
    Likes Received:
    0
    S.E:
    2019-11-06 19:12:32
    L.T:
    Regular
    L.C:
    1
    It split in 10k each batch.. this number is hardcoded (10.000).

    =
    /**
    * @var int maximum number of records allowed per file.
    * Above this number, files will be split into smaller files
    */
    public $max_records_per_file_split = 10000;
    =

    Can it be configured somewhere?
     

Share This Page