Query from the hell : )

Joseph smith

New Member
Hi there,

I've started to check a couple of MySQL tips performance and everything running well, except this:
EXPLAIN SELECT COUNT(DISTINCT t.subscriber_id) as counter FROM `mw_list_subscriber` `t` LEFT JOIN `mw_list_field_value` `fieldValues308` ON (`fieldValues308`.`subscriber_id`=`t`.`subscriber_id`) LEFT JOIN `mw_list_field_value` `fieldValues299` ON (`fieldValues299`.`subscriber_id`=`t`.`subscriber_id`) LEFT JOIN `mw_list_field_value` `fieldValues305` ON (`fieldValues305`.`subscriber_id`=`t`.`subscriber_id`) WHERE ((t.list_id='16') AND (t.status='confirmed') AND (`fieldValues308`.`field_id` = 308 AND (CAST(`fieldValues308`.`value` AS UNSIGNED) > 1) ) AND (`fieldValues299`.`field_id` = 299 AND (CAST(`fieldValues299`.`value` AS UNSIGNED) = 4) ) AND (`fieldValues305`.`field_id` = 305 AND (`fieldValues305`.`value` < '2017-09-09') )):
I'm using mytop according with it, the query takes a lot of time per state , storing + sending

any reply answers or it would be appreciated!!! thanks for all aahh in addition just for info:

This query takes so much time to complete, I was debugging it check this out:
this my config or better say my important variable modified on MySQL
Specs: cloud => 8 Core virtual, 30 GB ram, 60 GB SSD

table_open_cache = 30000
query_cache_limit = 5M
query_cache_size = 256M
query_cache_type = 1
tmp_table_size = 64M
max_heap_table_size = 64M
table_open_cache = 512
table_open_cache_instances = 16
join_buffer_size = 512K
sort_buffer_size = 512K
read_rnd_buffer_size = 512K

innodb_flush_method = O_DIRECT
innodb_flush_log_at_trx_commit = 2
innodb_buffer_pool_instances = 10

innodb_io_capacity = 9000
innodb_io_capacity_max = 9200

innodb_read_io_threads = 8
innodb_thread_concurrency = 8
innodb_write_io_threads = 8
innodb_checksum_algorithm = crc32
innodb_buffer_pool_size = 10G
innodb_log_file_size = 256M
This is what's happening inside segments, i haven't found a better/faster way to do this.
The query itself, if you run explain on it, is pretty good. The thing is there are a lots of rows to scan, which automatically means a lot of time.
Fully understandable. I'm going to do a couple of things as custom controller|Database query any improves in the application I'll make you reply, I think I can share my knowledge in the case.

Good app again