Hi, we are trialling the wpDataTables plugin and it looks promising, however a major issue we've encountered is really poor performance from tables that are querying very large external SQL database tables, some of our tables have roughly ~137 million rows. It take 12-15 minutes to load the first 25 entries. Both the external SQL database (MariaDB) & the wordpress installation are in Docker containers running on the same server. Any help/ideas to improve the loading times? Thanks Mitch
I've done some further testing today and trialled the WP Data Access plugin. Running the same SQL query the data is updated in seconds, as opposed to the wpDataTables query which take minutes, it's looking very much like it's an issue with the wpDataTables plugin. I have moved the website to https://web.dcssentinel.com.au/ and I've setup two pages with corresponding plugins.
Firstly, I would like to sincerely apologize for the delayed response as we have been experiencing an unusually high number of tickets. I am sorry that it has taken longer than usual to respond to your concern and your patience is highly appreciated.
I am not able to load the related page - also tried with a VPN, it says the site is down, from my end.
Can you check does the page load on your end or maybe it's not the correct URL?
We are not fully familiar with how the WP Data Access plugin renders SQL Queries and we can't comment on that.
We can just advise in regards to our plugin.
Our plugin's SQL Query functionality is not the same as if you ran the Query through a database management software ( e.g. PHPMyAdmin),
the logic is based on a PHP SQL parser which has full support for the SQL dialect for the following statement types
SELECT, INSERT, UPDATE, DELETE, REPLACE, RENAME, SHOW, SET, DROP, CREATE INDEX, CREATE TABLE, EXPLAIN and DESCRIBE.
Some of them are disabled for security reasons.
Filtering, sorting, and search may not work properly if you include:
You can try preparing a MySQL view (which will return the data that you need, call it e.g. “view1” and then build a wpDataTables based on a simple query like "SELECT * FROM view1″.
The loading speed depends on a lot of factors - your hosting plan, server's performance, amount of other data on the page, internet speed, etc.
For example, you can take a look at the table on our documentation, which contains around 2.600.000 rows, and see how it loads. The more data the table has, the more chance that it will impact the loading time.
If it takes 12-15 minutes to just load the first 25 entries and server-side processing is enabled, that sounds strange.
Maybe you have too many unique values for filters on some columns...
Can you please check this, to see if you have too many entries in the filters?
All of them are being rendered in HTML of the page and it is passing through a piece of the code in WordPress where it can not handle that much data.
Check in all of the column settings, in the Data Tab, how many of the possible values to load for filtering are allowed :
- When the Data is set to "All" (and you have over 12.000 unique values for the filter), the content breaks.
The solution is to add this to the funcions.php file of your theme, or child theme
( we recommend using a child theme because if you update a theme, it can overwrite this):
add_filter( 'option_use_smilies', '__return_false' );
Let me know if that helped?
If not, we can get help from our 2nd level Team to try to advise you how you might increase the loading speed.
We would need access to check the back-end of the table, though,
Please provide me a temporary WP-admin (administrator) user for your site where this happens, so we could log in and take a look ‘from the inside’ as that’s the most efficient way to see and resolve the issue.
We do not interfere with any data or anything else except for the plugin (in case that’s a production version of the site), and of course, we do not provide login data to third parties.
You can write credentials here just check PRIVATE Reply so nobody can see them except us.
And point us to the table ID.
If you could also send us the SQL Query that is used, and an SQL dump of the tables from your database, so that we can try to reproduce the slow load issue on our Test server/site?
( Of course, if there is any sensitive data that you send, please mark reply as Private)
Miloš Jovanović[email protected]
Rate my support
wpDataTables: FAQ | Facebook | Twitter | Instagram | Front-end and back-end demo | Docs
Amelia: FAQ | Facebook | Twitter | Instagram | Amelia demo sites | Docs
You can try wpDataTables add-ons before purchase on these sandbox sites:
Powerful Filters | Gravity Forms Integration for wpDataTables | Formidable Forms Integration for wpDataTables | Master-Detail Tables
Apologies about access to the webpage. I had commented on my own post that I had changed the URL. New address is https://web.dcssentinel.com.au/
I had already created a view and am querying it, as per your documentation.
Checked the filtering of columns for unique values as you suggested, I'm not seeing that option? It appears that it is only relevant for string datatypes? The table that querying has a single Datetime column and the remainder are floats.
Have created a SQL dump file and uploaded to Dropbox, the file is pretty big ~8GB.
The data table ID is 2
Thank you for the access, and the SQL dump, i downloaded it.
We will do our best to isolate the issue as quickly as possible, i will keep you posted.
I can see that table ID 2 is loading extremely slowly even on the back-end.
Could you please send us the SQL Query that is behind the View as well?
The view query is SELECT * FROM `DragLoops`
My apologies once again for all this waiting time.
My senior colleagues from 2nd level are working on this, and we can't seem to get a working solution for you.
It seems that the sheer size of the data set is the biggest obstacle for our SQL Parser,
but i will give them some more time to confirm that for us to be sure, i don't wish to conclude anything without being sure.
We will do our best to confirm this for you as quickly as possible going forward, since you have been waiting so much already.
As soon as they give me new information i will report back right away.
Thank you for your patience.
Our 2nd level Team went through everything with me again.
Unfortunately, we tried everything, but are not able to create a table from the SQL dump that you sent.
Our servers on our Test sites have been optimized by our developers - but they still are able to handle the magnitude of the data set,
there are too many cells in the data, unfortunately.
So, on our end, on our test servers - we are not even able to load the data to create tables.
The limitation is coming from the servers, so, unfortunately, there is nothing we can do to help with a data set that is as large at this.
The best we can advise is to try and get the best possible server,
but we conclude that the issue does not come from our plugin, but rather the sheer number of cells in the data is going to be difficult for almost any server to properly load it without crashing or being very slow.
I am truly sorry that we are not able to provide a working solution at this time for this kind of data set.