As the years passed, and I added tables to my project, I had already noticed that everything was getting slower: now (I reached 94 tables in my project) to have a populated list from a lookup field, also with a little number of records, we have to wait 15 - 25 seconds so it is very problematic to use the application.
I noticed that:
1) when I select a lookup field the MYSQL's server CPU go near 100% as long as it takes to populate the list
2) when a user has limited access to tables (two or three tables) the lookup operation is very fast, almost instantaneous.
Why is he acting like this?
How can I improve performance ?
ALex.
Reached 94 tables in my project -> lookup very slow
Reached 94 tables in my project -> lookup very slow
- Attachments
-
- Clipboard01.jpg (211.2 KiB) Viewed 1270 times
Re: Reached 94 tables in my project -> lookup very slow
No one can confirm that an increase in the number of tables can cause a severe slowdown in field's lookup ?
Our application has become almost unusable.
ALex.
Our application has become almost unusable.
ALex.
Re: Reached 94 tables in my project -> lookup very slow
If this was my project, I'd start narrowing down.
First which comes into my mind: is it really a matter of number of tables or a matter of the serverside query, fetching data for your lookup field. If performance changes depending on permissions I would debug the final SQL statement (which causes the delay) and I would consider adding useful indexes to the relevant tables.
In another project, when we were getting timeouts after 2 minutes (query never came to an end), I was able to reduce query time to less than 3 seconds by overthinking the SQL statement and by adding two indexes to one table. This was not due to standard lookups but due to custom code. But it proved to me that we can boost standard generated AppGini code by just improving our database when it comes to large amount of data.
Just my 5 cents.
First which comes into my mind: is it really a matter of number of tables or a matter of the serverside query, fetching data for your lookup field. If performance changes depending on permissions I would debug the final SQL statement (which causes the delay) and I would consider adding useful indexes to the relevant tables.
In another project, when we were getting timeouts after 2 minutes (query never came to an end), I was able to reduce query time to less than 3 seconds by overthinking the SQL statement and by adding two indexes to one table. This was not due to standard lookups but due to custom code. But it proved to me that we can boost standard generated AppGini code by just improving our database when it comes to large amount of data.
Just my 5 cents.
Kind regards,
<js />
My AppGini Blog:
https://appgini.bizzworxx.de/blog
You can help us helping you:
Please always put code fragments inside
AppGini 24.10 Revision 1579 + all AppGini Helper tools
<js />
My AppGini Blog:
https://appgini.bizzworxx.de/blog
You can help us helping you:
Please always put code fragments inside
[code]...[/code]
blocks for better readabilityAppGini 24.10 Revision 1579 + all AppGini Helper tools
Re: Reached 94 tables in my project -> lookup very slow
You can usually solve database bottlenecks by checking which fields are being frequently used in WHERE conditions in slow queries and adding an index to those fields. This dramatically increases performance. In modern versions of AppGini, we added a Query logs page in the admin area (accessible from the utilities menu). This page lists slow queries so you can check the exact queries slowing down the server.
AppGini plugins to add more power to your apps:
- DataTalk is an innovative AppGini plugin based on ChatGPT that allows you to interact with your AppGini database using natural language questions, without writing any SQL. Check the demo video
- Mass Update plugin: Update multiple records at once and improve your workflow efficiency.
- Check our other plugins and get a generous discount of up to 30% when buying 2 or more plugins.
Re: Reached 94 tables in my project -> lookup very slow
Hi,
I want to suggest to run "ANALYZE", "REPAIR" and "OPTIMIZE" on the database tables.
I once transferred tabled via SQL to another server .... but that (for whatever reason) did not seem to create the indices (index) right away. After running ANALYZE it was smooth (=fast) again.
Olaf
I want to suggest to run "ANALYZE", "REPAIR" and "OPTIMIZE" on the database tables.
I once transferred tabled via SQL to another server .... but that (for whatever reason) did not seem to create the indices (index) right away. After running ANALYZE it was smooth (=fast) again.
Olaf
Some postings I was involved, you might find useful:
SingleEdit - Prevent concurrent edits on records; Field Permissions; Column-Value-Based-Permissions; Custom (error) message; Audit Log; Backup your database; Two Factor Authentication; Block brute force (failed) logins; Add 2nd SAVE CHANGES button; Place a search on details view
SingleEdit - Prevent concurrent edits on records; Field Permissions; Column-Value-Based-Permissions; Custom (error) message; Audit Log; Backup your database; Two Factor Authentication; Block brute force (failed) logins; Add 2nd SAVE CHANGES button; Place a search on details view