Jump to content

DBGrid Export Excel


N.Marzio

Recommended Posts

  • 1 year later...
I have the following Query that returns approximately 8,000,000 records. It takes 8 seconds to run and 3 minutes to display on the grid.
SELECT H.Empresa, H.Empleado, H.Periodo, H.Liquidacion, H.NroCorrLiquidacion, H.Concepto, H.NroCorrConcepto, H.Devengamiento, H.Unitario, H.Importe, H.Porcentaje, H.Cantidad, H.Convenio,H.Agrupamiento, H.CCostoNivelUno, H.CCostoNivelDos, H.CCostoNivelTres, H.CCostoNivelCuatro, H.CCostoNivelCinco 
FROM LqEmpleadosActivos LQ  INNER JOIN HiDetalle H ON LQ.Empresa = H.Empresa and LQ.Empleado = H.Empleado 
WHERE LQ.Trabajaactualmente='S'
Is there a way to make the information display faster in the grid?
I have tried using AdsQuery and FDQuery and the result is the same.
Thanks
Link to comment
Share on other sites

9 hours ago, ygerdel said:

I have the following Query that returns approximately 8,000,000 records. It takes 8 seconds to run and 3 minutes to display on the grid.

Hello,

Well, your request returns a lot of data.

You should use pagination.

Link to comment
Share on other sites

14 hours ago, ygerdel said:

Is there a way to make the information display faster in the grid?

You think that any user can find some usefull information in the 8 millions records? He is an alien or a robot? 😁

You need to reduce amount of data that are shown to users. No one at the Earth need to view 8 mln records simultaneously.

Link to comment
Share on other sites

1 hour ago, Tokay said:

You think that any user can find some usefull information in the 8 millions records? He is an alien? 😁

Hi,

Please, don't take it as a joke. Depending on the project, it is a more common problem than you might think (may be not with 8 millons records 🙂)

In our case, we have tens of millions of installation points, hundreds of thousands of them, we have to migrate their technology between this year and the next, with millions of service control records with its sales, objectives, offers and opportunities. A given operator profile may see a grid with only a few hundred records, but his boss sees a few thousands, and his boss's boss several hundreds thousands, and so on.

Thank goodness most high-profile users prefer to see executive summaries, but even so, it's very easy for a given grid to occasionally end up receiving many, many records. Filters initially help keeping record counts "relatively" low, but the problem is that those filters get altered simply because, for example, a manager is seaching for some kind of anomaly affecting hundreds of thousands of installations of which he is responsible.

So, the database is our most important tool and its designs is simply critical. In UniGUI, I usually solve the problem by designing different views depending on the profile (usually need about 3 or 4 profile views for a simple project), but it is not the first nor the last time that a profile insists / needs to see the same data as his subordinates, and UniGUI is simply is not ready for it yet. I asked for advice a time ago but without results.

Regards,

 

Link to comment
Share on other sites

It's phisically impossible to embrace such amount of data. You have totally wrong software architecture. You need to select and show to users just so much information as it possible to embrace and analyze, and then you and your users have no issues with showing millions records.

Link to comment
Share on other sites

Hahahaha ... As you like ... As I said, only have this issue in very specific moments, when the wonderful design cannot be adjusted to what is requested. Hopefully, you will never have that problem, but the fact is UniGUI grid is far from knowing how to handle large datasets as our Salesforce does. It's possible, of course, but I have to build this kind of grid.

Link to comment
Share on other sites

21 minutes ago, Sherzod said:

Hello,

I think, anyway that you don't need to display so much data in the grid.

Instead, you need to export the data on the server side, and send it to the client.

This is exactly what I do in those cases...

What I was trying to tell is that unigui does not know how to optimize communication between database server and UniGUI server for grids when their datasets are very large (as, for example, Salesforce does) ... We all know that no one is going to look in detail into a dataset of 300,000 lines (not even 2000), but the unigui server does not know how to paginate without having the complete dataset, and the problem (for me) is the transferring time that this operation needs.

This is what I need to do to simulate what Salesforce does: "To create a custom system to replace a large handful of very useful features of UniGUI's DBGrid: pagination/infinite scrolling, grid filters and sorters. That is, using a DBGrid without paging and implement a totally new paging system doing "SELECT COUNT()" to retrieve the number of rows, use my own controls to do pagination and link it to SQL's "LIMIT OFFSET, ROWS" to retrieve only one desired page. And, obviously, disable all client side filtering and sorting and replacing it with server side filtering and sorting. "

 

Link to comment
Share on other sites

11 minutes ago, Sherzod said:

I will try to test UniDBGrid for big data...

Thank you very much... Please, pay attention to the first point, which is where the most of wasted time (transferring the complete data set over the network):

"I think the normal sequence in a UniGUI project is the following:
1.    MySQL Server executes the query and returns it to UniGUI server. As I can only use client side cursors with UniGUI controls, this huge amount of data travels from MySQL to UniGUI servers, consuming network resources and time.
2.    UniGUI server sends much less amount of data to UniGUI client with a tipical paged DBGrid, so this is done quickly. Filters and sorters of a DBGrid manage temporarily stored data in UniGUI server, so those operations are very fast. "

 

 

 

Link to comment
Share on other sites

I would work with a summarized data model in the first search, and for each record, the user could expand the data, like a drwdown.

ex: you do the first search by grouping the most important information, this will reduce the number of lines by millions..

from this first select with a group by, your user will select the desired information in the grid, a double click is performed, and from there it will open a new query, based on the selected record, in this way you can make a chain of queries, until your user finds the pertinent information.

ex:
I have a sales table with 30 million records.

the first search returns me sales totals by region.

when selecting the region, it returns me more sales details, only for the region.
and every open grid, I apply a new filter. until you reach the last level of registration.

Of course you don't need your tool to display 8 million rows in a grid, this is totally useless for any user.
but you can get the desired information by helping your user through the filtering steps.

 

you say in your text
"
a manager is looking for some kind of anomaly that affects hundreds of thousands of installations
"

It would be good if your application understands what an anomaly can be, and only display the pertinent data, rather than letting your user search for it.

if the manager identifies such a record as wrong, and because there is a rule for him to reach that conclusion, then put that rule into the system, and help his user find the record that really matters.

 

Link to comment
Share on other sites

Thanks for the suggestions but I'm afraid my level of English is so bad that they don't understand me.

Let's start again, this time with a list:

  • I don't usually have problems with large data sets because I use profiles, different views, filters, exports, etc. I told you our numbers so you can see that it is very easy to generate a situation with huge records. Only that.
  • So, in some cases, even with all this, a large data set is generated.
  • I did a test with UniGUI and a similar huge dataset in our Salesforce. Salesforce's grid ALWAYS starts displaying the grid in 2-3 seconds. UniGUI grid needs 25-35 seconds.
  • After researching how they can do that, I changed the classic UniGUI gird way of working, making my own paging system and "voilà", the "new conceptual" grid now starts displaying data in 2-3 seconds. This is why I say that UNIGUI does not work well with large data sets.
  • This solution is very fast, and has its own problems and limitations, including the loss of many DBGrid functionalities (infinite scrolling, paging, filters, etc.).
  • This is what I tried to convey to the UniGUI team when I said "what I need", but I see that it has not been understood. I simply trying to explain the need to have a different type of grid for specific cases.
  • Btw, this issue is not specific to UniGUI. The same test in VCL gets similar results, because the main problem is the transfer of information from the database server to the unigui server and the creation of the data set in the latter.

Hoping it's clearer now.

Regards,

 

Link to comment
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
×
×
  • Create New...