Statistics with SQL queries or in JavaScript

I have an excel like table with a should value and an is value for each day of a month:

descrip. |        | 01 | 02 | 03 | 04 |
_______________________________________
column 1 | should | 60 |  0 | 60 |  0 |
         | is     | 60 |  0 | 60 | 60 |
_______________________________________
column 2 | should |  0 | 15 |  0 | 15 |
column 3 | is     |  0 |  0 |  0 | 15 |

I need the values of this table for two purposes:

  1. Extract some statistics (total; should / is ratio; etc)
  2. Based on the values see if the actions (entry in column) are done
    • Just once
    • Once every week
    • daily
    • periodically

Should I get the statistics with SQL queries or calculate them with JavaScript based on the JSON response? What are the (dis)advantages

Some additional information:

I’m using SpringBoot with JPA and a PostgreSQL database.

Tables:

  • Chart
  • Column
  • Action (should / is values)

Here is a part of my JSON response:

"columns": [{
        "id": 12,
        "should": [{
            "id": 13,
            "date": 1438552800000,
            "min": 60
        }],
        "is": []
        }
    }]

6

As @Ewan has said, it would be better to have these statistics already calculated (we have here different options. Ewan pointed to one of them).

Back to the question, here some disadvantages of each solution:

  • Client side:
    Depending on the complexity of the calcs and the volume of the data to retrieve; to load the page may take longer than usual.
    In mobile clients, it also has an impact on the memory and the battery (the charge is limited here).
    Then comes the fact that statistics should be calculated every time the page is loaded.

    Important to say that, if data is not persisted in any sort of cache, is quite easy to overload the DBMS just by clicking F5 like crazy.

    Summarising, it affects the UX, and in some cases, it has an impact on the resources (memory, battery, CPU, …)

  • DB-side:
    DB is a shared resource. Some calcs may cause a serious problem of performance to the DBMS and any application connected to the DB is going to suffer the consequences.

    At this point is good to know the concurrency (possible number of simultaneous requests to the DB asking for the statistics) and the frequency of these requests. The volume of the data and its complexity also matters.

    Then comes the maintenance. Depending on the complexity of the SQL statements, the task could be dramatic.

    However, DBMSs are good at doing calcs, aggregates, etc. But you should be careful because you can end up taking down the DBMS (true story).

Neither, You should use a data warehouse, calculate your statistics via a MDX query and store the result.

your website can then present the stored results without having to recalculate on each page request.

1

Trả lời

Email của bạn sẽ không được hiển thị công khai. Các trường bắt buộc được đánh dấu *