优化数据可视化Web应用程序的性能

I'm rewriting a data visualisation web tool which was written 3 years ago. Since that time, JavaScript engine of browser have become way faster so i was thinking to transfer part of the job from server to client.

On the page, data is visualized in a table and in a map (or chart), it's using the same data, but in a different way so the two algorithm to prepare the data for display are different.

Before at every interaction of the user with the data dropdown selectors (3 main + 2sub depending on the 3 main), 3 ajax request were sent, php doing all the work and sending back only necesary data (in html for the table/xml for the chart) very tiny response, no performance issue and javascript was appending response and doing not much more than chasing change events.

So performance was ok but at every single change of criteria user has to wait for ajax response :/

Now my idea is to send back a json object in one ajax request, only at every change of the main 3 criteria combination and then have javascript populating the data in the table and the chart/map on ajaxsuccess and then also on change of the 2 sub criteria.

My hesitation concerns the structure of the json send by the server, the balance of the payload.

Indeed, if there were only one algorithm necessary to create the wanted json structure to display the data from the raw data, i would have php processing the data into this object ready for javascript to deal with it without any additional treatment; but there are 2.

So

  • if I make php process the data to create 2 objects (one for table/one for chart), I will double the size of the json response & increase memory usage on the client side; i don't like this approach because this two object contain the same data, just structured differently & redundancy is evil, isn't it ?

  • if i send the raw object and let javascript search for what to display and where i'm giving lot of job to the client - this also at every subcriteria change (or i could create all the json objects once on ajaxsuccess so they are ready in case of this subcriteria change ?)- here i'm little worry for users with old browser/small ram...

(The raw json object untreated, depending on criteria vary between 3kb and 12kb, between 500 and 2000 records)

I'm failing to spot the best approach...

So for this single raw data to multiple structured objects job, would you have php (increasing response size and sending redundant data) or javascript (increasing javascript payload) processing the raw data ?

Thanks a ton for your opinion

I found an appropriate solution, so I will answer my own question.

I have followed @Daverandom's advice:

  • PHP sends raw data (along with a couple of parameters that depends on the combination of the main criteria)

  • JavaScript processes the raw data and render it in the page

  • JavaScript reprocesses the raw data if sub-criteria are changed, as upon testing the looping process appears to be very fast and doesn't freeze the browser whatsoever, so there is no need to keep the structured object in the scope

  • Aggressive caching headers are sent with the JSON AJAX response (those data never change - only new records are added every year) in case user re-consults data that has already been consulted: so raw data is not kept in the JavaScript scope if it is not being displayed

  • On top of that, the JSON strings echoed by php are cached on the server (because those data never change) so this reduces database queries and improves response time

The final code is neat, easy to maintain, and the application works flawlessly.

Thanks to @Daverandom for the help.