In PHP, does array_slice()
serve good enough to process large data array
that cannot be paginated
since its not stored in database
but calculated on other db tables
.
Anyways, so I have an array
of around 50k
which might increase later. First time on page load it fetches all 50k records then slices
it for ajax based pagination
. Will this cause server load
in future since all records
are being fetched on page load?
At first its a bad idea to create array containing 50K moreover it can encrease. It may "eat" all your memory in high traffic. Also where you store sliced parts of array for using on ajax requests?
I think (if you can not set limit in query) you can create additional table in which you can store your data (with cron for example) and show users data from it using limit for pagination, or you can create caching layer (or use existed caching systems: file cache, php memcache, ...), and write some algorithm for updating cache (it depends on your programm logic).