I have 6 Databases with 130k rows together. I want to show them in a table but to show just 10 rows takes up to 10 minutes. How can I fix it? I want to show them up quicker.
I don't know what else to write about this problem.
Route::post('/xx/data', [
'as' => 'data',
'uses' => function () {
$results = DB::select('SELECT
ero.FehlerCounter, ero.AllFehler, "x" as platz1,
pa.
pa.
pa.
pa.
"xx" as platz2,
ju.
ju.
ju.
ju.
"xxx" as platz3,
ncr.
ncr.
ncr.
"xxxx" as platz4,
wm.
wm.
wm.
"xxxxx" as platz5,
cpi.
cpi.
cpi.
FROM Error_tabel ero LEFT JOIN x_tabel pa ON (ero.Seriennummer =
pa.Seriennummer) LEFT JOIN xx_tabel ju ON (ero.Seriennummer=ju.SerialNo)
LEFT JOIN xxx_tabel ncr on (ero.Seriennummer=ncr.Serial) LEFT JOIN
xxxx_tabel wm ON (ero.Seriennummer=wm.SerialNum) LEFT JOIN xxxxx_tabel cp ON
(ero.Seriennummer=cp.SerialNumber)', array());
return \Yajra\Datatables\Datatables::of(collect($results))->make();
// return $results;
}
]);
It is going to take time because you are loading data to the front-end,
however there is a direct way in Laravel that gives you a better solution, check data chuncks
If you need to work with thousands of database records, consider using the chunk method. This method retrieves a small chunk of the results at a time and feeds each chunk into a Closure for processing. This method is very useful for writing Artisan commands that process thousands of records. For example, let's work with the entire users table in chunks of 100 records at a time.
DB::table('users')->orderBy('id')->chunk(100, function ($users) {
foreach ($users as $user) {
// your code
}
});