Follow up to this
I have now found that my 2 jobs take 2-4 seconds and 8-15 seconds respectively.
When I run my two jobs from the Bigquery.go client library they are done after 50-70 seconds. I use job.Wait() and have tried my own polling function (polling every 5 seconds) but there is not a noticeably change. Is this an issue in the client library, on the link level or my code?
Job 1 (DB Query, ~5s when checking by console)
job, err := database_query.Run(ctx)
if err != nil {
return err
}
jobID := job.ID()
job, err = client.JobFromID(ctx, jobID)
if err != nil {
return err
}
status, err := job.Wait(ctx)
if err != nil {
return err
}
Job 2 (~12s when checking by console)
job_extract, err := extractor.Run(ctx)
if err != nil {
return err
}
status, err = job_extract.Wait(ctx)
if err != nil {
return err
}
if status.Err() != nil {
handler.Logger.Criticalf("Job failed with error %v", status.Err())
return status.Err()
}
After this there is a simple return with code 200.
I know that there is code which does not do anything which will be removed later.