i have a chron task to perform the best way in Golang.
sellers
sellers
in a database, i need to browse another large JSON webservice with sellersID
parameter to save to another table named customers
.customer
has an initial state, if this state has changed from the data of the webservice (n°2) i need to store the difference in another table changes
to have a history of changes.var wg sync.WaitGroup
action.FetchSellers() // fetch large JSON and stort in sellers table ~2min
sellers := action.ListSellers()
for _, s := range sellers {
wg.Add(1)
go action.FetchCustomers(&wg, s) // fetch multiple large JSON and stort in customers table and store notify... ~20sec
}
wg.Wait()
action.FetchCustomers
function does a lot of work that I think can be done in a concurrency way.I need to run this code every hour so it needs to be well built, currently it works but not in the best way. I think that considering the use of Worker Pools in Go like this example Go by Example: Worker Pools But I have trouble conceiving it
Not to be a jerk! But I would use a queue for this kind of things. I have already created a library and using this. github.com/AnikHasibul/queue
// Limit the max
maximumJobLimit := 50
// Open a new queue with the limit
q := queue.New(maximumJobLimit)
defer q.Close()
// simulate a large amount of jobs
for i := 0; i != 1000; i++ {
// Add a job to queue
q.Add()
// Run your long long long job here in a goroutine
go func(c int) {
// Must call Done() after finishing the job
defer q.Done()
time.Sleep(time.Second)
fmt.Println(c)
}(i)
}
//wait for the end of the all jobs
q.Wait()
// Done!