I am trying to build (to keep it as simple as possible) "something like Google Analytics". It means: I want to store objects with few informations (size <2KB) from web page into some storage and be able to query it.
I have JS code that sends those event objects to the PHP endpoint on Google App Engine. This endpoint then inserts it into Google BigQuery. Here comes my problem: the insertion is done via Google API PHP library - REST request. So it performs HTTP request which is very slow.
My question here is: is there a better way to store the events in Google Cloud environment? Is better (and more cost effective?) to use PubSub or Redis for storing events there and have some workers in the background that loads this queue to the BigQuery?
Any idea how to do this as efficient (both in performance and cost) as possible would be greatly appreciated!
If I had to do this I would first make the endpoint handler save the raw data into a push queue, because enqueuing stuff is relatively fast on App Engine. The processing of the data and the BigQuery API calls would be done later in the task queue.
I guess the performance and cost you can get also vary a bit depending on the App Engine language(PHP,Go,Java,...).