I have a Raspberry PI collecting data from a break beam sensor I wish to use as part of an already developed Laravel application. I was just wondering what would the best way to transfer the data be.
I was thinking of creating an JSON file uploading it to a directory then running a cron job hourly to pick up on new files before running them through the Laravel controller to update the database and send emails.
I would like to pass the data through the Laravel application rather than sending from Python for management purposes. Could anyone see any issues with my way/ know a better way?
Your approach sounds fine - the only caveat would be that you will not have "real time" data. You rely on the schedule of your cron jobs to sync the data around - of course you could do this every minute if you wanted to, which would minimize most of the effect of that delay.
The other option is to expose an API in your Laravel application which can accept the JSON payload from your python script and process it immediately. This approach offers the benefits of real-time processing and less processing overall because it's on demand, but also requires you to properly secure your API endpoint which you wouldn't need to do with a cron based approach.
For the record, I highly recommend using JSON as the data transfer format. Unless you need to implement schema validation (in which case possibly look as XML), using JSON is easy on both PHP and python's side.
Use python to extract data from the serial ports of rasberry pi and json encode it and store it in the web directory of your laravel project files. Later json decode and present the data on the web end via laravel php. This is all good . Beind said that another way is to get data from python and then make a curl Post request to your php project and collect data