I have multiple Python processes, running on a server (RPI5) by cron
, which read data from web API's, and write it then into a common InfluxDB database - on the same bucket.
However, some of the data is lost. The code to write into Influx is:
influxdb_client = InfluxDBClient(url=url, token=token, org=org)
...
def f(df):
write_api = influxdb_client.write_api()
...
record = []
for i in range(df.shape[0]):
point = Point(measurement).tag("location", ...).time(...)
for col in list(df.columns):
value = df.loc[i, col]
point = point.field(col, value)
record += [point]
write_api.write(bucket=bucket, org=org, record=record)
...
## Let df be a data.frame with 20-500 rows, and 10-20 columns.
f(df)
What could be the reason of this issue? Some problem with asynchronous/synchronous?
Thx
InfluxDB
is not "thread safe". maybe when you use it with many processes then it can't control it and finally some processes delete values from other processes. Maybe you should have only one process which accessInfluxDB
and other should usequeue
to send data to this process.