You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Yes, we convert the csv to json and then insert to postgres, this is because currently our pg lib doesn't support the pg COPY command(see nikita-volkov/hasql#1), once that's supported we could insert the csv in one go with no additional processing.
Bulk inserting JSON could also be made a lot faster, see #690.
@nathancahill No, not really. In fact I don't see us using COPY now since it'll prevent using libpq pipeline mode (#2707).
I think the solution for this is #2826, which will provide an escape hatch for doing your own csv parsing in SQL (or any pg language, like plrust) which will have higher perf. I expect to work on it for the next release.
Environment
f0b44c71fee9
postgres:9.6
3153f5b6d430
postgrest/postgrest
Description of issue
Not an issue per se, however, I am getting poor performance when uploading a relatively small
CSV
fileI am essentially uploading time series data.
Below is the relevant excert of the db schema.
The key table is the
shape_value
table.Test
I have uplaoded the csv file I am uploading as a gist
I have noticed in the
db
logs, it seemsPostgres
is converting thecsv
tojson
before inserting, is this the expected behaviour.Any help would be greatly appreciated, thanks!
I imagine I am sending the request incorrectly.
The text was updated successfully, but these errors were encountered: