Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ActiveRecord::StatementInvalid: PG::SyntaxError: ERROR: VALUES lists must all be the same length #12

Open
sergeykish opened this issue May 4, 2017 · 2 comments

Comments

@sergeykish
Copy link
Contributor

Once run bulk_insert with non constant attributes count

records = [
  { :name => "Foo", :age => 30 },
  { :name => "Bar" }
]
expect {SampleRecord.bulk_insert(records)}.to change{SampleRecord.count}.by(records.size)

It fails

 ActiveRecord::StatementInvalid:
   SQLite3::SQLException: all VALUES must have the same number of terms:       INSERT INTO "sample_records"
           (name, age, created_at, updated_at)
         VALUES
           ('Foo', 30, '2017-05-04 12:44:57.825195', '2017-05-04 12:44:57.825195'),('Bar', '2017-05-04 12:44:57.825351', '2017-05-04 12:44:57.825351')
 # ./lib/active_record_bulk_insert.rb:40:in `bulk_insert'
 # ./spec/sample_record_spec.rb:24:in `block (4 levels) in <top (required)>'
 # ./spec/sample_record_spec.rb:24:in `block (3 levels) in <top (required)>'

As you can see it does not fill gap

INSERT INTO "sample_records" 
  (name, age, created_at, updated_at)
VALUES                       
  ('Foo', 30, '2017-05-04 12:44:57.825195', '2017-05-04 12:44:57.825195'),
  ('Bar',     '2017-05-04 12:44:57.825351', '2017-05-04 12:44:57.825351')
          ^^^ here
@bjhaid
Copy link
Owner

bjhaid commented May 4, 2017

@sergeykish I think your spec is inadequate, I am leaning towards a wontfix for this issue

@sergeykish
Copy link
Contributor Author

I do not push.

There was a problem on my project, I've fixed it, shared results, I'm ok with fork.

What I don't agree with is

ensuring that the hashes passed in contains the same set of keys and a default supplied to normalize hashes with fewer keys should be the job of whatever calls bulk_insert and not bulk_insert.

If you allow the exception to propagate this should be reflected in the API. Just and bang bulk_insert!, describe input requirements in README and that's it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants