Skip to content

Commit

Permalink
reduze elasticsearch indexing batch size to 50
Browse files Browse the repository at this point in the history
  • Loading branch information
Martin Fenner committed Jan 6, 2021
1 parent 5550d9c commit 23953db
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 3 deletions.
4 changes: 2 additions & 2 deletions app/models/datacite_doi.rb
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ def self.import_by_ids(options = {})
# TODO remove query for type once STI is enabled
# SQS message size limit is 256 kB, up to 2 GB with S3
DataciteDoi.where(type: "DataciteDoi").where(id: from_id..until_id).
find_in_batches(batch_size: 100) do |dois|
find_in_batches(batch_size: 50) do |dois|
ids = dois.pluck(:id)
DataciteDoiImportInBulkJob.perform_later(ids, index: index)
count += ids.length
Expand All @@ -71,7 +71,7 @@ def self.import_by_client(client_id)

# TODO remove query for type once STI is enabled
DataciteDoi.where(type: "DataciteDoi").where(datacentre: client.id).
find_in_batches(batch_size: 100) do |dois|
find_in_batches(batch_size: 50) do |dois|
ids = dois.pluck(:id)
DataciteDoiImportInBulkJob.perform_later(ids, index: self.active_index)
end
Expand Down
2 changes: 1 addition & 1 deletion app/models/other_doi.rb
Original file line number Diff line number Diff line change
Expand Up @@ -95,7 +95,7 @@ def self.import_by_ids(options = {})

# TODO remove query for type once STI is enabled
DataciteDoi.where(type: "OtherDoi").where(id: from_id..until_id).
find_in_batches(batch_size: 100) do |dois|
find_in_batches(batch_size: 50) do |dois|
ids = dois.pluck(:id)
OtherDoiImportInBulkJob.perform_later(ids, index: index)
count += ids.length
Expand Down

0 comments on commit 23953db

Please sign in to comment.