Skip to main content

Hi all,

I have been attempting to upload the City of Milwaukee’s largest accession yet in terms of number of files (over 6,000). I ran Directory Printer on the records in order to associate base metadata (format, collection identifiers, dates, etc.) on the records, copied into the CSV template, and attempted to upload using the usual bulk metadata upload process. This… did not go well, to the tune of “hung for half an hour and then kicked me out without uploading”.

I have confirmed that I am able to upload the files WITHOUT the metadata files in a relatively expedited manner. I have also confirmed using a subset of the files that I have the template formatted correctly to perform the upload. So what I assume is happening is that the parser is going through the spreadsheet and mapping to the files in the upload package… which is a process that takes an extra-long time with 6000+ files.

So I put it to the community: Have any of you discovered a soft cap on the number of files you can import with metadata at once? Are there other factors I might not be considering about why the big transfer is hanging up? I’ve pretty much resolved myself to chunking the process out, but if there’s data as to an ideal chunk size I’d like to know that before I start randomly guessing. Thanks.

 

Brad

Hi Brad,

There are limits to the number of files and/or size of packages that can be ingested; I believe, due to browser and/or local policies for data transfer via a browser. However, it also depends on how you are ingesting them. Are you ingesting via PUT? OPEX incremental ingest package? Etc. etc. You may also be limited based on your Preservica license and may need to verify that with your rep.

Keep us posted if you figure it out.

Cheers


Reply