<?xml version='1.0' encoding='UTF-8'?> InternalErrorWe encountered an internal error. Please try again.Failed to get necessary short term bucket lock for mbid-5757569b-3b22-4457-b240-0afce313d069, please try againfc5eddaf-f95f-40a9-a194-b2d96cf51032
read timeout at /home/musicbrainz/carton-local/lib/perl5/Net/HTTP/Methods.pm line 274.
Hi cover art folks… any pro tips to upload scans with less suffering? Maybe a CLI interface? I’m wanting to get into scanning more carefully, but the upload experience for large numbers of high resolution images is extremely frustrating:
Adding pictures tries to thumbnail them. With multiple large files, this seems to cause the computer to run out of memory, and the OOM killer shuts down Firefox. The file manager can thumbnail them without issue, so I think this is something to do with the uploader.
So, I can’t queue them, I have to do them one by one, or a few at a time. But that means I’ll have to wait for each batch to finish before I can upload the next batch.
Uploads are sloooow. I can upload at least 1g per hour to Google Drive, but it’s probably something like 10 or 20 times slower to CAA. Combined with the impossibility of queuing more than a few uploads, this looks like it’ll take several days to upload a single CD’s scans (~7g@2400dpi).
I guess the obvious answer is lossy compression and lower resolution, but that’s not really my kind of rodeo. And it’s 2025, these really aren’t very big files nowadays.
I upload a ton of CD scans. I can upload 25-30 images in one hit without too much trouble. Generally 600dpi, 85% JPGs around 2-15MB in size each.
Literally drag and drop a whole batch onto the page. Then resort them. (Though the awkward sorting means I tend to drag and drop in small batches which is still fast)
Once I have hit < Enter Edit > and have a batch uploading, I just open another tab in the browser and go do something else. This includes just uploading another batch of images to the next release.
What kind of size images are you working with? If you are one of those uploading 50MB PNGs then don’t be surprised if you get hassles.
A comparison between Google Images and MB\CAA upload is a little unfair on budget sizes. Some days the CAA is being hit especially hard with bandwidth issues.
Oof. I don’t know if there is any current solution… particularly regarding it taking a long time to upload (the Internet Archive is a godsend but it can crawl) and uploading a lot of scans using a lot of browser memory (can confirm).
I think the most proactive thing to do, if you have a little bit of a tech brain, is to make tickets for improvements.
e.g. it seems that you’ve identified the loading/creation of thumbnails browser-side as a sticking point? I wonder if adding a toggle for this would be an “easy fix™” in terms of dev time.
In the meantime… I guess you’ll have to leave them uploading in the background, as IvanDobsky mentions
Hats of for that legendary scan resolution!!
The scans are mostly around 350-450MB. Uploading them unedited except for the front covers since I have a couple big boxes of CDs to do (hm, let’s see how many I can get done before I get distracted… ooh, a squirrel!), and for an archive raw seems ideal anyway. The files are a bit oversized since I’m scanning with a color guide (because of this other forum post) in case anyone wants to correct them later on.
A comparison between Google Images and MB\CAA upload is a little unfair on budget sizes
True… soz, wrote my last rant/post when I’d just lost my queued uploads for the nth time and was feeling more than a little “salty”. Just meant it seems to be struggling on the server end, not my own connection.
I think the most proactive thing to do, if you have a little bit of a tech brain, is to make tickets for improvements.
MBS-12524 Improve performance of cover art uploader
Yes, that MBS-12524 looks exactly like the issue I ran into! Those details look helpful.
I might play around with making a userscript. I’m not good at writing them like some of y’all are, but wonder if I could switch off the thumbnails that way. Potentially making it send more than one scan at once might help them go faster too. I’ll post back here if I can figure out anything.
Made a hacky userscript that seems to be working (for my use case), posting if anyone else might want it. (It’s probably full of bugs)
The changes it makes are: hides thumbnails, runs multiple uploads simultaneously, and (untested - edit: sorry, retries don’t work reliably, I actually had to delete the “disabled” attribute from the button using right-click→Inspect to retry in one case) tries to automatically retry failed uploads.
Thanks again all for your pointers!
// ==UserScript==
// @name CAA upload faster
// @description CAA upload faster
// @version 2025-4-23-2
// @license GPL-2+ https://github.com/metabrainz/musicbrainz-server/blob/master/COPYING.md
// @namespace https://
// @match https://musicbrainz.org/release/*/add-cover-art
// @match https://*.musicbrainz.org/release/*/add-cover-art
// @grant none
// ==/UserScript==
(function() {
'use strict';
// https://tickets.metabrainz.org/browse/MBS-12524
const removePreviews = function() {
document.querySelectorAll('.uploader-preview-image').forEach(e => e.remove());
window.setTimeout(removePreviews, 1000);
}
removePreviews();
MB.Art.file_data_uri = function (file) {
var deferred = $.Deferred();
window.setTimeout(deferred.resolve(''), 100);
return deferred.promise();
};
MB.Art.add_art_submit = function (gid, upvm) {
var pos = parseInt($(`#id-add-cover-art\\.position`).val(), 10);
$('.add-files.row').hide();
$(`#cover-art-position-row`).hide();
$('#content')[0].scrollIntoView();
$(`#add-cover-art-submit`).prop('disabled', true);
var queue = MB.Art.process_upload_queue(gid, upvm, pos);
executePromisesWithDelay(queue)
.done(function () {
window.location.href =
`/release/${gid}/cover-art`;
})
.fail(function () {
$(`#add-cover-art-submit`).prop('disabled', false);
});
};
// This function is modified by Microsoft Copilot (sigh...) based loosely on https://github.com/metabrainz/musicbrainz-server/blob/fb0b86872f180e740360bd63d5efd1d99c8e604d/root/static/scripts/edit/MB/Art.js#L631
function executePromisesWithDelay(promises) {
var deferred = $.Deferred();
var results = [];
var pendingCount = promises.length;
var delay = 10000; // Delay in milliseconds
function executePromise(index) {
if (index >= promises.length) return;
setTimeout(() => {
promises[index]()
.then(
(result) => {
results[index] = result; // Store the result
if (--pendingCount === 0) {
deferred.resolve(results);
}
},
() => {
// Retry the failed promise
promises[index]().then((result) => {
results[index] = result;
if (--pendingCount === 0) {
deferred.resolve(results);
}
});
});
// Start the next promise execution after delay
executePromise(index + 1);
}, index * delay);
}
executePromise(0); // Start executing promises
return deferred.promise();
}
})();
Be careful with retries. Put a decent timer in there. Some days CAA struggles. And uploading 3GB of images WILL make it struggle. Feed a system designed for hundreds of MB multiple GBs and it will strain it.
I am also scared to think what Picard will do on that too. Especially those people who drag in 10000s of files… LOL
YOUR broadband is not the choke point. It is the server’s abilities to handle that kind of hit. If your collection in Picard just happens to have a dozen albums with 3GB of artwork each, that it very different to the average load.
It looks like the default is to use the 500px thumbnails, or at least that’s how it’s set on mine. If anyone has it set to “Full size” for tagging I guess that could cause pain, hope they leave it on 500 or 1200
(maybe would make sense to have an option to use “original, or 1200px if the original is >50mb” or something like that…)
For me 1200px artwork is not good enough. So I have it on “full sized”. But also set to download all artwork. To me this is awkward when I hit a high-res booklet. Just takes time…
Personally I then archive this down to something more manageable for my collection as I don’t need artwork taking up more space that the music. And the screen is only 4K.
There is something being created for Picard to have more options to adjust this I believe. Not sure if that was to happen server side or on the destination. Would be ideal for this kind of archived image.
The 1200px setting is on the CAA server. Making it most efficient.