Test dataset submission problem

Tags: #<Tag:0x00007f05095694c8> #<Tag:0x00007f0509569388> #<Tag:0x00007f0509569220> #<Tag:0x00007f05095690e0>


@Gentlecat, as I said in the blog comment, I can’t submit my test dataset.
It says « Can’t add this dataset into evaluation queue because it’s incomplete » :sweat:

Licensing of AcousticBrainz datasets

Maybe try adding more than one Recording to “without vocals”? I don’t know if this is the reason for the message (and if so, it should probably be made more explicit that’s what’s incomplete about it :wink: ), but I guess it’s worth a try?

Oh, and I also think the challenge thing only works on beta.ab.o? Did you try and submit it on beta.ab.o or ab.o?


I did try on beta again just now. I will remember to add some recordings next time. :slight_smile:


Hm. I’m getting the “incomplete” message now for a dataset that has only had more recordings added since it was last run. @alastairp, can you take a look at https://beta.acousticbrainz.org/datasets/ff148dfe-272e-44e9-b3a0-707aa9f5d03e and see what’s “incomplete” about it? Is it because not all Recordings have actually been submitted to AB?


Check with @Gentlecat if he released https://github.com/metabrainz/acousticbrainz-server/pull/206 yet, which will tell you the reason for a dataset being incomplete.


I guess he didn’t if it doesn’t. :slight_smile:


I just released that update on both main website and beta.


Superb @Gentlecat, thanks very much, now I can see a detailed error message ! :smiley: :tada:

Cannot add this dataset because of a validation error: Can’t find
low-level data for recording: 0091e4e9-e521-4ef9-8085-61960bb2b156

I should probably start by submitting an AcoustID:sweat_smile:

EDIT: It could be helpful that the plain text MBID would rather be a link — here, 0091e4e9-e521-4ef9-8085-61960bb2b156.

EDIT2: Sorry, I confused AcousticBrainz and AcoustID. :sweat:


That’s a good idea. This message comes from an exception, so converting it to HTML won’t be a super simple task, but I agree that it’s a good idea. Let me think about the best way to do it (normally we link mbids to the AB page, but that doesn’t make sense for ones that aren’t in AB yet…)


I filed a ticket about simply not including any MBIDs not submitted to AB when evaluating datasets: