A couple of questions related to merges

Tags: #<Tag:0x00007fd29c948268> #<Tag:0x00007fd29c948178> #<Tag:0x00007fd29c948010> #<Tag:0x00007fd29c94fec8> #<Tag:0x00007fd29c94fdd8>
  1. When entities are merged, are their tags also merged?

  2. How does merging affect acouticbrainz data?
    2a) What to do if you think two recordings are the same, but acouticbrainz has different keys? (I had an example but I can’t find it right now… will add later if I do)

1 Like

The test server is very useful for answering questions like this, because you can just submit an arbitrary merge and instantly apply it to see what happens.

Anyway I just tried it on two recordings and the tags from the source were indeed moved to the target.


Found that acousticbrainz example:

I have the compilation track, and a comparison by ear with this sounds the same to me.

1 Like

AcousticBrainz computation of keys and key strength was recently discussed when we added keys to the recording sidebar: https://github.com/metabrainz/musicbrainz-server/pull/1578

If you look at Figure 11 in the scientific paper linked there you will see the usual mistakes made by the algorithm, i.e. a half-step up or down, a fifth up or down and minor/major relatives. I haven’t read the paper, so I can’t tell you more.

So if you fall in these cases I would imagine it’s not a reason not to merge


Thanks. This one may just be challenging, lol. I threw it into some other “key analysis” software out of curiosity, and I think one came up with Gm and one came up with D (don’t remember if that was major or minor). Also, I was looking at the acousticbrainz data and unless I’m misunderstanding what it means, it says the vocals are female, which… I think there might be a woman’s scream at one point?

So it looks like merging recordings on MusicBrainz only keeps the acousticbrainz info from the target recording. https://acousticbrainz.org/85345aa0-c5ca-491d-a92b-54cf27382c70?n=0

Is that desired? Is there any way to get the other info linked again?

Hi there,
Sorry that I didn’t reply to this when it was asked. @loujin did a good job of answering too!

Some of our automated tools can give different results due to a number of reasons, including bugs in our code and even differences in some kinds of encoding formats and parameters (e.g. low bitrate mp3s)

When data is submitted to AcousticBrainz, it stores only the mbid that it was submitted with. This means that we could have some different situations:

  • Two MBIDs that are merged in MB as being the same, but submitted under two separate ids
  • Someone submitting data for an MBID, but it was actually an incorrect recording

We have some in-progress work that we hope to complete that will use the merged information in MusicBrainz to identify that two different submissions in AcousticBrainz are the same. This will allow us to ensure that we report the same value no matter what MBID you request.

It’s also possible that the submitted data is incorrect. With our tools developed by @aidanlw17 (https://github.com/metabrainz/acousticbrainz-server/pull/364) we hope that we can identify when we have multiple submissions and some of the data is incorrect (e.g. if there are 9 recordings that say G major and 1 that says A minor then that one is probably incorrect!).

1 Like