Classical music - Accessing work and parts structure

Using Picard, it is possible to access the “Work”, but only as a single flat string including all parts from the top level down to the individual movement. MusicBrainz can actually store an (up to) 8 level structure (although two or three levels is most common). Is there a plugin that accesses the additional structure from MusicBrainz and allows access to it in Picard via tags? If not, is it feasible to write such a plugin (I have some programming capability, but no experience of writing Picard plugins)?
Thanks

Where do you get the 8 from? I don’t see why it wouldn’t be possible to do “infinite” levels.

Yes it would be possible to write a plugin that extracts the information you want.
From a plugin you can find the work id of a song.
You could then create a http request to the web service to get more info from that work.
You then could parse the results, see if there is a relationship to another work and create another http request to the service.
Repeat until you have all the information you need.

Plugins are written in python check out the github repo for plugins.


I would look at albumartist_website or wikidata as examples that make further requests to the musicbrainz service form information.

1 Like

You are almost certainly right - although I am not aware of any more than eight in practice!

Thanks for the suggestions. I’ve done some simple python stuff. I’ll look into it.

Cool. It would be interesting to be able to do more with all the available information. It’s a shame music players can’t really do anything useful with data hierarchy though.

Most music players may not, but Muso, which is a music library manager (particularly, but not only, in the Squeezebox ecosystem) has excellent features, and now will link direct to MusicBrainz - see http://klarita.net/muso.html. It is to make use of the features of Muso that I wish to create the additional tags - you can effectively have a three level hierarchy for work/sub-part/movement. In the UPnP world, Minimserver supports such structures also.

1 Like

MusicBee has also recent added support for work and movement, similar to iTunes

I have made a start on this, following the suggested approach, but am a bit confused by the sequence of events. What I have written is (in sequence):
A) Defined a class PartLevels with a method add_worktag. This is registered via register_track_metadata_processor(PartLevels().add_worktag)
B) add_worktag gets the workId for the recording from the metadata
C) It then calls the method add_partof which contains the xml lookup namely: return tagger.tagger.xmlws.get(host, port, path, partial(self.process_rels, metadata),queryargs=queryargs)
D) process_rels traverses the response to extract the work which the workId is “part of” and updates the metadata
E) add_worktag updates the “bottom level” metadata.

All the individual bits seem to work OK, but not in this order. Having put various traces in, it is clear from the log that what happens is:

  1. Picard loads and clusters the files
  2. add_worktag executes and updates the metadata for each track (i.e. E above), but without the additional level from process_rels which does not execute at this stage.
  3. Picard then moves the files to the right-hand pane.
  4. Now process_rels executes, finds the parent work and updates the metadata object (i.e D above). BUT it has “missed the boat” because Picard has done with tagging and the metadata in Picard has not been updated for the new list items in the metadata object (which I can see clearly in the log).

So it seems that add_partof returns control to add_worktag which completes without process_rels having executed yet.
Maybe I’m being a bit dim and maybe there is a better explanation of the Picard API somewhere so I can work out where I’m going wrong?
Many thanks for any light shed.

Are you able to send me your code.
Pastebin, github, contact editor form on musicbrainz https://musicbrainz.org/user/dns_server

Picard is designed to process metadata all in one method.
Picard loads the metadata for that item.
Picard calls all plugins.
Picard finishes processing metadata.

As you are making asynchronous api calls the add_worktag has triggered the http call and returned back to picard suggesting that it had done.

The hack that the albumartist_website plugin uses is to change the _requests variable to indicate that there are pending requests for this item and delay picard from finalizing the album.
You then need to call _finalize_loading once you have no pending requests.
album._requests += 1
album._requests -= 1
album._finalize_loading(None)

With the wkidata plugin i’ve added some threading locks as well to get around some threading issues I had so this may be needed.

If the picard developers can suggest a better way of doing this and a proper api that plugins can use to let picard know that it is still doing asynchronous tasks in the background and has not finished processing metadata. This is a hack that works but I would like suggestions of a better approach.

Thanks for the help. I had written the plugin from scratch as I prefer to write my own code - at least I stand a chance of understanding what I have written. Instead I have now used the Album Artist Website as a starting point and have got the basics working (one level deep only). Now just need to add the extra levels :pensive:

1 Like

First version of this is written (“workparts.py”). I have tested it and it seems to be OK. It adds custom fields for work_0 (& workId_0) being the bottom level and work_parent_n (& work_parentId_n) for each successive level n above that.
If there is more than one work at the bottom level, then it should create work_0.1 etc, but I haven’t found a suitable album to test that (and arguably it is bad style to have more than one…). If there is more than one “parent” work of a lower level work, it uses the one with the longest name, on the grounds that the longest-named is likely to be the lowest level; this might possibly be improved.
Many thanks to Daniel Sobey for assistance and to Sophist for the albumartist_website code which I have used extensively. I have tried to add some more comments to help any others trying the same techniques. There are also extensive log.info and log.debug statements which could be slimmed down once testing is complete.
Further enhancements (among others) are to include discrimination as to type of parts relationship and to add arranger etc.
I now need to put it on GitHub for others to test, but I have never used a Git before, so am not sure of what the easiest and simplest method is to upload it (I have created an account). I would be grateful for some guidance on this, then others can play with it.

https://github.com/metabrainz/picard-plugins is the repository for plugins.

git is designed based on pulling changes instead of pushing features to the main repository.
You can “fork” to create a copy of the repository under your account.
You then make the changes in your copy until you are happy with your work.
You then let everyone know you have a copy and issue a “pull” request to say that you have a change and the owners of the repository will take our changes and merge that in to the master repository.

Login to github and navigate to https://github.com/metabrainz/picard-plugins
On the second row there is a “fork button”, this will create a copy in your account.
Now you have a copy under your account you can add your files.
You can use the github web page to upload files if that is simpler for you.

The normal way of doing it is to clone the repository to your pc, edit it there, commit the changes and push this to your github page.
There is a green clone button, clicking this will give you a url, clone it wiith something like the below
git clone https://github.com/metabrainz/picard-plugins.git
cd picard-plugins/picard
mkdir workparts
cd workparts
copy the file to this directory
git add workparts.py
Leave a message describing your changes
git push

You should now have a commit pushed to your repository.
go back to github and create a pull request, there is a new pull request button.

1 Like

Thanks. Hope I’ve done that right. See https://github.com/metabrainz/picard-plugins/pull/94

Revised version now at https://github.com/MetaTunes/picard-plugins/tree/1563fac8ee1e74ff81a29a8273d0bad182f16f5d/plugins/classical_workparts
Read the readme there for a full explanation.

Completely revised and updated version of “Classical Extras” is now available in beta test. See this thread Classical Extras plugin for information and any further updates.