'Use original values' for a tag on a large selection takes very long with a high CPU usage

Not a big concern, but mainly wondering:

When I have loaded and matched a large amount of releases in the right panel, select them all, and then right-click on a tag: ‘Use original values’, it sometimes takes a very long time, with a (on my aging system) rather high CPU (and memory) load for Picard to process that.

Uneducated me thinks: “why is telling Picard not to do something” to a tag such a burden?

If others don’t experience this, perhaps it is something specific to my system? (settings, plugins, scripts)

As I am typing this, Picard is processing such an action for some 15 (edit: 30) minutes now. (saying: Not Responding)

I’m expecting it to finish, but if it doesn’t I will see if I can find something in a log and report on it.
(edit2: and it’s finished)

(W10, Picard portable)

1 Like

I guess that’s another one of those instances where Picard becomes very busy with updating the UI. We had a couple of optimizations for the metadata box and related UI updates, but I think nobody looked in detail yet into this specific use case.

I added https://tickets.metabrainz.org/browse/PICARD-2166 for this.

@gabrielcarvfer , maybe you want to take a look at this as our performance wizard :smiley:

4 Likes

I feel like we’ve looked into this. When you click to use the original tags you’re not marking a checkbox and doing a single refresh on the metadatabox, but actually changing every single tag that changed in every single file selected. I’ve originally prevented this changes from updating the metadatabox until the file selection changed. People noticed right away that was the case (which I didn’t think would happen during normal use) and it got rolled back.

useorigs = []
...
if status == TagStatus.CHANGED or status == TagStatus.REMOVED:
    file_tracks = []
    track_albums = set()
    for file in self.files:
        objects = [file]
        if file.parent in self.tracks and len(self.files & set(file.parent.files)) == 1:
            objects.append(file.parent)
            file_tracks.append(file.parent)
            track_albums.add(file.parent.album)
        orig_values = list(file.orig_metadata.getall(tag)) or [""]
        useorigs.append(partial(self.set_tag_values, tag, orig_values, objects))
    for track in set(self.tracks)-set(file_tracks):
        objects = [track]
        orig_values = list(track.orig_metadata.getall(tag)) or [""]
        useorigs.append(partial(self.set_tag_values, tag, orig_values, objects))
        track_albums.add(track.album)
    for album in track_albums:
        objects = [album]
        orig_values = list(album.orig_metadata.getall(tag)) or [""]
        useorigs.append(partial(self.set_tag_values, tag, orig_values, objects))
...
use_orig_value_action.triggered.connect(partial(self._apply_update_funcs, useorigs))
1 Like

@outsidecontext possible change

def _apply_update_funcs(self, funcs):
    with self.parent.ignore_selection_changes: # prevents file updates from updating the selection
        for f in funcs:
            f()
    self.parent.update_selection(new_selection=False, drop_album_caches=True)

How does that affect things? (changing 4 text tags in 41 files without linked albums and rolling them back to the originals)
image

Side effect, you won’t be able to change selection while this is processing. Hopefully it will be so fast that it won’t matter. =x

Now testing for 1.5k files with a single tag change.
image

3 Likes