I am in exactly the same position. Using JSON API live is not an option due to its performance constraints. Caching data is the only feasible option but it requires regular batch updates. I just ran a small test update (~20,000 releases), and I had to spread it over 2 days to stay within the rate limit. How can I update a few hundred thousand entities?
I assume it should not be too much of a challenge to add a few log statements when a key entity is changed (at a minimum, “artist”, “release” and “recording”). If you simply expose this change log, we can take it from there - parsing logs is trivial.
This will help prevent millions of unnecessary API calls, freeing bandwidth for editors.
If it helps, I can give you free cloud storage space (Google Cloud Storage) for such logs, and I can write a simple public JSON API for anyone to use - without touching your servers or bandwidth.