This was a problem on a project I worked on just before xmas, sorry for the delayed report(and lack of details), I had a few other big/improvements and features gained from insights on the project but lost them to a powercut while noting them down..
This was a "low poly" environment mesh of 2 million triangles(original is a pointcloud in the billions). 100 udims and a few thousand islands(mixture of auto seams and manual).
After applying set texel density feature, I recall all 128GB of system RAM used, no crash. I remember it completing the operation(not possible to cancel? It takes a while), but failing to apply bringing up an error about vector size or something.
This vector error would continue when attempting other operations, you could not save the current progress and reload the file hoping to have reset/resolve the bug. So it persisted over a save.
Apologies for not having more useful information, I doubt most users will ever run into this issue if it is due to mesh size, most places will divide such a large mesh into smaller chunks prior.