Nah, Java is alright. All the old complicated “enterprise” community and frameworks gave it a bad reputation. It was designed to be an easier, less bloated C++ (in terms of features/programming paradigms). It’s also executed fairly efficiently. Last time I checked, the same program written in C would typically take 2x the time to complete in Java; whereas it would take 200x the time to complete in Python. Here’s some recent benchmarks: https://benchmarksgame-team.pages.debian.net/benchmarksgame/fastest/python3-java.html
I haven’t had a chance to try Rust yet, but want to. Interestingly, Rust scores poorly on source-code complexity: https://benchmarksgame-team.pages.debian.net/benchmarksgame/how-programs-are-measured.html#source-code
The Stable Diffusion algorithm is strange, and I’m surprised someone thought of it, and surprised it works.
IIRC it works like this: Stable Diffusion starts with an image of completely random noise. The idea is that the text prompt given to the model describes a hypothetical image where the noise was added. So, the model tries to “predict,” given the text, what the image would look like if it was denoised a little bit. It does this repeatedly until the image is fully denoised.
So, it’s very easy for the algorithm to make a “mistake” in one iteration by coloring the wrong pixels black. It’s unable to correct it’s mistake in later denoising iterations, and just fills in the pixels around it with what it thinks looks plausible. And, it can’t really “plan” ahead of time, it can only do one denoising operation at a time.