The way I understand it, bloom is used because of the limited brightness of computer monitors.
The problem is that monitors only have 255^3 colors, which means 255 brightness levels. The actual brightness depends on the brightness level of the particular monitor, but in short this means that there are only 255 steps between pure black and pure white.
In real life, there are WAY more steps, probably infinite. However, until someone develops a monitor that can match the light intensity of say, the sun or a magnesium flame (both pretty bright), accurate light reproduction is impossible.
What bloom does is compensate for brightness levels that are outside the monitor's range by simulating what your eye would see if the monitor actually could make bright enough images, and somehow adding that to the normal image (in 255^3 colors of course).
HDR is more or less the same principle. It processes the lighting in a much wider range than the monitor is capable of displaying or the eye is capable of seeing at the same time, and then uses that data as well as the average image brightness to compress that data into an image that monitors can display.
This has little to do with camera lens effects. More with the sensors. When we're talking about lens flares, yes those are lens effects.