Same meaning, and for valid reasons: there are applications that are not that
sensitive to strict semantics, but have a lot to gain by cutting corners.
> A computer lets you make more mistakes faster than any invention in
> human history - with the possible exceptions of handguns and tequila.
On the AV front, NTSC is a great example of how many bits you can lose if you
really need to be backwards compatible with the old B/W signal, and don't mind
telling people not to wear checked suits.
Steganography (see <http://www.stego.com/>) is an example of making use of
those low-order bits that were otherwise useless for human consumption. Jim
and Rohit need to look for covert channels in the Nordquist corpus: if it
doesn't contain the date of the next market crash, it probably at least
contains the proper instructions for animating a golem...
On the compiler writing front, not assuming "volatile" is the trivial example:
we might lose bits if something else were writing to that memory, but in
practice it's usually not worth dealing with.
The problem of aliasing is a bit more subtle, and unless the language has
taken steps to avoid it, the compiler writer will be conservative. In that
case, it's up to a human to assure the compiler that of course the code in
question doesn't have any potential aliasing, at least not in the intended
application, and anyway, no eyes will be poked out even if there were this
alleged aliasing, so go ahead and make it faster...