Why Does Floating-Point Output Differ Across Platforms?
As far as I know, due to the IEEE754 standard, extremely large numbers inputted by the user is not stored precisely in binary format. When such imprecisely stored values are converted back to decimal for display on the console, should the output remain consistent across different operating systems (Windows/Linux) and CPU architectures?