Conversions To/From String Representations

Posted by Beetle B. on Wed 06 February 2019

This section addresses how one can convert a character sequence into a decimal/binary floating point number.

Decimal Character Sequence To Decimal Format

Input and output conversions are appropriately rounded.

Hexadecimal Character Sequence To Binary Format

Input and output conversions are appropriately rounded. Any binary number should be representable by a finite character sequence.

Decimal Character Sequence To Binary Format

To convert to character sequence, ensure you use enough digits to guarantee a lossless conversion.

There is some detail about converting in the other direction that I did not read thoroughly.