Hello all,
I developed dll in Intel C++ that applies long double arithmetic operations.
My dll allocates 128 bit for each long double variable and performs 19 digits precision operations if I use C++ or Delphi host application.
In case of C# host application the dll allocates 128 bit also but number of significant digits is reduced to 15 (same with double).
How does it possible and what do I have to do to increase a number of significant digits to 19?
Thanks.
I developed dll in Intel C++ that applies long double arithmetic operations.
My dll allocates 128 bit for each long double variable and performs 19 digits precision operations if I use C++ or Delphi host application.
In case of C# host application the dll allocates 128 bit also but number of significant digits is reduced to 15 (same with double).
How does it possible and what do I have to do to increase a number of significant digits to 19?
Thanks.
Comment