How did it happen that 0.1 + 0.2 = 0.30000000000000004?

From childhood we were taught that 0.1 + 0.2 equals 0.3. However, in the mysterious world of computing, things work differently. I recently started writing JavaScript code, and while reading about data types, I noticed strange behavior 0.1 + 0.2 not equal to 0.3. I turned to Stack Overflow for help and found a couple of posts that helped. Take a look below:





, , . : . , , .





: , 0,1 + 0,2 = 0,30000000000000004?





, , Java C, , . : .





, .





, : ? . :





, , . , 0,0005606 :





Significant- , , , (10). , .





: . 32 , 64 .





, JavaScript IEEE 754.





64 , () 0 51, - 52 62, - 63.





0,1 64- IEEE754.





(0,1) 10 ( 2).





0,1 2 , .





64 , , , 52 .





52 , :





. :





11 , 64- , -4 .





0,1:





0.2 :





, , :





:





0,1 + 0,2.





0,1 + 0,2 = 0,30000000000000004.








All Articles