0

I'm running the following code, and in my mind the result should be the same. However, it is not.

money = 100
count = 1
while count < 8:
    money = money * 1.1
    count += 1
    print(money)
print("final result for money =", money)
print(100 * 1.1 ** 7)

Here is the output:

110.00000000000001
121.00000000000003
133.10000000000005
146.41000000000008
161.0510000000001
177.15610000000012
194.87171000000015
final result for money = 194.87171000000015
194.87171000000012

Can you please help me understand the difference? In case it helps, I'm running the code in a Python emulator, on datacamp.com.

S.B
  • 7,457
  • 5
  • 15
  • 37
Sebastian
  • 1
  • 1
  • Hi @Sebastian, like a third of `precision` tagged questions end up in this same issue: misunderstanding of floating-point workings. If you are dealing with money you perhaps want to use a python [decimal](https://stackoverflow.com/questions/20354423/clarification-on-the-decimal-type-in-python) type. – Arc Jan 11 '22 at 05:04

0 Answers0