-1

I am a Python beginner. I'm trying to write a simple but neat code to solve the following problem.

I've written this, but it always evaluates to ~325% somehow. What might be wrong?

import random 
numberOfStreaks = 0 
flip = []
for experimentNumber in range(10000):
# Code that creates a list of 100 'heads' or 'tails' values.

    for i in range(101):    
        flip.append(random.randint(0,1))

# Code that checks if there is a streak of 6 heads or tails in a row.
    for i in flip:
        if flip[i] == flip[i+1]:
            if flip[i] == flip[i+2]:
                if flip[i] == flip[i+3]:
                    if flip[i] == flip[i+4]:
                        if flip[i] == flip[i+5]:
                            numberOfStreaks += 1


    flip = []

print('Chance of streak: %s%%' % (numberOfStreaks / 100))
halfer
  • 19,471
  • 17
  • 87
  • 173

1 Answers1

0

This is happening because you are dividing numberOfStreaks by 100 and not the size of the total population.

If you want to get the proportion of streaks that occurred in your sample, then you have to divide the number of successes by the population size.

In other words...

numberOfExperiments = 10000
print('Chance of streak: {}'.format(numberOfStreaks / numberOfExperiments))
SpaceKatt
  • 848
  • 6
  • 12