Sorry for the bad title, I wasn't sure how to ask this specific question.
So for a (extra credit) homework assignment, I wrote a python program for my differential equations class that should model the following problem:
A group of immigrants are crossing borders. At each border crossing, each immigrant flips a coin to decide whether to pass through or stay in the country they are in. How many are left at N border?
I modelled it like this: $$N'+0.5N = 0 , N(0)=1000$$ Solving the linear ODE: $$N=1000e^{-0.5x}$$
So that should be the equation of the line, right? We were supposed to model it by flipping 50 coins, but I said nope to that and wrote 90 lines of python, that goes something like this:
start with 1000 immigrants
for each border, subtract a binomial random number of them
#for example, 1000-(np.random.binomial()) = approx 500
repeat 10 times
average y-values and take natural log
do linear regression on the y-values
print out equation and chart
The resulting chart looks something like this: Chart of Data, Fit line, and expected line
This result is unexpected. The grey lines are the data, the red line is the line of best fit (Approximately $y=1050e^{-0.7x}$) and the blue line is the expected curve. ($y=1000e^{-0.5x}$) I have tried my linear regression step a number of different ways, (including scipy's exponential fit feature) but I always get a $b$ value (where $y=ae^{bx}$) of -0.68 to -0.73 or so.
What's up with that? Is it my model or something wrong with my code?
edit: the code I'm working with is here in case someone would like to view it: https://github.com/isademigod/populationproblem