此文为《概率论》课程小项目。
关于圣彼得堡悖论的一些思考
下面作模拟:
import random
import matplotlib.pyplot as plt
MaxN = 10000000
def getAward():
award = 1
while(1):
award *= 2
if (random.random() <= 0.5):
break
return award
sumAward = 0
x_v = []
y_v = []
for i in range(0, MaxN):
sumAward += getAward()
x_v.append(i + 1)
y_v.append(sumAward / (i + 1))
plt.xlabel("Count of rounds")
plt.ylabel("Average award")
plt.plot(x_v, y_v)
[<matplotlib.lines.Line2D at 0x1f843cec1c0>]
from scipy.stats import norm
n = 10000000
m = 30
p = pow(0.5, m)
Ey = m/(1-p)
Dy = (pow(2, m + 1) - 2) / (1 - p)
x = norm.ppf(1 - p)
M = x * pow(n * Dy, 0.5) + n * Ey
print("M = ", M)
M = 1180628405.2149527
解出 \(M\) 后,\(E(X/n|X \le M)\)
可以近似认为就是 \(log M\)
import math
print(math.log(M)/math.log(2))
30.136907811683777
考虑对比两条曲线:
y_v_2 = []
for i in range(0, MaxN):
n = i + 1
M = x * pow(n * Dy, 0.5) + n * Ey
y_v_2.append(math.log(M)/math.log(2))
plt.plot(x_v, y_v, label = "real")
plt.plot(x_v, y_v_2, label = "estimate", color = 'red')
plt.legend()
plt.show()
不难发现还挺准的。
标签:plt,log,求解,pow,random,import,悖论,圣彼得堡,math From: https://blog.51cto.com/u_16105286/7987842