首页 > 其他分享 >Poisson Process

Poisson Process

时间:2023-11-14 17:34:47浏览次数:25  
标签:independent ge0 Process process Poisson event lambda

1. Counting process

​ We say that \(\{N(t),t\ge0\}\) is a counting process if \(N(t)\) represents the total number of "events" occur by time \(t\) and satisfies the following:

  1. \(N(t)\ge0\)
  2. \(N(t)\in\N^+\)
  3. \(N(s)\le N(t)\) if \(s\le t\)
  4. for \(s<t\), \(N(t)-N(s)\) equals the number of events that occur in the interval \((s,t]\).

​ A counting process is said to possess independent increments if the numbers of events that occur in disjoint time intervals are independent.

2. Poisson process

1st definition

​ The counting process \(\{N(t), t\ge0\}\) is said to be a Poisson process having rate \(\lambda,\lambda>0\), if

  1. \(N(0)=0\)

  2. The process has independent increments

  3. The number of events in any interval of length \(t\) is Poisson distributed with mean \(\lambda t\). Which means, for any \(s, t\ge0\)

    \[P\left\{N(s+t)-N(s)=n\right\} = \displaystyle{\frac{(\lambda t)^ne^{-\lambda t}}{n!}},\quad n=0,1,... \]

2nd definition

​ The counting process \(\{N(t),t\ge0\}\) is said to be a Poisson process having rate \(\lambda,\lambda >0\) if

  1. \(N(0)=0\)
  2. The process has stationary and independent increments
  3. \(P\{N(h)=1\}=\lambda h+o(h)\)
  4. \(P\{N(h)\ge2\}=o(h)\)
Proof of the equivalence of above two definitions

​ First we prove 2nd definition \(\Longrightarrow\) 1st definition. To start, fix \(u\ge0\) and let

\[g(t) = E[\exp\{-uN(t)\}] \]

​ We derive a differential equation of \(g(t)\) as follows:

\[\begin{align} g(t+h) &= E\left[\exp\{-uN(t+h)\}\right]\\ &=E[\exp\{-u(N(t+h)-N(t))\}\cdot \exp\{-uN(t)\}]\\ &=g(t)E(\exp\{-u(N(t+h)-N(t))\})\\ &=g(t)E(\exp\{N(h)\}) \end{align} \]

By the 3rd and 4th condition of the 2nd definition, we can yield

\[E[\exp\{-uN(h)\}]=1-\lambda h+e^{-u}\lambda h+o(h)\\ g(t+h)=g(t)(1-\lambda h+e^{-u}\lambda h) \]

which implies that

\[\frac{g(t+h)-g(t)}{h} = \lambda(e^{-u}-1)g(t)+\frac{o(h)}{h}\\ \lim_{h\rarr0}\frac{g(t+h)-g(t)}{h}=\lambda(e^{-u}-1)g(t)\\ g'(t) = \lambda(e^{-u}-1)g(t) \]

Solve this differential equation we can get that

\[g(t) = C\exp\{\lambda(e^{-u}-1)t\} \]

That is, the Laplace transform of \(N(t)\) evaluated at \(u\) is \(e^{\lambda t(e^{-u}-1)}\). Since that is also the Laplace transform of a Poisson random variable with mean \(\lambda t\), the result follows from the fact that the distribution of a nonnegative random variable is uniquely determined by its Laplace transform.

3. Interarrival and Waiting Time Distributions

​ Denote the time of the first event by \(T_1\), denote the elapsed time between the \((n-1)\)st event and the \(n\)th event by \(T_n\) for \(n>1\).

​ Now we determine the distribution of \(T_n\), note that

\[\begin{align} P(T_n\le t) &= 1-P(T_n>t)\\ &= 1-P\{N(t)\le n-1\} \end{align} \]

The event \(\{T_1>t\}\) takes place if and only if no events of the Poisson process occur in the interval \([0,t]\) and thus,

\[P(T_1>t) = P(N(t) =0)=e^{-\lambda t} \]

Hence, \(T_1\) has an exponential distribution with mean \(1/\lambda\). Now,

\[P(T_2>t) = E[P(T_2>t\ |\ T_1)] \]

However,

\[\begin{align} P(T_2>t\ | T_1=s) &= P\{N(s+t)-N(t)=0\ |\ T_1=s\}\\ &=P\{N(s+t)-N(t)=0\}\\ &=e^{-\lambda t} \end{align} \]

where the last two equations followed from independent and stationary increments. Therefore, we conclude that \(T_2\) is also an exponential random variable with mean \(1/\lambda\) and, furthermore, that \(T_2\) is independent of \(T_1\). Repeating the same argument yields the following.

Proposition 3.1 \(T_n=1,2,...\) are independent identically distributed exponential random variables having mean \(1/\lambda\).

​ The arrival time of the \(n\)th event is called the waiting time until the \(n\)th event.

\[S_n = \sum_{i=1}^nT_i,\quad n\ge1 \]

\(S_n\) has a gamma distribution with parameters \(n\) and \(\lambda\), the probability of \(S_n\) is given by

\[fs_n(t)=\lambda e^{-\lambda t}\frac{(\lambda t)^{n-1}}{(n-1)!},\quad t\ge0 \]

4. Further Properties of Poisson Process

​ Suppose each event occur in a Poisson process is classified as either type I or type II. Suppose further that each event is classified as a type I event with probability \(p\) and as a type II event with probability \(1-p\).

​ Let \(N_1(t)\) and \(N_2(t)\) denote respectively the number of type I and type II event occurring in \([0,t]\).

Proposition 4.1 \(\{N_1(t),t\ge0\}\) and \(\{N_2(t),t\ge0\}\) are both Poisson processes having respective rate \(\lambda p\) and \(\lambda(1-p)\). Furthermore, two processes are independent.

​ Now we prove \(N_1(t)\) is Poisson process:

  • \[\begin{align} P\{N_1(h)=1\} &= P\{N_1(h)=1\ |\ N(h)=1 \}P\{N(h)=1\}\\ &+P\{N_1(h)=1\ |\ N(h)\ge 2 \}P\{N(h)\ge2\}\\ &= p\lambda h+o(h) \end{align} \]

  • \(P\{N_1(h)\ge2\}\le P\{N(h)\ge2\}\)

Thus we can see that \(N_1(t)\) satisfies the 2nd definition of Poisson Process with rate \(\lambda p\).


The Coupon Collecting Problem

Problem There are \(m\) different types coupons. Each time a person collects a coupon it is, independently of ones previously obtained, a type \(j\) coupon with probability \(p_j\), \(\sum_{j=1}^mp_j=1\). Let \(N\) denote the number of coupons one needs to collect in order to have a complete collection of at least of one of each type. Find \(E[N]\).

Solution Let \(N_j\) be the number we must collect to obtain a type \(j\) coupon, then we can express \(N\) as

\[N = \max_{1\le j\le m}N_j \]

let \(X=\max_{1\le j\le m}X_j\) denote the time at which a complete collection is amassed. The waiting time of each type is independent and is exponentially distributed with parameter \(p_j\)

\[P(X<t) = \prod_{j=1}^m(1-e^{-p_jt}) \]

Therefore,

\[\begin{align} E[X] &= \int_0^\infin P(X>t)\ \mathrm dt\\ &=\int_0^\infin\left\{1-\prod_{j=1}^m(1-e^{-p_jt}) \right\}\ \mathrm dt \end{align} \]

Let \(T_i\) denote the \(i\)th interarrival time of the Poisson process that counts the number of coupons obtained.

\[X=\sum_{i=1}^NT_i\\ E[X|N] = NE[T_i] = N\\ \]

Therefore

\[E[X] = E[N] \]


5. Conditional Distribution of the Arrival Times

​ Suppose we are told that exactly one event of a Poisson process has taken place by time \(t\), and we are asked to determine the distribution of the time at which the event occurred. For \(s\le t\)

\[\begin{align} P\{T_1<s|N(t)=1\} &= \frac{P\{T_1<s,N(t)=1\}}{P\{N(t)=1\}}\\ &=\frac{P\{N(s)=1,N(t)-N(s)=0\}}{P\{N(t)=1\}}\\ &=\frac{\lambda se^{-\lambda s}e^{-\lambda (t-s)}}{\lambda te^{-\lambda t}}\\ &=\frac st \end{align} \]

Theorem 5.1 Given that \(N(t)=n\), the \(n\) arrival times \(S_1,S_2,...,S_n\) have the same distribution as the order statistic corresponding to \(n\) independent random variables uniformly distributed on the interval \((0,t)\).

Definition 5.1 We say the \(Y_{(1)},...,Y_{(n)}\) are the order statistics corresponding to \(n\) random variables \(Y_1,...,Y_n\) if \(Y_{(k)}\) is \(k\)th smallest value among \(Y_1,...,Y_m\). If the \(Y_i,i=1,...,n\) are independent identically distributed continuous random variables with density function \(f\), the joint density of order statistics \(Y_{(1)},...,Y_{(m)}\) is given by

\[f(y_1,...,y_n) = n!\prod_{i=1}^nf(y_i), \quad y_1<y_2<\cdots<y_n \]

Proof of Theorem 5.1 Note that the event that \(S_1=s_1,S_2=s_2,...,S_n=s_n,N(t)=n\) is equivalent to the event that the first \(n+1\) interarrival times satisfy \(T_1=s_1,T_2=s_2-s_1,...,T_n=s_n-s_{n-1},T_{n+1}>t-s_n\). According to Proposition 3.1, we have the conditional join density of \(S_1,...,S_n\) given that \(N(t)=n\) is as follows:

\[\begin{align} f(s_1,...,s_n|n)&=\frac{f(s_1,...,s_n,n)}{P\{N(t)=n\}}\\ &=\frac {\lambda e^{-\lambda s_1}\cdots\lambda e^{-\lambda (s_n-s_{n-1})}\lambda e^{-\lambda(t-s_n)}} {e^{-\lambda t}(\lambda t)^n/n!}\\ &= \frac{n!}{t^n}\quad 0<s_1<\cdots<s_n<t \end{align} \]

Proposition 5.1 Suppose each time a Poisson event is classified into \(k\) types of events. Every type of event occurred at time \(t\), is independent of anything that has previously occurred, with probability \(P_i(t),i=1,...,k\). If \(N_i(t),i=1,...,k\) represents the number of type \(i\) events occurring by time \(t\) then \(N_i(t), i=1,...,k\) are independent Poisson random variables having means

\[E[N_i(t)]=\lambda\int_0^tP_i(s)\mathrm ds \]

Proof Now consider an arbitrary event that occurred in the interval \([0,t]\). If it had occurred at time \(s\), then the probability that it would be a type \(i\) event would be \(P_i(s)\). According to Theorem 5.1, the probability that this event will be a type \(i\) event is

\[P_i = \frac1t\int_0^tP_i(s)\mathrm ds \]

which is independent of the other events. Hence the joint probability

\[P\left\{N_i(t)=n_i,i=1,...,k\ \left|\ N(t)=\sum_{i=1}^kn_i \right.\right\}=\frac{(\sum_{i=1}^kn_i)!}{n_1!\cdots n_k!}P_1^{n_1}\cdots P_k^{n_k} \]

Consequently,

\[\begin{align} P\{N_i(t)=n_i,i=1,...,k\} &= \frac{(\sum_in_i)!}{n_1!\cdots n_k!}P_1^{n_1}\cdots P_k^{n_k} \left(e^{-\lambda t}\frac{(\lambda t)^{\sum_in_i}}{\left(\sum_in_i\right)!}\right)\\ &= \prod_{i=1}^k e^{-\lambda tP_i}\frac{(\lambda tP_i))^{n_i}}{n_i!} \end{align} \]

As said above, \(N_i(t), i=1,...,k\) are independent, so random variable \(N_i(t)\) with probability function

\[P\{N_i(t)=n_i\} =e^{-\lambda tP_i}\frac{(\lambda tP_i))^{n_i}}{n_i!} \]

satisfies the Poisson distribution with rate \(\lambda P_i\), therefore

\[E[N_i(t)] = \lambda tP_i = \lambda \int_0^tP_i(s)\mathrm ds \]

and the proof is complete.

Proposition 5.2 Given that \(S_n=t\), the set \(S_1,...,S_{n-1}\) has the distribution of a set of \(n-1\) independent uniform \((0,t)\) random variables.

6. Generalizations of the Poisson Process

6.1 Nonhomogeneous Poisson Process

Definition 6.1 The counting process \(\{N(t),t\ge0\}\) is said to be nonhomogeneous Poisson process with intensity function \(\lambda(t),t\ge0\). if

  1. \(N(0)=0\)
  2. \(\{N(t),t\ge0\}\) has independent increments
  3. \(P\{N(t+h)-N(t)\ge2\}=o(h)\)
  4. \(P\{N(t+h)-N(t)=1\}=\lambda h+o(h)\)

Proposition 6.1 Let \(\{N(t),t\ge0\}\) and \(\{M(t),t\ge0\}\), be independent nonhomogeneous Poisson processes, with respective intensity functions \(\lambda(t)\) and \(\mu(t)\), and let \(N^*(t)=N(t)+M(t)\). Then, the following are true.

  1. \(\{N^*(t),t\ge0\}\) is a nonhomogeneous Poisson process with intensity function \(\lambda(t)+\mu(t)\).
  2. Given that an event of \(\{N^*(t),t\ge0\}\) process occurs at time \(t\) then, independent of what occurred prior to \(t\), the event at \(t\) from the \(\{N(t)\}\) process with probability \(\displaystyle{\frac{\lambda(t)}{\lambda(t)+\mu(t)}}\).

6.2 Compound Poisson Process

Definition 6.2 A stochastic process \(\{X(t),t\ge0\}\) is said to be a compound Poisson process if it can be expressed as

\[X(t)=\sum_{i=1}^{N(t)}Y_i,\quad t\ge0 \]

where \(\{N(t),t\ge0\}\) is a Poisson process, and \(\{Y_i,i\ge1\}\) is a family of independent and identically distributed random variables that is also independent of \(\{N(t),t\ge0\}\).

​ We have

\[\begin{align} E[X(t)] &= E\left[\sum_{i=1}^{N(t)}Y_i\right]\\ &= \sum_{n=1}^\infin E\left[\sum_{i=1}^nY_i\left|N(t)=n\right.\right]P\{N(t)=n\}\\ &=\sum_{n=1}^\infin \left(E\left[\sum_{i=1}^nY_i\right]P\{N(t)=n\}\right)\\ &= \sum_{n=1}^\infin\left(nE[Y]P\{N(t)=n\}\right) \\ &= E[Y]\sum_{n=1}^\infin\left(nP\{N(t)=n\}\right) \\ &=\lambda tE[Y] \end{align} \]

Similarly,

\[\mathrm{Var}(X(t)) = \lambda tE[Y^2] \]

6.3 Conditional or Mixed Poisson Processes

​ Let \(\{N(t),t\ge0\}\) be a counting process whose probabilities are defined as follows. There is a positive random variable \(L\) such that, conditional on \(L=\lambda\), the counting process is a Poisson process with rate \(\lambda\). Such a counting process is called a conditional or a mixed Poisson process.

​ Suppose that \(L\) is continuous with density function \(g\). Then

\[\begin{align} P\{N(t+s)-N(s)=n\} &= \int_0^\infin P\left\{N(t+s)-N(s)=n\ |\ L=\lambda \right\}g(\lambda)\ \mathrm d\lambda\\ &=\int_0^\infin e^{-\lambda t}\frac{(\lambda t)^n}{n!}g(\lambda)\ \mathrm d\lambda \end{align} \]

we see that a conditional Poisson process has a stationary increments. However, knowing how many events occur in an interval gives information about the possible value of \(L\), which affects the distribution of the number of events in any other interval, it follows that a conditional Poisson process does not generally have independent increments. Consequently, a conditional Poisson is not generally a Poisson process.

标签:independent,ge0,Process,process,Poisson,event,lambda
From: https://www.cnblogs.com/kaleidopink/p/17832119.html

相关文章

  • CentOS 7.9 防火墙启动报错--Process: 12848 ExecStart=/usr/sbin/firewalld --nofork
    原因:配置防火墙策略过程中,多次启停防火墙,导致防火墙启动报错报错截图: 排查:python版本是一致的,有一个遗留的防火墙进程防火墙正常关闭后没有这个进程 解决办法:杀掉这个进程,启动防火墙  ......
  • A Latent Hidden Markov Model for Process Data读文献笔记
    【个人笔记】:笔记(ALatentHiddenMarkovModelforProcessData)\SummaryResponseprocessdatafromcomputer-basedproblem-solvingitemsdescriberespondents'problem-solvingprocessesassequencesofactions.Suchdataprovideavaluablesourcefor......
  • Spring6.0官方文档示例:(28)多种方式添加BeanPostProcessor
    一、定义三个BeanPostProcessorpackagecn.edu.pku;importorg.springframework.beans.BeansException;importorg.springframework.beans.factory.config.BeanPostProcessor;importorg.springframework.stereotype.Component;@ComponentpublicclassMyScannedBeanPostPr......
  • process-exporter 监控linux机器进程使用情况
    process-exporter监控linux机器进程使用情况背景前期一直想进行关于IP地址的来源和目的地的监控但是耗费了很多精力都没有搞定.感觉应该去偷师一下安全监控软件的使用方式.今天晚上再github上面漫无目的的进行exporter的查找依旧一无所获,但是找到了process-expor......
  • Unity 自定义Postprocess 景深和散景模糊
    前言本篇将介绍如何通过添加RenderFeature实现自定义的postprocess——散景模糊(BokehBlur)和景深关于RenderFeature的基础可以看这篇https://www.cnblogs.com/chenglixue/p/17816447.html景深定义:聚焦清晰的焦点前后可接受的清晰区域,也就是画面中景象清晰的范围如下图相......
  • Recurrent Marked Temporal Point Processes: Embedding Event History to Vector
    目录概MotivationMarkedTemporalPointProcess代码DuN.,DaiH.,TrivediR.,UpadhyayU.,Gomez-RodriguzeM.andSongL.Recurrentmarkedtemporalpointprocesses:Embeddingeventhistorytovector.KDD,2016.概利用RNN学习强度函数\(\lambda^*\).在往下......
  • Temporal Point Processes
    目录TPPEvolutionarypointprocessesConditionalintensityfunction[\(t\)]Conditionalintensityfunction[\(t,\kappa\)]InferenceSimulationInverseMethodOgata’smodifiedthinningalgorithm例子RasmussenJ.G.Lecturenotes:Temporalpointprocessesandt......
  • Unity 自定义Postprocess Kawase Blur
    前言本篇将介绍如何通过添加RenderFeature实现自定义的postprocess——KawaseBlur关于RenderFeature的基础可以看这篇https://www.cnblogs.com/chenglixue/p/17816447.htmlKawaseBlur介绍因为毛神对于十大模糊算法的介绍已经整理得十分详细了,所以这里不会深入,但会大致讲讲它......
  • Unity 自定义Postprocess BSC明度饱和度对比度
    前言本篇将介绍如何通过添加RenderFeature实现自定义的postprocess——调整屏幕的明度、饱和度、对比度(以下统称BSC)关于RenderFeature的基础可以看这篇https://www.cnblogs.com/chenglixue/p/17816447.htmlShaderBrightness:很简单乘以rendertargettexture即可Saturatio......
  • Process-与操作系统中的进程进行交互
    Process介绍在Java中,Process类是一个抽象类,它提供了与操作系统中的进程进行交互的方法。当你在Java程序中启动一个新的进程(例如,运行一个外部程序或脚本)时,JVM会创建一个Process实例来代表这个新的进程。Process类提供了以下主要的方法:getInputStream():获取进程的标准输出流。你......