Markov's Inequality
If X is a nonnegative random variable and \(a>0\), then the probability that \(X\) is at least \(a\) is at most the expectation of \(X\) divided by \(a:\)
\[\mathbf{P}(X\ge a)\le\frac{\mathbf{E}(X)}{a} \]In the language of measure theory, Markov's inequality states that if \((X\mathbf{,\sigma,\mu})\) is a measure space, \(f\) is a measurable extended real-valued function, and \(\epsilon>0\), then
\[\mu\left(\left\{x \in X:|f(x)| \geq \varepsilon\right\}\right) \leq \frac{1}{\varepsilon} \int_{X}|f| d \mu \]This measure-theoretic definition is sometimes referred to as Chebyshev's inequality.
Chebyshev's Inequality
Let X be any random variable. If you define \(Y=(X−EX)^2,\) then \(Y\) is a nonnegative random variable, so we can apply Markov's inequality to \(Y\). In particular, for any positive real number \(b\), we have
\[P(Y\ge b^2)\le \frac{EY}{b^2} \]Note that
\[\begin{array}{l} E Y=E(X-E X)^{2}=\operatorname{Var}(X) \\ P\left(Y \geq b^{2}\right)=P\left((X-E X)^{2} \geq b^{2}\right)=P(|X-E X| \geq b) \end{array}\]Thus, we conclude that
\[P(|X-E X| \geq b) \leq \frac{\operatorname{Var}(X)}{b^{2}} \]Deviration of Markov's Inequality
\[\begin{aligned} E X & =\int_{-\infty}^{\infty} x f_{X}(x) d x \\ & =\int_{0}^{\infty} x f_{X}(x) d x \\ & \geq \int_{a}^{\infty} x f_{X}(x) d x \\ & \geq \int_{a}^{\infty} a f_{X}(x) d x \\ & =a \int_{a}^{\infty} f_{X}(x) d x \\ & =a P(X \geq a) \end{aligned}\]Reference
https://en.wikipedia.org/wiki/Markov's_inequality
https://www.probabilitycourse.com/chapter6/6_2_2_markov_chebyshev_inequalities.php