# Lewko's blog

## An Exact Asymptotic for the Square Variation of Partial Sum Processes

Posted in math.PR, Paper by Mark Lewko on June 6, 2011

Allison and I just arxiv’ed our paper An Exact Asymptotic for the Square Variation of Partial Sum Processes.

Let $\{X_{i}\}$ be a sequence of independent, identically distributed random variables with mean $\mu < \infty$ . The strong law of large numbers asserts that

$\sum_{i=1}^{N}X_{i} \sim N\mu$

almost surely. Without loss of generality, one can assume that $X_{i}$ are mean-zero by defining $Y_{i}=X_{i}-\mu$. If we further assume a finite variance, that is $\mathbb{E}\left[|X_{i}|^2 \right] = \sigma^2 < \infty$, the Hartman-Wintner  law of the iterated logarithm gives an exact error estimate for the strong law of large numbers. More precisely,

$\left|\sum_{i=1}^{N} X_{i} \right|^2\leq (2+o(1))\sigma^2 N \ln\ln (N)$

where the constant $2$ can not be replaced by a smaller constant. That is, the quantity $\sum_{i=1}^{N}X_{i}$ gets as large/small as $\pm \sqrt{ (2-\epsilon) \sigma N \ln\ln (N)}$ infinitely often. The purpose of our current work is to prove a more delicate variational asymptotic that refines the law of the iterated logarithm and captures more subtle information about the oscillations of a sums of i.i.d random variables about its expected value. More precisely,

Theorem Let $\{X_{i}\}$ be a sequence of independent, identically distributed mean zero random variables with variance $\sigma$ and satisfying $\mathbb{E}\left[|X_{i}|^{2+\delta}\right] < \infty$.  If we let $\mathcal{P}_{N}$ denote the set of all possible partitions of the interval $[N]$ into subintervals, then we have almost surely:

$\max_{\pi \in \mathcal{P}_{N}} \sum_{I \in \pi } | \sum_{i\in I} X_{i}|^2 \sim 2 \sigma^2N \ln \ln(N)$.

Choosing the partition $\pi$, to contain a single interval  $J=[1,N]$ immediately recovers  the upper bound in the law of the iterated logarithm. This result also strengthens earlier work of J. Qian.

An interesting problem left by this work is deciding if the moment condition $\mathbb{E}\left[|X_{i}|^{2+\delta}\right] < \infty$ can be removed.  Without an auxiliary moment condition we are able to establish the following weaker `in probability’ result.

Theorem Let $\{X_i\}$ be a sequence of independent, identically distributed mean zero random variables with finite variance $\sigma$. We then have that

$\frac{\max_{\pi \in \mathcal{P}_{N}} \sum_{I \in \pi } | \sum_{i\in I} X_{i}|^2}{2 \sigma^2 N \ln \ln(N)} \xrightarrow{p} 1$