Abstract
A rate of complete convergence for weighted sums of arrays of rowwise independent random variables was obtained by Sung and Volodin (2011). In this paper, we extend this result to negatively associated and negatively dependent random variables. Similar results for sequences of -mixing and -mixing random variables are also obtained. Our results improve and generalize the results of Baek et al. (2008), Kuczmaszewska (2009), and Wang et al. (2010).
1. Introduction
The concept of complete convergence of a sequence of random variables was introduced by Hsu and Robbins [1]. A sequence of random variables converges completely to the constant if In view of the Borel-Cantelli lemma, this implies that almost surely. Therefore, the complete convergence is a very important tool in establishing almost sure convergence of summation of random variables as well as weighted sums of random variables. Hsu and Robbins [1] proved that the sequence of arithmetic means of independent and identically distributed random variables converges completely to the expected value if the variance of the summands is finite. ErdΓΆs [2] proved the converse. The result of Hsu-Robbins-ErdΓΆs is a fundamental theorem in probability theory and has been generalized and extended in several directions by many authors.
Ahmed et al. [3] obtained complete convergence for weighted sums of arrays of rowwise independent Banach-space-valued random elements.
We recall that the array of random variables is said to be stochastically dominated by a random variable if where is a positive constant.
Theorem 1.1 (Ahmed et al. [3]). Let be an array of rowwise independent random elements which are stochastically dominated by a random variable . Let be an array of constants satisfying Suppose that there exists such that . Let and . If and in probability, then
Note that there was a typographical error in Ahmed et al. [3] (the relation should be ). If , then the conclusion of Theorem 1.1 is immediate. Hence, Theorem 1.1 is of interest only for .
Baek et al. [4] extended Theorem 1.1 to negatively associated random variables.
Theorem 1.2 (Baek et al. [4]). Let be an array of rowwise negatively associated random variables which are stochastically dominated by a random variable . Let be an array of constants satisfying (1.3) and (1.4). Suppose that there exists such that . Let and . If , for all and , and then
Sung and Volodin [5] improved Theorem 1.1 as follows.
Theorem 1.3 (Sung and Volodin [5]). Suppose that . Let be an array of rowwise independent random elements which are stochastically dominated by a random variable . Let be an array of constants satisfying (1.3) and (1.4). Assume that in probability. If then (1.5) holds.
In this paper, we extend Theorem 1.3 to negatively associated and negatively dependent random variables. We also obtain similar results for sequences of -mixing and -mixing random variables. Our results improve and generalize the results of Baek et al. [4], Kuczmaszewska [6], and Wang et al. [7].
Throughout this paper, the symbol denotes a positive constant which is not necessarily the same one in each appearance. It proves convenient to define , where denotes the natural logarithm.
2. Preliminaries
In this section, we present some background materials which will be useful in the proofs of our main results.
The following lemma is well known, and its proof is standard.
Lemma 2.1. Let be a sequence of random variables which are stochastically dominated by a random variable . For any and , the following statements hold:(i),(ii).
Lemma 2.2 (Sung [8]). Let be a random variable with for some . For any , the following statements hold:(i) for any ,(ii) for any such that ,(iii).
The Rosenthal-type inequality plays an important role in establishing complete convergence. The Rosenthal-type inequalities for sequences of dependent random variables have been established by many authors.
The concept of negatively associated random variables was introduced by Alam and Saxena [9] and carefully studied by Joag-Dev and Proschan [10]. A finite family of random variables is said to be negatively associated if for every pair of disjoint subsets and of , whenever and are coordinatewise increasing and the covariance exists. An infinite family of random variables is negatively associated if every finite subfamily is negatively associated.
The following lemma is a Rosenthal-type inequality for negatively associated random variables.
Lemma 2.3 (Shao [11]). Let be a sequence of negatively associated random variables with and for some and all . Then there exists a constant depending only on such that
The concept of negatively dependent random variables was given by Lehmann [12]. A finite family of random variables is said to be negatively dependent (or negatively orthant dependent) if for each , the following two inequalities hold: for all real numbers . An infinite family of random variables is negatively dependent if every finite subfamily is negatively dependent.
Obviously, negative association implies negative dependence, but the converse is not true.
The following lemma is a Rosenthal-type inequality for negatively dependent random variables.
Lemma 2.4 (Asadian et al. [13]). Let be a sequence of negatively dependent random variables with and for some and all . Then there exists a constant depending only on such that
For a sequence of random variables defined on a probability space , let denote the -algebra generated by the random variables . Define the -mixing coefficients by The sequence is called -mixing (or -mixing) if as .
For any , let . Define the -mixing coefficients by where the supremum is taken over all with , and all and . The sequence is called -mixing (or -mixing) if there exists such that .
Note that if is a sequence of independent random variables, then and for all .
The following lemma is a Rosenthal-type inequality for -mixing random variables.
Lemma 2.5 (Wang et al. [7]). Let be a sequence of -mixing random variables with and for some and all . Assume that . Then there exists a constant depending only on and such that
The following lemma is a Rosenthal-type inequality for -mixing random variables.
Lemma 2.6 (Utev and Peligrad [14]). Let be a sequence of random variables with and for some and all . If for some , then there exists a constant depending only on , and such that
3. Main Results
In this section, we extend Theorem 1.3 to negatively associated and negatively dependent random variables. We also obtain similar results for sequences of -mixing and -mixing random variables.
The following theorem extends Theorem 1.3 to negatively associated random variables.
Theorem 3.1. Suppose that . Let be an array of rowwise negatively associated random variables which are stochastically dominated by a random variable . Let be an array of constants satisfying (1.3) and (1.4). If for all and , and (1.8) holds, then
Proof. Since , we may assume that . For and , define
Then is still an array of rowwise negatively associated random variables. Moreover, is also an array of rowwise negatively associated random variables. Since for all and , it suffices to show that
We will prove (3.3) with three cases.
Case 1 (). For , we get by Markovβs inequality, Lemmas 2.1β2.3, (1.3), and (1.4) that
The fifth inequality follows from Lemma 2.2.
For , we get by Markovβs inequality, stochastic domination, and (1.4) that
Case 2 (). As in Case 1, we have that .
Similar to in Case 1, we have that
Case 3 (). For , we take sufficiently large such that . Then we obtain by Markovβs inequality and Lemma 2.3 that
Similar to in Case 1, we obtain that
Noting , we obtain by (1.3) and (1.4) that
since . Hence, . As in Case 2, we obtain .
Remark 3.2. The moment condition of Theorem 3.1 is weaker than that of Theorem 1.2. Also, the conclusion of Theorem 3.1 implies the conclusion of Theorem 1.2. Hence, Theorem 3.1 improves Theorem 1.2. Moreover, the method of the proof of Theorem 3.1 is simpler than that of the proof of Theorem 1.2.
Corollary 3.3. Let be an array of rowwise negatively associated random variables which are stochastically dominated by a random variable . Let be a Toeplitz array satisfying If then
Proof. For the case , the result can be easily proved by
For the case , we let . Observe that
By Theorem 3.1 with , and replaced by , we get that
To complete the proof, we only prove that
but as .
Remark 3.4. When , Corollary 3.3 holds without negative association. Kuczmaszewska [6, Corollaryββ2.4], proved Corollary 3.3 under the stronger moment condition .
The following theorem extends Theorem 1.3 to negatively dependent random variables.
Theorem 3.5. Suppose that . Let be an array of rowwise negatively dependent random variables which are stochastically dominated by a random variable . Let be an array of constants satisfying (1.3) and (1.4). If for all and , and (1.8) holds, then (1.7) holds.
Proof. The proof is the same as that of Theorem 3.1 except that we use Lemma 2.4 instead of Lemma 2.3.
If the array in Theorem 3.1 is replaced by the sequence , then we can extend Theorem 3.1 to -mixing and -mixing random variables.
Theorem 3.6. Suppose that . Let be a sequence of -mixing random variables which are stochastically dominated by a random variable . Let be an array of constants satisfying (1.3) and (1.4). Assume that . If for all , and (1.8) holds, then
Proof. Since for all , it suffices to show that The rest of the proof is the same as that of Theorem 3.1 except that we use Lemma 2.5 instead of Lemma 2.3 and it is omitted.
Remark 3.7. Can Theorem 3.6 be extended to the array of rowwise -mixing random variables? Let be the sequence of -mixing coefficients for the th row of the array . When we apply Lemma 2.5 to the th row, the constant depends on both and . That is, the constant depends on . Hence we cannot extend Theorem 3.6 to the array by using the method of the proof of Theorem 3.1.
Corollary 3.8. Let be a sequence of -mixing random variables which are stochastically dominated by a random variable . Let be a Toeplitz array satisfying (3.10). Assume that . If (3.11) holds, then
Proof. The proof is the same as that of Corollary 3.3 except that we use Theorem 3.6 instead of Theorem 3.1.
Remark 3.9. When , Corollary 3.8 holds without -mixing. Wang et al. [7, Theoremββ2.5] proved Corollary 3.8 under the stronger moment condition .
Theorem 3.10. Suppose that . Let be a sequence of -mixing random variables which are stochastically dominated by a random variable . Let be an array of constants satisfying (1.3) and (1.4). If for all , and (1.8) holds, then (3.17) holds.
Proof. The proof is the same as that of Theorem 3.6 except that we use Lemma 2.6 instead of Lemma 2.5.
Remark 3.11. Likewise in Remark 3.7, we also cannot extend Theorem 3.10 to the array by using the method of the proof of Theorem 3.1.
Acknowledgments
The author would like to thank the Editor Zhenya Yan and an anonymous referee for the helpful comments. This research was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education, Science and Technology (2010-0013131).