Literature DB >> 29104399

Complete convergence and complete moment convergence for weighted sums of extended negatively dependent random variables under sub-linear expectation.

Haoyuan Zhong1, Qunying Wu1.   

Abstract

In this paper, we study the complete convergence and complete moment convergence for weighted sums of extended negatively dependent (END) random variables under sub-linear expectations space with the condition of [Formula: see text], further [Formula: see text], [Formula: see text] ([Formula: see text] is a slow varying and monotone nondecreasing function). As an application, the Baum-Katz type result for weighted sums of extended negatively dependent random variables is established under sub-linear expectations space. The results obtained in the article are the extensions of the complete convergence and complete moment convergence under classical linear expectation space.

Entities:  

Keywords:  END random variables; complete convergence; complete moment convergence; sub-linear expectation space

Year:  2017        PMID: 29104399      PMCID: PMC5651798          DOI: 10.1186/s13660-017-1538-1

Source DB:  PubMed          Journal:  J Inequal Appl        ISSN: 1025-5834            Impact factor:   2.491


Introduction

Additivity has been generally regarded as a fairly natural assumption, so the classical probability theorems have always been considered under additive probabilities and the linear expectations. However, many uncertain phenomena do not satisfy this assumption. So Peng [1-5] introduced the notions of sub-linear expectations to extend the classical linear expectations. He also established the general theoretical framework of the sub-linear expectation space. The theorems of sub-linear expectations are widely used to assess financial riskiness under uncertainty. For complete convergence and complete moment convergence, there are few reports under sub-linear expectations. This paper aims to obtain the complete convergence and complete moment convergence under sub-linear expectation space with the condition of , further , . In addition, the results and conditions of this paper include a slow varying and monotone nondecreasing function, so the theorems are more generic than the traditional complete convergence. In a word, it is meaningful that this paper extends the complete convergence and complete moment convergence under sub-linear expectation. Sub-linear expectations generate lots of interesting properties which are unlike those in linear expectations, and the issues in sub-linear expectations are more challenging, so lots of scholars have attached importance to them. Numbers of results have been established, for example, Peng [1-5] gained a weak law of large numbers and a central limit theorem under sub-linear expectation space. Chen [6] gained the law of large numbers for independent identically distributed random variables with the condition of . The powerful tools as the moment inequalities and Kolmogorov’s exponential inequalities were established by Zhang [7-9]. He also obtained the Hartman-Wintner’s law of iterated logarithm and Kolmogorov’s strong law of large numbers for identically distributed and extended negatively dependent random variables. Wu and Chen [10] also researched the law of the iterated logarithm, and Cheng [11] studied the strong law of larger number with a general moment condition , and so on. Many powerful inequations and conventional methods for linear expectation and probabilities are no longer valid, the study of limit theorems under sub-linear expectation becomes much more challenging. The complete convergence has a relatively complete development in probability limit theory. The notion of complete convergence was raised by Hsu and Robbins [12], and Chow [13] established complete moment convergence. The complete moment convergence is a more general version of the complete convergence. Lots of results on complete convergence and complete moment convergence for different sequences have been found under classical probability space. For example, Shen et al. [14], Wang et al. [15] and Wu and Jiang [16], and so on. Some recent papers had new results about complete convergence and complete moment convergence. For instance, Wang et al. [17] gained general results of complete convergence and complete moment convergence for weighted sums of some class of random variables, and Wang et al. [18] researched complete convergence and complete moment convergence for a class of random variables, and so on. In addition, the theorems of this paper are the extensions of the literature [14] under sub-linear expectation space. And we prove the theorems in this paper with the condition of , further , ( is a slow varying function). In the next section, we generally introduce some basic notations and concepts, related properties under sub-linear expectations and preliminary lemmas that are useful to prove the main theorems. In Section 3, the complete convergence and complete moment convergence under sub-linear expectation space are established. The proofs of these theorems are stated in the last section.

Basic settings

The study of this paper uses the framework and notations which are established by Peng [1-5]. So, we omit the definitions of sub-linear expectation (), capacity , countably sub-additive and Choquet integrals/expectations and so on.

Definition 2.1

(Peng [1, 2], Zhang [8]) (Identical distribution) Assume that a space and a space are two n-dimensional random vectors defined severally in the sub-linear expectation space and . They are named identically distributed if whenever the sub-expectations are finite. A sequence of random variables is named to be identically distributed if, for each and are identically distributed. (Extended negatively dependent) A sequence of random variables is named to be upper (resp. lower) extended negatively dependent if there is some dominating constant such that Whenever the nonnegative functions ,  , are all nondecreasing (resp. all nonincreasing). They are named extended negatively dependent if they are both upper extended negatively dependent and lower extended negatively dependent. It is distinct that if is a sequence of extended independent random variables and , then is also a sequence of extended dependent random variables with ; if is a sequence of upper extended negatively dependent random variables and are all nondecreasing (resp. all nonincreasing) functions, then is also a sequence of upper (resp. lower) extended negatively dependent random variables. It shall be noted that the extended negative dependence of under does not imply the extended negative dependence under ε̂. In the following, let be a sequence of random variables in and . The symbol C is on behalf of a generic positive constant which may differ from one place to another. Let denote that there exists a constant such that for sufficiently large n, denotes an indicator function, denotes . Also, let denote that there exist constants and such that for sufficiently large n. The following three lemmas are needed in the proofs of our theorems.

Lemma 2.1

([19]) is a slow varying function if and only if where , , and .

Lemma 2.2

Suppose , , , and is a slow varying function. Then, for , If , then for any and ,

Proof

(i) By Lemma 2.1, we can express as equality (2.1), and as , as . Let , be the inverse function of , is a slow varying function and for any , we have So, (ii) By the proof of (i), we can imply that for any  □

Lemma 2.3

(Zhang [9] (Rosenthal’s inequalities)) Let be a sequence of upper extended negatively dependent random variables in . And , . Then

Main results

Theorem 3.1

Let , , , and be a sequence of END and identically distributed random variables under sub-linear expectations. Let be a slow varying and monotone nondecreasing function. And is an array of real numbers such that If further, for , Then, for any , where if , and if ; where if , and if . In particular, if , then where if , and if .

Theorem 3.2

Suppose that the conditions of Theorem 3.1 hold, and , , then, for any ,

Theorem 3.3

Suppose that and other conditions of Theorem 3.1 hold. Let be a monotone nondecreasing function. Assume further that is an array of real numbers such that (3.1) holds and . If then, for ,

Proof

Proof of Theorem 3.1

Without loss of generality, we can assume that , when . We just need to prove (3.5). Because of considering instead of in (3.5), we can obtain (3.6). Noting that , without loss of generality, we can assume that and for all and . It follows by (3.2) and Hölder’s inequality that For fixed , denote for that It is easily checked that for , which can imply that For , let be a decreasing function and , for all x and if , if . Then In order to prove (3.5), it suffices to show and . By Lemma 2.2(i) and identically distributed random variables, we can get that In the following, we prove that . First, we prove that Case 1: . For any , by the inequality and (4.4), So, by (4.3) we can imply that By (2.3), we can imply that and , so we get as . Next, we estimate . Let , such that for all x and if , if or . Then For every n, there exists k such that , thus by (4.7), , and , from , we get Noting that by (2.4), , It follows that from the Kronecker lemma and . Case 2: . By (3.4), we can get that By (4.9) and , , one can get that It follows that for all n large enough, which implies that By Definition 2.1(ii), we can know that fixed are still END random variables. Hence, we have by Lemma 2.3 (taking ) that By (4.6), we have By Lemma 2.2(i), we can get . Noting that by (4.8) By , we get . Next we estimate . By (2.4), we can imply that Hence, it follows that By and , we can get . This finishes the proof of Theorem 3.1. □

Proof of Theorem 3.2

Without loss of generality, we can assume that when , and assume that . For , we have by Theorem 3.1 that Hence, it suffices to show that For , denote and Since , it follows that Note that by Lemma 2.2(i) In the following, we prove that . First, we show that Case 1: . Note (4.10) and (4.4), which imply that So, for , we get We get as in the proof of (4.7). Next, we estimate . For every n, there exists k such that , thus by (4.8), (4.13), , and , from , we get Noting that by (2.4), , It follows that from the Kronecker lemma and . Case 2: . By and , , we can get that It follows that for all n large enough, which implies that For fixed and , it is easily seen that are still END random variables. Hence, we have by Markov’s inequality, Lemma 2.3, (4.3), (4.12), (4.13), Lemma 2.2(i) that Hence, this finishes the proof of Theorem 3.2. □

Proof of Theorem 3.3

We use the same notations as those in Theorem 3.1. The proof is similar to that of Theorem 3.1. We only need to show that Because is a monotone nondecreasing function, we have which together with (3.8) yields that . Noting that and , we have  □
  1 in total

1.  Complete Convergence and the Law of Large Numbers.

Authors:  P L Hsu; H Robbins
Journal:  Proc Natl Acad Sci U S A       Date:  1947-02       Impact factor: 11.205

  1 in total
  1 in total

1.  Central limit theorems for sub-linear expectation under the Lindeberg condition.

Authors:  Cheng Hu
Journal:  J Inequal Appl       Date:  2018-11-16       Impact factor: 2.491

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.