Bayesian Estimation Analysis of Bernoulli Measurement Error Model for Longitudinal Data

The Bayesian method is used to study the inference of the semi-parametric measurement error model (MEs) with longitudinal data. A semi-parametric Bayesian method combined with fracture prior and Gibbs sampling combined with Metropolis-Hastings (MH) algorithm is applied and applied to the simulation observation from the posterior distribution, and the combined Bayesian statistics of unknown parameters and measurement errors are obtained. We obtained Bayesian estimates of the parameters and covariates of the measurement error model. Under three different priori assumptions, four simulation studies illustrate the effectiveness and utility of the proposed method.


Introduction
Longitudinal data is obtained when the same individual is repeatedly measured at different time points. Longitudinal data is widely found in bio-medicine, epidemiology and labor medicine. For example, biomedical longitudinal samples can generally be obtained through clinical trials and observational cohort studies. Longitudinal data is also widely used in the fields of finance, economy, etc. It is an unbalanced data and is generally processed using a linear hybrid model. Measurement error data and missing data are often encountered for various reasons. When the covariate contains measurement error, You (2006) [1] proposed a profile least squares estimation method for error correction; Zhou (2009) [2] the covariate of measurement error in the research model has statistical inference problem of auxiliary information; Wei (2010) [3] studied the parameter estimation problem of the model when the response variable is missing and the covariate contains the measurement error; Wei (2012) [4] studied the constraint estimation and hypothesis testing of the variable coefficient partial linear measurement error model parameters. There are also Liang, Ḧrdel and Carroll (1999) [5], Ma and Carroll (2006) [6], Liang, Wang and Carroll (2007) [7], Pan, Zeng and Lin (2008) [8] and other literature pairs. Such models have been studied. This paper proposes a hybrid algorithm for generating the observations required for Bayesian inference from the parameter posterior distribution and from the covariates of the ME. The algorithm combines a normal distribution with a mixed normal distribution. Gibbs sampling of a priori and MH algorithms was broken.

The Measurement Error Model
For = 1, … , , hypothesis Y i is the observation variable, which X i is an unobservable covariate vector of order × 1 , and U i is a covariate vector that can be observed in one order × 1 . Let = ( , ) , we assume that the values are conditionally independent of each other. For longitudinal data, we consider the following generalized linear measurement error model of structure. Here is a divergence parameter, d (⋅) and c (⋅,⋅) are specific differentiable functions, and has ̇( ) = ∂ ( ) ∂ ⁄ and ̈( ) = ∂ 2 ( ) ∂ 2 ⁄ . The conditional mean satisfies the following equation.
Here (⋅) is a monotonic differentiable link function, which = ( , ) is an unknown regression coefficient with vector ( + ) × 1. According to reference [9], for each individual i, we measure times for the true value covariate . and are error independent. That is, for each = 1, … , with following equations, we can't observe but we can observe ij .
These measurement error values ij are subject to unknown distributions, and they are independent of the true values . According to Lachos (2010) [10], our hypothetical distribution ij is suitable for a mixed model of the Dirichlet Process (DP).
In order to calculate the previously set covariate measurement error model, we also need to define a real covariate model. The true covariate model for ki ( = 1, … , ) can be defined as Here We set these parameters ， ， = ( 0 , ) for = 1, … , and 2 with a priori obey the following distribution These 1 , 2 , 0 , 0 , 0 , 0 , 1 and 2 are hyperparameters, and assume that their values are given by a priori information. According to the joint probability density function given above and their priors, we can use the Bayesian method to make statistical inferences on the parameters = { , , }. In addition, we use Gibbs sampling and Metropolis-Hastings algorithm to analyze the measurement error model with longitudinal data. We get the posterior distribution of the interested parameters where ) ) ( (

Simulation and Bayesian Estimation
In order to test the feasibility of the Bayesian method in the case where our previously assumed model obeys a large number of different distributions in the measurement error ij , for the sample size =  To study the sensitivity of a priori to Bayesian estimation, we consider the following three priori assumptions for the parameters and .  Table 1-4. We drop the first 5000 iterations of all parameters and collect 5000 data after 5000th to generate 100 sets of data from the posterior distribution of the full data through Marko. Bayesian Monte Carol (MCMC) sampling was used to evaluate Bayesian estimates. The results of the above two hypotheses and their three different a priori designs are given in Tables 1-4, where 'Bias' is the absolute value of the difference between the true value and the parameter mean of the 100 sets of replicates;

International Journal of Applied Physics and Mathematics
and 'RMS' It is the mean square error of the parameter estimates and true values for 100 replicates. We also have plotted densities of and ̂ for Simulation 4 under Type C prior inputs in Fig. 1.

Conclusion
According to Table 1-4 and Fig. 1, we know that (i) the model uses Bayesian estimation to be reasonable and correct, regardless of the distribution and a priori assumptions, because the unknown parameters produce Bias values less than 0.1 and RMS values less than 0.2. (ii) Dirichlet priori is generally sufficient to capture the characteristics of the various distribution hypotheses of the measurement error model. (iii) The results show that the proposed method is a good estimate of the distribution .