Decomposition of the Random Error Vector of a General Linear Model

α
Jaesung Choi
Jaesung Choi
α Keimyung University Keimyung University

Send Message

To: Author

Decomposition of the Random Error Vector of a General Linear Model

Article Fingerprint

ReserarchID

564T4

Decomposition of the Random Error Vector of a General Linear Model Banner

AI TAKEAWAY

Connecting with the Eternal Ground
  • English
  • Afrikaans
  • Albanian
  • Amharic
  • Arabic
  • Armenian
  • Azerbaijani
  • Basque
  • Belarusian
  • Bengali
  • Bosnian
  • Bulgarian
  • Catalan
  • Cebuano
  • Chichewa
  • Chinese (Simplified)
  • Chinese (Traditional)
  • Corsican
  • Croatian
  • Czech
  • Danish
  • Dutch
  • Esperanto
  • Estonian
  • Filipino
  • Finnish
  • French
  • Frisian
  • Galician
  • Georgian
  • German
  • Greek
  • Gujarati
  • Haitian Creole
  • Hausa
  • Hawaiian
  • Hebrew
  • Hindi
  • Hmong
  • Hungarian
  • Icelandic
  • Igbo
  • Indonesian
  • Irish
  • Italian
  • Japanese
  • Javanese
  • Kannada
  • Kazakh
  • Khmer
  • Korean
  • Kurdish (Kurmanji)
  • Kyrgyz
  • Lao
  • Latin
  • Latvian
  • Lithuanian
  • Luxembourgish
  • Macedonian
  • Malagasy
  • Malay
  • Malayalam
  • Maltese
  • Maori
  • Marathi
  • Mongolian
  • Myanmar (Burmese)
  • Nepali
  • Norwegian
  • Pashto
  • Persian
  • Polish
  • Portuguese
  • Punjabi
  • Romanian
  • Russian
  • Samoan
  • Scots Gaelic
  • Serbian
  • Sesotho
  • Shona
  • Sindhi
  • Sinhala
  • Slovak
  • Slovenian
  • Somali
  • Spanish
  • Sundanese
  • Swahili
  • Swedish
  • Tajik
  • Tamil
  • Telugu
  • Thai
  • Turkish
  • Ukrainian
  • Urdu
  • Uzbek
  • Vietnamese
  • Welsh
  • Xhosa
  • Yiddish
  • Yoruba
  • Zulu

Abstract

This paper deals with the decomposition of an error vector to identify how the error vector is related to the expected value of an observation vector under a general linear sample model since the error vector is defined as the deviance of observation vector from the expected value. The main idea of the paper is in that a random error vector can be decomposed into two orthogonal components vectors; i.e., one is in a vector space generated by the coefficient matrix of the unknown parameter vector and the other is in orthogonal complement of it. As related topics to the decomposition, two things are discussed: partitioning an observation vector and constructing the covariance structure of it. It also shows the reason why a projection method would be preferred rather than a least squares method.

Generating HTML Viewer...

References

19 Cites in Article
  1. S Searle (1971). Linear models.
  2. F Graybill (1976). Theory and Application of the Linear Model.
  3. N Draper,H Smith (1981). Applied Regression Analysis.
  4. F Graybill (1983). Matrices with Applications in Statistics.
  5. D Johnson,R Wichern (2014). Applied multivariate statistical analysis.
  6. George Milliken,Dallas Johnson (1984). Analysis of Messy Data Volume 1.
  7. Bruce Hill (1965). Inference about Variance Components in the One-Way Model.
  8. S Searle,G Casella,C Mcculloch (2009). Variance components.
  9. J Choi (2019). Nonnegative estimates of variance components in a two-way random model.
  10. Jaesung Choi (2020). Nonnegative variance component estimation for mixed-effects models.
  11. Jaesung Choi (1210). Nonnegative Estimation of Variance Components for a Nested Three-Way Random Model.
  12. W Krumbein,F Graybill (1965). An introduction to statistical models in geology.
  13. W Thompson (1961). Negative estimates of variance components: an introduction.
  14. W Thompson (1962). The Problem of Negative Estimates of Variance Components.
  15. W Thompson,James Moore (1963). Non-Negative Estimates of Variance Components.
  16. J Nelder (1954). The interpretation of negative components of variance.
  17. C Henderson (1953). Estimation of Variance and Covariance Components.
  18. H Hartley (1967). Expectations, variances and covariances of ANOVA means squares by "synthesis.
  19. F Satterthwaite (1946). An Approximate Distribution of Estimates of Variance Components.

Funding

No external funding was declared for this work.

Conflict of Interest

The authors declare no conflict of interest.

Ethical Approval

No ethics committee approval was required for this article type.

Data Availability

Not applicable for this article.

How to Cite This Article

Jaesung Choi. 2026. \u201cDecomposition of the Random Error Vector of a General Linear Model\u201d. Global Journal of Science Frontier Research - F: Mathematics & Decision GJSFR-F Volume 23 (GJSFR Volume 23 Issue F2): .

Download Citation

Analysis of error vectors in a general linear model.
Issue Cover
GJSFR Volume 23 Issue F2
Pg. 31- 37
Journal Specifications

Crossref Journal DOI 10.17406/GJSFR

Print ISSN 0975-5896

e-ISSN 2249-4626

Keywords
Classification
GJSFR-F Classification: MSC 2010: 15A03
Version of record

v1.2

Issue date

April 13, 2023

Language
en
Experiance in AR

Explore published articles in an immersive Augmented Reality environment. Our platform converts research papers into interactive 3D books, allowing readers to view and interact with content using AR and VR compatible devices.

Read in 3D

Your published article is automatically converted into a realistic 3D book. Flip through pages and read research papers in a more engaging and interactive format.

Article Matrices
Total Views: 1294
Total Downloads: 45
2026 Trends
Related Research

Published Article

This paper deals with the decomposition of an error vector to identify how the error vector is related to the expected value of an observation vector under a general linear sample model since the error vector is defined as the deviance of observation vector from the expected value. The main idea of the paper is in that a random error vector can be decomposed into two orthogonal components vectors; i.e., one is in a vector space generated by the coefficient matrix of the unknown parameter vector and the other is in orthogonal complement of it. As related topics to the decomposition, two things are discussed: partitioning an observation vector and constructing the covariance structure of it. It also shows the reason why a projection method would be preferred rather than a least squares method.

Our website is actively being updated, and changes may occur frequently. Please clear your browser cache if needed. For feedback or error reporting, please email [email protected]

Request Access

Please fill out the form below to request access to this research paper. Your request will be reviewed by the editorial or author team.
X

Quote and Order Details

Contact Person

Invoice Address

Notes or Comments

This is the heading

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut elit tellus, luctus nec ullamcorper mattis, pulvinar dapibus leo.

High-quality academic research articles on global topics and journals.

Decomposition of the Random Error Vector of a General Linear Model

Jaesung Choi
Jaesung Choi Keimyung University

Research Journals