Loading…
Possibility of Decrease in a Level of Data Correlation During Processing Small Samples Using Neural Networks by Generating New Statistic Tests
Statistic tests created in the 20th century may be mapped with some equivalent artificial neurons. As a result, a network of dozens of artificial neurons that combines dozens of known statistic tests may be used for the validation of the normality hypothesis. The quality of solutions made by a neura...
Saved in:
Published in: | Journal of physics. Conference series 2020-05, Vol.1546 (1), p.12080 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Statistic tests created in the 20th century may be mapped with some equivalent artificial neurons. As a result, a network of dozens of artificial neurons that combines dozens of known statistic tests may be used for the validation of the normality hypothesis. The quality of solutions made by a neural network depends on a number of used neurons (tests). This tendency gives rise to the task of creating new statistic tests (neurons) that first and foremost require low correlation of their decision with known tests. The paper presents a forecast of attainable confidential probabilities for the validation of the normality hypothesis for a small sample of 21 tests in a network consisting of 21 artificial neurons, where each one is mapped with one traditional statistic test. When new tests are used (that should be created in the 21st century), the correlation of data is expected to lower by far, which should allow an approximately 10-fold decrease in a number of error probabilities. |
---|---|
ISSN: | 1742-6588 1742-6596 |
DOI: | 10.1088/1742-6596/1546/1/012080 |