Research output: Chapter in Book/Report/Conference proceeding › Conference contribution › Research › peer-review
Differential Privacy for Statistical Data of Educational Institutions. / Podsevalov, Ivan; Podsevalov, Alexei; Korkhov, Vladimir.
Computational Science and Its Applications – ICCSA 2022 Workshops. 2022. p. 603-615 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 13380).Research output: Chapter in Book/Report/Conference proceeding › Conference contribution › Research › peer-review
}
TY - GEN
T1 - Differential Privacy for Statistical Data of Educational Institutions
AU - Podsevalov, Ivan
AU - Podsevalov, Alexei
AU - Korkhov, Vladimir
N1 - Conference code: 22
PY - 2022
Y1 - 2022
N2 - Electronic methods of managing the educational process are gaining popularity. Recently, a large number of user programs have appeared for such accounting. Based on this, the issue of personal data protection requires increased attention. The coronavirus pandemic has led to a significant increase in the amount of data distributed remotely, which requires information security for a wider range of workers on a continuous basis. In this article, we will consider such a relatively new mechanism designed to help protect personal data as differential privacy. Differential privacy is a way of strictly mathematical definition of possible risks in public access to sensitive data. Based on estimating the probabilities of possible data losses, you can build the right policy to “noise” publicly available statistics. This approach will make it possible to find a compromise between the preservation of general patterns in the data and the security of the personal data of the participants in the educational process.
AB - Electronic methods of managing the educational process are gaining popularity. Recently, a large number of user programs have appeared for such accounting. Based on this, the issue of personal data protection requires increased attention. The coronavirus pandemic has led to a significant increase in the amount of data distributed remotely, which requires information security for a wider range of workers on a continuous basis. In this article, we will consider such a relatively new mechanism designed to help protect personal data as differential privacy. Differential privacy is a way of strictly mathematical definition of possible risks in public access to sensitive data. Based on estimating the probabilities of possible data losses, you can build the right policy to “noise” publicly available statistics. This approach will make it possible to find a compromise between the preservation of general patterns in the data and the security of the personal data of the participants in the educational process.
KW - Differential privacy
KW - Education
KW - Security
KW - Statistics
UR - http://www.scopus.com/inward/record.url?scp=85135932105&partnerID=8YFLogxK
UR - https://www.mendeley.com/catalogue/a29f4560-b96f-3216-b4ba-975ffe8cb1da/
U2 - 10.1007/978-3-031-10542-5_41
DO - 10.1007/978-3-031-10542-5_41
M3 - Conference contribution
AN - SCOPUS:85135932105
SN - 9783031105418
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 603
EP - 615
BT - Computational Science and Its Applications – ICCSA 2022 Workshops
T2 - 22nd International Conference on Computational Science and Its Applications , ICCSA 2022
Y2 - 4 July 2022 through 7 July 2022
ER -
ID: 98811400