Literature DB >> 33946438

Selecting an Effective Entropy Estimator for Short Sequences of Bits and Bytes with Maximum Entropy.

Lianet Contreras Rodríguez1, Evaristo José Madarro-Capó1, Carlos Miguel Legón-Pérez1, Omar Rojas2, Guillermo Sosa-Gómez2.   

Abstract

Entropy makes it possible to measure the uncertainty about an information source from the distribution of its output symbols. It is known that the maximum Shannon's entropy of a discrete source of information is reached when its symbols follow a Uniform distribution. In cryptography, these sources have great applications since they allow for the highest security standards to be reached. In this work, the most effective estimator is selected to estimate entropy in short samples of bytes and bits with maximum entropy. For this, 18 estimators were compared. Results concerning the comparisons published in the literature between these estimators are discussed. The most suitable estimator is determined experimentally, based on its bias, the mean square error short samples of bytes and bits.

Entities:  

Keywords:  cryptography; entropy; estimation; randomness; undersample

Year:  2021        PMID: 33946438     DOI: 10.3390/e23050561

Source DB:  PubMed          Journal:  Entropy (Basel)        ISSN: 1099-4300            Impact factor:   2.524


  14 in total

1.  Entropy estimation of symbol sequences.

Authors:  Thomas Schurmann; Peter Grassberger
Journal:  Chaos       Date:  1996-09       Impact factor: 3.642

2.  A Note on Entropy Estimation.

Authors:  Thomas Schürmann
Journal:  Neural Comput       Date:  2015-08-27       Impact factor: 2.026

Review 3.  A Tutorial for Information Theory in Neuroscience.

Authors:  Nicholas M Timme; Christopher Lapish
Journal:  eNeuro       Date:  2018-09-11

4.  Estimating functions of probability distributions from a finite set of samples.

Authors: 
Journal:  Phys Rev E Stat Phys Plasmas Fluids Relat Interdiscip Topics       Date:  1995-12

5.  Efficient Randomness Certification by Quantum Probability Estimation.

Authors:  Yanbao Zhang; Honghao Fu; Emanuel Knill
Journal:  Phys Rev Res       Date:  2020

6.  Nonparametric estimation of Küllback-Leibler divergence.

Authors:  Zhiyi Zhang; Michael Grabchak
Journal:  Neural Comput       Date:  2014-07-24       Impact factor: 2.026

7.  minet: A R/Bioconductor package for inferring large transcriptional networks using mutual information.

Authors:  Patrick E Meyer; Frédéric Lafitte; Gianluca Bontempi
Journal:  BMC Bioinformatics       Date:  2008-10-29       Impact factor: 3.169

8.  ARACNE: an algorithm for the reconstruction of gene regulatory networks in a mammalian cellular context.

Authors:  Adam A Margolin; Ilya Nemenman; Katia Basso; Chris Wiggins; Gustavo Stolovitzky; Riccardo Dalla Favera; Andrea Califano
Journal:  BMC Bioinformatics       Date:  2006-03-20       Impact factor: 3.169

9.  Estimating mutual information using B-spline functions--an improved similarity measure for analysing gene expression data.

Authors:  Carsten O Daub; Ralf Steuer; Joachim Selbig; Sebastian Kloska
Journal:  BMC Bioinformatics       Date:  2004-08-31       Impact factor: 3.169

10.  A Chaotic-Based Encryption/Decryption Framework for Secure Multimedia Communications.

Authors:  Ibrahim Yasser; Mohamed A Mohamed; Ahmed S Samra; Fahmi Khalifa
Journal:  Entropy (Basel)       Date:  2020-11-04       Impact factor: 2.524

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.