On randomness of compressed data using non-parametric randomness tests

Four randomness tests were used to test the outputs (compressed files) of four lossless compressions algorithms: JPEG-LS and JPEG-2000 algorithms are image-dedicated algorithms, while 7z and Bzip2 algorithms are generalpurpose algorithms. The relationship between the result of randomn...

Full description

Bibliographic Details
Main Authors: Al-Khayyat, Kamal A., Alshaikhli, Imad Fakhri Taha, Vijayakumar, V.
Format: Article
Language:English
English
Published: Lembaga Penerbitan dan Publikasi Ilmiah (LPPI), Universitas Ahmad Dahlan 2018
Subjects:
Online Access:http://irep.iium.edu.my/62964/
http://irep.iium.edu.my/62964/
http://irep.iium.edu.my/62964/
http://irep.iium.edu.my/62964/1/62964_On%20randomness%20of%20compressed%20data%20using.pdf
http://irep.iium.edu.my/62964/2/62964_On%20randomness%20of%20compressed%20data%20using_SCOPUS.pdf
Description
Summary:Four randomness tests were used to test the outputs (compressed files) of four lossless compressions algorithms: JPEG-LS and JPEG-2000 algorithms are image-dedicated algorithms, while 7z and Bzip2 algorithms are generalpurpose algorithms. The relationship between the result of randomness tests and the compression ratio was investigated. This paper reports the important relationship between the statistical information behind these tests and the compression ratio. It shows that, this statistical information almost the same at least, for the four lossless algorithms under test. This information shows that 50 % of the compressed data are grouping of runs, 50% of it has positive signs when comparing adjacent values, 66% of the files containing turning points, and using Cox-Stuart test, 25% of the file give positive signs, which reflects the similarity aspects of compressed data. When it comes to the relationship between the compression ratio and these statistical information, the paper shows also, that, the greater values of these statistical numbers, the greater compression ratio we get.