Table of contentsClick link to navigate to the desired location
This content has been automatically translated from Ukrainian.
Entropy is a concept from information theory and statistics that is used to measure the degree of uncertainty or disorder in a system. In the context of data or information, entropy indicates how great the diversity or complexity of the information is.
Entropy in simple terms
The greater the entropy, the greater the uncertainty or diversity in the data.
Low entropy indicates that the data is more ordered or less diverse.
Example of entropy
A good example of entropy can be archiving text files.A text file that has 1000 lines (identical) with the word 'thisone' will have low entropy and the level of file compression during archiving will be high (identical data is easier to organize and compress).
A text file that contains 1000 different words has higher entropy and the level of compression will be lower.
This post doesn't have any additions from the author yet.