Understanding Entropy: A Deep Dive into Information Theory and Its Application in the Financial Market

Brijesh Mishra
6 min readJust now

Introduction

In many fields, from physics to machine learning, the concept of entropy plays a crucial role in understanding the unpredictability and complexity of systems. Originally a concept in thermodynamics, entropy has been adapted in information theory to quantify uncertainty or “randomness” in a set of possibilities. In this article, we will explore the concept of entropy and the relationship between surprise and probability and clearly understand the formula for entropy. Finally, we’ll see how these concepts apply to real-life scenarios like the stock market.

What is Entropy?

At its core, entropy is a measure of uncertainty or surprise within a distribution. While the mathematical formula for entropy may seem unclear at first glance, by the end of this article, we’ll break it down and clarify each component, making the concept and formula much easier to understand.

Where:

Surprise as the Inverse of Probability

To understand entropy, we first must explore surprise, which is inversely related to probability.

  • When we encounter something with a high probability (e.g., your…

--

--