This is an idea I had about 10 years ago and couldn’t find written down anywhere.
In cryptography “indistinguishability” is perhaps the most important notion. Two (sequences of) distributions $X_n$, $Y_n$ are indistinguishable with respect to a family of tests $\mathcal{F}$ if for all $f \in \mathcal{F}$, the difference $|\mathsf{Pr}[f(X_n) = 1] - \mathsf{Pr}[f(Y_n) = 1]|$ goes to 0 quickly in $n$. In other words, no test $f$ can reliably identify whether its input comes from $X$ or from $Y$. Perhaps a more intuitive (equivalent) definition is that the probability that a test $f$ correctly identifies whether its input is from $X$ or $Y$, is at most very slightly above 1/2. In other words, barely beating a random guess.
Often we talk about distributions which are indistinguishable with respect to all efficient (i.e., polynomial time) algorithms. For example, the basic definition of a secure encryption scheme is one for which the encryption of one message is indistinguishable from the encryption of another.
Obfuscation turns semantic equality into syntactic non-inequality, a double negated form of equality.