“If you try to improve the performance of a system of people, machines, and procedures by setting numerical goals for the improvement of individual parts of the system, the system will defeat your efforts and you will pay a price where you least expect it.”. – Myron Tribus
Myron Tribus was the source of the anecdote about why Shannon chose to name his measure of information after the thermodynamic concept of entropy. Furthermore, Shannon expressed to him “.. misgivings about using his definition of entropy for applications beyond communication channels”.
I don’t know if Shannon knew much about thermodynamics or not – he was an engineer and mathematician. I also don’t know whether Von Neumann (who certainly did know physics) suggested the name “entropy” based only on the superficial syntactic similarity between Shannon’s sum(pi log(pi)) and Boltzmann’s -Nk sum(pi log(pi)) .. or whether perhaps he grokked a deeper connection like what ET Jaynes later tried to pursue.