I didn’t understand probability, until I read E T Jaynes Probability Theory: Logic of Science

It defines the base blocks of probability very, very slow. And never hand-waves anything. But it’s the “bayesian” view of probability; but it’s honestly the easier one to understand.

I took a course on Applied Bayesian Statistics taught by David Draper in grad school and we covered Bayesian Data Analysis (Gelman et Al.) https://www.amazon.com/dp/0521592712/ref=cm_sw_r_em_apa_i_v3...

The former is a much recommended book since it's very comprehensive and builds everything from the ground up and was the basis for the entire course. The latter is a beast of it's own and we simply covered what was effectively the first chapter as part of the course.

I might add that what made me understand these concepts is the writings of physicist and probability theorist E.T. Jaynes, especially his unpublished manuscripts: http://bayes.wustl.edu/etj/node2.html

I think if he would have been alive at the right time these would have been blog posts. Before reading them, I had taken an intro class in thermodynamics which at left me completely confused.

Read THE EVOLUTION OF CARNOT'S PRINCIPLE ( http://bayes.wustl.edu/etj/articles/ccarnot.pdf )
for incredible insights on how Carnot pioneered thermodynamics by trying to optimize steam engines.

Also if you think you dislike statistics and probabilities but you like math in general his book might change your mind: Probability Theory: The Logic of Science.
Free draft: http://omega.albany.edu:8008/JaynesBook.html

In fact understanding his stance on probabilities, the mind projection fallacy in particular might be prerequisite to understand thermodynamics, the fundamental point being that entropy is not really directly a property of matter but more of a meta property that is about knowledge or information which is taken to mean correlations across aggregate matter.

I can't find the book with that title, do you have a link or ISBN? http://www.amazon.com/Probability-Theory-Science-T-Jaynes/dp... is the closest match but it does not seem to match your description.

It defines the base blocks of probability very, very slow. And never hand-waves anything. But it’s the “bayesian” view of probability; but it’s honestly the easier one to understand.

https://www.amazon.com/Probability-Theory-Science-T-Jaynes/d...

https://www.amazon.com/Probability-Theory-Science-T-Jaynes/d...

And there is a website with more information and a collection of his papers:

https://bayes.wustl.edu/

https://bayes.wustl.edu/etj/node1.html

The former is a much recommended book since it's very comprehensive and builds everything from the ground up and was the basis for the entire course. The latter is a beast of it's own and we simply covered what was effectively the first chapter as part of the course.

http://www.amazon.com/Probability-Theory-Science-T-Jaynes/dp...

http://bayes.wustl.edu/etj/prob/book.pdf

http://omega.albany.edu:8008/JaynesBook.html

I think if he would have been alive at the right time these would have been blog posts. Before reading them, I had taken an intro class in thermodynamics which at left me completely confused.

Read THE EVOLUTION OF CARNOT'S PRINCIPLE ( http://bayes.wustl.edu/etj/articles/ccarnot.pdf ) for incredible insights on how Carnot pioneered thermodynamics by trying to optimize steam engines.

Also if you think you dislike statistics and probabilities but you like math in general his book might change your mind: Probability Theory: The Logic of Science. Free draft: http://omega.albany.edu:8008/JaynesBook.html

Amazon: http://www.amazon.com/Probability-Theory-Science-T-Jaynes/dp...

In fact understanding his stance on probabilities, the mind projection fallacy in particular might be prerequisite to understand thermodynamics, the fundamental point being that entropy is not really directly a property of matter but more of a meta property that is about knowledge or information which is taken to mean correlations across aggregate matter.