All Comments
TopTalkedBooks posted at August 19, 2017

Unity has already been well covered here by other redditors.

Nevertheless, if you are also in the process of learning how to program, I would like to contribute with two pointers.

Code: The Hidden Language of Computer Hardware and Software, by Charles Petzold

For the very basic of computer science

Inside C#, by Tom Archer and Andrew Whitechapel

This is quite old now but very easy to read and good to understand the language

If you are not in the US, just google around for other sources.

Happy hacking

TopTalkedBooks posted at August 19, 2017

Some more specific replies already, but if you want a great book that starts from the ground up and builds the concepts of how a computer works, I can't recommend this highly enough:

Code: The Hidden Language of Computer Hardware and Software

TopTalkedBooks posted at August 19, 2017
Recently, I've come to know "Code: The Hidden Language of Computer Hardware and Software"[1], and I really enjoyed it. I don't think it's necessarily the first book every programmer should read, but I do think it's a book every programmer should read. It's an easy read, it's fun, and really does provide what it promises. Highly recommended.


TopTalkedBooks posted at August 19, 2017
Haven't read the book you're recommending, but I feel it's more or less close to Code[0] by Charles Petzold, which in itself is a fascinating read.


TopTalkedBooks posted at August 19, 2017
Some resources on making tiny Hello World programs down to the kernel level that may be useful:

A wee bit heavy, but it's comprehensive. It deals with what happens when you run code, how the architecture of the computer works (by and large) including at the logic level:

If you want to go lower (and higher).. look at Understanding the Linux kernel for a good understanding of how an OS is put together, with specific examples i.e. Linux.

Code, by Petzold, deals with logic and computers from the ground up. It starts with relays and builds them up into gates and usable arithmetic blocks.

The physics is fairly simple, at least from a CRT or LED display perspective. Gets more tricky dealing with interconnecting microprocessors because a good chunk is vendor specific.

I think this kind of project is well suited to a guide on how to build a computer from the ground up, starting with logic gates, writing a real time OS and developing a scripting language that will run and compile on it. Then you can skip a lot of largely extraneous stuff and have a solid understanding of how the hardware works.

TopTalkedBooks posted at August 19, 2017

I'll throw some out there, these are books I revere:

The Evolution of God by Robert Wright

  • Traces the evolution of religion from prehistoric times and attempts to explain why "primitive" or "animalistic" religion came up at first, then polytheism, then monotheism
  • This is the book I'd give militant atheists and skeptics like us readers of SSC (hm actually I wonder what the recent demographics survey said about how many of us are religious, but I digress)—this is the book I'd give us to explain why religion is not going anywhere anytime soon, and why I actually think it's going to evolve again and come back stronger than ever
  • If you want a sneak peak of what I mean by that, and see that I'm not being hyperbolic, from Robert Wright:

> I don’t think a “materialist” account of religion’s origin, history, and future — like the one I’m giving here — precludes the validity of a religious worldview. In fact, I contend that the history of religion presented in this book, materialist though it is, actually affirms the validity of a religious worldview; not a traditionally religious worldview, but a worldview that is in some meaningful sense religious. > > It sounds paradoxical. On the one hand, I think gods arose as illusions, and that the subsequent history of the idea of god is, in some sense, the evolution of an illusion. On the other hand: (1) the story of this evolution itself points to the existence of something you can meaningfully call divinity; and (2) the “illusion,” in the course of evolving, has gotten streamlined in a way that moved it closer to plausibility. In both of these senses, the illusion has gotten less and less illusory.

I loved this book!

The Machinery of Life by David Goodsell

  • This book has very high quality computer generated graphics!
  • But don't let that you get the wrong impression... it's one of the most technical books I've read.
  • I'd say it's the equivalent of Code by Charles Petzold , which some of you engineers might know of. Code does an amazing job of explaining how computers work, from the very, very smallest details, all the way up.
  • The Machinery of Life does the same for microbiology. It was super trippy to read this book because I'd think about all of that stuff was happening in my body as I was reading.

The Story of Art by E H Gombrich

  • If you've ever wondered what the point of modern art is or why it can be so ridiculous look no further than this book.
  • It also has high quality photographs, so its 700 page read is maybe more like 500 pages?
  • But this book was such a beautiful read–very, very interesting. It starts at prehistoric times again and goes through history and attempts to explain why and how art has changed through history the way it has.

The Unfolding of Language

  • The book traces the evolution of language through history
  • It was a very fascinating read, I'll share a tidbit with you:

"Going to" used to be an expression that only related to space. So we would say something like "I'm going to the lake." But then that sense of being in a new place in the future started to become more abstract, in that "going to" became a way to refer to anything that is going to happen in the future. As in "Is it going to stop raining?" The way our minds make analogies like this is one way how language has come to be.

Far From the Tree by Andrew Soloman

  • Each chapter spends ~50-90ish? pages exploring an identity that can make a kid feel "far from the tree", such as deafness, dwarfism, schizophrenia, multiple severe disabilities.
  • This was a really, really touching book... and reading one of the topics it covers that I know very well affirmed for me it's well researched. The reason why it was so touching was because the author interviewed for ten years and hundreds of families and there are so many poignant stories.
  • And while you're at it you get to learn about these different disabilities/identities.

Confront and Conceal by David E Sanger

  • The author of this book is the chief Whitehouse Washington correspondent for the New York Times and is an amazing journalist.
  • He writes about the Obama administration and explores foreign affairs around Pakistan and Afghanistan, Iran, Drones and Cyber warfare, the Arab Spring, and China and North Korea.
  • I loved this book a lot because it was such a well written overview of and sneak peak into US foreign affairs during the time of Obama.

You can empathize with me now when I learn someone absolutely love non-fiction books too, only to hear them mention Malcolm Gladwell hahaha.

TopTalkedBooks posted at August 19, 2017

There are definitely a lot of posts here for stressed out/overwhelmed folks that recommend the Time Management book. I wonder if it would be bad form to buy it for my boss? ;-)

Actually, just from the chapter titles and descriptions, it seems useful to a lot of my non-IT co-workers as well.

I hadn't heard of The Compassionate Geek but will add it to the list.

As someone that feels pretty deficient in their hardware knowledge, I figured I'd start at the basics and came across this book:

Code: The Hidden Language of Computer Hardware and Software -

Very engaging. Takes you from flicking flashlights on and off to the microprocessor. Might be a bit basic for a lot of folks but I like drilling down to get to essentials on things I don't know and thought it a good place to start.

TopTalkedBooks posted at August 19, 2017
TopTalkedBooks posted at August 19, 2017
TopTalkedBooks posted at August 19, 2017
I'm extremely grateful and was not at all expecting such an explanation.

I wanna exlpain few things.

Let me rephrase what I meant by "minimize the time wasting". You see there are lot of great advice available online. You ask something on a subreddit or here and then people will share great resources. I love this and this kind of learning. My concern is that sometimes these resources and advice is given along the lines of "although its not completely necessary, it'll still be an experience in itself".

The problem here is that such kind of learning sometime waste too much of time and leave you with confusion. People daily ask so many questions on CompSci and you'll find books starting from complete basics of computer like Code, Nand2tetris course etc to something very sophisticated like AI. I hope you can understand that if a person spends too much time on these kinda things given that he's got a job or he's student in university with a sweet CompSci curriculum (you know what I mean) then its a problem. Although the above mentioned resources are exceptional there are others too which teaches the same thing. Can a person read all of them one by one "just to satisfy his curiosity and thinking that it'll help him in future"?

RE is already an extremely sophisticated and vast field which requires computer mastery. I'm in college and it has made me hate things I loved. I'm extremely curious guy and can spend 10-20 hours in front of PC easily. I've ~6 years of experience with linux. Now I'm literally not in a state to read 2-3 400-800 page books on a single topic which I don't even know would be required in RE. There are some topics which are quite difficult but at least if we have an idea that it IS mandatory for RE then you can be sure and refer other resources. If you don't even know what's your syllabus how can one concentrate and master it let alone learning. RE requires you to study every minute details or computer system but wasting too much of time on those horrible digital logics and design is really not worth it.

So My purpose is to make it completely clear what I actually need to know so that I can focus on it instead of reading each and every topic in complete detail thinking that if I'll miss the direction of even a single electron in I/O I won't be able to do efficient reversing. I'm literally fed up of those architecture diagrams with arrows and cramming those definitions ROM, EEROM, EEPROM.............. again and again for tests and assignments.

I've few questions for you:

You mentioned Computer Organization and Design which I think is authored by Patterson and Hennessy which is used by almost all Universities. I'm just curious about its not so good looking amazon reviews. Also what's your opinion on Tanenbaum's books which you've mentioned in that reddit link.

Now let's summarize what I've understood (PLEASE help me correct if I'm wrong)

>>>> UNDERSTANDING the system you want to hack

> Learn the most used fundamental programmming languages. (the way we TALK with computers) 1. C (also C++ in some cases) 2. Python or Ruby (given its dominance in industry right now thanks to its productive nature, also being used exploit writing) 3. Java or C# (object oriented programming which along with above languaged completes our programming fundamentals) 4. Assembly (obviously needed in RE) I think it need not be mentioned that we need to have good grasp of Data Structures and Algorithms with above languages (obviously not all)

> Understand each and every data flow and HOW a computer system work

Computer Organization and Design and Architecture

(OS fundamentals, memory management, virtual memory, paging, caching etc, Linux(macOS too) and Windows internals part I think comes here)

You restored my faith in humanity when you said I can skip the hardware and microcode part (please explain what specific topics, I swear I won't look at them again until I'm done with required topics.)

> Network Fundamentals and Programming Basics of http, TCP/IP and other protocols.... Socket programming


> Learning WHAT loopholes are there in this above process of data read write Types of attacks (buffer overflows, heap overflows....)

> HOW those loopholes are exploited

>Reverse Engineering (Learning tools of trade: IDA, gdb.....) learning and practising reversing. Fuzzing

>Exploiting the bugs making exploits.

Please review and correct. Thanks again.

TopTalkedBooks posted at August 19, 2017
But software is so pervasive that within a generation or two, not understanding how it works will put you at a severe disadvantage.

Unfortunately the corporations seem determined to put a stop to that sort of pervasive knowledge, if only for the purpose of controlling and monetising their users. They don't want people to know how easy it is to do things like strip DRM or remove artificial limitations from software. [See Stallman's famous story, or the Cory Doctorow articles on the demise of general-purpose computing.]

And thus most of the "learn to code" efforts I've seen have seemed to focus on some ultra-high-level language, removed from reality and often sandboxed, so that while people do learn the very-high-level concepts of how computers work, they are no less unclear about the functioning of the actual systems they use daily --- or how to use them fully to their advantage. In some ways, they're trying to teach how to write programs without understanding existing ones.

The document said children should be writing programs and understand how computers store information by their final years of primary or intermediate school.

However, this sounds more promising. Especially if they're starting more along the lines of this book:

TopTalkedBooks posted at August 20, 2017

For these languages, broadly, return ends the current function and returns execution flow to the calling function. You can optionally pass a value after the return statement which is the result of the current function in the context of the calling function.

For example:

void CallingFunction() {
    int x = OtherFunction();

int OtherFunction() {
    return 3;

The return statement ends OtherFunction(), i.e. any other statements after the return would not be executed, and the value of x in CallingFunction() will be 3, since that is what OtherFunction() returned.

This idea of functions and returning values is a very basic, core concept for this style of programming. As others have pointed out in the comments, you really should check out a book or find a mentor who can help you get started.

Code is a very easy to read, high-level overview of the concepts you'll need.

TopTalkedBooks posted at August 20, 2017

Two really great books come to mind.

Also, online you can read Bit Twiddling Hacks.

TopTalkedBooks posted at August 20, 2017

To multiply by any value of 2 to the power of N (i.e. 2^N) shift the bits N times to the left.

0000 0001 = 1 

times 4 = (2^2 => N = 2) = 2 bit shift : 0000 0100 = 4

times 8 = (2^3 -> N = 3) = 3 bit shift : 0010 0000 = 32


To divide shift the bits to the right.

The bits are whole 1 or 0 - you can't shift by a part of a bit thus if the number you're multiplying by is does not factor a whole value of N ie.

since: 17 = 16  + 1 
thus:  17 = 2^4 + 1

therefore: x * 17 = (x * 16) + x in other words 17 x's  

thus to multiply by 17 you have to do a 4 bit shift to the left, and then add the original number again:

==> x * 17 = (x * 2^4) + x 
==> x * 17 = (x shifted to left by 4 bits) + x 

so let x = 3 = 0000 0011 

times 16 = (2^4 => N = 2) = 4 bit shift : 0011 0000 = 48

plus the x (0000 0011)


    0011 0000  (48)  
+   0000 0011   (3)
    0011 0011  (51)

Edit: Update to original answer. Charles Petzold has written a fantastic book 'Code' that will explain all of this and more in the easiest of ways. I thoroughly recommend this.

TopTalkedBooks posted at August 20, 2017
I still haven't read anything better than Code by Charles Petzold [1] and it's not even close.


TopTalkedBooks posted at October 16, 2017
Teaching CS is hard but I think the root of it is our ability to teach people how to think. Learning to program is easy compared to the problem of learning how to think algorithmically.

IMHO this is because a lot of CS courses start at a very high level with very abstract concepts, which might also be because they can't afford to start with the very basics of "how does a computer work" due to the limited time available.

On the other hand, I think CS should always start with a book like this one, which truly does start at the basics:

A large part of why beginners fail is because they expect too much of, and don't comprehend, how the machine works. I believe that once they see how this complexity is actually the result of a large number of very, very simple operations, they will have a better understanding of what it means to instruct the machine.

TopTalkedBooks posted at December 06, 2017

You'll do little to no traditional math but you will use Boolean Algebra all the time. A lot of what an admin does is identify parts of a process that can be automated from existing data. Since very little of the data in Salesforce is numerical you'll rarely do traditional math.

I constantly recommend this book to people transitioning into CS from other careers because it explains why Boolean Algebra is so important to how systems work and is extremely entertaining. It also doesn't have a single line of code in the whole book. It's really just about the concepts that power code and how these concepts can be applied in any system.

With all of that said, the Salesforce ecosystem is so large that you don't have to be an admin. If you like teaching but want a raise there are tons of companies that hire Salesforce trainers. They are typically salesforce admins that really enjoy working with users over development. Get your cert, get some experience then find your niche.

TopTalkedBooks posted at December 06, 2017

Just to include a slightly different angle, the author of this book gives a fascinating description of how you could build analog mechanical logic gates using telegraph switches.

TopTalkedBooks posted at December 06, 2017

While a formal education covers most of these questions, they are still important to brush up on every now and then. My next book I will be reading is something that I believe might fall in line with what you're looking for.

TopTalkedBooks posted at January 06, 2018
Chapter 22 of "Code: The Hidden Language of Computer Hardware and Software" uses CP/M to illustrate the inner workings of operating systems. I highly recommend this book overall:
TopTalkedBooks posted at March 18, 2018

Well done. This video seems to be based on the book Code by Charles Petzold, if you were interested in this video I highly recommend you read this book!

TopTalkedBooks posted at March 18, 2018

Get the book CODE by Charles Petzold. It's a super easy read, and by the time you're done with it you'll understand how CPUs work.

TopTalkedBooks posted at March 18, 2018

CODE by Charles Petzold is the book to read to understand computers at a base level. It literally starts at a single bit and moves all the way up the stack. I cannot recommend this book enough for someone starting out.

Top Books
We collected top books from hacker news, stack overflow, Reddit, which are recommended by amazing people.