The End of Error: Unum Computing (Chapman & Hall/CRC Computational Science)

Category: Computer Science
Author: John L. Gustafson
4.3
This Month Hacker News 1

Comments

by daly   2021-04-27
The field of software is exploding in many directions (e.g. machine learning, quantum programming, program proofs, dependent types, category theory, etc).

At the hardware level it is also exploding. There are a dozen new computer architectures and instruction sets. It is now possible to design your own cpu (yes, I am) and build it in an FPGA at the cost of a few dollars. That forces you to learn VERILOG and processor design. It also forces you to learn electronics in order to breadboard your ideas.

The field is also converging. Intel now has a cpu that also has an fpga. (This is only available to the FAANG players it seems, which is a source of frustration for me. Why, Intel, Why?). Imaging being able to build your own instructions "on the fly". I want to implement Gustafson's UNUM arithmetic (see "The End of Error" https://www.amazon.com/End-Error-Computing-Chapman-Computati...)

It's not clear what the long-term shakeout will be but it is fun to try to learn what the leading edge is doing.

Like the Red Queen from Alice in Wonderland said

'Now, here, you see, it takes all the running you can do, to keep in the same place. If you want to get somewhere else, you must run at least twice as fast as that!'

I'm doing all the running I can to ride the leading edge.

What could be more fun?

by andrewl   2019-07-16
I'd never heard of him. He's done a lot of interesting stuff, mostly in high performance computing. From his personal page:

"Gustafson has recently finished writing a book, The End of Error: Unum Computing, that presents a new approach to computer arithmetic: the unum. The universal number, or unum format, encompasses all IEEE floating-point formats as well as fixed-point and exact integer arithmetic. This approach obtains more accurate answers than floating-point arithmetic yet uses fewer bits in many cases, saving memory, bandwidth, energy, and power."

Has anybody read it?

https://www.amazon.com/dp/1482239868

by daly   2019-04-12
Are they planning to implement "The End of Error" (https://www.amazon.com/End-Error-Computing-Chapman-Computati...) algorithms?