Brain Floating Point (Bfloat16)

by ml_basicson 5/28/23, 6:27 PMwith 1 comments
by ml_basicson 5/28/23, 6:31 PM

bfloat16 is probably familiar only to ML practitioners; it's a reduced precision floating point format designed for ML models. I was surprised to learn that the "b" stands for "brain", as in the team at google that developed it along with many other advances in machine learning.