Floating point is ubiquitous in computers, where it is the default way to represent non-integer numbers. However, few people understand it. We all see weird behavior sometimes, and many programmers treat it as a mystical and imprecise system of math that just works until it sometimes doesn't.