A FLOPS (Floating Point Operation per Second) is a measure of a
computerās ability to perform arithmetic calculations involving decimal
numbers (floating-point numbers). Specifically, it refers to how many
mathematical operationsālike addition, subtraction, multiplication, or
divisionāa processor can complete per second.
More FLOPS generally mean more raw computational power, but there are
important caveats:
- ā” Power Efficiency ā Not all FLOPS are equalāsome devices achieve higher
performance per watt, which isnāt reflected in raw FLOPS.
- š§ Specialised chips ā Some chips may seem slower but are designed for
highly to be effective for a very specific task which may not be
reflected in its FLOPS (e.g. Neural Processing Units are highly
optimised for specific AI tasks like machine learning).
- šø Cost per FLOPS ā Supercomputers push the limits of performance, but
consumer devices often offer better FLOPS per dollar.
- āļø Quantum Computing: Quantum processors, like IBMās Eagle, donāt use
FLOPS but instead measure performance through qubits and quantum gate
complexity. Their power scales differently, making direct comparisons
with classical computers tricky.
Note on Accuracy: FLOPS can be measured in different ways, and some
devices (particularly supercomputers) are upgraded over time.
Additionally, certain devices are sold in multiple configurations. Weāve
aimed to be consistent with the FLOPS scores we use and have selected
the most accurate numbers available.