A default cost model for SFL for SetType=BitSet<64>, based on benchmarks.
The numbers here were obtained in February 2026 by:
- For a variety of machines:
- Running a fixed collection of ~385000 clusters found through random generation and fuzzing, optimizing for difficulty of linearization.
- Linearize each ~3000 times, with different random seeds. Sometimes without input linearization, sometimes with a bad one.
- Gather cycle counts for each of the operations included in this cost model, broken down by their parameters.
- Correct the data by subtracting the runtime of obtaining the cycle count.
- Drop the 5% top and bottom samples from each cycle count dataset, and compute the average of the remaining samples.
- For each operation, fit a least-squares linear function approximation through the samples.
- Rescale all machine expressions to make their total time match, as we only care about relative cost of each operation.
- Take the per-operation average of operation expressions across all machines, to construct expressions for an average machine.
- Approximate the result with integer coefficients. Each cost unit corresponds to somewhere between 0.5 ns and 2.5 ns, depending on the hardware.
Definition at line 496 of file cluster_linearize.h.