0.5 + 0.4 * cos(x) + 0.1 * cos(2 x)

Percentage Accurate: 100.0% → 100.0%
Time: 4.1s
Alternatives: 8
Speedup: 1.0×

Specification

?
\[-1000 \leq x \land x \leq 1000\]
\[\begin{array}{l} \\ \left(0.5 + 0.4 \cdot \cos x\right) + 0.1 \cdot \cos \left(2 \cdot x\right) \end{array} \]
(FPCore (x)
 :precision binary64
 (+ (+ 0.5 (* 0.4 (cos x))) (* 0.1 (cos (* 2.0 x)))))
double code(double x) {
	return (0.5 + (0.4 * cos(x))) + (0.1 * cos((2.0 * x)));
}
real(8) function code(x)
    real(8), intent (in) :: x
    code = (0.5d0 + (0.4d0 * cos(x))) + (0.1d0 * cos((2.0d0 * x)))
end function
public static double code(double x) {
	return (0.5 + (0.4 * Math.cos(x))) + (0.1 * Math.cos((2.0 * x)));
}
def code(x):
	return (0.5 + (0.4 * math.cos(x))) + (0.1 * math.cos((2.0 * x)))
function code(x)
	return Float64(Float64(0.5 + Float64(0.4 * cos(x))) + Float64(0.1 * cos(Float64(2.0 * x))))
end
function tmp = code(x)
	tmp = (0.5 + (0.4 * cos(x))) + (0.1 * cos((2.0 * x)));
end
code[x_] := N[(N[(0.5 + N[(0.4 * N[Cos[x], $MachinePrecision]), $MachinePrecision]), $MachinePrecision] + N[(0.1 * N[Cos[N[(2.0 * x), $MachinePrecision]], $MachinePrecision]), $MachinePrecision]), $MachinePrecision]
\begin{array}{l}

\\
\left(0.5 + 0.4 \cdot \cos x\right) + 0.1 \cdot \cos \left(2 \cdot x\right)
\end{array}

Sampling outcomes in binary64 precision:

Local Percentage Accuracy vs ?

The average percentage accuracy by input value. Horizontal axis shows value of an input variable; the variable is choosen in the title. Vertical axis is accuracy; higher is better. Red represent the original program, while blue represents Herbie's suggestion. These can be toggled with buttons below the plot. The line is an average while dots represent individual samples.

Accuracy vs Speed?

Herbie found 8 alternatives:

AlternativeAccuracySpeedup
The accuracy (vertical axis) and speed (horizontal axis) of each alternatives. Up and to the right is better. The red square shows the initial program, and each blue circle shows an alternative.The line shows the best available speed-accuracy tradeoffs.

Initial Program: 100.0% accurate, 1.0× speedup?

\[\begin{array}{l} \\ \left(0.5 + 0.4 \cdot \cos x\right) + 0.1 \cdot \cos \left(2 \cdot x\right) \end{array} \]
(FPCore (x)
 :precision binary64
 (+ (+ 0.5 (* 0.4 (cos x))) (* 0.1 (cos (* 2.0 x)))))
double code(double x) {
	return (0.5 + (0.4 * cos(x))) + (0.1 * cos((2.0 * x)));
}
real(8) function code(x)
    real(8), intent (in) :: x
    code = (0.5d0 + (0.4d0 * cos(x))) + (0.1d0 * cos((2.0d0 * x)))
end function
public static double code(double x) {
	return (0.5 + (0.4 * Math.cos(x))) + (0.1 * Math.cos((2.0 * x)));
}
def code(x):
	return (0.5 + (0.4 * math.cos(x))) + (0.1 * math.cos((2.0 * x)))
function code(x)
	return Float64(Float64(0.5 + Float64(0.4 * cos(x))) + Float64(0.1 * cos(Float64(2.0 * x))))
end
function tmp = code(x)
	tmp = (0.5 + (0.4 * cos(x))) + (0.1 * cos((2.0 * x)));
end
code[x_] := N[(N[(0.5 + N[(0.4 * N[Cos[x], $MachinePrecision]), $MachinePrecision]), $MachinePrecision] + N[(0.1 * N[Cos[N[(2.0 * x), $MachinePrecision]], $MachinePrecision]), $MachinePrecision]), $MachinePrecision]
\begin{array}{l}

\\
\left(0.5 + 0.4 \cdot \cos x\right) + 0.1 \cdot \cos \left(2 \cdot x\right)
\end{array}

Alternative 1: 100.0% accurate, 1.0× speedup?

\[\begin{array}{l} \\ \left(0.5 + 0.4 \cdot \cos x\right) + 0.1 \cdot \cos \left(2 \cdot x\right) \end{array} \]
(FPCore (x)
 :precision binary64
 (+ (+ 0.5 (* 0.4 (cos x))) (* 0.1 (cos (* 2.0 x)))))
double code(double x) {
	return (0.5 + (0.4 * cos(x))) + (0.1 * cos((2.0 * x)));
}
real(8) function code(x)
    real(8), intent (in) :: x
    code = (0.5d0 + (0.4d0 * cos(x))) + (0.1d0 * cos((2.0d0 * x)))
end function
public static double code(double x) {
	return (0.5 + (0.4 * Math.cos(x))) + (0.1 * Math.cos((2.0 * x)));
}
def code(x):
	return (0.5 + (0.4 * math.cos(x))) + (0.1 * math.cos((2.0 * x)))
function code(x)
	return Float64(Float64(0.5 + Float64(0.4 * cos(x))) + Float64(0.1 * cos(Float64(2.0 * x))))
end
function tmp = code(x)
	tmp = (0.5 + (0.4 * cos(x))) + (0.1 * cos((2.0 * x)));
end
code[x_] := N[(N[(0.5 + N[(0.4 * N[Cos[x], $MachinePrecision]), $MachinePrecision]), $MachinePrecision] + N[(0.1 * N[Cos[N[(2.0 * x), $MachinePrecision]], $MachinePrecision]), $MachinePrecision]), $MachinePrecision]
\begin{array}{l}

\\
\left(0.5 + 0.4 \cdot \cos x\right) + 0.1 \cdot \cos \left(2 \cdot x\right)
\end{array}
Derivation
  1. Initial program 100.0%

    \[\left(0.5 + 0.4 \cdot \cos x\right) + 0.1 \cdot \cos \left(2 \cdot x\right) \]
  2. Add Preprocessing
  3. Add Preprocessing

Alternative 2: 100.0% accurate, 1.0× speedup?

\[\begin{array}{l} \\ \mathsf{fma}\left(\cos \left(x + x\right), 0.1, \mathsf{fma}\left(\cos x, 0.4, 0.5\right)\right) \end{array} \]
(FPCore (x) :precision binary64 (fma (cos (+ x x)) 0.1 (fma (cos x) 0.4 0.5)))
double code(double x) {
	return fma(cos((x + x)), 0.1, fma(cos(x), 0.4, 0.5));
}
function code(x)
	return fma(cos(Float64(x + x)), 0.1, fma(cos(x), 0.4, 0.5))
end
code[x_] := N[(N[Cos[N[(x + x), $MachinePrecision]], $MachinePrecision] * 0.1 + N[(N[Cos[x], $MachinePrecision] * 0.4 + 0.5), $MachinePrecision]), $MachinePrecision]
\begin{array}{l}

\\
\mathsf{fma}\left(\cos \left(x + x\right), 0.1, \mathsf{fma}\left(\cos x, 0.4, 0.5\right)\right)
\end{array}
Derivation
  1. Initial program 100.0%

    \[\left(0.5 + 0.4 \cdot \cos x\right) + 0.1 \cdot \cos \left(2 \cdot x\right) \]
  2. Add Preprocessing
  3. Step-by-step derivation
    1. lift-+.f64N/A

      \[\leadsto \color{blue}{\left(\frac{1}{2} + \frac{3602879701896397}{9007199254740992} \cdot \cos x\right) + \frac{3602879701896397}{36028797018963968} \cdot \cos \left(2 \cdot x\right)} \]
    2. +-commutativeN/A

      \[\leadsto \color{blue}{\frac{3602879701896397}{36028797018963968} \cdot \cos \left(2 \cdot x\right) + \left(\frac{1}{2} + \frac{3602879701896397}{9007199254740992} \cdot \cos x\right)} \]
    3. lift-*.f64N/A

      \[\leadsto \color{blue}{\frac{3602879701896397}{36028797018963968} \cdot \cos \left(2 \cdot x\right)} + \left(\frac{1}{2} + \frac{3602879701896397}{9007199254740992} \cdot \cos x\right) \]
    4. *-commutativeN/A

      \[\leadsto \color{blue}{\cos \left(2 \cdot x\right) \cdot \frac{3602879701896397}{36028797018963968}} + \left(\frac{1}{2} + \frac{3602879701896397}{9007199254740992} \cdot \cos x\right) \]
    5. lower-fma.f64100.0

      \[\leadsto \color{blue}{\mathsf{fma}\left(\cos \left(2 \cdot x\right), 0.1, 0.5 + 0.4 \cdot \cos x\right)} \]
    6. lift-+.f64N/A

      \[\leadsto \mathsf{fma}\left(\cos \left(2 \cdot x\right), \frac{3602879701896397}{36028797018963968}, \color{blue}{\frac{1}{2} + \frac{3602879701896397}{9007199254740992} \cdot \cos x}\right) \]
    7. +-commutativeN/A

      \[\leadsto \mathsf{fma}\left(\cos \left(2 \cdot x\right), \frac{3602879701896397}{36028797018963968}, \color{blue}{\frac{3602879701896397}{9007199254740992} \cdot \cos x + \frac{1}{2}}\right) \]
    8. lift-*.f64N/A

      \[\leadsto \mathsf{fma}\left(\cos \left(2 \cdot x\right), \frac{3602879701896397}{36028797018963968}, \color{blue}{\frac{3602879701896397}{9007199254740992} \cdot \cos x} + \frac{1}{2}\right) \]
    9. *-commutativeN/A

      \[\leadsto \mathsf{fma}\left(\cos \left(2 \cdot x\right), \frac{3602879701896397}{36028797018963968}, \color{blue}{\cos x \cdot \frac{3602879701896397}{9007199254740992}} + \frac{1}{2}\right) \]
    10. lower-fma.f64100.0

      \[\leadsto \mathsf{fma}\left(\cos \left(2 \cdot x\right), 0.1, \color{blue}{\mathsf{fma}\left(\cos x, 0.4, 0.5\right)}\right) \]
  4. Applied rewrites100.0%

    \[\leadsto \color{blue}{\mathsf{fma}\left(\cos \left(2 \cdot x\right), 0.1, \mathsf{fma}\left(\cos x, 0.4, 0.5\right)\right)} \]
  5. Step-by-step derivation
    1. lift-cos.f64N/A

      \[\leadsto \mathsf{fma}\left(\color{blue}{\cos \left(2 \cdot x\right)}, \frac{3602879701896397}{36028797018963968}, \mathsf{fma}\left(\cos x, \frac{3602879701896397}{9007199254740992}, \frac{1}{2}\right)\right) \]
    2. lift-*.f64N/A

      \[\leadsto \mathsf{fma}\left(\cos \color{blue}{\left(2 \cdot x\right)}, \frac{3602879701896397}{36028797018963968}, \mathsf{fma}\left(\cos x, \frac{3602879701896397}{9007199254740992}, \frac{1}{2}\right)\right) \]
    3. cos-2N/A

      \[\leadsto \mathsf{fma}\left(\color{blue}{\cos x \cdot \cos x - \sin x \cdot \sin x}, \frac{3602879701896397}{36028797018963968}, \mathsf{fma}\left(\cos x, \frac{3602879701896397}{9007199254740992}, \frac{1}{2}\right)\right) \]
    4. cos-sumN/A

      \[\leadsto \mathsf{fma}\left(\color{blue}{\cos \left(x + x\right)}, \frac{3602879701896397}{36028797018963968}, \mathsf{fma}\left(\cos x, \frac{3602879701896397}{9007199254740992}, \frac{1}{2}\right)\right) \]
    5. lower-cos.f64N/A

      \[\leadsto \mathsf{fma}\left(\color{blue}{\cos \left(x + x\right)}, \frac{3602879701896397}{36028797018963968}, \mathsf{fma}\left(\cos x, \frac{3602879701896397}{9007199254740992}, \frac{1}{2}\right)\right) \]
    6. lower-+.f64100.0

      \[\leadsto \mathsf{fma}\left(\cos \color{blue}{\left(x + x\right)}, 0.1, \mathsf{fma}\left(\cos x, 0.4, 0.5\right)\right) \]
  6. Applied rewrites100.0%

    \[\leadsto \mathsf{fma}\left(\color{blue}{\cos \left(x + x\right)}, 0.1, \mathsf{fma}\left(\cos x, 0.4, 0.5\right)\right) \]
  7. Add Preprocessing

Alternative 3: 99.0% accurate, 1.6× speedup?

\[\begin{array}{l} \\ \mathsf{fma}\left(\mathsf{fma}\left(0.016666666666666666, x \cdot x, -0.2\right), x \cdot x, 0.9\right) + 0.1 \cdot \cos \left(2 \cdot x\right) \end{array} \]
(FPCore (x)
 :precision binary64
 (+
  (fma (fma 0.016666666666666666 (* x x) -0.2) (* x x) 0.9)
  (* 0.1 (cos (* 2.0 x)))))
double code(double x) {
	return fma(fma(0.016666666666666666, (x * x), -0.2), (x * x), 0.9) + (0.1 * cos((2.0 * x)));
}
function code(x)
	return Float64(fma(fma(0.016666666666666666, Float64(x * x), -0.2), Float64(x * x), 0.9) + Float64(0.1 * cos(Float64(2.0 * x))))
end
code[x_] := N[(N[(N[(0.016666666666666666 * N[(x * x), $MachinePrecision] + -0.2), $MachinePrecision] * N[(x * x), $MachinePrecision] + 0.9), $MachinePrecision] + N[(0.1 * N[Cos[N[(2.0 * x), $MachinePrecision]], $MachinePrecision]), $MachinePrecision]), $MachinePrecision]
\begin{array}{l}

\\
\mathsf{fma}\left(\mathsf{fma}\left(0.016666666666666666, x \cdot x, -0.2\right), x \cdot x, 0.9\right) + 0.1 \cdot \cos \left(2 \cdot x\right)
\end{array}
Derivation
  1. Initial program 100.0%

    \[\left(0.5 + 0.4 \cdot \cos x\right) + 0.1 \cdot \cos \left(2 \cdot x\right) \]
  2. Add Preprocessing
  3. Taylor expanded in x around 0

    \[\leadsto \color{blue}{\left(\frac{8106479329266893}{9007199254740992} + {x}^{2} \cdot \left(\frac{3602879701896397}{216172782113783808} \cdot {x}^{2} - \frac{3602879701896397}{18014398509481984}\right)\right)} + \frac{3602879701896397}{36028797018963968} \cdot \cos \left(2 \cdot x\right) \]
  4. Step-by-step derivation
    1. +-commutativeN/A

      \[\leadsto \color{blue}{\left({x}^{2} \cdot \left(\frac{3602879701896397}{216172782113783808} \cdot {x}^{2} - \frac{3602879701896397}{18014398509481984}\right) + \frac{8106479329266893}{9007199254740992}\right)} + \frac{3602879701896397}{36028797018963968} \cdot \cos \left(2 \cdot x\right) \]
    2. *-commutativeN/A

      \[\leadsto \left(\color{blue}{\left(\frac{3602879701896397}{216172782113783808} \cdot {x}^{2} - \frac{3602879701896397}{18014398509481984}\right) \cdot {x}^{2}} + \frac{8106479329266893}{9007199254740992}\right) + \frac{3602879701896397}{36028797018963968} \cdot \cos \left(2 \cdot x\right) \]
    3. lower-fma.f64N/A

      \[\leadsto \color{blue}{\mathsf{fma}\left(\frac{3602879701896397}{216172782113783808} \cdot {x}^{2} - \frac{3602879701896397}{18014398509481984}, {x}^{2}, \frac{8106479329266893}{9007199254740992}\right)} + \frac{3602879701896397}{36028797018963968} \cdot \cos \left(2 \cdot x\right) \]
    4. sub-negN/A

      \[\leadsto \mathsf{fma}\left(\color{blue}{\frac{3602879701896397}{216172782113783808} \cdot {x}^{2} + \left(\mathsf{neg}\left(\frac{3602879701896397}{18014398509481984}\right)\right)}, {x}^{2}, \frac{8106479329266893}{9007199254740992}\right) + \frac{3602879701896397}{36028797018963968} \cdot \cos \left(2 \cdot x\right) \]
    5. metadata-evalN/A

      \[\leadsto \mathsf{fma}\left(\frac{3602879701896397}{216172782113783808} \cdot {x}^{2} + \color{blue}{\frac{-3602879701896397}{18014398509481984}}, {x}^{2}, \frac{8106479329266893}{9007199254740992}\right) + \frac{3602879701896397}{36028797018963968} \cdot \cos \left(2 \cdot x\right) \]
    6. lower-fma.f64N/A

      \[\leadsto \mathsf{fma}\left(\color{blue}{\mathsf{fma}\left(\frac{3602879701896397}{216172782113783808}, {x}^{2}, \frac{-3602879701896397}{18014398509481984}\right)}, {x}^{2}, \frac{8106479329266893}{9007199254740992}\right) + \frac{3602879701896397}{36028797018963968} \cdot \cos \left(2 \cdot x\right) \]
    7. unpow2N/A

      \[\leadsto \mathsf{fma}\left(\mathsf{fma}\left(\frac{3602879701896397}{216172782113783808}, \color{blue}{x \cdot x}, \frac{-3602879701896397}{18014398509481984}\right), {x}^{2}, \frac{8106479329266893}{9007199254740992}\right) + \frac{3602879701896397}{36028797018963968} \cdot \cos \left(2 \cdot x\right) \]
    8. lower-*.f64N/A

      \[\leadsto \mathsf{fma}\left(\mathsf{fma}\left(\frac{3602879701896397}{216172782113783808}, \color{blue}{x \cdot x}, \frac{-3602879701896397}{18014398509481984}\right), {x}^{2}, \frac{8106479329266893}{9007199254740992}\right) + \frac{3602879701896397}{36028797018963968} \cdot \cos \left(2 \cdot x\right) \]
    9. unpow2N/A

      \[\leadsto \mathsf{fma}\left(\mathsf{fma}\left(\frac{3602879701896397}{216172782113783808}, x \cdot x, \frac{-3602879701896397}{18014398509481984}\right), \color{blue}{x \cdot x}, \frac{8106479329266893}{9007199254740992}\right) + \frac{3602879701896397}{36028797018963968} \cdot \cos \left(2 \cdot x\right) \]
    10. lower-*.f6499.0

      \[\leadsto \mathsf{fma}\left(\mathsf{fma}\left(0.016666666666666666, x \cdot x, -0.2\right), \color{blue}{x \cdot x}, 0.9\right) + 0.1 \cdot \cos \left(2 \cdot x\right) \]
  5. Applied rewrites99.0%

    \[\leadsto \color{blue}{\mathsf{fma}\left(\mathsf{fma}\left(0.016666666666666666, x \cdot x, -0.2\right), x \cdot x, 0.9\right)} + 0.1 \cdot \cos \left(2 \cdot x\right) \]
  6. Add Preprocessing

Alternative 4: 98.9% accurate, 1.7× speedup?

\[\begin{array}{l} \\ \left(0.5 + 0.4 \cdot \cos x\right) + \mathsf{fma}\left(\mathsf{fma}\left(0.06666666666666667, x \cdot x, -0.2\right), x \cdot x, 0.1\right) \end{array} \]
(FPCore (x)
 :precision binary64
 (+
  (+ 0.5 (* 0.4 (cos x)))
  (fma (fma 0.06666666666666667 (* x x) -0.2) (* x x) 0.1)))
double code(double x) {
	return (0.5 + (0.4 * cos(x))) + fma(fma(0.06666666666666667, (x * x), -0.2), (x * x), 0.1);
}
function code(x)
	return Float64(Float64(0.5 + Float64(0.4 * cos(x))) + fma(fma(0.06666666666666667, Float64(x * x), -0.2), Float64(x * x), 0.1))
end
code[x_] := N[(N[(0.5 + N[(0.4 * N[Cos[x], $MachinePrecision]), $MachinePrecision]), $MachinePrecision] + N[(N[(0.06666666666666667 * N[(x * x), $MachinePrecision] + -0.2), $MachinePrecision] * N[(x * x), $MachinePrecision] + 0.1), $MachinePrecision]), $MachinePrecision]
\begin{array}{l}

\\
\left(0.5 + 0.4 \cdot \cos x\right) + \mathsf{fma}\left(\mathsf{fma}\left(0.06666666666666667, x \cdot x, -0.2\right), x \cdot x, 0.1\right)
\end{array}
Derivation
  1. Initial program 100.0%

    \[\left(0.5 + 0.4 \cdot \cos x\right) + 0.1 \cdot \cos \left(2 \cdot x\right) \]
  2. Add Preprocessing
  3. Taylor expanded in x around 0

    \[\leadsto \left(\frac{1}{2} + \frac{3602879701896397}{9007199254740992} \cdot \cos x\right) + \color{blue}{\left(\frac{3602879701896397}{36028797018963968} + {x}^{2} \cdot \left(\frac{3602879701896397}{54043195528445952} \cdot {x}^{2} - \frac{3602879701896397}{18014398509481984}\right)\right)} \]
  4. Step-by-step derivation
    1. +-commutativeN/A

      \[\leadsto \left(\frac{1}{2} + \frac{3602879701896397}{9007199254740992} \cdot \cos x\right) + \color{blue}{\left({x}^{2} \cdot \left(\frac{3602879701896397}{54043195528445952} \cdot {x}^{2} - \frac{3602879701896397}{18014398509481984}\right) + \frac{3602879701896397}{36028797018963968}\right)} \]
    2. *-commutativeN/A

      \[\leadsto \left(\frac{1}{2} + \frac{3602879701896397}{9007199254740992} \cdot \cos x\right) + \left(\color{blue}{\left(\frac{3602879701896397}{54043195528445952} \cdot {x}^{2} - \frac{3602879701896397}{18014398509481984}\right) \cdot {x}^{2}} + \frac{3602879701896397}{36028797018963968}\right) \]
    3. lower-fma.f64N/A

      \[\leadsto \left(\frac{1}{2} + \frac{3602879701896397}{9007199254740992} \cdot \cos x\right) + \color{blue}{\mathsf{fma}\left(\frac{3602879701896397}{54043195528445952} \cdot {x}^{2} - \frac{3602879701896397}{18014398509481984}, {x}^{2}, \frac{3602879701896397}{36028797018963968}\right)} \]
    4. sub-negN/A

      \[\leadsto \left(\frac{1}{2} + \frac{3602879701896397}{9007199254740992} \cdot \cos x\right) + \mathsf{fma}\left(\color{blue}{\frac{3602879701896397}{54043195528445952} \cdot {x}^{2} + \left(\mathsf{neg}\left(\frac{3602879701896397}{18014398509481984}\right)\right)}, {x}^{2}, \frac{3602879701896397}{36028797018963968}\right) \]
    5. metadata-evalN/A

      \[\leadsto \left(\frac{1}{2} + \frac{3602879701896397}{9007199254740992} \cdot \cos x\right) + \mathsf{fma}\left(\frac{3602879701896397}{54043195528445952} \cdot {x}^{2} + \color{blue}{\frac{-3602879701896397}{18014398509481984}}, {x}^{2}, \frac{3602879701896397}{36028797018963968}\right) \]
    6. lower-fma.f64N/A

      \[\leadsto \left(\frac{1}{2} + \frac{3602879701896397}{9007199254740992} \cdot \cos x\right) + \mathsf{fma}\left(\color{blue}{\mathsf{fma}\left(\frac{3602879701896397}{54043195528445952}, {x}^{2}, \frac{-3602879701896397}{18014398509481984}\right)}, {x}^{2}, \frac{3602879701896397}{36028797018963968}\right) \]
    7. unpow2N/A

      \[\leadsto \left(\frac{1}{2} + \frac{3602879701896397}{9007199254740992} \cdot \cos x\right) + \mathsf{fma}\left(\mathsf{fma}\left(\frac{3602879701896397}{54043195528445952}, \color{blue}{x \cdot x}, \frac{-3602879701896397}{18014398509481984}\right), {x}^{2}, \frac{3602879701896397}{36028797018963968}\right) \]
    8. lower-*.f64N/A

      \[\leadsto \left(\frac{1}{2} + \frac{3602879701896397}{9007199254740992} \cdot \cos x\right) + \mathsf{fma}\left(\mathsf{fma}\left(\frac{3602879701896397}{54043195528445952}, \color{blue}{x \cdot x}, \frac{-3602879701896397}{18014398509481984}\right), {x}^{2}, \frac{3602879701896397}{36028797018963968}\right) \]
    9. unpow2N/A

      \[\leadsto \left(\frac{1}{2} + \frac{3602879701896397}{9007199254740992} \cdot \cos x\right) + \mathsf{fma}\left(\mathsf{fma}\left(\frac{3602879701896397}{54043195528445952}, x \cdot x, \frac{-3602879701896397}{18014398509481984}\right), \color{blue}{x \cdot x}, \frac{3602879701896397}{36028797018963968}\right) \]
    10. lower-*.f6499.0

      \[\leadsto \left(0.5 + 0.4 \cdot \cos x\right) + \mathsf{fma}\left(\mathsf{fma}\left(0.06666666666666667, x \cdot x, -0.2\right), \color{blue}{x \cdot x}, 0.1\right) \]
  5. Applied rewrites99.0%

    \[\leadsto \left(0.5 + 0.4 \cdot \cos x\right) + \color{blue}{\mathsf{fma}\left(\mathsf{fma}\left(0.06666666666666667, x \cdot x, -0.2\right), x \cdot x, 0.1\right)} \]
  6. Add Preprocessing

Alternative 5: 98.9% accurate, 4.2× speedup?

\[\begin{array}{l} \\ \mathsf{fma}\left(\mathsf{fma}\left(0.016666666666666666, x \cdot x, -0.2\right), x \cdot x, 0.9\right) + 0.1 \cdot \mathsf{fma}\left(\mathsf{fma}\left(0.6666666666666666, x \cdot x, -2\right), x \cdot x, 1\right) \end{array} \]
(FPCore (x)
 :precision binary64
 (+
  (fma (fma 0.016666666666666666 (* x x) -0.2) (* x x) 0.9)
  (* 0.1 (fma (fma 0.6666666666666666 (* x x) -2.0) (* x x) 1.0))))
double code(double x) {
	return fma(fma(0.016666666666666666, (x * x), -0.2), (x * x), 0.9) + (0.1 * fma(fma(0.6666666666666666, (x * x), -2.0), (x * x), 1.0));
}
function code(x)
	return Float64(fma(fma(0.016666666666666666, Float64(x * x), -0.2), Float64(x * x), 0.9) + Float64(0.1 * fma(fma(0.6666666666666666, Float64(x * x), -2.0), Float64(x * x), 1.0)))
end
code[x_] := N[(N[(N[(0.016666666666666666 * N[(x * x), $MachinePrecision] + -0.2), $MachinePrecision] * N[(x * x), $MachinePrecision] + 0.9), $MachinePrecision] + N[(0.1 * N[(N[(0.6666666666666666 * N[(x * x), $MachinePrecision] + -2.0), $MachinePrecision] * N[(x * x), $MachinePrecision] + 1.0), $MachinePrecision]), $MachinePrecision]), $MachinePrecision]
\begin{array}{l}

\\
\mathsf{fma}\left(\mathsf{fma}\left(0.016666666666666666, x \cdot x, -0.2\right), x \cdot x, 0.9\right) + 0.1 \cdot \mathsf{fma}\left(\mathsf{fma}\left(0.6666666666666666, x \cdot x, -2\right), x \cdot x, 1\right)
\end{array}
Derivation
  1. Initial program 100.0%

    \[\left(0.5 + 0.4 \cdot \cos x\right) + 0.1 \cdot \cos \left(2 \cdot x\right) \]
  2. Add Preprocessing
  3. Taylor expanded in x around 0

    \[\leadsto \color{blue}{\left(\frac{8106479329266893}{9007199254740992} + {x}^{2} \cdot \left(\frac{3602879701896397}{216172782113783808} \cdot {x}^{2} - \frac{3602879701896397}{18014398509481984}\right)\right)} + \frac{3602879701896397}{36028797018963968} \cdot \cos \left(2 \cdot x\right) \]
  4. Step-by-step derivation
    1. +-commutativeN/A

      \[\leadsto \color{blue}{\left({x}^{2} \cdot \left(\frac{3602879701896397}{216172782113783808} \cdot {x}^{2} - \frac{3602879701896397}{18014398509481984}\right) + \frac{8106479329266893}{9007199254740992}\right)} + \frac{3602879701896397}{36028797018963968} \cdot \cos \left(2 \cdot x\right) \]
    2. *-commutativeN/A

      \[\leadsto \left(\color{blue}{\left(\frac{3602879701896397}{216172782113783808} \cdot {x}^{2} - \frac{3602879701896397}{18014398509481984}\right) \cdot {x}^{2}} + \frac{8106479329266893}{9007199254740992}\right) + \frac{3602879701896397}{36028797018963968} \cdot \cos \left(2 \cdot x\right) \]
    3. lower-fma.f64N/A

      \[\leadsto \color{blue}{\mathsf{fma}\left(\frac{3602879701896397}{216172782113783808} \cdot {x}^{2} - \frac{3602879701896397}{18014398509481984}, {x}^{2}, \frac{8106479329266893}{9007199254740992}\right)} + \frac{3602879701896397}{36028797018963968} \cdot \cos \left(2 \cdot x\right) \]
    4. sub-negN/A

      \[\leadsto \mathsf{fma}\left(\color{blue}{\frac{3602879701896397}{216172782113783808} \cdot {x}^{2} + \left(\mathsf{neg}\left(\frac{3602879701896397}{18014398509481984}\right)\right)}, {x}^{2}, \frac{8106479329266893}{9007199254740992}\right) + \frac{3602879701896397}{36028797018963968} \cdot \cos \left(2 \cdot x\right) \]
    5. metadata-evalN/A

      \[\leadsto \mathsf{fma}\left(\frac{3602879701896397}{216172782113783808} \cdot {x}^{2} + \color{blue}{\frac{-3602879701896397}{18014398509481984}}, {x}^{2}, \frac{8106479329266893}{9007199254740992}\right) + \frac{3602879701896397}{36028797018963968} \cdot \cos \left(2 \cdot x\right) \]
    6. lower-fma.f64N/A

      \[\leadsto \mathsf{fma}\left(\color{blue}{\mathsf{fma}\left(\frac{3602879701896397}{216172782113783808}, {x}^{2}, \frac{-3602879701896397}{18014398509481984}\right)}, {x}^{2}, \frac{8106479329266893}{9007199254740992}\right) + \frac{3602879701896397}{36028797018963968} \cdot \cos \left(2 \cdot x\right) \]
    7. unpow2N/A

      \[\leadsto \mathsf{fma}\left(\mathsf{fma}\left(\frac{3602879701896397}{216172782113783808}, \color{blue}{x \cdot x}, \frac{-3602879701896397}{18014398509481984}\right), {x}^{2}, \frac{8106479329266893}{9007199254740992}\right) + \frac{3602879701896397}{36028797018963968} \cdot \cos \left(2 \cdot x\right) \]
    8. lower-*.f64N/A

      \[\leadsto \mathsf{fma}\left(\mathsf{fma}\left(\frac{3602879701896397}{216172782113783808}, \color{blue}{x \cdot x}, \frac{-3602879701896397}{18014398509481984}\right), {x}^{2}, \frac{8106479329266893}{9007199254740992}\right) + \frac{3602879701896397}{36028797018963968} \cdot \cos \left(2 \cdot x\right) \]
    9. unpow2N/A

      \[\leadsto \mathsf{fma}\left(\mathsf{fma}\left(\frac{3602879701896397}{216172782113783808}, x \cdot x, \frac{-3602879701896397}{18014398509481984}\right), \color{blue}{x \cdot x}, \frac{8106479329266893}{9007199254740992}\right) + \frac{3602879701896397}{36028797018963968} \cdot \cos \left(2 \cdot x\right) \]
    10. lower-*.f6499.0

      \[\leadsto \mathsf{fma}\left(\mathsf{fma}\left(0.016666666666666666, x \cdot x, -0.2\right), \color{blue}{x \cdot x}, 0.9\right) + 0.1 \cdot \cos \left(2 \cdot x\right) \]
  5. Applied rewrites99.0%

    \[\leadsto \color{blue}{\mathsf{fma}\left(\mathsf{fma}\left(0.016666666666666666, x \cdot x, -0.2\right), x \cdot x, 0.9\right)} + 0.1 \cdot \cos \left(2 \cdot x\right) \]
  6. Taylor expanded in x around 0

    \[\leadsto \mathsf{fma}\left(\mathsf{fma}\left(\frac{3602879701896397}{216172782113783808}, x \cdot x, \frac{-3602879701896397}{18014398509481984}\right), x \cdot x, \frac{8106479329266893}{9007199254740992}\right) + \frac{3602879701896397}{36028797018963968} \cdot \color{blue}{\left(1 + {x}^{2} \cdot \left(\frac{2}{3} \cdot {x}^{2} - 2\right)\right)} \]
  7. Step-by-step derivation
    1. +-commutativeN/A

      \[\leadsto \mathsf{fma}\left(\mathsf{fma}\left(\frac{3602879701896397}{216172782113783808}, x \cdot x, \frac{-3602879701896397}{18014398509481984}\right), x \cdot x, \frac{8106479329266893}{9007199254740992}\right) + \frac{3602879701896397}{36028797018963968} \cdot \color{blue}{\left({x}^{2} \cdot \left(\frac{2}{3} \cdot {x}^{2} - 2\right) + 1\right)} \]
    2. *-commutativeN/A

      \[\leadsto \mathsf{fma}\left(\mathsf{fma}\left(\frac{3602879701896397}{216172782113783808}, x \cdot x, \frac{-3602879701896397}{18014398509481984}\right), x \cdot x, \frac{8106479329266893}{9007199254740992}\right) + \frac{3602879701896397}{36028797018963968} \cdot \left(\color{blue}{\left(\frac{2}{3} \cdot {x}^{2} - 2\right) \cdot {x}^{2}} + 1\right) \]
    3. lower-fma.f64N/A

      \[\leadsto \mathsf{fma}\left(\mathsf{fma}\left(\frac{3602879701896397}{216172782113783808}, x \cdot x, \frac{-3602879701896397}{18014398509481984}\right), x \cdot x, \frac{8106479329266893}{9007199254740992}\right) + \frac{3602879701896397}{36028797018963968} \cdot \color{blue}{\mathsf{fma}\left(\frac{2}{3} \cdot {x}^{2} - 2, {x}^{2}, 1\right)} \]
    4. sub-negN/A

      \[\leadsto \mathsf{fma}\left(\mathsf{fma}\left(\frac{3602879701896397}{216172782113783808}, x \cdot x, \frac{-3602879701896397}{18014398509481984}\right), x \cdot x, \frac{8106479329266893}{9007199254740992}\right) + \frac{3602879701896397}{36028797018963968} \cdot \mathsf{fma}\left(\color{blue}{\frac{2}{3} \cdot {x}^{2} + \left(\mathsf{neg}\left(2\right)\right)}, {x}^{2}, 1\right) \]
    5. metadata-evalN/A

      \[\leadsto \mathsf{fma}\left(\mathsf{fma}\left(\frac{3602879701896397}{216172782113783808}, x \cdot x, \frac{-3602879701896397}{18014398509481984}\right), x \cdot x, \frac{8106479329266893}{9007199254740992}\right) + \frac{3602879701896397}{36028797018963968} \cdot \mathsf{fma}\left(\frac{2}{3} \cdot {x}^{2} + \color{blue}{-2}, {x}^{2}, 1\right) \]
    6. lower-fma.f64N/A

      \[\leadsto \mathsf{fma}\left(\mathsf{fma}\left(\frac{3602879701896397}{216172782113783808}, x \cdot x, \frac{-3602879701896397}{18014398509481984}\right), x \cdot x, \frac{8106479329266893}{9007199254740992}\right) + \frac{3602879701896397}{36028797018963968} \cdot \mathsf{fma}\left(\color{blue}{\mathsf{fma}\left(\frac{2}{3}, {x}^{2}, -2\right)}, {x}^{2}, 1\right) \]
    7. unpow2N/A

      \[\leadsto \mathsf{fma}\left(\mathsf{fma}\left(\frac{3602879701896397}{216172782113783808}, x \cdot x, \frac{-3602879701896397}{18014398509481984}\right), x \cdot x, \frac{8106479329266893}{9007199254740992}\right) + \frac{3602879701896397}{36028797018963968} \cdot \mathsf{fma}\left(\mathsf{fma}\left(\frac{2}{3}, \color{blue}{x \cdot x}, -2\right), {x}^{2}, 1\right) \]
    8. lower-*.f64N/A

      \[\leadsto \mathsf{fma}\left(\mathsf{fma}\left(\frac{3602879701896397}{216172782113783808}, x \cdot x, \frac{-3602879701896397}{18014398509481984}\right), x \cdot x, \frac{8106479329266893}{9007199254740992}\right) + \frac{3602879701896397}{36028797018963968} \cdot \mathsf{fma}\left(\mathsf{fma}\left(\frac{2}{3}, \color{blue}{x \cdot x}, -2\right), {x}^{2}, 1\right) \]
    9. unpow2N/A

      \[\leadsto \mathsf{fma}\left(\mathsf{fma}\left(\frac{3602879701896397}{216172782113783808}, x \cdot x, \frac{-3602879701896397}{18014398509481984}\right), x \cdot x, \frac{8106479329266893}{9007199254740992}\right) + \frac{3602879701896397}{36028797018963968} \cdot \mathsf{fma}\left(\mathsf{fma}\left(\frac{2}{3}, x \cdot x, -2\right), \color{blue}{x \cdot x}, 1\right) \]
    10. lower-*.f6499.0

      \[\leadsto \mathsf{fma}\left(\mathsf{fma}\left(0.016666666666666666, x \cdot x, -0.2\right), x \cdot x, 0.9\right) + 0.1 \cdot \mathsf{fma}\left(\mathsf{fma}\left(0.6666666666666666, x \cdot x, -2\right), \color{blue}{x \cdot x}, 1\right) \]
  8. Applied rewrites99.0%

    \[\leadsto \mathsf{fma}\left(\mathsf{fma}\left(0.016666666666666666, x \cdot x, -0.2\right), x \cdot x, 0.9\right) + 0.1 \cdot \color{blue}{\mathsf{fma}\left(\mathsf{fma}\left(0.6666666666666666, x \cdot x, -2\right), x \cdot x, 1\right)} \]
  9. Add Preprocessing

Alternative 6: 98.9% accurate, 9.7× speedup?

\[\begin{array}{l} \\ \mathsf{fma}\left(\mathsf{fma}\left(0.08333333333333334, x \cdot x, -0.4\right), x \cdot x, 1\right) \end{array} \]
(FPCore (x)
 :precision binary64
 (fma (fma 0.08333333333333334 (* x x) -0.4) (* x x) 1.0))
double code(double x) {
	return fma(fma(0.08333333333333334, (x * x), -0.4), (x * x), 1.0);
}
function code(x)
	return fma(fma(0.08333333333333334, Float64(x * x), -0.4), Float64(x * x), 1.0)
end
code[x_] := N[(N[(0.08333333333333334 * N[(x * x), $MachinePrecision] + -0.4), $MachinePrecision] * N[(x * x), $MachinePrecision] + 1.0), $MachinePrecision]
\begin{array}{l}

\\
\mathsf{fma}\left(\mathsf{fma}\left(0.08333333333333334, x \cdot x, -0.4\right), x \cdot x, 1\right)
\end{array}
Derivation
  1. Initial program 100.0%

    \[\left(0.5 + 0.4 \cdot \cos x\right) + 0.1 \cdot \cos \left(2 \cdot x\right) \]
  2. Add Preprocessing
  3. Taylor expanded in x around 0

    \[\leadsto \color{blue}{\frac{36028797018963969}{36028797018963968} + {x}^{2} \cdot \left(\frac{18014398509481985}{216172782113783808} \cdot {x}^{2} - \frac{3602879701896397}{9007199254740992}\right)} \]
  4. Step-by-step derivation
    1. +-commutativeN/A

      \[\leadsto \color{blue}{{x}^{2} \cdot \left(\frac{18014398509481985}{216172782113783808} \cdot {x}^{2} - \frac{3602879701896397}{9007199254740992}\right) + \frac{36028797018963969}{36028797018963968}} \]
    2. *-commutativeN/A

      \[\leadsto \color{blue}{\left(\frac{18014398509481985}{216172782113783808} \cdot {x}^{2} - \frac{3602879701896397}{9007199254740992}\right) \cdot {x}^{2}} + \frac{36028797018963969}{36028797018963968} \]
    3. lower-fma.f64N/A

      \[\leadsto \color{blue}{\mathsf{fma}\left(\frac{18014398509481985}{216172782113783808} \cdot {x}^{2} - \frac{3602879701896397}{9007199254740992}, {x}^{2}, \frac{36028797018963969}{36028797018963968}\right)} \]
    4. sub-negN/A

      \[\leadsto \mathsf{fma}\left(\color{blue}{\frac{18014398509481985}{216172782113783808} \cdot {x}^{2} + \left(\mathsf{neg}\left(\frac{3602879701896397}{9007199254740992}\right)\right)}, {x}^{2}, \frac{36028797018963969}{36028797018963968}\right) \]
    5. metadata-evalN/A

      \[\leadsto \mathsf{fma}\left(\frac{18014398509481985}{216172782113783808} \cdot {x}^{2} + \color{blue}{\frac{-3602879701896397}{9007199254740992}}, {x}^{2}, \frac{36028797018963969}{36028797018963968}\right) \]
    6. lower-fma.f64N/A

      \[\leadsto \mathsf{fma}\left(\color{blue}{\mathsf{fma}\left(\frac{18014398509481985}{216172782113783808}, {x}^{2}, \frac{-3602879701896397}{9007199254740992}\right)}, {x}^{2}, \frac{36028797018963969}{36028797018963968}\right) \]
    7. unpow2N/A

      \[\leadsto \mathsf{fma}\left(\mathsf{fma}\left(\frac{18014398509481985}{216172782113783808}, \color{blue}{x \cdot x}, \frac{-3602879701896397}{9007199254740992}\right), {x}^{2}, \frac{36028797018963969}{36028797018963968}\right) \]
    8. lower-*.f64N/A

      \[\leadsto \mathsf{fma}\left(\mathsf{fma}\left(\frac{18014398509481985}{216172782113783808}, \color{blue}{x \cdot x}, \frac{-3602879701896397}{9007199254740992}\right), {x}^{2}, \frac{36028797018963969}{36028797018963968}\right) \]
    9. unpow2N/A

      \[\leadsto \mathsf{fma}\left(\mathsf{fma}\left(\frac{18014398509481985}{216172782113783808}, x \cdot x, \frac{-3602879701896397}{9007199254740992}\right), \color{blue}{x \cdot x}, \frac{36028797018963969}{36028797018963968}\right) \]
    10. lower-*.f6499.0

      \[\leadsto \mathsf{fma}\left(\mathsf{fma}\left(0.08333333333333334, x \cdot x, -0.4\right), \color{blue}{x \cdot x}, 1\right) \]
  5. Applied rewrites99.0%

    \[\leadsto \color{blue}{\mathsf{fma}\left(\mathsf{fma}\left(0.08333333333333334, x \cdot x, -0.4\right), x \cdot x, 1\right)} \]
  6. Add Preprocessing

Alternative 7: 98.6% accurate, 18.5× speedup?

\[\begin{array}{l} \\ \mathsf{fma}\left(x \cdot x, -0.4, 1\right) \end{array} \]
(FPCore (x) :precision binary64 (fma (* x x) -0.4 1.0))
double code(double x) {
	return fma((x * x), -0.4, 1.0);
}
function code(x)
	return fma(Float64(x * x), -0.4, 1.0)
end
code[x_] := N[(N[(x * x), $MachinePrecision] * -0.4 + 1.0), $MachinePrecision]
\begin{array}{l}

\\
\mathsf{fma}\left(x \cdot x, -0.4, 1\right)
\end{array}
Derivation
  1. Initial program 100.0%

    \[\left(0.5 + 0.4 \cdot \cos x\right) + 0.1 \cdot \cos \left(2 \cdot x\right) \]
  2. Add Preprocessing
  3. Taylor expanded in x around 0

    \[\leadsto \color{blue}{\frac{36028797018963969}{36028797018963968} + \frac{-3602879701896397}{9007199254740992} \cdot {x}^{2}} \]
  4. Step-by-step derivation
    1. +-commutativeN/A

      \[\leadsto \color{blue}{\frac{-3602879701896397}{9007199254740992} \cdot {x}^{2} + \frac{36028797018963969}{36028797018963968}} \]
    2. *-commutativeN/A

      \[\leadsto \color{blue}{{x}^{2} \cdot \frac{-3602879701896397}{9007199254740992}} + \frac{36028797018963969}{36028797018963968} \]
    3. lower-fma.f64N/A

      \[\leadsto \color{blue}{\mathsf{fma}\left({x}^{2}, \frac{-3602879701896397}{9007199254740992}, \frac{36028797018963969}{36028797018963968}\right)} \]
    4. unpow2N/A

      \[\leadsto \mathsf{fma}\left(\color{blue}{x \cdot x}, \frac{-3602879701896397}{9007199254740992}, \frac{36028797018963969}{36028797018963968}\right) \]
    5. lower-*.f6498.5

      \[\leadsto \mathsf{fma}\left(\color{blue}{x \cdot x}, -0.4, 1\right) \]
  5. Applied rewrites98.5%

    \[\leadsto \color{blue}{\mathsf{fma}\left(x \cdot x, -0.4, 1\right)} \]
  6. Add Preprocessing

Alternative 8: 98.3% accurate, 222.0× speedup?

\[\begin{array}{l} \\ 1 \end{array} \]
(FPCore (x) :precision binary64 1.0)
double code(double x) {
	return 1.0;
}
real(8) function code(x)
    real(8), intent (in) :: x
    code = 1.0d0
end function
public static double code(double x) {
	return 1.0;
}
def code(x):
	return 1.0
function code(x)
	return 1.0
end
function tmp = code(x)
	tmp = 1.0;
end
code[x_] := 1.0
\begin{array}{l}

\\
1
\end{array}
Derivation
  1. Initial program 100.0%

    \[\left(0.5 + 0.4 \cdot \cos x\right) + 0.1 \cdot \cos \left(2 \cdot x\right) \]
  2. Add Preprocessing
  3. Taylor expanded in x around 0

    \[\leadsto \color{blue}{\frac{36028797018963969}{36028797018963968}} \]
  4. Step-by-step derivation
    1. Applied rewrites98.1%

      \[\leadsto \color{blue}{1} \]
    2. Add Preprocessing

    Reproduce

    ?
    herbie shell --seed 1 
    (FPCore (x)
      :name "0.5 + 0.4 * cos(x) + 0.1 * cos(2 x)"
      :precision binary64
      :pre (and (<= -1000.0 x) (<= x 1000.0))
      (+ (+ 0.5 (* 0.4 (cos x))) (* 0.1 (cos (* 2.0 x)))))