Models of computation
Big-O notation Link to heading
Big-\(O\), or \(\mathcal O\) notation relates the asymptotic growth of two functions to each other.
Definition 1 Big-O.
Given two functions \(f, g: \mathbb R \to \mathbb R^{\geq 0}\), we say that \[f(x) = O(g(x)), \text{ or } f(x)\in O(g(x))\] when there exists a constant \(M\in \mathbb R\) such that for all sufficiently large \(x\): \[ |f(x)| \leq M\cdot g(x).\]
Quiz time! Link to heading
Proof of asdf with Definition 1.