If you interested in some back-of-the-hand order of magnitude estimates, you might consider looking at how $\binom{n}{k}$ behaves when $k=k(n)$ has a certain size. The idea I have in mind is to break down $\sum_{k=0}^m\binom{n}{k}$ into a sum over intervals of $k$ satisfying a certain regime. For example, look at terms where $k=\Theta(n)$, $k=\Theta(n^{1/2})$, etc. In general, using Stirling's approximation, you'll get:
$\binom{n}{k}=\frac{n^ke^k}{k^k\sqrt{2\pi k}} A$
where $A:=\frac{n_{k}}{k^k}=\prod_{i=0}^{k-1}\left(1-\frac{i}{n}\right)$ and $n_k$ is the falling factorial. In particular, it's nicer to work with $B:=\ln(A) = \sum_{i=0}^{k-1} \ln\left(1-\frac{i}{n}\right)$.
Now the idea is that each of the logarithm terms in $B$ can be Taylor expanded up to "sufficient" order depending on the size of $k$ compared to $n$. For example if $k=o(1)$, then$B\approx \sum_{i=0}^{k-1}\approx -\frac{k^2}{2n}$, so you get $A=e^{-\frac{k^2}{2n}(1+o(1))}$. In fact, you can do better than this if you expand $B$ to higher orders. In particular, if $k=o(n^{2/3})$, then $B=\sum_{i=0}^{k-1}-\frac{i}{n}+O(i^2n^{-2})=-\frac{k^2}{2n}+o(1)$ which gives $A=e^{-\frac{k^2}{2n}}(1+o(1))$ where now the $o(1)$ is no longer exponentiated. For other sizes of $k$, the exact same procedure works as long as you expand $B$ to sufficiently high order.