Deadly Boring Math
The Mostest Bestestest FOSS Math Journal on the Internet[citation needed]

News
Differential Equations Week 4 & 5
by Tyler Clarke on 2025-6-15
Multivariable Calculus Megathread
by Tyler Clarke on 2025-5-15
Differential Equations Exam 1 Review
by Tyler Clarke on 2025-6-6
Differential Equations Week 1
by Tyler Clarke on 2025-5-17
Differential Equations Week 3
by Tyler Clarke on 2025-5-31
Differential Equations Week 2
by Tyler Clarke on 2025-5-25
Differential Equations Week 6
by Tyler Clarke on 2025-6-23
Differential Quiz 1
by Tyler Clarke on 2025-5-21
Differential Quiz 2
by Tyler Clarke on 2025-6-2
Differential Quiz 4
by Tyler Clarke on 2025-6-25
Differential Equations Week 7
by Tyler Clarke on 2025-6-30
Physics Exam 3 Review
by Tyler Clarke on 2025-4-11
Math Gun pt. 1
by Tyler Clarke on 2025-5-2
Inductance Hell
by Tyler Clarke on 2025-4-5
Physics Final Exam Review
by Tyler Clarke on 2025-4-26
A Brief Introduction to AI Math
by Tyler Clarke on 2025-4-8

Paul's Online Math Notes
Khan Academy

Notice a mistake? Contact me at plupy44@gmail.com
How To Contribute
Rendered with Sitix
Mathematical formulas generated via MathJax.
Disclaimer (important!)

Differential Equations Week 7

By Tyler Clarke in Calculus on 2025-6-30

Welcome back! It's been one hell of a week, in the best way possible. We are haz Laplace! Last week, we introduced Laplace transforms; this week, we're actually using them to solve equations.

Briefly: Laplace Features

The good ol' Laplace transform has a few features we need to cover. First is that `L(e^{ct} f(t)) = F(s - c)`, where `L(f(t)) = F(s)`. This is fairly easy to prove, but I'm not going to. This feature is incredibly useful for solving some difficult problems: for instance, `L(e^{3t}cos(2t))` is just `F(s - 3)` where `F(s) = L(cos(2t))`.

Also of note is that the Laplace transform of a derivative is trivial: if `F(s) = L(f(t))`, then `L(f'(t)) = s F(s) - f(0)`. Easy! This extends nicely to higher derivatives: consider that, by this formula, `L(f''(t)) = s L(f'(t)) - f''(0)`, which expands nicely to `L(f''(t)) = s (s L(f(t)) - f(0)) - f'(0) = s^2 L(f(t)) - s f(0) - f'(0)`. This process can be repeated for arbitrarily high derivatives.

Finally, given `L(t^n f(t))`, the Laplace transform is `(-1)^n F^(n)(s)`: negative one to the nth power times the nth derivative of the laplace transform of `f(t)`.

All of these except the last one are pretty easy to work out a proof yourself. If you're interested in the proofs, I highly recommend reading the textbook on all of them.

Invertin'!

Before we can actually solve, we need to cover one last thing: how do we get the function out of Laplace-space? This is hard. In fact, it's sufficiently hard that the easiest way is to use a lookup table of well-known Laplace transforms, pick the one that matches, and use it in reverse. For instance, the inverse Laplace transform `L^{-1}(frac 1 s)` is easily found to be `= 1`. It's time to build a rolling table!

We'll add to that as we go along. For a full table, consult page 313 in the textbook.

Note that in many cases, there's some algebraic teasing you can do to the Laplace transform to make it invertible. For instance, the linear property of Laplace transforms means that `L^{-1}(frac {a + s} {s^2 + a^2})` is equivalent to `L^{-1}(frac a {s^2 + a^2}) + L^{-1}(frac s {s^2 + a^2})`, which is incredibly useful: solving the compact form might be impossible, but in the expanded form, the result is just `f(t) = cos(at) + sin(at)`. Easy-ish!

Let's do a serious example. The textbook kindly provides us the problem `L^{-1}(frac 2 {(s + 2)^4} + frac 3 {s^2 + 16} + 5 frac {s + 1} {s^2 + 2s + 5})`. This looks intimidating, but it's easier than it seems. Let's solve it piece by piece. First up is `L^{-1} (frac 2 {(s + 2)^4})`. The closest match is the identity `L(t^n e^{at}) = frac {n!} {(s - a)^{n+1}}`, which would give us `n=3` and `a=-2`: `3! = 6`, so we'll have the constant coefficient `frac 1 3`. Thus: `L^{-1} (frac 2 {(s + 2)^4}) = frac 1 3 t^3 e^{-2t}`.

Next problem. `L^{-1}(frac 3 {s^2 + 16})` is best matched by the identity `L(sin(at)) = frac a {s^2 + a^2}`, where `a=4`; this gives us the constant coefficient `frac 3 4`, to a final result `L^{-1}(frac 3 {s^2 + 16}) = frac 3 4 sin(4t)`.

Last but not least: `L^{-1} (5 frac {s + 1} {s^2 + 2s + 5})`. This closest matches the identity `L(e^{at} cos(bt)) = frac {s - a} {(s - a)^2 + b^2)}`: `a` is obviously `-1`, so we expand this out to get `frac {s + 1} {s^2 + 2s + 1 + b^2}`: for this to match, `b^2 = 4`, so `b = 2`. A hop, skip, and a substitution later, and we have `L^{-1} (5 frac {s + 1} {s^2 + 2s + 5}) = 5 e^{-t} cos(2t)`.

Adding all these together, we get `L^{-1}(frac 2 {(s + 2)^4} + frac 3 {s^2 + 16} + 5 frac {s + 1} {s^2 + 2s + 5}) = frac 1 3 t^3 e^{-2t} + frac 3 4 sin(4t) + 5 e^{-t} cos(2t)`. Surprisingly easy!

These are the inversion substitutions we've covered so far:

Solving!

Solving is actually surprisingly easy. The process, for any differential equation, is to take the Laplace transform of both sides, solve for `L(y)`, and then invert `L(y)` to get `y`. This makes it possible to represent difficult differential problems as simple algebra problems.

Let's start with an example. The textbook kindly gives us `y' + 2y = sin(4t), y(0) = 1`. We'll start by taking the Laplace transform of both sides: we end up with something like `s L(y) - y(0) + 2L(y) = frac 4 {s^2 + 16}`. Our goal is to isolate `L(y)`, which is pretty easy to do algebraically: we get `L(y) = frac {frac 4 {s^2 + 16} + y(0)} {s + 2}`. Laplace is easier to do when it's broken up, so we expand to get `L(y) = frac 4 {(s^2 + 16)(s + 2)} + frac 1 {s + 2}` (note that I substituted `y(0) = 1`).

This is going to require a partial fraction expansion. We can use `frac 4 {(s^2 + 16)(s + 2)} = frac A {s + 2} + frac {sB + C} {s^2 + 16}`, which solves out to `frac 4 {(s^2 + 16)(s + 2)} = frac 1 5 (frac 1 {s + 2} + frac {2 - s} {s^2 + 16})`. Recombining and rearranging gives us `L(y) = frac 6 5 frac 1 {s + 2} - frac 1 5 frac {s} {s^2 + 16} + frac 1 10 frac {4} {s^2 + 16}`. This is actually pretty easy to solve with the identities `L(e^{-2t}) = frac 1 {s + 2}`, `L(cos(4t)) = frac s {s^2 + 16}`, and `L(sin(4t)) = frac 4 {s^2 + 16}`. We get the result `y = frac 6 5 e^{-2t} - frac 1 5 cos(4t) + frac 1 10 sin(4t)`. Nice!

Briefly: Step Functions and Time Shifting

The unit step function, or Heaviside function, is a piecewise function defined as `h(t) = 0; t < 0` and `h(t) = 1; t >= 0`: if we're using electric circuits as an analogy, the Heaviside function throws the switch at precisely time 0. There are a couple other step functions and variants: for instance, the indicator function, which is `1` between `0` and `d`, and `0` otherwise.

Many useful situations involve a shift of `c` units across the `t` axis. The typical way to do this is to multiply by a shifted heaviside function `h_c` (`0` before `c`, `1` after it), and subtract `c` in the function parameter. That's a mouthful, but it's really quite simple: given `f(t)` and some shift `c`, the shifted version is `h_c(t) f(t - c)`. The Laplace transform of this is actually very easy: `L(h_c(t) f(t - c)) = e^{-cs} F(s)`. I won't bother proving why, but it's interesting and I highly recommend reading the textbook about it.