Deadly Boring Math
The Mostest Bestestest FOSS Math Journal on the Internet[citation needed]

News
Differential Equations Week 4 & 5
by Tyler Clarke on 2025-6-15
Multivariable Calculus Megathread
by Tyler Clarke on 2025-5-15
Differential Equations Exam 1 Review
by Tyler Clarke on 2025-6-6
Differential Equations Week 1
by Tyler Clarke on 2025-5-17
Differential Equations Week 8
by Tyler Clarke on 2025-7-7
Differential Equations Week 3
by Tyler Clarke on 2025-5-31
Differential Equations Week 2
by Tyler Clarke on 2025-5-25
Differential Equations Exam 2 Review
by Tyler Clarke on 2025-7-4
Differential Equations Week 6
by Tyler Clarke on 2025-6-23
Differential Quiz 1
by Tyler Clarke on 2025-5-21
Differential Quiz 5
by Tyler Clarke on 2025-7-16
Differential Equations Final Exam
by Tyler Clarke on 2025-7-27
Differential Quiz 2
by Tyler Clarke on 2025-6-2
Differential Quiz 4
by Tyler Clarke on 2025-6-25
Differential Equations Week 9
by Tyler Clarke on 2025-7-15
Differential Equations Week 10
by Tyler Clarke on 2025-7-21
Differential Equations Week 7
by Tyler Clarke on 2025-6-30
Physics Exam 3 Review
by Tyler Clarke on 2025-4-11
Math Gun pt. 1
by Tyler Clarke on 2025-5-2
Inductance Hell
by Tyler Clarke on 2025-4-5
Physics Final Exam Review
by Tyler Clarke on 2025-4-26
A Brief Introduction to AI Math
by Tyler Clarke on 2025-4-8

Paul's Online Math Notes
Khan Academy

Notice a mistake? Contact me at plupy44@gmail.com
How To Contribute
Rendered with Sitix
Mathematical formulas generated via MathJax.
Disclaimer (important!)

Differential Quiz 5

By Tyler Clarke in Calculus on 2025-7-16

The fifth and final quiz is tomorrow in the normal studio section. If you're happy with your previous quiz grades, you don't need to attend! One is dropped. I'm not quite so lucky as to be able to skip, hence this post.

WS5.7.1: Solving with Laplace (easy)

Let's start with an easy one. We're given `y'' - y = -20 delta (t - 3), y(0) = 4, y'(0) = 4`, and asked to solve for `y`. Implicitly, we need to solve with Laplace transforms: any other technique would get unpleasant, fast. Taking the Laplace transform of both sides is pretty easy: `(s^2 - 1)L(y) - 4s - 4 = -20e^{-3s}`. Rearranging and solving yields `L(y) = frac { 4 + 4s - 20e^{-3s} } {s^2 - 1}`. This is not hard to take an inverse Laplace transform of; note that we can separate it out to `L(y) = 4 frac 1 {s^2 - 1} + 4 frac {s} {s^2 - 1} - 20e^{-3s} frac 1 { s^2 - 1 }`. This is convenient! We know from the Laplace table (thankfully, memorizing this isn't necessary) that `L(sinh (at)) = frac a {s^2 - a^2}`, `L(cosh(at)) = frac s {s^2 - a^2}`, and `L(u_c (t) f(t - c)) = e^{-cs} L(f(s))`, so this separated form really nicely reverse-Laplaces down to `y = 4 sinh(t) + 4 cosh(t) - 20u_{3}cosh(t + 3)`.

Note that hyperbolic trig isn't actually necessary to solve this problem. The worksheet has a different (but probably equivalent) result which does not use cosh or sinh.

WS5.7.2: Solving with Laplace (gettin' harder all the tii-iime)

This is in much the same vein, but far more complicated. We're given `y'' + y = delta(t - 2 pi) frac {1 + cos(t)} {1 + t^2}, y(0) = 0, y'(0) = 0`, and asked to solve. Note that, for the right hand side, we easily get `L(y'' + y) = (s^2 + 1)L(y)`. What of the left hand? It turns out there's a little-covered (completely ignored in the textbook, as far as I can tell) property of the delta function: `L(delta (t - t_0) f(t)) = e^{-s t_0} f(t_0)`. In this case, `t_0 = 2 pi`, so `(s^2 + 1)L(y) = e^{-2 pi s} frac {2} {1 + 4 pi^2}`. Rearrange: `L(y) = frac {e^{-2 pi s}} {s^2 + 1} frac {2} {1 + 4 pi^2}`. Note that much of this is constant! Using the rule that `L(u_c(t)f(t - c)) = e^{-cs} L(f(t))` and `L(sin(t)) = frac a {s^2 + a^2}`, we can invert this to get `y = frac {2} {1 + 4 pi^2} u_{2pi}(t) sin(t - 2pi)`.

WS5.8.1: How Very Convoluted

We're asked to find the Laplace transform of `int_0^t e^{-(t - tau)} sin(tau) d tau`. How very convenient; this is exactly the right form for a convolution integral! Remember that `L(int_0^t f(t - tau) g(tau) d tau) = L(f(t))G(f(t))`. In this case, `f(t) = e^{-t}`, and `g(t) = sin(t)`. `L(f(t)) = L(e^{-t}) = frac 1 {s + 1}`, and `L(g(t)) = L(sin(t)) = frac 1 {s^2 + 1}`, so our final result is just `frac 1 {(s + 1)(s^2 + 1)}`.

WS5.8.3: That, But In Reverse

Now we're given `frac s {(s + 1)(s^2 + 4)}`, and asked to find the inverse Laplace transform. This is not very difficult. If we represent it as the product `frac 1 {s + 1} frac s {s^2 + 4}`, we can separate to use convolution: `L(f(t)) = frac 1 {s + 1}`, `L(g(t)) = frac s {s^2 + 4}`, meaning `f(t) = e^{-t}` and `g(t) = cos(2t)` (these are basic rules from the Laplace sheet). Substituting into the convolution integral gives us `int_0^t e^{-(t - tau)} cos(2 tau) d tau`.

Sadly, we have to simplify this. I really, really, really don't want to do the double-IBP necessary to reduce this; fortunately, the problem is well within Wolfram's capabilities! We get the fairly nice `frac {2sin(2t) + cos(t) - e^{-t}} {5}`.

WS7.1.1.a: Findin' Some Critical Points

We're given the system `x'(t) = 1 + 5y, y'(t) = 1 - 6x^2`, and asked to find its critical points - the points where both `x'(t)` and `y'(t)` are equal to 0. The trick is to find the nullclines: these are lines along which each derivative is individually 0. The intersection points are the critical points. For `x'`, there's only one nullcline: `y = - frac 1 5`; for `y'` there are two: `x = +- sqrt(frac 1 6)`. These are vertical and horizontal, so finding the intersection points is easy, and gives us the critical points `(- sqrt (frac 1 6), - frac 1 5), (sqrt(frac 1 6), - frac 1 5)`. Easy.

WS7.1.1.b: Even More Critical Points

This time it's a bit different. We're given the system `x' = 2x - x^2 - xy`, `y' = 3y - 2y^2 - 3xy`, and asked to find critical points. How can we do this? We start by zero-substituting as normal, getting the system `2x - x^2 - xy = 0`, `3y - 2y^2 - 3xy = 0`. We need to be able to find zeroes easily, so let's use an age old trick: factoring! We end up with `x(2 - x - y) = 0`, `y(3 - 2y - 3x) = 0`. This gives us a pretty obvious `(0, 0)` solution at the origin. What about solutions not at the origin? `2 - x - y = 0, 3 - 2y - 3x = 0` is conveniently linear: we can solve this with Gaussian elimination!

The augmented matrix looks something like this: `[[1, 1 | 2], [3, 2 | 3]]`. I won't go through the process; our end result is `(x, y) = (-1, 3)`. Great! That's our only solution not lying on one of the zero-axes. We could have more solutions on those axes, though! For instance, when `x = 0`, we fully satisfy the first equation, and the second one looks like `0 = 3y - 2y^2` - we've already ruled out the `y=0` case here, so we only want the `2y = 3` part; `y = frac 3 2`. That yields the critical point `(0, frac 3 2)`. Now let's check for `y=0`: this fully satisfies the second equation, and gives us `0 = 2x - x^2`. `x = 2`. Hence, we have four critical points, `(0, 0)`, `(-1, 3)`, `(0, frac 3 2)`, and `(2, 0)`. Nice! It's very important that you consider all the possibilities for zeroes; missing one could be disastrous.

WS7.2.1

This one isn't as hard as it is tedious. We're given a system `x' = x - y^2, y' = x - 2y + x^2`, and asked to,

First off, critical points! We have two equations: `x - y^2 = 0` and `x - 2y + x^2 = 0`. The origin `(0, 0)` is obviously a critical point here. Along both the `x=0` and the `y=0` line, there are no critical points but the origin; this can be show by substituting into the first equation. Let's find some more! `x' = 0` along the horizontal parabola `x = y^2`, and `y' = 0` along the shifted parabola `y = frac { x + x^2 } 2`. Intersecting these gives us `y = y^2 frac {1 + y^2} 2`, `2 = y + y^3`, `y^3 + y - 2 = 0`. There is only one solution to this: `y = 1`. We resubstitute that into `x = y^2`, to get the point `(1, 1)`. These are the only two solutions!

Now that we have `(0, 0)` and `(1, 1)`, we need to linearize. For a linear system in the form `x' = F(x, y), y' = G(x, y)`, the linearization near a point `(x_0, y_0)` is `[x', y'] = [[F_{x}(x_0, y_0), F_{y}(x_0, y_0)], [G_{x}(x_0, y_0), G_{y}(x_0, y_0)]] cdot [x - x_0, y - y_0]`. In this case, `F(x, y) = x - y^2`, `G(x, y) = x - 2y + x^2`. The partial derivative matrix is thus `[[1, -2y], [1 + 2x, -2]]`, and so our linearization at the origin is `[x', y'] = [[1, 0], [1, -2]] [x, y]`. Note that this is the original system, minus the nonlinear component!

Our linearization at `(1, 1)` is `[x', y'] = [[1, -2], [3, -2]] [x - 1, y - 1]`.

Now we need to classify the critical points. The simplest way to do this is with the eigenvalues of the linearization matrix:

The eigenvalues of the system at `(0, 0)` are easily found by the characteristic polynomial method to be `-1 +- sqrt(3)`. Because these are real and distinct, and `sqrt(3) - 1 > 0`, we're looking at a saddle point.

The eigenvalues of the system at `(1, 1)` are `frac {-1 +- sqrt(15)i} {2}`, so we're in the complex case; the real part is `< 0`, so we're looking at an asymptotically stable spiral.

Brief Aside: Validating WS7.2.1

In the previous problem, we simply assumed that the function was almost linear around the given points. This obviously worked out, but it's not necessarily a safe assumption! To guarantee that it behaves linearly near each critical point, we need `lim_{ (x, y) -> (x_0, y_0) } frac {g(x, y)} {sqrt((x - x_0)^2 + (y - y_0)^2)}` to go to `0`. Calculating this is hard; fortunately, there's a really easy way to check: look at the Jacobian. If the determinant of the Jacobian matrix is nonzero, then we're almost linear!

SA59: Another Laplace Solution

This one isn't actually very difficult, but it's a nice practice piece for Laplace transforms. We're given `y'' - 3y' + 2y = u_{10} (t), y(0) = 0, y'(0) = 0`, and asked to solve for `y` in terms of unit step functions.

Step one is, of course, to take the Laplace transform of both sides. At this point, I'm assuming anyone reading this already knows how to take Laplace transforms, and won't be surprised that we get `(s^2 - 3s + 2)L(y) = frac { e^{-10s} } s`. Rearranging gives us `L(y) = frac { e^{-10s} } { s(s^2 - 3s + 2) }`. The problem statement kindly tells us that `frac 1 {s(s - 1)(s - 2)} = frac 1 {2s} - frac 1 {s - 1} + frac 1 {2(s - 2)}`, so our inverse Laplace works out to `y = u_{10} (frac 1 2 - e^{t - 10} + frac 1 2 e^{2t - 20})`. Since this is already in terms of the unit step function, we don't need to do any more work!

SA63: Some Critical Points

Easy, quick, good practice. Given the system `x' = x(3 - y), y' = y(x - 4)`, find the critical points. We want all the situations where `x(3 - y) = 0` and `y(x - 4) = 0`. One such situation is clearly `(0, 0)`. For `y = 3`, we have `x = 4`. Hence: `(0, 0), (4, 3)`.

SA64: Linearization

We're given a system `x' = x(3 - y), y' = y(x - 4)`, and asked to find the linear system near the nonorigin critical point. If this looks familiar, it's because it is; we already found the critical points in the immediately preceding question. Our critical point is `(4, 3)`. Remember the Jacobian thing? Our Jacobian is `[[3 - y, -x], [y, x - 4]]`; adapted to the critical point, it is `[[0, -4], [3, 0]]`. The determinant is nonzero so the problem is solvable. Plugging in yields `[x', y'] = [[3 - y, -x], [y, x - 4]] cdot [x - 4, y - 3]`.

Final Notes

Make sure to memorize the behaviors table above! There will probably be classification questions. Good luck on the quiz!