You have reached the end of this book, but you are only at the beginning of the journey of scientific discovery. There are a lot of cool things left for you to learn about. Below are some recommendation of subjects which you might be interested in.
Electrostatics is the study of the electric force $\vec{F}_e$ and the associated electric potential $U_e$. You will also learn about the electric field $\vec{E}$ and electric potential $V$.
Magnetism is the study of the magnetic force $\vec{F}_b$ and the magnetic field $\vec{B}$, which is caused by electric currents flowing in wires. The current $I$ is the total number of electrons passing through a cross-section of the wire in one second. By virtue of its motion through space, each electron contributes to the strength of the magnetic field surrounding the wire.
The beauty of electromagnetism is that the entire theory can be describes in just four equations: \[ \begin{align*} \nabla \cdot \vec{E} &= \frac {\rho} {\varepsilon_0} & \textrm{Gauss's law} \nl \nabla \cdot \vec{B} &= 0 & \textrm{ Gauss's law for magnetism} \nl \nabla \times \vec{E} &= -\frac{\partial \vec{B}} {\partial t} & \textrm{Faraday's law of induction } \nl \nabla \times \vec{B} &= \mu_0\vec{J} + \mu_0 \varepsilon_0 \frac{\partial \vec{E}} {\partial t} & \textrm{ Ampère's circuital law } \end{align*} \] Together, these are known as Maxwell's equations.
You may be wondering what the triangle thing is $\nabla$. The symbol $\nabla$ (nabla) is the vector derivative operation. Guess what, you can also do calculus with vectors.
In vector calculus you will learn about path integrals, surface integrals and volume integrals of vector quantities. You will also learn about vector-derivatives and two vector equivalents of the Fundamental Theorem of Calculus:
\[ \iint_{\Sigma} \nabla \times \vec{F} \cdot d\vec{\Sigma} = \int_{\partial\Sigma} \vec{F} \cdot d \vec{r}, \]
which relates the integral of the $\textrm{curl} \vec{F} \equiv \nabla \times \vec{F}$ of the field $\vec{F}$ over the surface $\Sigma$ to the circulation of $\vec{F}$ along the boundary of the surface $\partial\Sigma$. * Gauss' Divergence Theorem: \[ \iiint_{\mathrm{V}} \nabla \cdot \vec{F} \ d\mathrm{V} = \int\!\!\!\int_{\partial \mathrm{V}} \vec{F} \cdot d \vec{\Sigma}, \] which relates the integral of the divergence $\textrm{div} \vec{F} \equiv \nabla \cdot \vec{F}$ of the field $\vec{F}$ over the volume $\mathrm{V}$ to the flux of $\vec{F}$ through the volume boundary $\partial\mathrm{V}$.
Both of the above theorems relate the total of some derivative quantity over some region $R$ to the quantity on the boundary of the region $R$, which we denote as $\partial R$. The Fundamental Theorem of Calculus can also be interpreted in the same manner: \[ \int_I F^\prime(x) \; dx = \int_a^b F^\prime(x) \; dx = F_{\partial I} = F(b) - F(a), \] where $I=[a,b]$ is the interval from $a$ to $b$ on the real line and the two points $a$ and $b$ form its boundary $\partial I$.
Only physicists and engineers have to take this course.
Of wider interest is the course which studies calculus with functions which have more than one input variable. Consider as an example the function $f(x,y)$ which has two input variables $x$ and $y$. You can plot this function as a surface, where the height $z$ of the function above the point $(x,y)$ is given by the function value $z=f(x,y)$.
There is no new math to learn in multivariable calculus: it is just the same stuff as Calculus I (derivatives) and Calculus 2 (integrals) but with more variables. For a function $f(x,y)$ there will be an “$x$-derivative” $\frac{\partial}{\partial x}$ and a “$y$-derivative” $\frac{\partial}{\partial y}$. The operator $\nabla$ is a combination of both the $x$ and $y$ derivatives: $\nabla f(x,y) = [ \frac{\partial f}{\partial x}, \frac{\partial f}{\partial x}]$. Note that $\nabla$ acts on a function $f(x,y)$ to produce a vector. This is known as the gradient vector, which tells you the “slope” of the function. More specifically it tells you the direction of maximum increase of the function. If you think of $z=f(x,y)$ as the height of a mountain at a particular $(x,y)$ coordinates on a map then the gradient vector $\nabla f(x,y)$ always points uphill.
If you understood derivatives and integrals well, then you should definitely take this course (usually called Calculus III) as it is perhaps the easiest science course you will ever take.
Probability distribution are a fundamental tool for modelling non-deterministic behaviour. A discrete random variable $X$ is associated with a probability mass function $p_X(x) \equiv \textrm{Pr}\{ X = x \}$, which assigns a “probability mass” to each of the possible outcomes $x \in \mathcal{X}$. For example, if $X$ represents a fair die, then the possible outcomes are $\mathcal{X}=\{ 1, 2, 3, 4, 5, 6 \}$ and the probability mass function has the values $p_X(x)=\frac{1}{6}$, $\forall x \in \mathcal{X}$.
Probability theory is used all over the place: statistics, machine learning, quantum mechanics, gambling, risk analysis, etc.
Mathematics is a very broad field. There are all kinds of topics to learn about: some of them fun, some of them useful, some of them boring and some which have been know historically to drive people insane. Like, literally.
I recently found a very interesting book which covers many topics of general interest and serves as a great overview of the various areas of mathematics. I highly recommend that you take a look at this book if you are interested in math. It is written for the general audience so it is very accessible.
NOINDENT [BOOK] Richard Elwes. Mathematics 1001: Absolutely Everything That Matters About Mathematics in 1001 Bite-Sized Explanations, Firefly Books, 2010, ISBN 1554077192.
If you want to more about physics, I highly recommend the Feynman lectures on physics. This three-tome collection covers all of undergraduate physics with countless links to more advanced topics:
NOINDENT [BOOK] Richard P. Feynman, Robert B. Leighton, Matthew Sands. The Feynman Lectures on Physics including Feynman's Tips on Physics: The Definitive and Extended Edition, Addison Wesley, 2005, ISBN 0805390456.
While on the Feynman note, I want to also recommend his other book with life stories.
NOINDENT [BOOK] Richard P. Feynman. Surely You're Joking, Mr. Feynman! (Adventures of a Curious Character), W. W. Norton & Company, 1997, ISBN 0393316041.
In this book we learned about Newtonian mechanics, that is, Mechanics starting from Newton's laws. There is a much more general framework known as Lagrangian mechanics which can be used to analyze more complex mechanical systems. The following is an excellent book on the subject.
NOINDENT [BOOK] Herbert Goldstein, Charles P. Poole Jr., John L. Safko. Classical Mechanics, Addison-Wesley, Third edition, 2001, ISBN 0201657023.
Quantum mechanics describes the physics of all things is small: photons, electrons and atoms. An absolutely approachable and readable introduction to the subject is Richard Feynman's QED book.
NOINDENT [BOOK] Richard P. Feynman. QED: The strange theory of light and matter. Princeton University Press, 2006, ISBN 0691125759.
For a deeper understanding of quantum mechanics, I recommend the book by Sakurai. If you understand linear algebra, then you can understand quantum mechanics.
NOINDENT [BOOK] Jun John Sakurai. Modern Quantum Mechanics, Second Edition, Addison-Wesley, 2010, ISBN 0805382917.
If you want to read Sakurai, it would be a good idea to first learn about Lagrangian mechanics from Goldstein. Goldstein followed by Sakurai is an excellent combo.
Claude Shannon developed a mathematical framework for studying the problems of information storage and information transmission. Using statistical notions such as entropy, we can quantify the information content of data sources and the information transmitting abilities of noisy communication channels.
We can arrive at an operational interpretation of the information carrying capacity of a noisy communication channel in terms of our ability to convert it into a noiseless channel. Channels with more noise have a smaller capacity for carrying information. Consider a channel which allows us to send data at the rate of 1 MB/sec on which half of the packets sent get lost due to the effects of noise on the channel. It is not true that the capacity of such a channel is 1MB/sec, because we also have to account for the need to retransmit lost packets. In order to correctly characterize the information carrying capacity of a channel, we must consider the rate of the end-to-end code which converts many uses of the noisy channel into an effectively noiseless communication channel.
Channel coding is one are the fundamental problems studied in information theory. The book by Cover and Thomas is an excellent textbook on the subject, which I highly recommend.
NOINDENT [BOOK] Thomas M. Cover, Joy A. Thomas. Elements of Information Theory, Wiley, 2006, ISBN 0471241954.
With this book, I tried to equip you with as much tools as I could, so that the remainder of your science studies will be enjoyable and pain free. Remember to always take it easy. Play with math and never take things too seriously. Grades don't matter. Big paycheques don't matter. Never settle for a boring job just because it is well paid. Try to work only on projects which you care about.
I want you to be confident in your ability to handle math, physics and other complicated stuff that life will throw at you. You have the tools to do anything you want. Choose your own adventure. And if the banks come-a-knocking one day, offering you a big paycheque for the application of your analytical skills to their avaricious schemes, send them-a-walking.