*Science Alert*.

## Thursday, February 23, 2017

## Wednesday, February 22, 2017

### Lost in Translation

Here's what I got so far...

"Change happens."Calculus I:

"How to Speak Computer: A Beginner's Guide for Humans."Discrete Math:

"How To Solve a lot of Problems That Look Like Each Other, All at Once, in No Time."Linear Algebra:

"Create order. Prevent chaos."Intro to Databases:

### Backwards, Forwards, and All At Once

I took an online Calculus MOOC course some years ago. The toughest thing about it for me wasn't necessarily the calculus. I got the concept of limits fine enough, until it required factoring polynomials in order to simplify the limit. So I learned there were some algebra basics I had forgotten after years of not using, but I managed to study on the side just enough to stick with it.

The class also introduced trigonometry in both limits and derivatives. Trig was something that drew a blank entirely. Either I had never studied it, didn't pay attention in high school, or simply had long forgotten. At that point I decided to quit, because I had enrolled in the course with the intention of getting the most out of it. Even if I somehow managed to get through it, it was clear to me that it wouldn't stick at this point. When I enrolled back in school and started off in Algebra years down the road, I knew could have started higher, but I also I was there because I really wanted to get a firm grasp of all the important basics that would matter in any future classes I took or self-study I pursued. And I'm glad I did it.

This attempt to learn the basics of math and science seems to very much be about studying some topic, realizing that you have a basic knowledge gap, and going back to shore up your fundamental knowledge base before or while moving forward. This theme seems to only repeat itself, even as I (slowly) move up the ladder.

After that MOOC class I read through a couple (or a few) introductory calculus books on my own over time. I'm not sure how much I absorbed. But I tested it in part by trying to look through books I was interested in reading later. Somewhere along the way, I decided that two of the books I really wanted to not only get through, but to eventually read with ease and understand intuitively, were

*Mathematics in Nature*and

*A Mathematical Walk Through Nature*, by John A. Adams. It sounds pseudo hippy-dippy corny as can be, but what I wanted to do was read the book, get a really good grasp of what he was saying, then spend some times in the woods, by the beach, or wherever I see more greenery than concrete, and see all the things he wrote about (this is still an eventual goal of mine). Then, I'd follow up with

*Adams' X and the City: Modeling Aspects of Urban Life*and take a stroll through a nearby city to see if all the same things somehow looked different.

When I tried to read

*Mathematics in Nature*(that must have been some two or three years ago), it was clear I wasn't yet up to the task. But that book, along with a handful of others, did reaffirm the notion that if I was going to get a basic foundation for appreciating the world around me through a basic understanding of the laws of nature that governed it, that it would be important to really get comfortable with calculus and trigonometry.

Just last week, I attempted to get through

*In Praise of Simple Physics*. Turns out, early on, the author mentions the book is really for readers who have already gotten through a class in freshman calculus. I tried anyway, and only made it through something like one-half to two-thirds of the book. But conceptually, I understood what I knew, and got a better sense of the things that I lacked. For one, my current calculus I professor had thankfully introduced implicit differentiation early in the class. So I came to quickly see that implicit differentiation was one of the things that stopped me from understanding calculus in other books I tried to read before. I always used to wonder were that extra term came from - it always felt like it just appeared out of nowhere for no apparent reason. And now the process was somewhat intuitive.

This week, I took on

*The Theoretical Minimum*, which I had also tried to read years before. And it is now already remarkably clearer than my first time around. I could at least understand what was going on. I also learned something new here. Whenever I used to see symbols for partial differentiation, I never knew what the heck I was looking at. Now, the author's explanation seemed straightforward and intuitive. The book even starts off touching upon concepts related to discrete math, which I'm also currently taking.

In chapter 6, roughly midway through

*The Theoretical Minimum*, the author introduces the concept of The Principle of Least Action. Here, he uses Euler-Langrange equations. Somehow, I can sort of mechanically follow some or most of the mathematics and sort of kind of get what was going on conceptually with out really being able to put it all together. These are equations I could not work out myself as of yet. I stopped after the next chapter, when I realized future portions of the book are built on concepts shared in chapter 6.

I learned something from reading both of these books. One, that there's been some progression for me. Secondly, how important integrals are (I have a reason to look forward to learning about them in detail). Also, while math may be the bedrock for physics, and physics may be the considered the starting point for understanding the theory behind physical science for general, at least for me, understanding the mathematics doesn't necessarily mean understanding the basics of physics. Merging the two seems more like drawing a picture in your head, filtering to through all the mathematical gadgets and gizmos you have in your toolkit to see what might fit, creating new ones if you need to, and rearranging it all together in a way that makes sense and is truly (or at least functionally) representative of.

This makes me think that deriving equations for physical phenomena aren't always intuitive as I'd assumed. Part of doing the hard work of exhaustive and detailed reasoning is making sure you are aren't overcome by faulty assumptions. I get the impression that intuition may get one started in getting from A to B, and help smooth out rough patches in between, but that the end result hardly ever exactly matches what first came to mind. More like the creators learn along the way of trying to figure it all out. Or at least it's safer to assume so.

I don't know that I'll plan to take any physics classes in school (time and money are of concern). But I do know that I plan to use what I learn while taking these math classes to study more about the physical world, starting with concepts in physics before I move onto other things. From reading around, I found that calculus and linear algebra certainly both play a role in physics. So I went looking for books that covered both, hopefully in context of physics, and at a lower level than the two previous I read over the last couple of weeks.

Based on the synopsis and table of contents,

*No Bullshit Guide To Math and Physics*seemed to fit the criteria I was looking for. It shores up pre-calculus fundamentals before going on to basic physics and calculus, vectors, differential and integral calculus, and mechanics. It seems conceptually thorough while still catering to the beginner. Hopefully, it will prove to be a good compliment to current classes, and a decent starting point for similar books I plan to read afterward.

Onward, ho.

Subscribe to:
Posts (Atom)