Motivation If you've never heard of why someone would study the "Theory of Computation" and how this differs from something perhaps more tangible like Motivation If you've never heard of why someone would study the "Theory of Computation" and how this differs from something perhaps more tangible like programming or software engineering, you might come to the conclusion that this is "all just theory"...*waves around hand in the air in a useless circular manner*, but I'd beg you to not jump to this thought too soon! It's not that the latter is practical, while the former is just theory, but rather that what we call "programming" or "software engineering" is usually very specific to a certain set of technologies of a specific era or even a specific hardware implementation, whereas the former "theory" contains concepts that can be applied agnostic of the hardware you are dealing with, agnostic to the time period you are living in, agnostic of the specific programming language you are using, and even in my case, agnostic to the field at which you are even applying this knowledge to!
Computing is a very general concept that can be applied to multiple fields and domains of expertise and is much more a "way of thinking" and collection of unique logical tools and methods for solving problems in whatever domain, either software, linguistics, mathematics, electrical engineering, philosophy, botany, physics, and in my case, the foundations of biology! It is kind of unfortunate given the name, but "computing" is not about computers per se, but rather about the nature of problem solving, talking about certain limits to how efficiently a problem can be solved, and even figuring out if our problem even has a way of solving it to begin with (which you'll come to learn is very different from merely checking whether an answer is correct)! The nature of time, memory, space, efficiency, and representation are all topics that are explored in this area.
Background Having just recently finished a Formal Grammars & Automata Theory course at the local university where we used this as our textbook, I have to say that this book was probably the main reason for why I was able to keep up and not fall behind in the course. It doesn't read like a dry definition-lemma-proof book you might have encountered in some fields of math, but rather a hybrid of that with lots of helpful injected commentary going over the thought processes of how to solve these problems and how to work your way through these proofs in plain old regular english, with lots of useful diagrams when needed. One thing I especially appreciated was the author giving a theorem or making a claim, then following that up with a helpful discussion on the "proof idea" behind that claim, and then next with the proof written in a combination of english + symbols, and finally if needed, a very formal "algorithmic" way with the steps clearly enumerated line-by-line and in concise fashion.
An example of that can be seen here: [image]
This is a really good pedagogical technique that I wish more authors would follow, as students are given a choice as to "what level" of explanation and rigor best appeals to them. Here's another example, this time with an accompanying picture: [image]
For the self-study people out there thinking about prerequisites, it would probably be wise to have some practice and former experience with "proofs", but regardless, the author does a great job reviewing common mathematical notation and basic proof techniques at the beginning of the book, which would usually correspond to your common undergraduate first course in discrete mathematics. Such things as relations, functions, graphs, strings, languages, boolean logic, proof by construction / contradiction / induction, etc are reviewed at the beginning and accompanied with ample practice problems (with solutions!) such that you can strengthen your foundation if needed before moving onto the meat of the book. As a disclaimer, although I'm just a data scientist with a former background in neuroscience, I have invested a non-trivial amount of both non-formal + official time in coursework such as logic, set theory, and generally "pure math" areas, and I was also concurrently going through another official and very complementary logic course in Undecidability & Incompleteness, so take my review with that in mind.
Other Helpful Material When studying some subject I always recommend trying to learn from multiple sources because you can't always guarantee that you'll never "get stuck" on some concept or section of a book. I've found that sometimes having multiple perspectives and different people explaining things to me can help. As such, I supplemented my reading with another textbook called What Can Be Computed? (my review here) which gives a more *hands-on*, practical approach to the theory of computing with lots of exercises in Python.
Two other books I was reading in parallel that I recommend are The Annotated Turing and Gödel's Theorem: An Incomplete Guide to Its Use and Abuse, the former of which I highly recommend for its excellent historical treatment of Alan Turing's original 36-page paper establishing what is now known as the Turing Machine, and the latter book whose mathematical & philosophical topic arguably spurred the development of the former (for a short & neat explanation of this for the layman, ). I usually like accompanying my studies with historical background like this to not only help motivate me, but also to give me a bigger "forest-over-trees" picture and context as to why these ideas were even formed to begin with, how this fits into the grander historical narrative, and why I should even care. It's also somewhat fun for my socially-inclined primate brain to know the people and controversies behind such ideas, which provides a useful scaffold and cognitive framework to remember this material much better and for longer (if you especially enjoy this, see the book Exact Thinking in Demented Times: The Vienna Circle and the Epic Quest for the Foundations of Science).
In addition to the aforementioned books, I would also recommend going through which covers the Theory of Computing and solves a lot of practice problems with you (the latter of which is key to becoming fluent with this material!) Hopefully all that linked material proves helpful on your learning journey :)
Content This book is broken up into 3 main parts: Automata & Languages, Computability Theory, and Complexity Theory. Again, each section contains lots of worked out examples with solutions, accompanied with helpful commentary.
The first part was my favorite and the main reason for why I was taking this course. In it the author covers the theory of regular languages, going over finite automata, non-determinism, regular expressions, and non-regular languages, while introducing you to a new and useful proof technique called the Pumping Lemma, which allows you to disprove the regularity of a language. For those coming from a linguistics background, you might find the intersection of this material and the incredibly fascinating. The author then follows this up with material about context-free languages, discussing context-free grammars, pushdown automata, non-context free languages, and deterministic context-free languages. This material is highly relevant to those who are interested in the more practical problems of parsing natural languages, or maybe designing your own programming language, or even how a compiler works, etc. Somewhat of an aside, but I really wished this book at least introduced or covered a little bit of context-sensitive grammars, since that was my primary interest and motivation for even taking this course to begin with (due to its relationship with cellular automata, lindenmayer systems, artificial life, and theoretical biology, all of which ).
In the second part of this book, the content shifts focus towards arguably more historical and philosophical content going under the header of 'Computability Theory'. Here we are introduced to the famous Church-Turing Thesis, Turing machines and a few of their variants (for an interesting article on "time-traveling Turing machines" and closed-timelike curves in computation, by quantum computing theorist, Scott Aaronson). This section also covers the famous Halting/Decidability problem, as well as important proof techniques like diagonalization and reducibility. Some advanced topics are also covered with a small introduction to information theory and minimal length description, the compressibility of strings, randomness, etc. Although we didn't officially cover it in my course, I found this latter section quite interesting! (and especially relevant to biology!)
This last section of the book switches gears to what arguably makes up the bulk of most theoretical computer science research today. It deals with something called 'Complexity Theory'. Whereas the former second section of the book dealt with the question of whether some problems were even possible to solve to begin with, this latter third section of the book deals with how efficient can a problem be solved given a certain amount of time, space, memory, etc, ranking problems according to certain complexity classes. In here we first are introduced to how we measure complexity using Big-O notation, the famous P vs NP problem, NP-completeness, the Cook-Levin Theorem, SAT problems, and additional NP-complete problems like independent set, vertex cover, hamiltonian path, subset-sum, etc (which are all helpful in proof techniques like reduction used to prove NP-completeness, etc). And although we never were able to cover this far in my official course, the book does continue to go over space complexity like Savitch's theorem, NL-completeness, Intractability (hierarchy problems, relativization, and circuit complexity), as well as more advanced topics like approximation algorithms, probabilistic algorithms interactive proof systems, parallel computation, and cryptography, all of which I assume the author included in case this book is used in the graduate-level Masters or PhD setting for specific topical courses and such.
Conclusion Overall, this book introduced highly fascinating material at the intersection of multiple disciplines of computer science, logic, math, philosophy, electrical engineering, linguistics, etc. The amount of accompanying explanatory discussion along with the ample amount of worked-out examples makes this book highly recommended in my opinion (and suitable for self-study, especially if you follow along with one of the free lecture materials I linked to above).
My only comment for improvement is to include some introductory material in the first section on *context-sensitive grammars! I would've loved to have an official and well-backed source that I could reference and learn this from. And also, although it might seem a tad too advanced, I do think maybe a brief introduction to what is known as which observes the fundamental relationships between computation / programming languages, formal logic / type theory, and category theory / topology to be a highly relevant and significant perspective which would allow the student coming from another field to see all the fascinating connections linking everything together, or alternatively help a student to branch out to other fields to which the theory of computation can be applied. This "rosetta stone" is a marvelous way to bring this book and field to higher prominence and relevance today.