Tuesday, January 24, 2012

The Nuts and Bolts of Proofs by Antonella Cupillari

     In my quest to better understand bioinformatics, I have found myself on the topic of algorithms.  After thumbing through Cormen's classic text, and a couple other resources online, I decided that a better grasp of mathematical proofs would give me the tools needed to tackle algorithms.  I picked up The Nuts and Bolts of Proofs by Antonella Cupillari, and worked my way through in about a month.  The topics covered were: Direct proof, proof by contrapositive, equivalence theorems, use of counterexamples, mathematical induction, uniqueness/existence theorems, and equality of sets/numbers.
      Cupillari does an excellent job explaining the how to construct proofs, and leaves the reader with plenty of exercises after each chapter, as well as an hefty section of exercises without solutions and a collection proofs.  I felt he did a good job covering the concepts of proofs, using no math beyond high school algebra to place the emphasis on constructing arguments using mathematical logic.  The allowed me to start thinking in proofs sooner, without being bogged down by mathematical theory.  Having completed the chapter exercises, and with no other training on mathematical proofs, I feel very comfortable constructing proofs which make up a large portion of the exercises in many algorithm text books.  If you are interested in learning proofs and aren't afraid to put a little work in, I would definitely recommend this book!

Stanford's Machine Learning Course, ml-class.org

     After two months of octave and online videos, I finished Stanford's online course on machine learning.  I am very pleased with the course, and wish it could continue indefinitely.  The course consisted of online videos, content quizzes, and finally programming assignments in octave.  Topics covered include linear regression, neural networks, support vector machines, anomaly detection, and were presented in a way consistent with practical applications of these algorithms.  The programming exercises were taught using Octave, GNU's version of Matlab, and stressed vectorization for efficient computations. This enabling the students to implement and train a computational expensive neural network, and train it with backpropagation for optical character recognition.
     I learned a lot about linear algebra in the course, and developed an appreciation for its broad uses in numerical problems.  Learning more about linear algebra is a priority, and I plan to take a course on it during graduate school or sooner but for right now a handle on matrix multiplication and 3d coordinates will have to do.  Beyond basic linear algebra, the course was not very math heavy, requiring no proofs or formalism.  That being said, some of the vectorization required for the homework assignments was a little tricky, but nothing beyond  a little pen and paper work to figure out.
     If you are looking to learn more about machine learning in general, or would like to learn more about a specific algorithm covered in the course, I would recommend going to ml-class.org and watching some of the videos.  I am glad I took this course, and look forward to using the knowledge I gained in my research.