Sunday, May 07, 2006

first year of graduate school: lessons learned

A few days ago I went to the final class of my first year of graduate school. I just finished (well a few things are still due; however, I have no more classes) the first year of the Robotics PhD program at Carnegie Mellon University.

My first semester I took Appearance Modeling (graphics/vision course) and Machine Learning. This past semester I took Advanced Perception (aka vision 2) and Kinematics, Dynamics &Control (robot/physics course). In addition to these courses, I have been regularly attending CMU's Computer Vision misc-read reading group. I have also had the opportunity to collaborate with many other graduate students on course projects.

The course projects that I worked on are:

Demultiplexing Interreflections with Jean-Francois
Modeling Text Corpora with Latent Dirichlet Allocation with Jon
Learning to Walk without a Leash with Geoff and Mark
Detecting Objects with Multiple Segmentations and Latent Dirichlet Allocation with Jon

I probably learned the most amount of new concepts from the Machine Learning community. Graphical models are definitely very trendy in Computer Vision in 2006. Almost everyone wants to be Bayesian about random variables. After this first year of graduate school, I've expanded my vocabulary to include terms from topics such as: SVMs, kernel methods, spectral clustering, manifold learning, graphical models, texton-based texture modeling, boosting, MCMC, EM, density estimation, pLSA, variational inference, gibbs sampling, RKHS...

However, the one key piece of advice that I keep hearing over and over is the following:
Do not blindly throw Machine Learning algorithms at a vision problem in order to beat the performance of an existing algorithm.

I don't think I'll stray away from Machine Learning; however, it is very important to understand what a Machine Learning algorithm is doing and when it works/fails. On another note, next semester I'm taking Carlos Guestrin's Probabilistic Graphical Models course.

No comments:

Post a Comment