On Thursday night, I saw "The Machine" -- a Pink Floyd Tribute band -- play in B.B. King's in New York City with my mom. Yes, with my mom; she's quite the avid Pink Floyd fan. Twas an excellent show; I got to hear every song from Animals, Money, Hey You, Time, Us and Them, Wish You Were Here, Shine On You Crazy Diamond, and many more.
Today it was raining in Patchogue, and I realized that I truly miss snow. I also miss Shadyside.
I've been working a lot lately and I will have some fun text segmentation results to show next week.
I hope everybody has a safe New Year's Eve tonight!!!
Deep Learning, Computer Vision, and the algorithms that are shaping the future of Artificial Intelligence.
Saturday, December 31, 2005
Wednesday, December 21, 2005
on LI for the Holiday Season
Today was my first time driving to Patchogue, LI from Pittsburgh, PA. I only hit a little bit of GW Bridge traffic as the trip took 7.5 hours. I also found out that I got an A in Machine Learning. I still have to wait for my grade in Appearance Modeling (but it should be an A).
*Update* I did get my second A in Appearance Modeling.
Happy Holidays to everyone!
Here is a link to the Google Maps version of my Patchogue Jogging Path.
*Update* I did get my second A in Appearance Modeling.
Happy Holidays to everyone!
Here is a link to the Google Maps version of my Patchogue Jogging Path.
Snow action + Long Drive to LI
I went downhill skiing yesterday (first time in several years) at Seven Springs and it was really really fun! It was also my first time night skiing. I can't wait to go again; however, next time I might rent a snowboard.
In addition, here are some pics from last week's adventure in Western New York.
Here's a picture of my brother, Matt.
Here is a trail map for the park that we went hiking in.
I will be driving to Long Island of the Holiday Season in a few hours. I'll be missing the warmth and love of Shadyside.
In addition, here are some pics from last week's adventure in Western New York.
Here's a picture of my brother, Matt.
Here is a trail map for the park that we went hiking in.
I will be driving to Long Island of the Holiday Season in a few hours. I'll be missing the warmth and love of Shadyside.
Monday, December 19, 2005
"Once you make a decision, the universe conspires to make it happen."
"Once you make a decision, the universe conspires to make it happen."
-Ralph Waldo EmersonI just got back from the Union Grill with some first year Robograds. We had our Machine Learning final examination this morning, and it was soo much fun! I'm sure I did very well on the exam, because I'm very much into Machine Learning. However, once concept whose significance I failed to appreciate was PAC learning. What else is there to say: I am an anti PAC-learning kind of person.
I have a busy day ahead of me. First on the list is a visit to EMS for some snow gear. I should also mention that I've been recently playing more guitar (mastering overlapping modes) and have been very recently introduced to a band called Sonata Arctica.
Sunday, December 18, 2005
Geoff's Holiday Party Picture
Friday, December 16, 2005
{Eigen,Dead,Snow}
Eigenvalues and eigenvectors put me in a very special place. I wish more would grok the beauty behind the vector space approach to mathematics.
Tonight I am going to see Dark Star Orchestra perform at Mr. Small's theatre. They are a Grateful Dead cover band and I've heard many great things about them. I'm wearing my tye-dye shirt right now and I'm quite excited. I walked to campus today while listening to YEM and whistling a Guyute. I'll be very glad if I get a {Cassidy,UJB,Dark Star,HOTW,Eyes,Scarlet Begonias} tonight.
I will post pictures about my snow-filled adventures with Matt, my brother, very soon. I also plan on going skiing on Tuesday, and I'm hella-excited. And if you're wondering, 'hella-excited' is indeed a techincal term denoting a 8.7/10.0 on the excitement/anticipation scale.
Tonight I am going to see Dark Star Orchestra perform at Mr. Small's theatre. They are a Grateful Dead cover band and I've heard many great things about them. I'm wearing my tye-dye shirt right now and I'm quite excited. I walked to campus today while listening to YEM and whistling a Guyute. I'll be very glad if I get a {Cassidy,UJB,Dark Star,HOTW,Eyes,Scarlet Begonias} tonight.
I will post pictures about my snow-filled adventures with Matt, my brother, very soon. I also plan on going skiing on Tuesday, and I'm hella-excited. And if you're wondering, 'hella-excited' is indeed a techincal term denoting a 8.7/10.0 on the excitement/anticipation scale.
The role of fiction: I am Michael Valentine Smith
Fiction plays a significant role in my everyday life. The reason why I prefer books over movies is that literature is more picturesque. Each image induced by a scenario depicted in a novel is painted with the internal brush, and I believe that such a mental exercise is healthy.
Generally the plot in a novel is somehow deeply related to the writer's own life, but one must realize that when I read a novel I'm not simply seeing the same mona-lisa that was painted by the author. In the same way that all observations are theory-laden, when I read novel X and you read novel X, we are seeing a somewhat different picture. There will always be certain concepts related to human life portrayed in the novel that make one reflect upon his/her own life experiences, and such an intimate connection with past experiences and the current immersion in the novel is not repeatable (with respect to other agents reading the novel).
It actually goes deeper than that. The world that each one of us lives in has been shaped by our life experiences. However, somebody will still ask me a question once in a while that is so deeply rooted in my own experiences that I cannot help but reply with a , "I don't know." In reality there's a good chance that I know the answer (know to myself). I simply choose not to attempt to project the answer from my own inner world onto their own little world. Sometimes concepts are lost in translation and until I feel that a particular concept will make the my-world to your-world leap unscathed, I will refrain from any such translations.
Don't think for one second that my musings into the world of literature are mini-journeys independent of the grand problem of object recognition. I'm simply trying to convey the point that there is something about past experience that is deeply related to current experience.
On another note, I'm currently reading Heinlen's Stranger in a Strange Land. I have become more like Michael Valentine Smith and he has become a little bit more like me.
Generally the plot in a novel is somehow deeply related to the writer's own life, but one must realize that when I read a novel I'm not simply seeing the same mona-lisa that was painted by the author. In the same way that all observations are theory-laden, when I read novel X and you read novel X, we are seeing a somewhat different picture. There will always be certain concepts related to human life portrayed in the novel that make one reflect upon his/her own life experiences, and such an intimate connection with past experiences and the current immersion in the novel is not repeatable (with respect to other agents reading the novel).
It actually goes deeper than that. The world that each one of us lives in has been shaped by our life experiences. However, somebody will still ask me a question once in a while that is so deeply rooted in my own experiences that I cannot help but reply with a , "I don't know." In reality there's a good chance that I know the answer (know to myself). I simply choose not to attempt to project the answer from my own inner world onto their own little world. Sometimes concepts are lost in translation and until I feel that a particular concept will make the my-world to your-world leap unscathed, I will refrain from any such translations.
Don't think for one second that my musings into the world of literature are mini-journeys independent of the grand problem of object recognition. I'm simply trying to convey the point that there is something about past experience that is deeply related to current experience.
On another note, I'm currently reading Heinlen's Stranger in a Strange Land. I have become more like Michael Valentine Smith and he has become a little bit more like me.
Wednesday, December 07, 2005
We spend a good portion of our lives asleep
I didn't get much sleep last night; however, waking up was remarkably easy this morning. I'll have plenty of time to catch up on sleep when I'm old. I think I'll try to keep my days long and my nights short. I might take a nap this afternoon, but it will surely be followed by an intense session at the gym and/or a run.
During the cold winter months we - mere mortals - spend lots of time thinking about warmth. Actually, it is more than just 'thinking' about warmth. Like the attraction between heavenly bodies, we gravitate towards warmth. Such an inverse squared relationship renders the force between us and a source of heat negligible when the distance is large; however, at close distances this force is unescapable. I shouldn't say unescapable, because I should stay true to my heavenly body analogy. This brutal quest for warmth implies a periodic relationship with our heat sources (consider the elliptical trajectories of the planets in our solar system). Periodicity is the driving element of life. Can you name an aspect of life that is not cyclical?
I think I'll take all the warmth I can get (and my apartment is pretty toasty most of the time) while gravitating towards a friendly source of fire. I don't believe that the universe will every dia a heat death, but I'd rather have a warm demise than a cold one.
On another note, I'm going to visit my brother this weekend up in Buffalo, NY. I saw a movie last night, and one character definitely reminded me of my bro. It was the CIA-father in Meet the Fockers.
Currently listening to: Fire
During the cold winter months we - mere mortals - spend lots of time thinking about warmth. Actually, it is more than just 'thinking' about warmth. Like the attraction between heavenly bodies, we gravitate towards warmth. Such an inverse squared relationship renders the force between us and a source of heat negligible when the distance is large; however, at close distances this force is unescapable. I shouldn't say unescapable, because I should stay true to my heavenly body analogy. This brutal quest for warmth implies a periodic relationship with our heat sources (consider the elliptical trajectories of the planets in our solar system). Periodicity is the driving element of life. Can you name an aspect of life that is not cyclical?
I think I'll take all the warmth I can get (and my apartment is pretty toasty most of the time) while gravitating towards a friendly source of fire. I don't believe that the universe will every dia a heat death, but I'd rather have a warm demise than a cold one.
On another note, I'm going to visit my brother this weekend up in Buffalo, NY. I saw a movie last night, and one character definitely reminded me of my bro. It was the CIA-father in Meet the Fockers.
Currently listening to: Fire
Tuesday, December 06, 2005
A Mitchell, a Moore, and an LDA Hacker
If you don't know what this title refers to, then I'll quickly remind you. Tom Mitchell and Andrew W. Moore are the two (high caliber) professors who are teaching the Machine Learning course I'm taking this semester.
First of all, I'd like to mention that Tom Mitchell is teaching a class titled "Advanced Statistical Language Processing" next semester.
The course description goes as follows:
This is an advanced, research-oriented course on statistical natural language processing. Students and the instructor will work together to understand, implement, and extend state-of-the-art machine learning algorithms for information extraction, named entity extraction, co-reference resolution, and related natural language processing tasks. The course will involve two primary activities: reading and discussing current research papers in this area, and developing a novel approach to continuous learning for natural language processing. More specifically, as a class we will work together to build a computer system that runs 24 hours/day, 7 days/week, performing two tasks: (1) extracting factual content from unstructured and semi-structured web pages, and (2) continuously learning to improve its competence at information extraction. We will begin the course with a simple prototype system of this type. During the course, students will populate it with a variety of statistical learning methods that enable it to extract information from the web, and to continuously learn to improve its capabilities.
Doesn't that sound awesome!
Secondly, Andrew Moore gave the last two lecture on reinforcement learning. What I like about this theory is that it literally places action in perception (remember Alva Noe's book titled "Action in Perception"). I think it is always exciting to see Andrew Moore talk about something he is passion about and these last two lectures were high quality.
Last, but not least, I got my Machine Learning project back today. 100/100. It was a great project and I feel like Jon and I deserved it. I'm generally much more proud of a high grade on a longer project compared to an exam and this is why I'm spreading my joy. I sent an email to Jon (since he is gone at NIPS [lucky, lucky, lucky] this week) and he said something like, "I went to a Jordan tutorial on hierarchical dirichlet processes." Then he summarized it all in 3 words: "It was intense." One day I will cross paths with this Michael Jordan fellow and I will be ready for the intensity that ensues.
First of all, I'd like to mention that Tom Mitchell is teaching a class titled "Advanced Statistical Language Processing" next semester.
The course description goes as follows:
This is an advanced, research-oriented course on statistical natural language processing. Students and the instructor will work together to understand, implement, and extend state-of-the-art machine learning algorithms for information extraction, named entity extraction, co-reference resolution, and related natural language processing tasks. The course will involve two primary activities: reading and discussing current research papers in this area, and developing a novel approach to continuous learning for natural language processing. More specifically, as a class we will work together to build a computer system that runs 24 hours/day, 7 days/week, performing two tasks: (1) extracting factual content from unstructured and semi-structured web pages, and (2) continuously learning to improve its competence at information extraction. We will begin the course with a simple prototype system of this type. During the course, students will populate it with a variety of statistical learning methods that enable it to extract information from the web, and to continuously learn to improve its capabilities.
Doesn't that sound awesome!
Secondly, Andrew Moore gave the last two lecture on reinforcement learning. What I like about this theory is that it literally places action in perception (remember Alva Noe's book titled "Action in Perception"). I think it is always exciting to see Andrew Moore talk about something he is passion about and these last two lectures were high quality.
Last, but not least, I got my Machine Learning project back today. 100/100. It was a great project and I feel like Jon and I deserved it. I'm generally much more proud of a high grade on a longer project compared to an exam and this is why I'm spreading my joy. I sent an email to Jon (since he is gone at NIPS [lucky, lucky, lucky] this week) and he said something like, "I went to a Jordan tutorial on hierarchical dirichlet processes." Then he summarized it all in 3 words: "It was intense." One day I will cross paths with this Michael Jordan fellow and I will be ready for the intensity that ensues.
Hey You
Hey you, out there in the cold
Getting lonely, getting old
Can you feel me?
Hey you, standing in the aisles
With itchy feet and fading smiles
Can you feel me?
Hey you, don’t help them to bury the light
Don’t give in without a fight.
Hey you, out there on your own
Sitting naked by the phone
Would you touch me?
Hey you, with you ear against the wall
Waiting for someone to call out
Would you touch me?
Hey you, would you help me to carry the stone?
Open your heart, I’m coming home.
Hey You - Pink Floyd
Getting lonely, getting old
Can you feel me?
Hey you, standing in the aisles
With itchy feet and fading smiles
Can you feel me?
Hey you, don’t help them to bury the light
Don’t give in without a fight.
Hey you, out there on your own
Sitting naked by the phone
Would you touch me?
Hey you, with you ear against the wall
Waiting for someone to call out
Would you touch me?
Hey you, would you help me to carry the stone?
Open your heart, I’m coming home.
Hey You - Pink Floyd
Saturday, December 03, 2005
FIRST LEGO League Robotics Challlenge Judging
I just returned from a fun-filled morning at NREC where I was a programming judge for the FIRST LEGO League Robotics Challenge. In this challenge, kids (9-14 years of age) had to build robots using LEGOs and program them using a graphical programming interface. My job was to talk to each team for 10 minutes while asking them questions and scoring them based on things such as {sophistication of approach, efficiency of approach, teamwork, planning the programming task, etc}.
It made me very happy to see young kids excited about science and technology. Playing with LEGOs is very hands-on approach to engineering and I saw kids from many different age groups show off their creativity. I was also particularly impressed by the large number of girl in the challenge and the fact that many of the most outstanding programs were developed by women.
I'm pretty sure that I want to do this again next year and I might even become a part of a longer educational program during the summer.
It made me very happy to see young kids excited about science and technology. Playing with LEGOs is very hands-on approach to engineering and I saw kids from many different age groups show off their creativity. I was also particularly impressed by the large number of girl in the challenge and the fact that many of the most outstanding programs were developed by women.
I'm pretty sure that I want to do this again next year and I might even become a part of a longer educational program during the summer.
Friday, December 02, 2005
Navigating two worlds
Today I want to talk about synonyms and photometric invariance while comparing and contrasting the world of language and the visual world. My primary objective is to build a vision system that can learn to recognize objects in an unsupervised or semi-supervised fashion. I want to stress the fact that I'm much more interested in machine learning these days than I ever was.
I have been recently introduced to unsupervised techniques in the field of statistical language modeling and the following discussion will revolve around the differences between the man-made world of text and the natural world of images.
Here, when I mention text I am referring to a legitimate configuration of English words. It is important to realize that in the world of language, there are two very different uses for words. In one case, words are mere vessels for the transportation of a high-level concept. Here, there is nothing special about a particular choice of words and many different configurations of words map to the same high-level semantic interpretation. On the other hand, a poetic use of language strives to convey a high-level meaning with a carefully selected configuration of words.
In the visual domain we can also treat images as having many purposes. In the first case images could capture 'a' configuration of the world and in the second case they could capture 'the' configuration of the world. Allow me to explain. 'A' configuration of the world represents a possible configuration of objects where there is nothing particularly interesting about that specific configuration. For example, when depicting 'a' configuration of a hikers camping on a mountaintop the color of the tent doesn't alter the high-level fact that there is a tent and the presence of snow on the mountain doesn't alter the mountain. On the other hand, when using images to capture 'the' configuration of the world the color of the tent and the presence of the snow does matter. 'The' configuration would represent some high-level concept such as 'Julie and Tim camping on Mount Sefton in March.' In the 'a' and 'the' configurations nothing was stated about the sky (cloudy, sunny, sunset,sunrise) thus both images could contain different skies while being true to their 'a' or 'the' purposes.
Although understanding the world of english text is easier than understanding the visual world, there are many similarities. Statistical co-occurence is the key idea behind unsupervised topic discovery and parts-of-speech tagging while it is also a necessary notion when trying to understand images. When local structures (letters,words,image patches) co-occur, we can use induction to explain this phenomenon. In some sense understanding data is not much more than mere compression of the data. Here I don't refer to compression as a way of reducing data set size so that the initial data set can be reconstructed in some L2-norm sense. I'm referring to a compression (a projection onto a lower dimensional space) such that the reconstruction preserves the high-level {semantic,visual} attributes that are relevant. Consider the 'the' configuration of the hikers mentioned in the paragraph above. A good compression would preserve {the identities of the hikers, the presence of snow, the colour of the tent} but it would discard anything about the sky if it was not relevant.
Within a few days I'll be posting my LDA results on unsupervised topic discovery in text. I will then quickly delineate some of the new directions I've been taking with respect to unsupervised segmentation of text (which was superficially concatenated as to eliminate the spaces) and how these results can be applied to the visual domain where object boundaries are what we want to find.
I have been recently introduced to unsupervised techniques in the field of statistical language modeling and the following discussion will revolve around the differences between the man-made world of text and the natural world of images.
Here, when I mention text I am referring to a legitimate configuration of English words. It is important to realize that in the world of language, there are two very different uses for words. In one case, words are mere vessels for the transportation of a high-level concept. Here, there is nothing special about a particular choice of words and many different configurations of words map to the same high-level semantic interpretation. On the other hand, a poetic use of language strives to convey a high-level meaning with a carefully selected configuration of words.
In the visual domain we can also treat images as having many purposes. In the first case images could capture 'a' configuration of the world and in the second case they could capture 'the' configuration of the world. Allow me to explain. 'A' configuration of the world represents a possible configuration of objects where there is nothing particularly interesting about that specific configuration. For example, when depicting 'a' configuration of a hikers camping on a mountaintop the color of the tent doesn't alter the high-level fact that there is a tent and the presence of snow on the mountain doesn't alter the mountain. On the other hand, when using images to capture 'the' configuration of the world the color of the tent and the presence of the snow does matter. 'The' configuration would represent some high-level concept such as 'Julie and Tim camping on Mount Sefton in March.' In the 'a' and 'the' configurations nothing was stated about the sky (cloudy, sunny, sunset,sunrise) thus both images could contain different skies while being true to their 'a' or 'the' purposes.
Although understanding the world of english text is easier than understanding the visual world, there are many similarities. Statistical co-occurence is the key idea behind unsupervised topic discovery and parts-of-speech tagging while it is also a necessary notion when trying to understand images. When local structures (letters,words,image patches) co-occur, we can use induction to explain this phenomenon. In some sense understanding data is not much more than mere compression of the data. Here I don't refer to compression as a way of reducing data set size so that the initial data set can be reconstructed in some L2-norm sense. I'm referring to a compression (a projection onto a lower dimensional space) such that the reconstruction preserves the high-level {semantic,visual} attributes that are relevant. Consider the 'the' configuration of the hikers mentioned in the paragraph above. A good compression would preserve {the identities of the hikers, the presence of snow, the colour of the tent} but it would discard anything about the sky if it was not relevant.
Within a few days I'll be posting my LDA results on unsupervised topic discovery in text. I will then quickly delineate some of the new directions I've been taking with respect to unsupervised segmentation of text (which was superficially concatenated as to eliminate the spaces) and how these results can be applied to the visual domain where object boundaries are what we want to find.
Subscribe to:
Posts (Atom)