PhD in Natural Language Processing
2012 (1 year, not completed)
University of Sydney, ə-lab
I spent a year in the School of Information Technology schwa lab under the supervision of Dr. James Curran. I was briefly researching opinion mining, publishing one paper, and then delved into the difficult problem of word sense disambiguation, in particular techniques for improving the performance of the closely-related problem of word sense induction, the extraction and construction of sense inventories automatically from text corpora. Both problems boil down to discerning what sense of a word is being applied in a given context, e.g. the word bass in the two contexts:
Big dirty stinkin' bass
A fine bass fresh from the salty sea
Distinguishing coarse senses, the distinctions between things so different as a frequency of sound and a marine animal, have good performance. The difficulty of the problem lies more in differentating meanings of closely related senses, such as wood in a forest versus wood in a cutting board. The contextual differences are not so great and confusion begins to occur. The importance of resolving these senses accurately is demonstrated by the frequent failures in machine translation programs.
My work was centered on using datasets of unprecedented size in word sense induction to improve on the state of the art accuracy in WSI systems. I was working with an enormous dataset (~50GB text) scraped from the DMOZ open directory project, an inventory of category-labelled websites from across the internet.
I suspended my PhD a little over a year after joining the lab to try freelance software engineering, then effectively left the lab by not resuming it (losing my scholarship) at the beginning of 2013. I am very happy I had the privelege to experience research, be a part of the lab and its lovely characters, and explore a problem as interesting as word sense disambiguation, a problem stretching to the depths of meaning itself. The skills, knowledge and wisdom I developed conducting research have been a great asset to me ever since, and I will forever have a passion for discovery and learning.
2011, with first class Honors
University of Sydney
My honors thesis was in Astroinformatics, a developing and exciting field concerning the application of modern Computer Science to Astronomy. I was under the supervision of Dr. Tara Murphy at the University of Sydney, and my work was contributing to the VAST project, a computational astronomy initiative concerned with the detection of astronomical transients. I received an overall grade of 88 and a grade on my thesis of 92.
Astronomical transients are the parts of the sky that change over time, making them worthy of study because they can reveal much about the mechanics of the universe. The structure of the data is a light curve, which corresponds to a time series in the context of machine learning research. Framing the problem as time series classification is how I approached my research task.
My work contributed to the VAST project with a comphrehensive set of classification experiments using a variety feature extraction techniques. All experiments had working software and were reproducible with a single command. I completed a detailed and comphrehensive review of time series classificaiton literature, and augmented and applied some leading techniques in that field order to apply them to the problem.
Bachelor of Information Technology (Adv) / Bachelor of Science
2007 - 2011
University of Sydney
Completed the Bachelor of IT in 2011 with a distinction average across IT subjects. I have a major in mathematics in the Bachelor of Science across a variety of topics, but I was mostly interested in discrete maths and information theory.
2000 - 2006
Blue Mountains Grammar School
I completed the HSC at BMGS taking 4 unit maths, chemistry, physics, english (adv) and modern history. I received a UAI of 96.65 (basically the same score as in ATAR)
At school I enjoyed debating, chess, and thanks to one Dr. Chris Huxley got my first ever taste of programming coding python in his after school class.