Home

Papers

Team

Teaching

Gholamreza (Reza) Haffari

  Senior Lecturer (= North American Associate Prof.)          
  Director of the Graduate Diploma in Data Science
  Faculty of Information Technology
  Monash University
  Clayton, VIC 3800, Australia



Email: gholamreza.haffariATmonash.edu
Office: 131A, Bulding 63
Tel: +61 3 9905-8106
Fax: +61 3 9905-5159




My research spans two broad areas: Natural Language Processing and Machine Learning. Understanding human language by computers has been a central goal of AI. Human language is an intricate system; each sentence has its own grammatical structure, inter-connected references, and set of possible meanings. The field of Natural Language Processing (NLP) aims to build computational models of language in order to make predictions based on real-world textual data. Example applications of NLP include machine translation, information extraction, and question answering. Tools developed for these problems are increasingly becoming part of daily life, from speech and dialogue systems on mobile devices to structured search on the web to real-time translation. NLP is a rich intersection of formal modeling, applied algorithms and scalable data systems, and has served as an important application domain for related fields such as Machine Learning (ML).

My research aligns with User Modelling and Language Technology and Data Science in our faculty. In the past, I have organised The Machine Learning Book Reading Group. Our faculty runs Research Seminars and Dean's Seminar Series, so please feel free to join.

For PhD application, please read this before you contact me. To international undergrad students: Please email me only if you have your own funding for internship as I do not have any funding to support interns.

Research Areas

  • Deep learning methods, particularly why they work and how to use them well.
  • Structured prediction, particularly predicting complex linguistic structure.
  • Discourse, semantics, syntax, and morphology; particularly for machine translation.
  • Advanced data structures (eg succinct suffix trees) for large-scale NLP problem, such as language modelling.
  • Learning with limited amounts of supervision and large amount of un-annotated data; learning across different domains of data.
  • Learning and inference in probabilistic graphical models, particularly for NLP problems.
  • Non-parametric Bayesian models, particularly for NLP.
  • Reinforcement learning, Markov decision processes, and multi-armed bandit.
  • Dialogue systems, particularly with the deep learning approach.
  • Learning programs from data, particularly with deep learning.

Highlights

  • April 2018: Congratulations to Daniel (Postdoc), Sameen (PhD student), Ming (PhD student), and Poorya (PhD student) for the four papers accepted to ACL 2018.

  • March 2018: Thanks Google for supporting our team with a Google Faculty Research Award.

  • Feb 2018: I will be co-organising a workshop on Deep Learning Approaches for Low-Resource NLP at ACL 2018, please consider submitting your work.
  • Feb 2018: Two long papers accepted to NAACL-HLT 2018. Congratulations to student co-authors Quan Tran and Poorya Zaremoodi.

  • Oct 2017: Congratulations to Ehsan Shareghi for completing his PhD thesis. Ehsan is going to join University of Cambridge as a Postdoctoral Fellow soon.

  • Sep 2017: Congratulations to Sunil Aryal for submitting his PhD thesis. Sunil has taken up a Lecturer position at Federation University.

  • July 2017: Techincal Program Chair of the Australian Language Technology Association (ALTA) workshop, which will be held in Queensland University of Technology (Dec 2017). Please consider submitting a paper to ALTA 2017.

  • July-August 2017: Attended ACL 2017 (Vancouver) and IJCAI 2017 (Melbourne).

  • March 2017: Area chair of machine learning in the 8th International Joint Conference on Natural Language Processing IJCNLP 2017. Please consider submitting your work to the conference.

  • Dec 2016: Excited to co-lead a team for a workshop on neural MT at CMU this summer, as part of the JSALT workshop series operated by JHU.
  • March 2016: Local Co-Chair of the Australian Language Technology Association (ALTA) workshop, which will be held in Monash (Dec 2016). Please consider submitting a paper.

  • Attended NAACL 2016 (San Diego) and ICML 2016 (New York), 2016.

  • Dec 2015: Attended NIPS 2015 in Montreal, lots of interesting papers to read and follow up.

  • Sep 2015: Congratulations to my PhD student, Ajay Ganesh, for a great PhD thesis and graduation. Ajay has joined Andrew Mccallum's group in UMass-Amherst as a postdoc.

  • July-Aug 2015: Excited to be part of the team for Continuous Wide-band Machine Translation workshop held in University of Washington, Seattle.

Grants and Awards

  • Google Faculty Research Award; US$95k (2018-2019).

  • ARC Discovery Project; A$450k (2016-2019).

  • Nvidia GPU Grant (2017).

  • Transport Accident Commission - Victoria; A$174k (2016-2017).

  • Best (Student) Paper Award, Australasian Language Technology Association Workshop (2016).

  • Victoria Police; A$65k (2016-2017).

  • Government of Victoria; A$65k (2014-2015).

  • National ICT Australia (NICTA); A$49k (2014-2015).

  • Strategic Grant Seed Fund; Monash University; A$45k (2013-2014).

  • Dean's Award for Excellence in Research by an Early Career Researcher; Monash University; A$10k (2012).

Education