Gholamreza (Reza) Haffari

Senior Lecturer (= North American Associate Prof.)          
Director of the Graduate Diploma in Data Science
Faculty of Information Technology
Monash University
Clayton, VIC 3800, Australia

Email: gholamreza.haffariATmonash.edu
Office: 131A, Bulding 63
Tel: +61 3 9905-8106
Fax: +61 3 9905-5159

My research spans two broad areas: Natural Language Processing and Machine Learning. Understanding human language by computers has been a central goal of AI. Human language is an intricate system; each sentence has its own grammatical structure, inter-connected references, and set of possible meanings. The field of Natural Language Processing (NLP) aims to build computational models of language in order to make predictions based on real-world textual data. Example applications of NLP include machine translation, information extraction, and question answering. Tools developed for these problems are increasingly becoming part of daily life, from speech and dialogue systems on mobile devices to structured search on the web to real-time translation. NLP is a rich intersection of formal modeling, applied algorithms and scalable data systems, and has served as an important application domain for related fields such as Machine Learning (ML).

My research aligns with the Machine Learning flagship in our faculty. In the past, I have organised the following reading groups in our faculty: Deep Learning Reading Group, The Machine Learning Book Reading Group, and Natural Language Processing Reading Group.

Prospective students: For PhD application, please read this before you contact me. To international undergrad students, please email me only if you have your own funding for internship as I do not have any funding to support interns.

Research Areas

  • Deep learning methods, particularly why they work and how to use them well.
  • Structured prediction, particularly predicting complex linguistic structure.
  • Discourse, semantics, syntax, and morphology; particularly for machine translation.
  • Advanced data structures (eg succinct suffix trees) for large-scale NLP problem, such as language modelling.
  • Learning with limited amounts of supervision and large amount of un-annotated data; learning across different domains of data.
  • Learning and inference in probabilistic graphical models, particularly for NLP problems.
  • Non-parametric Bayesian models, particularly for NLP.
  • Reinforcement learning, Markov decision processes, and multi-armed bandit.
  • Dialogue systems, particularly with the deep learning approach.
  • Learning programs from data, particularly with deep learning.


  • Oct 2017: Congratulations to Ehsan Shareghi for completing his PhD thesis. Ehsan is going to join University of Cambridge as a Postdoctoral Fellow soon.

  • Sep 2017: Congratulations to Sunil Aryal for submitting his PhD thesis. Sunil has taken up a Lecturer position at Federation University.

  • July 2017: Techincal Program Chair of the Australian Language Technology Association (ALTA) workshop, which will be held in Queensland University of Technology (Dec 2017). Please consider submitting a paper to ALTA 2017.

  • July-August 2017: Attended ACL 2017 and IJCAI 2017.

  • March 2017: Area chair of machine learning in the 8th International Joint Conference on Natural Language Processing IJCNLP 2017. Please consider submitting your work to the conference.

  • Dec 2016: Excited to co-lead a team for a workshop on neural MT at CMU this summer, as part of the JSALT workshop series operated by JHU.
  • March 2016: Local Co-Chair of the Australian Language Technology Association (ALTA) workshop, which will be held in Monash (Dec 2016). Please consider submitting a paper.

  • Attended NAACL 2016 (San Diego) and ICML 2016 (New York), 2016.

  • Dec 2015: Attended NIPS 2015 in Montreal, lots of interesting papers to read and follow up.

  • Sep 2015: Congratulations to my PhD student, Ajay Ganesh, for a great PhD thesis and graduation. Ajay has joined Andrew Mccallum's group in UMass-Amherst as a postdoc.

  • July-Aug 2015: Excited to be part of the team for Continuous Wide-band Machine Translation workshop held in University of Washington, Seattle.

Grants and Awards

  • Neural Models for Structured Transduction. Nvidia GPU Grant, 2017.

  • Learning Deep Semantics for Automatic Translation between Human Languages. 2016 – 2019; ARC Discovery Project; $450k.

  • Best (Student) Paper Award, Australian Language Technology Association (ALTA) Workshop, 2016.

  • Improving the quality of primary health care for TAC clients. 2016-2017; Transport Accident Commission (TAC) Victoria; $174k.

  • Data Analysis of Victoria Police Incident and Injuries Data. 2016-2016; Victoria Police; $65k.

  • Large-scale Qualitative Data Analysis. 2014-2015; Government of Victoria; $65k.

  • Scalable Semi-Supervised Learning for Structured Prediction. 2014-2015; NICTA; $49k.

  • Strategic Grant Seed Fund, Monash University. 2013-2014; $45k.

  • Dean's Award for Excellence in Research by an Early Career Researcher. 2012; Monash University; $10k.