About Me


I am an assistant professor in the Department of Computer Science at Rutgers University. Previously I was a Postdoctoral Research Associate at Computer Science & Artificial Intelligence Lab of Massachusetts Institute of Technology, working with Prof. Dina Katabi and Prof. Tommi Jaakkola. I obtained Ph.D degree in CSE department, Hong Kong University of Science and Technology. My supervisor was Prof. Dit-Yan Yeung. I was a visiting scholar working with Prof. Eric Xing's group in Machine Learning department of Carnegie Mellon University. I am also a Microsoft Fellow and received the Baidu Research Fellowship. Before my Ph.D, I got my BS degree from Shanghai Jiao Tong University, 2013 under the supervision of Prof. Wu-Jun Li.

Email: hoguewang AT gmail.com / hw488 AT cs.rutgers.edu / hogue.wang AT rutgers.edu / hwang87 AT mit.edu

Recruiting: I am recruiting PhD students starting from Fall 2023 as well as interns. Send me an email if you are interested in working with me at Rutgers.

[Twitter] [Facebook] [Github] [LinkedIn]

Research Interest


My research interest focuses on statistical machine learning and deep learning. Currently, I mainly work on Bayeisan deep learning, probabilistic graphical models, collaborative filtering, and their applications in healthcare, recommender systems, computer vision, natural language processing, data mining, and social network analysis.



  • Our paper on Bayesian deep learning for federated learning, "FedNP: Towards non-IID federated learning via federated neural propagation", is accepted at AAAI (11/18/22).

  • Our paper on causal and counterfactual recommender systems, "Explicit counterfactual data augmentation for recommendation", is accepted at WSDM (10/18/22).

  • Our papers on Bayesian deep learning, continuously streaming domain adaptation, and spatio-temporal forecasting, "Extrapolative continuous-time Bayesian neural network for fast training-free test-time adaptation" and "Earthformer: Exploring space-time transformers for earth system forecasting" are accepted at NeurIPS (09/14/22).

  • Our paper on multi-domain imbalanced learning and deep learning for health, "Artificial intelligence-enabled detection and assessment of Parkinson’s disease using nocturnal breathing signals", is accepted at Nature Medicine (08/2/22).

  • Our papers on multi-domain imbalanced learning and relational forecasting, "On multi-domain long-tailed recognition, generalization and beyond" and "Social ODE: Multi-agent trajectory forecasting with neural ordinary differential equations" are accepted at ECCV (07/03/22).

  • Our paper on domain adaptation Transformer, "Domain adaptation for time series forecasting via attention sharing", is accepted at ICML (05/13/22).

  • Our paper on Bayesian deep learning and interpretable ML for healthcare, "'My nose is running.' 'Are you also coughing?': Building a medical diagnosis agent with interpretable inquiry logics", is accepted at IJCAI (04/20/22).

  • Our three papers on causality, interpretable ML and Bayesian deep learning, "Causal transportability for visual recognition", "OrphicX: A causality-inspired latent variable model for interpreting graph neural networks", and "Bayesian invariant risk minimization", are accepted at CVPR (03/03/22).

  • Our paper, "Graph-relational domain adaptation", is accepted at ICLR (1/20/22).

  • We are organizaing the ICLR 2022 Workshop on "PAIR^2Struct: Privacy, Accountability, Interpretability, Robustness, Reasoning on Structured Data" (12/06/21).

  • Our two papers on uncertainty estimation, "Context uncertainty in contextual bandits with applications to recommender systems" and "Training-free uncertainty estimation for dense regression: Sensitivity as a surrogate", are accepted at AAAI (12/01/21).

  • Grateful to receive NSF grant IIS-2127918 as PI, "RI: Small: Enabling Interpretable AI via Bayesian Deep Learning" (08/25/21).

  • Our two papers, "Adversarial attacks are reversible with natural supervision" and "Paint Transformer: Feed forward neural painting with stroke prediction", are accepted at ICCV (07/22/21).

  • Our three papers, "STRODE: Stochastic boundary ordinary differential equation", "Correcting exposure bias for link recommendation", and "Delving into deep imbalanced regression", are accepted at ICML (05/08/21).

  • Received Amazon Faculty Research Award to work on Domain Adaptation, Recommender Systems, Forecasting, and Bayesian Deep Learning. (04/28/21).

  • Our paper on causal learning and deep learning, "Generative interventions for causal learning", is accepted at CVPR (02/28/21).

  • Our paper on AI and Bayesian deep learning for health, "Assessment of medication self-administration using artificial intelligence", is accepted at Nature Medicine (10/30/20).

  • Our Bayesian deep learning survey paper, "A survey on Bayesian deep learning", is accepted and published at ACM Computing Surveys (10/01/20).

  • Our work BodyCompass was covered by: MIT News, Engadget, Yahoo, Technology Networks, Sleep Review, TechTimes, and other media outlets (09/30/20).

  • Our work on COVID-19 patient monitoring was covered by: CSAIL news, TechCrunch, Engadget, and other media outlets (09/30/20).

  • Our two papers, "Continuously indexed domain adaptation" and "Deep graph random process for relational-thinking-based speech recognition", are accepted at ICML (06/06/20).

  • We released a new TensorFlow implementation for our KDD 2015 paper "Collaborative deep learning for recommender systems" (06/06/20).

  • Our paper, "Learning guided electron microscopy with active acquisition", is accepted at MICCAI(06/06/20).

  • A new project page for an ongoing survey on Bayesian Deep Learning is set up (08/30/19).

  • A new project page for our NPN paper is set up with both Matlab and PyTorch code (06/30/19).

  • We are organizaing the ICML 2019 Workshop on "Learning and Reasoning with Graph-Structured Representations" (15/03/19).

  • We are organizaing the CVPR 2019 Workshop on "Towards Causal, Explainable and Universal Medical Visual Diagnosis" (03/11/19).

  • Our paper, "Rethinking knowledge graph propagation for zero-shot learning", is accepted at CVPR (02/24/19).

  • Our work on Deep Bayesian Networks is reported by MIT News (01/25/19).

  • Our paper, "ProbGAN: Towards probabilistic GAN with theoretical guarantees", is accepted at ICLR (12/22/18).

  • Our two papers, "Bidirectional inference networks: A class of deep Bayesian networks for health profiling" and "Recurrent Poisson process unit for speech recognition", are accepted at AAAI (11/01/18).

  • Our paper, "Deep learning for precipitation nowcasting: A benchmark and a new model", is accepted at NIPS (09/05/17).

  • A new project page for our CDL paper is set up with a brief list of CDL variants (06/12/17).

  • Our paper, "Relational deep learning: A deep latent variable model for link prediction", is accepted at AAAI (11/11/16).

  • Our survey/review paper on Bayesian deep learning, "Towards Bayesian deep learning: a framework and some existing methods", is accepted in TKDE (08/22/16).

  • Two of our papers, "Natural parameter networks: a class of probabilistic neural networks" and "Collaborative recurrent autoencoder: recommend while learning to fill in the blanks", are accepted at NIPS (08/15/16).

  • Give the talk "Bayesian deep learning for integrated intelligence: bridging the gap between perception and inference" at the Chinese University of Hong Kong (06/17/16). [slides]

  • Give the talk "Bayesian deep learning for integrated intelligence: bridging the gap between perception and inference" at the Baidu NLP Seminar (06/13/16). [slides]

  • We gave a talk about Bayesian Deep Learning at ACML (11/22/15). [slides]

  • Give a talk about Bayesian Deep Learning at MSRA (09/11/15) and Baidu (11/05/15). [slides]

  • Our paper "Convolutional LSTM network: A machine learning approach for precipitation nowcasting" is accepted at NIPS. (09/04/15) [pdf]

  • Our paper "Collaborative deep learning for recommender systems" is accepted at SIGKDD. (05/13/15) [pdf]

  • Give a talk about " Relational Stacked Denoising Autoencoder for Tag Recommendation" at HKUST-EPFL Workshop on Data Science. (12/02/14) [slides]

  • This homepage is set up. (11/18/14)

hit counters free