Publications
All Publications
I’ve been active in several domains, including computer networks, caching, task scheduling, data management, and cloud computing. Although these are important fields, in the list below I’m showing only works in the machine learning domain, which has been attracting most of my attention lately.
For the full publication list, please refer to:
Selected Publications
[1]
Bounoua, M., Franzese, G. and Michiardi, P., Score-based O-INFORMATION estimation. ICML (2024).
[2]
Franzese, G., Bounoua, M. and Michiardi, P., MINDE: Mutual information neural diffusion estimation. ICLR. (2024).
[3]
Bounoua, M., Franzese, G. and Michiardi, P., Multi-modal latent diffusion. Entropy. (2024).
[4]
Tran, B.-H., Franzese, G., Michiardi, P. and Filippone, M., One-line-of-code data mollification improves optimization of likelihood-based generative models. NeurIPS (2023).
[5]
Franzese, G., Corallo, G., Rossi, S., Heinonen, M., Filippone, M. and Michiardi, P., Continuous-time functional diffusion processes. NeurIPS (2023).
[6]
Franzese, G., Rossi, S., Yang, L., Finamore, A., Rossi, D., Filippone, M. and Michiardi, P., How much is enough? A study on diffusion times in score-based generative models. Entropy. 25, 4 (2023), 633.
[7]
Franzese, G., Milios, D., Filippone, M. and Michiardi, P., Revisiting the effects of stochasticity for hamiltonian samplers. ICML (2022), 6744–6778.
[8]
Tran, B.-H., Rossi, S., Milios, D., Michiardi, P., Bonilla, E.V. and Filippone, M., Model selection for bayesian autoencoders. NeurIPS. 34, (2021), 19730–19742.
[9]
Mita, G., Filippone, M. and Michiardi, P., An identifiable double vae for disentangled representations. ICML (2021), 7769–7779.
[10]
Tran, G.-L., Milios, D., Michiardi, P. and Filippone, M., Sparse within sparse gaussian processes using neighbor information. ICML (2021), 10369–10378.
[11]
Mita, G., Papotti, P., Filippone, M. and Michiardi, P., LIBRE: Learning interpretable boolean rule ensembles. AISTATS (2020), 245–255.
[12]
Rossi, S., Michiardi, P. and Filippone, M., Good initializations of variational bayes for deep models. ICML (2019), 5487–5497.
[13]
Milios, D., Camoriano, R., Michiardi, P., Rosasco, L. and Filippone, M., Dirichlet-based gaussian processes for large-scale calibrated classification. NeurIPS. 31, (2018).
[14]
Cutajar, K., Bonilla, E.V., Michiardi, P. and Filippone, M., Random feature expansions for deep gaussian processes. ICML (2017), 884–893.