Uncertainty quantification of predictive models is crucial in decision-making problems. To this purpose, quantile regression can be used to construct prediction intervals, but its validity relies on asymptotic and regularity conditions, and depends on the underlying model. Conformal prediction (CP) is a theoretically grounded framework for constructing prediction intervals with finite sample distribution-free marginal coverage guarantee that holds under the assumption that training and test points are exchangeable. The presence of missing values in real data brings additional challenges to uncertainty quantification. Despite an abundant literature on missing data, as far as we know, there is no work studying the quantification of uncertainty in predictive models. In this work, we first show that, for almost all imputations, a universally consistent quantile regression algorithm trained on the imputed data is Bayes optimal for the pinball risk. In the finite-sample regime, we show that, for almost all imputations and missing values mechanisms, the imputed data set is exchangeable. Thus, CP properties still hold and marginal guarantees are met. Nevertheless, we emphasize that the average coverage varies depending on the pattern of missing values: it tends to construct prediction intervals that often under-cover the response conditional on a given missing pattern. After theoretically studying the case of a linear model, we propose a methodology to achieve approximate conditional guarantees conditional on the 2d patterns of missing values, where d is the data dimension. We assess its improved performance on synthetic experiments.