Probabilistic graphical models are important tools for analyzing complex systems involving latent variables. They allow for modeling conditional independencies holding in a joint distribution. The sum-product algorithm, also called belief propagation in general graphs and forward-backward in Markov chains and Hidden Markov Models (HMMs), exploits the structure of the graph for reducing the complexity of an inference from exponential in the number of variables to linear in the number of variables and exponential in the treewidth of the graph. Exact computations are therefore tractable for graphs of reasonable treewidth. In most cases, computations involve real-valued functions, however, one can think of a variety of mathematical objects with the right algebraic properties towards the sum and the product operator for computing various quantities of interest. Polynomials have for instance previously been applied, in particular in HMMs, for obtaining the distribution and moments of the number of patterns in a sequence. In this talk, we will start with a review of these methods before exploring two additional extensions of the sum-product algorithm over polynomials. We will firstly introduce a constrained segment-based HMM with polynomial transition probabilities to relax the constraint on the prior for the segmentation space (in particular the number of homogeneous segments) in sequence segmentation. We will secondly propose a method based on an adaptation of the multiplicative law for computing the derivatives of the likelihood in a Bayesian network up to a chosen order. That later method will be applied to the field of sequence segmentation and two-point linkage in statistical genetics and it will be compared and/or combined to current methods