Extraction of structure, in particular of group symmetries, is increasingly crucial to understanding and building intelligent models. In parallel, some information-theoretic models of complexity-constrained learning have been argued to induce invariance extraction. Here, we formalise and extend the study of group symmetries through the information lens, by identifying a certain duality between probabilistic symmetries and information parsimony. Namely, we characterise group symmetries through the full information preservation case of Information Bottleneck-like compressions. More precisely, we require the compression to be optimal under the constraint of preserving the divergence from a given exponential family, yielding a novel generalisation of the Information Bottleneck framework. Through appropriate choices of exponential families, we characterise (in the discrete and full support case) channel invariance, channel equivariance and distribution invariance under permutation. Allowing non-zero distortion then leads to principled definitions of ``soft symmetries'' as the exact symmetries of a compressed representation of data. In simple synthetic experiments, we demonstrate that our method successively recovers, at increasingly compressed ``resolutions'', nested but increasingly perturbed equivariances, where new equivariances emerge at bifurcation points of the distortion parameter. Our framework provides the area of probabilistic symmetry discovery with a theoretical clarification of its link with information parsimony, and with a basis on which to potentially build new computational tools.