In previous works, a probabilistic generalization of formal concepts was developed that is resistant to noise and is capable of restoring formal concepts. In this paper, we show that probabilistic formal concepts have a deeper meaning than the restoration of formal concepts. Probabilistic formal concepts model “natural” concepts explored in cognitive sciences and “natural” classes explored in the “natural” classification. The hyper network of probabilistic formal concepts reflects the hierarchical structure of complex patterns – a hierarchy of secondary, increasingly complex features that are found as a result of deep learning. This hierarchy, obtained by logical-probabilistic methods, in addition to being “natural”, is also explanatory, since it can give descriptions of its classes in logical-probabilistic terms. Thus, the hierarchy of probabilistic formal concepts discovered on complex images yields logical-probabilistic deep learning. The vertices of the hyper simplexes of the hyper network of probabilistic formal concepts reflect the content of “natural” concepts and classes, as they are inextricably linked with the underlying features. These vertices determine the meanings of “natural” concepts and classes, which are not reducible to the features that form them.