Learning with the Minimum Description Length Principle [electronic resource] / by Kenji Yamanishi.

Por: Yamanishi, Kenji [author.]Colaborador(es): SpringerLink (Online service)Tipo de material: TextoTextoEditor: Singapore : Springer Nature Singapore : Imprint: Springer, 2023Edición: 1st ed. 2023Descripción: XX, 339 p. 51 illus., 48 illus. in color. online resourceTipo de contenido: text Tipo de medio: computer Tipo de portador: online resourceISBN: 9789819917907Tema(s): Data structures (Computer science) | Information theory | Machine learning | Data Structures and Information Theory | Machine LearningFormatos físicos adicionales: Printed edition:: Sin título; Printed edition:: Sin título; Printed edition:: Sin títuloClasificación CDD: 005.73 | 003.54 Clasificación LoC:QA76.9.D35Q350-390Recursos en línea: Libro electrónicoTexto
Contenidos:
Information and Coding -- Parameter Estimation -- Model Selection -- Latent Variable Model Selection -- Sequential Prediction -- MDL Change Detection -- Continuous Model Selection -- Extension of Stochastic Complexity -- Mathematical Preliminaries.
En: Springer Nature eBookResumen: This book introduces readers to the minimum description length (MDL) principle and its applications in learning. The MDL is a fundamental principle for inductive inference, which is used in many applications including statistical modeling, pattern recognition and machine learning. At its core, the MDL is based on the premise that "the shortest code length leads to the best strategy for learning anything from data." The MDL provides a broad and unifying view of statistical inferences such as estimation, prediction and testing and, of course, machine learning. The content covers the theoretical foundations of the MDL and broad practical areas such as detecting changes and anomalies, problems involving latent variable models, and high dimensional statistical inference, among others. The book offers an easy-to-follow guide to the MDL principle, together with other information criteria, explaining the differences between their standpoints. Written in a systematic, concise and comprehensive style, this book is suitable for researchers and graduate students of machine learning, statistics, information theory and computer science.
Star ratings
    Valoración media: 0.0 (0 votos)
Existencias
Tipo de ítem Biblioteca actual Colección Signatura Copia número Estado Fecha de vencimiento Código de barras
Libro Electrónico Biblioteca Electrónica
Colección de Libros Electrónicos 1 No para préstamo

Acceso multiusuario

Information and Coding -- Parameter Estimation -- Model Selection -- Latent Variable Model Selection -- Sequential Prediction -- MDL Change Detection -- Continuous Model Selection -- Extension of Stochastic Complexity -- Mathematical Preliminaries.

This book introduces readers to the minimum description length (MDL) principle and its applications in learning. The MDL is a fundamental principle for inductive inference, which is used in many applications including statistical modeling, pattern recognition and machine learning. At its core, the MDL is based on the premise that "the shortest code length leads to the best strategy for learning anything from data." The MDL provides a broad and unifying view of statistical inferences such as estimation, prediction and testing and, of course, machine learning. The content covers the theoretical foundations of the MDL and broad practical areas such as detecting changes and anomalies, problems involving latent variable models, and high dimensional statistical inference, among others. The book offers an easy-to-follow guide to the MDL principle, together with other information criteria, explaining the differences between their standpoints. Written in a systematic, concise and comprehensive style, this book is suitable for researchers and graduate students of machine learning, statistics, information theory and computer science.

UABC ; Perpetuidad

Con tecnología Koha