Advancements in Knowledge Distillation: Towards New Horizons of Intelligent Systems [electronic resource] / edited by Witold Pedrycz, Shyi-Ming Chen.

Colaborador(es): Pedrycz, Witold [editor.] | Chen, Shyi-Ming [editor.] | SpringerLink (Online service)Tipo de material: TextoTextoSeries Studies in Computational Intelligence ; 1100Editor: Cham : Springer International Publishing : Imprint: Springer, 2023Edición: 1st ed. 2023Descripción: VIII, 232 p. 70 illus., 51 illus. in color. online resourceTipo de contenido: text Tipo de medio: computer Tipo de portador: online resourceISBN: 9783031320958Tema(s): Computational intelligence | Artificial intelligence | Computational Intelligence | Artificial IntelligenceFormatos físicos adicionales: Printed edition:: Sin título; Printed edition:: Sin título; Printed edition:: Sin títuloClasificación CDD: 006.3 Clasificación LoC:Q342Recursos en línea: Libro electrónicoTexto
Contenidos:
Categories of Response-Based, Feature-Based, and Relation-Based Knowledge Distillation -- A Geometric Perspective on Feature-Based Distillation -- Knowledge Distillation Across Vision and Language -- Knowledge Distillation in Granular Fuzzy Models by Solving Fuzzy Relation Equations -- Ensemble Knowledge Distillation for Edge Intelligence in Medical Applications -- Self-Distillation with the New Paradigm in Multi-Task Learning -- Knowledge Distillation for Autonomous Intelligent Unmanned System.
En: Springer Nature eBookResumen: The book provides a timely coverage of the paradigm of knowledge distillation-an efficient way of model compression. Knowledge distillation is positioned in a general setting of transfer learning, which effectively learns a lightweight student model from a large teacher model. The book covers a variety of training schemes, teacher-student architectures, and distillation algorithms. The book covers a wealth of topics including recent developments in vision and language learning, relational architectures, multi-task learning, and representative applications to image processing, computer vision, edge intelligence, and autonomous systems. The book is of relevance to a broad audience including researchers and practitioners active in the area of machine learning and pursuing fundamental and applied research in the area of advanced learning paradigms.
Star ratings
    Valoración media: 0.0 (0 votos)
Existencias
Tipo de ítem Biblioteca actual Colección Signatura Copia número Estado Fecha de vencimiento Código de barras
Libro Electrónico Biblioteca Electrónica
Colección de Libros Electrónicos 1 No para préstamo

Acceso multiusuario

Categories of Response-Based, Feature-Based, and Relation-Based Knowledge Distillation -- A Geometric Perspective on Feature-Based Distillation -- Knowledge Distillation Across Vision and Language -- Knowledge Distillation in Granular Fuzzy Models by Solving Fuzzy Relation Equations -- Ensemble Knowledge Distillation for Edge Intelligence in Medical Applications -- Self-Distillation with the New Paradigm in Multi-Task Learning -- Knowledge Distillation for Autonomous Intelligent Unmanned System.

The book provides a timely coverage of the paradigm of knowledge distillation-an efficient way of model compression. Knowledge distillation is positioned in a general setting of transfer learning, which effectively learns a lightweight student model from a large teacher model. The book covers a variety of training schemes, teacher-student architectures, and distillation algorithms. The book covers a wealth of topics including recent developments in vision and language learning, relational architectures, multi-task learning, and representative applications to image processing, computer vision, edge intelligence, and autonomous systems. The book is of relevance to a broad audience including researchers and practitioners active in the area of machine learning and pursuing fundamental and applied research in the area of advanced learning paradigms.

UABC ; Perpetuidad

Con tecnología Koha