Machine Learning for Model Order Reduction [electronic resource] / by Khaled Salah Mohamed.

Por: Mohamed, Khaled Salah [author.]Colaborador(es): SpringerLink (Online service)Tipo de material: TextoTextoEditor: Cham : Springer International Publishing : Imprint: Springer, 2018Edición: 1st ed. 2018Descripción: XI, 93 p. online resourceTipo de contenido: text Tipo de medio: computer Tipo de portador: online resourceISBN: 9783319757148Tema(s): Electronic circuits | Microprocessors | Electronics | Microelectronics | Circuits and Systems | Processor Architectures | Electronics and Microelectronics, InstrumentationFormatos físicos adicionales: Printed edition:: Sin título; Printed edition:: Sin título; Printed edition:: Sin títuloClasificación CDD: 621.3815 Clasificación LoC:TK7888.4Recursos en línea: Libro electrónicoTexto
Contenidos:
Chapter1: Introduction -- Chapter2: Bio-Inspired Machine Learning Algorithm: Genetic Algorithm -- Chapter3: Thermo-Inspired Machine Learning Algorithm: Simulated Annealing -- Chapter4: Nature-Inspired Machine Learning Algorithm: Particle Swarm Optimization, Artificial Bee Colony -- Chapter5: Control-Inspired Machine Learning Algorithm: Fuzzy Logic Optimization -- Chapter6: Brain-Inspired Machine Learning Algorithm: Neural Network Optimization -- Chapter7: Comparisons, Hybrid Solutions, Hardware architectures and New Directions -- Chapter8: Conclusions.
En: Springer Nature eBookResumen: This Book discusses machine learning for model order reduction, which can be used in modern VLSI design to predict the behavior of an electronic circuit, via mathematical models that predict behavior. The author describes techniques to reduce significantly the time required for simulations involving large-scale ordinary differential equations, which sometimes take several days or even weeks. This method is called model order reduction (MOR), which reduces the complexity of the original large system and generates a reduced-order model (ROM) to represent the original one. Readers will gain in-depth knowledge of machine learning and model order reduction concepts, the tradeoffs involved with using various algorithms, and how to apply the techniques presented to circuit simulations and numerical analysis. Introduces machine learning algorithms at the architecture level and the algorithm levels of abstraction; Describes new, hybrid solutions for model order reduction; Presents machine learning algorithms in depth, but simply; Uses real, industrial applications to verify algorithms.
Star ratings
    Valoración media: 0.0 (0 votos)
Existencias
Tipo de ítem Biblioteca actual Colección Signatura Copia número Estado Fecha de vencimiento Código de barras
Libro Electrónico Biblioteca Electrónica
Colección de Libros Electrónicos 1 No para préstamo

Acceso multiusuario

Chapter1: Introduction -- Chapter2: Bio-Inspired Machine Learning Algorithm: Genetic Algorithm -- Chapter3: Thermo-Inspired Machine Learning Algorithm: Simulated Annealing -- Chapter4: Nature-Inspired Machine Learning Algorithm: Particle Swarm Optimization, Artificial Bee Colony -- Chapter5: Control-Inspired Machine Learning Algorithm: Fuzzy Logic Optimization -- Chapter6: Brain-Inspired Machine Learning Algorithm: Neural Network Optimization -- Chapter7: Comparisons, Hybrid Solutions, Hardware architectures and New Directions -- Chapter8: Conclusions.

This Book discusses machine learning for model order reduction, which can be used in modern VLSI design to predict the behavior of an electronic circuit, via mathematical models that predict behavior. The author describes techniques to reduce significantly the time required for simulations involving large-scale ordinary differential equations, which sometimes take several days or even weeks. This method is called model order reduction (MOR), which reduces the complexity of the original large system and generates a reduced-order model (ROM) to represent the original one. Readers will gain in-depth knowledge of machine learning and model order reduction concepts, the tradeoffs involved with using various algorithms, and how to apply the techniques presented to circuit simulations and numerical analysis. Introduces machine learning algorithms at the architecture level and the algorithm levels of abstraction; Describes new, hybrid solutions for model order reduction; Presents machine learning algorithms in depth, but simply; Uses real, industrial applications to verify algorithms.

UABC ; Temporal ; 01/01/2021-12/31/2023.

Con tecnología Koha