Information Loss in Deterministic Signal Processing Systems [electronic resource] / by Bernhard C. Geiger, Gernot Kubin.

Por: Geiger, Bernhard C [author.]Colaborador(es): Kubin, Gernot [author.] | SpringerLink (Online service)Tipo de material: TextoTextoSeries Understanding Complex SystemsEditor: Cham : Springer International Publishing : Imprint: Springer, 2018Edición: 1st ed. 2018Descripción: XIII, 145 p. 16 illus., 9 illus. in color. online resourceTipo de contenido: text Tipo de medio: computer Tipo de portador: online resourceISBN: 9783319595337Tema(s): Computational complexity | Signal processing | Image processing | Speech processing systems | Statistical physics | Dynamical systems | Complexity | Signal, Image and Speech Processing | Complex SystemsFormatos físicos adicionales: Printed edition:: Sin título; Printed edition:: Sin título; Printed edition:: Sin títuloClasificación CDD: 620 Clasificación LoC:QA267.7Recursos en línea: Libro electrónicoTexto
Contenidos:
Introduction -- Part I: Random Variables -- Piecewise Bijective Functions and Continuous Inputs -- General Input Distributions -- Dimensionality-Reducing Functions -- Relevant Information Loss -- II. Part II: Stationary Stochastic Processes -- Discrete-Valued Processes -- Piecewise Bijective Functions and Continuous Inputs -- Dimensionality-Reducing Functions -- Relevant Information Loss Rate -- Conclusion and Outlook.
En: Springer Nature eBookResumen: This book introduces readers to essential tools for the measurement and analysis of information loss in signal processing systems. Employing a new information-theoretic systems theory, the book analyzes various systems in the signal processing engineer's toolbox: polynomials, quantizers, rectifiers, linear filters with and without quantization effects, principal components analysis, multirate systems, etc. The user benefit of signal processing is further highlighted with the concept of relevant information loss. Signal or data processing operates on the physical representation of information so that users can easily access and extract that information. However, a fundamental theorem in information theory-data processing inequality-states that deterministic processing always involves information loss.  These measures form the basis of a new information-theoretic systems theory, which complements the currently prevailing approaches based on second-order statistics, such as the mean-squared error or error energy. This theory not only provides a deeper understanding but also extends the design space for the applied engineer with a wide range of methods rooted in information theory, adding to existing methods based on energy or quadratic representations.
Star ratings
    Valoración media: 0.0 (0 votos)
Existencias
Tipo de ítem Biblioteca actual Colección Signatura Copia número Estado Fecha de vencimiento Código de barras
Libro Electrónico Biblioteca Electrónica
Colección de Libros Electrónicos 1 No para préstamo

Acceso multiusuario

Introduction -- Part I: Random Variables -- Piecewise Bijective Functions and Continuous Inputs -- General Input Distributions -- Dimensionality-Reducing Functions -- Relevant Information Loss -- II. Part II: Stationary Stochastic Processes -- Discrete-Valued Processes -- Piecewise Bijective Functions and Continuous Inputs -- Dimensionality-Reducing Functions -- Relevant Information Loss Rate -- Conclusion and Outlook.

This book introduces readers to essential tools for the measurement and analysis of information loss in signal processing systems. Employing a new information-theoretic systems theory, the book analyzes various systems in the signal processing engineer's toolbox: polynomials, quantizers, rectifiers, linear filters with and without quantization effects, principal components analysis, multirate systems, etc. The user benefit of signal processing is further highlighted with the concept of relevant information loss. Signal or data processing operates on the physical representation of information so that users can easily access and extract that information. However, a fundamental theorem in information theory-data processing inequality-states that deterministic processing always involves information loss.  These measures form the basis of a new information-theoretic systems theory, which complements the currently prevailing approaches based on second-order statistics, such as the mean-squared error or error energy. This theory not only provides a deeper understanding but also extends the design space for the applied engineer with a wide range of methods rooted in information theory, adding to existing methods based on energy or quadratic representations.

UABC ; Temporal ; 01/01/2021-12/31/2023.

Con tecnología Koha