Author: V. Kreinovich
Publisher: Springer Science & Business Media
ISBN: 1475727933
Category : Mathematics
Languages : en
Pages : 460
Book Description
Targeted audience • Specialists in numerical computations, especially in numerical optimiza tion, who are interested in designing algorithms with automatie result ver ification, and who would therefore be interested in knowing how general their algorithms caIi in principle be. • Mathematicians and computer scientists who are interested in the theory 0/ computing and computational complexity, especially computational com plexity of numerical computations. • Students in applied mathematics and computer science who are interested in computational complexity of different numerical methods and in learning general techniques for estimating this computational complexity. The book is written with all explanations and definitions added, so that it can be used as a graduate level textbook. What this book .is about Data processing. In many real-life situations, we are interested in the value of a physical quantity y that is diflicult (or even impossible) to measure directly. For example, it is impossible to directly measure the amount of oil in an oil field or a distance to a star. Since we cannot measure such quantities directly, we measure them indirectly, by measuring some other quantities Xi and using the known relation between y and Xi'S to reconstruct y. The algorithm that transforms the results Xi of measuring Xi into an estimate fj for y is called data processing.
Computational Complexity and Feasibility of Data Processing and Interval Computations
Author: V. Kreinovich
Publisher: Springer Science & Business Media
ISBN: 1475727933
Category : Mathematics
Languages : en
Pages : 460
Book Description
Targeted audience • Specialists in numerical computations, especially in numerical optimiza tion, who are interested in designing algorithms with automatie result ver ification, and who would therefore be interested in knowing how general their algorithms caIi in principle be. • Mathematicians and computer scientists who are interested in the theory 0/ computing and computational complexity, especially computational com plexity of numerical computations. • Students in applied mathematics and computer science who are interested in computational complexity of different numerical methods and in learning general techniques for estimating this computational complexity. The book is written with all explanations and definitions added, so that it can be used as a graduate level textbook. What this book .is about Data processing. In many real-life situations, we are interested in the value of a physical quantity y that is diflicult (or even impossible) to measure directly. For example, it is impossible to directly measure the amount of oil in an oil field or a distance to a star. Since we cannot measure such quantities directly, we measure them indirectly, by measuring some other quantities Xi and using the known relation between y and Xi'S to reconstruct y. The algorithm that transforms the results Xi of measuring Xi into an estimate fj for y is called data processing.
Publisher: Springer Science & Business Media
ISBN: 1475727933
Category : Mathematics
Languages : en
Pages : 460
Book Description
Targeted audience • Specialists in numerical computations, especially in numerical optimiza tion, who are interested in designing algorithms with automatie result ver ification, and who would therefore be interested in knowing how general their algorithms caIi in principle be. • Mathematicians and computer scientists who are interested in the theory 0/ computing and computational complexity, especially computational com plexity of numerical computations. • Students in applied mathematics and computer science who are interested in computational complexity of different numerical methods and in learning general techniques for estimating this computational complexity. The book is written with all explanations and definitions added, so that it can be used as a graduate level textbook. What this book .is about Data processing. In many real-life situations, we are interested in the value of a physical quantity y that is diflicult (or even impossible) to measure directly. For example, it is impossible to directly measure the amount of oil in an oil field or a distance to a star. Since we cannot measure such quantities directly, we measure them indirectly, by measuring some other quantities Xi and using the known relation between y and Xi'S to reconstruct y. The algorithm that transforms the results Xi of measuring Xi into an estimate fj for y is called data processing.
Beyond Traditional Probabilistic Data Processing Techniques: Interval, Fuzzy etc. Methods and Their Applications
Author: Olga Kosheleva
Publisher: Springer Nature
ISBN: 3030310418
Category : Computers
Languages : en
Pages : 638
Book Description
Data processing has become essential to modern civilization. The original data for this processing comes from measurements or from experts, and both sources are subject to uncertainty. Traditionally, probabilistic methods have been used to process uncertainty. However, in many practical situations, we do not know the corresponding probabilities: in measurements, we often only know the upper bound on the measurement errors; this is known as interval uncertainty. In turn, expert estimates often include imprecise (fuzzy) words from natural language such as "small"; this is known as fuzzy uncertainty. In this book, leading specialists on interval, fuzzy, probabilistic uncertainty and their combination describe state-of-the-art developments in their research areas. Accordingly, the book offers a valuable guide for researchers and practitioners interested in data processing under uncertainty, and an introduction to the latest trends and techniques in this area, suitable for graduate students.
Publisher: Springer Nature
ISBN: 3030310418
Category : Computers
Languages : en
Pages : 638
Book Description
Data processing has become essential to modern civilization. The original data for this processing comes from measurements or from experts, and both sources are subject to uncertainty. Traditionally, probabilistic methods have been used to process uncertainty. However, in many practical situations, we do not know the corresponding probabilities: in measurements, we often only know the upper bound on the measurement errors; this is known as interval uncertainty. In turn, expert estimates often include imprecise (fuzzy) words from natural language such as "small"; this is known as fuzzy uncertainty. In this book, leading specialists on interval, fuzzy, probabilistic uncertainty and their combination describe state-of-the-art developments in their research areas. Accordingly, the book offers a valuable guide for researchers and practitioners interested in data processing under uncertainty, and an introduction to the latest trends and techniques in this area, suitable for graduate students.
Coping with Complexity: Model Reduction and Data Analysis
Author: Alexander N. Gorban
Publisher: Springer Science & Business Media
ISBN: 3642149413
Category : Mathematics
Languages : en
Pages : 356
Book Description
This volume contains the extended version of selected talks given at the international research workshop "Coping with Complexity: Model Reduction and Data Analysis", Ambleside, UK, August 31 – September 4, 2009. The book is deliberately broad in scope and aims at promoting new ideas and methodological perspectives. The topics of the chapters range from theoretical analysis of complex and multiscale mathematical models to applications in e.g., fluid dynamics and chemical kinetics.
Publisher: Springer Science & Business Media
ISBN: 3642149413
Category : Mathematics
Languages : en
Pages : 356
Book Description
This volume contains the extended version of selected talks given at the international research workshop "Coping with Complexity: Model Reduction and Data Analysis", Ambleside, UK, August 31 – September 4, 2009. The book is deliberately broad in scope and aims at promoting new ideas and methodological perspectives. The topics of the chapters range from theoretical analysis of complex and multiscale mathematical models to applications in e.g., fluid dynamics and chemical kinetics.
Computing Statistics under Interval and Fuzzy Uncertainty
Author: Hung T. Nguyen
Publisher: Springer Science & Business Media
ISBN: 3642249043
Category : Mathematics
Languages : en
Pages : 412
Book Description
In many practical situations, we are interested in statistics characterizing a population of objects: e.g. in the mean height of people from a certain area. Most algorithms for estimating such statistics assume that the sample values are exact. In practice, sample values come from measurements, and measurements are never absolutely accurate. Sometimes, we know the exact probability distribution of the measurement inaccuracy, but often, we only know the upper bound on this inaccuracy. In this case, we have interval uncertainty: e.g. if the measured value is 1.0, and inaccuracy is bounded by 0.1, then the actual (unknown) value of the quantity can be anywhere between 1.0 - 0.1 = 0.9 and 1.0 + 0.1 = 1.1. In other cases, the values are expert estimates, and we only have fuzzy information about the estimation inaccuracy. This book shows how to compute statistics under such interval and fuzzy uncertainty. The resulting methods are applied to computer science (optimal scheduling of different processors), to information technology (maintaining privacy), to computer engineering (design of computer chips), and to data processing in geosciences, radar imaging, and structural mechanics.
Publisher: Springer Science & Business Media
ISBN: 3642249043
Category : Mathematics
Languages : en
Pages : 412
Book Description
In many practical situations, we are interested in statistics characterizing a population of objects: e.g. in the mean height of people from a certain area. Most algorithms for estimating such statistics assume that the sample values are exact. In practice, sample values come from measurements, and measurements are never absolutely accurate. Sometimes, we know the exact probability distribution of the measurement inaccuracy, but often, we only know the upper bound on this inaccuracy. In this case, we have interval uncertainty: e.g. if the measured value is 1.0, and inaccuracy is bounded by 0.1, then the actual (unknown) value of the quantity can be anywhere between 1.0 - 0.1 = 0.9 and 1.0 + 0.1 = 1.1. In other cases, the values are expert estimates, and we only have fuzzy information about the estimation inaccuracy. This book shows how to compute statistics under such interval and fuzzy uncertainty. The resulting methods are applied to computer science (optimal scheduling of different processors), to information technology (maintaining privacy), to computer engineering (design of computer chips), and to data processing in geosciences, radar imaging, and structural mechanics.
Applied and Computational Matrix Analysis
Author: Natália Bebiano
Publisher: Springer
ISBN: 331949984X
Category : Mathematics
Languages : en
Pages : 346
Book Description
This volume presents recent advances in the field of matrix analysis based on contributions at the MAT-TRIAD 2015 conference. Topics covered include interval linear algebra and computational complexity, Birkhoff polynomial basis, tensors, graphs, linear pencils, K-theory and statistic inference, showing the ubiquity of matrices in different mathematical areas. With a particular focus on matrix and operator theory, statistical models and computation, the International Conference on Matrix Analysis and its Applications 2015, held in Coimbra, Portugal, was the sixth in a series of conferences. Applied and Computational Matrix Analysis will appeal to graduate students and researchers in theoretical and applied mathematics, physics and engineering who are seeking an overview of recent problems and methods in matrix analysis.
Publisher: Springer
ISBN: 331949984X
Category : Mathematics
Languages : en
Pages : 346
Book Description
This volume presents recent advances in the field of matrix analysis based on contributions at the MAT-TRIAD 2015 conference. Topics covered include interval linear algebra and computational complexity, Birkhoff polynomial basis, tensors, graphs, linear pencils, K-theory and statistic inference, showing the ubiquity of matrices in different mathematical areas. With a particular focus on matrix and operator theory, statistical models and computation, the International Conference on Matrix Analysis and its Applications 2015, held in Coimbra, Portugal, was the sixth in a series of conferences. Applied and Computational Matrix Analysis will appeal to graduate students and researchers in theoretical and applied mathematics, physics and engineering who are seeking an overview of recent problems and methods in matrix analysis.
Intelligent Data Analysis: Developing New Methodologies Through Pattern Discovery and Recovery
Author: Wang, Hsiao-Fan
Publisher: IGI Global
ISBN: 159904983X
Category : Education
Languages : en
Pages : 366
Book Description
Pattern Recognition has a long history of applications to data analysis in business, military and social economic activities. While the aim of pattern recognition is to discover the pattern of a data set, the size of the data set is closely related to the methodology one adopts for analysis. Intelligent Data Analysis: Developing New Methodologies Through Pattern Discovery and Recovery tackles those data sets and covers a variety of issues in relation to intelligent data analysis so that patterns from frequent or rare events in spatial or temporal spaces can be revealed. This book brings together current research, results, problems, and applications from both theoretical and practical approaches.
Publisher: IGI Global
ISBN: 159904983X
Category : Education
Languages : en
Pages : 366
Book Description
Pattern Recognition has a long history of applications to data analysis in business, military and social economic activities. While the aim of pattern recognition is to discover the pattern of a data set, the size of the data set is closely related to the methodology one adopts for analysis. Intelligent Data Analysis: Developing New Methodologies Through Pattern Discovery and Recovery tackles those data sets and covers a variety of issues in relation to intelligent data analysis so that patterns from frequent or rare events in spatial or temporal spaces can be revealed. This book brings together current research, results, problems, and applications from both theoretical and practical approaches.
Interval / Probabilistic Uncertainty and Non-classical Logics
Author: Van-Nam Huynh
Publisher: Springer Science & Business Media
ISBN: 3540776648
Category : Mathematics
Languages : en
Pages : 381
Book Description
This book contains the proceedings of the first International Workshop on Interval/Probabilistic Uncertainty and Non Classical Logics, Ishikawa, Japan, March 25-28, 2008. The workshop brought together researchers working on interval and probabilistic uncertainty and on non-classical logics. It is hoped this workshop will lead to a boost in the much-needed collaboration between the uncertainty analysis and non-classical logic communities, and thus, to better processing of uncertainty.
Publisher: Springer Science & Business Media
ISBN: 3540776648
Category : Mathematics
Languages : en
Pages : 381
Book Description
This book contains the proceedings of the first International Workshop on Interval/Probabilistic Uncertainty and Non Classical Logics, Ishikawa, Japan, March 25-28, 2008. The workshop brought together researchers working on interval and probabilistic uncertainty and on non-classical logics. It is hoped this workshop will lead to a boost in the much-needed collaboration between the uncertainty analysis and non-classical logic communities, and thus, to better processing of uncertainty.
Algebraic Approach to Data Processing
Author: Julio C. Urenda
Publisher: Springer Nature
ISBN: 3031167805
Category : Computers
Languages : en
Pages : 246
Book Description
The book explores a new general approach to selecting—and designing—data processing techniques. Symmetry and invariance ideas behind this algebraic approach have been successful in physics, where many new theories are formulated in symmetry terms. The book explains this approach and expands it to new application areas ranging from engineering, medicine, education to social sciences. In many cases, this approach leads to optimal techniques and optimal solutions. That the same data processing techniques help us better analyze wooden structures, lung dysfunctions, and deep learning algorithms is a good indication that these techniques can be used in many other applications as well. The book is recommended to researchers and practitioners who need to select a data processing technique—or who want to design a new technique when the existing techniques do not work. It is also recommended to students who want to learn the state-of-the-art data processing.
Publisher: Springer Nature
ISBN: 3031167805
Category : Computers
Languages : en
Pages : 246
Book Description
The book explores a new general approach to selecting—and designing—data processing techniques. Symmetry and invariance ideas behind this algebraic approach have been successful in physics, where many new theories are formulated in symmetry terms. The book explains this approach and expands it to new application areas ranging from engineering, medicine, education to social sciences. In many cases, this approach leads to optimal techniques and optimal solutions. That the same data processing techniques help us better analyze wooden structures, lung dysfunctions, and deep learning algorithms is a good indication that these techniques can be used in many other applications as well. The book is recommended to researchers and practitioners who need to select a data processing technique—or who want to design a new technique when the existing techniques do not work. It is also recommended to students who want to learn the state-of-the-art data processing.
Knowledge Processing with Interval and Soft Computing
Author: Chenyi Hu
Publisher: Springer Science & Business Media
ISBN: 1848003269
Category : Computers
Languages : en
Pages : 241
Book Description
Interval computing combined with fuzzy logic has become an emerging tool in studying artificial intelligence and knowledge processing (AIKP) applications since it models uncertainties frequently raised in the field. This book provides introductions for both interval and fuzzy computing in a very accessible style. Application algorithms covered in this book include quantitative and qualitative data mining with interval valued datasets, decision making systems with interval valued parameters, interval valued Nash games and interval weighted graphs. Successful applications in studying finance and economics, etc are also included. This book can serve as a handbook or a text for readers interested in applying interval and soft computing for AIKP.
Publisher: Springer Science & Business Media
ISBN: 1848003269
Category : Computers
Languages : en
Pages : 241
Book Description
Interval computing combined with fuzzy logic has become an emerging tool in studying artificial intelligence and knowledge processing (AIKP) applications since it models uncertainties frequently raised in the field. This book provides introductions for both interval and fuzzy computing in a very accessible style. Application algorithms covered in this book include quantitative and qualitative data mining with interval valued datasets, decision making systems with interval valued parameters, interval valued Nash games and interval weighted graphs. Successful applications in studying finance and economics, etc are also included. This book can serve as a handbook or a text for readers interested in applying interval and soft computing for AIKP.
From Intervals to –?
Author: Vladik Kreinovich
Publisher: Springer Nature
ISBN: 3031205693
Category : Technology & Engineering
Languages : en
Pages : 125
Book Description
This book is about methodological aspects of uncertainty propagation in data processing. Uncertainty propagation is an important problem: while computer algorithms efficiently process data related to many aspects of their lives, most of these algorithms implicitly assume that the numbers they process are exact. In reality, these numbers come from measurements, and measurements are never 100% exact. Because of this, it makes no sense to translate 61 kg into pounds and get the result—as computers do—with 13 digit accuracy. In many cases—e.g., in celestial mechanics—the state of a system can be described by a few numbers: the values of the corresponding physical quantities. In such cases, for each of these quantities, we know (at least) the upper bound on the measurement error. This bound is either provided by the manufacturer of the measuring instrument—or is estimated by the user who calibrates this instrument. However, in many other cases, the description of the system is more complex than a few numbers: we need a function to describe a physical field (e.g., electromagnetic field); we need a vector in Hilbert space to describe a quantum state; we need a pseudo-Riemannian space to describe the physical space-time, etc. To describe and process uncertainty in all such cases, this book proposes a general methodology—a methodology that includes intervals as a particular case. The book is recommended to students and researchers interested in challenging aspects of uncertainty analysis and to practitioners who need to handle uncertainty in such unusual situations.
Publisher: Springer Nature
ISBN: 3031205693
Category : Technology & Engineering
Languages : en
Pages : 125
Book Description
This book is about methodological aspects of uncertainty propagation in data processing. Uncertainty propagation is an important problem: while computer algorithms efficiently process data related to many aspects of their lives, most of these algorithms implicitly assume that the numbers they process are exact. In reality, these numbers come from measurements, and measurements are never 100% exact. Because of this, it makes no sense to translate 61 kg into pounds and get the result—as computers do—with 13 digit accuracy. In many cases—e.g., in celestial mechanics—the state of a system can be described by a few numbers: the values of the corresponding physical quantities. In such cases, for each of these quantities, we know (at least) the upper bound on the measurement error. This bound is either provided by the manufacturer of the measuring instrument—or is estimated by the user who calibrates this instrument. However, in many other cases, the description of the system is more complex than a few numbers: we need a function to describe a physical field (e.g., electromagnetic field); we need a vector in Hilbert space to describe a quantum state; we need a pseudo-Riemannian space to describe the physical space-time, etc. To describe and process uncertainty in all such cases, this book proposes a general methodology—a methodology that includes intervals as a particular case. The book is recommended to students and researchers interested in challenging aspects of uncertainty analysis and to practitioners who need to handle uncertainty in such unusual situations.