Markov and Bayesian Estimation and Classification

Home
Full List of Titles
1: Speech Processing
CELP Coding
Large Vocabulary Recognition
Speech Analysis and Enhancement
Acoustic Modeling I
ASR Systems and Applications
Topics in Speech Coding
Speech Analysis
Low Bit Rate Speech Coding I
Robust Speech Recognition in Noisy Environments
Speaker Recognition
Acoustic Modeling II
Speech Production and Synthesis
Feature Extraction
Robust Speech Recognition and Adaptation
Low Bit Rate Speech Coding II
Speech Understanding
Language Modeling I
2: Speech Processing, Audio and Electroacoustics, and Neural Networks
Acoustic Modeling III
Lexical Issues/Search
Speech Understanding and Systems
Speech Analysis and Quantization
Utterance Verification/Acoustic Modeling
Language Modeling II
Adaptation /Normalization
Speech Enhancement
Topics in Speaker and Language Recognition
Echo Cancellation and Noise Control
Coding
Auditory Modeling, Hearing Aids and Applications of Signal Processing to Audio and Acoustics
Spatial Audio
Music Applications
Application - Pattern Recognition & Speech Processing
Theory & Neural Architecture
Signal Separation
Application - Image & Nonlinear Signal Processing
3: Signal Processing Theory & Methods I
Filter Design and Structures
Detection
Wavelets
Adaptive Filtering: Applications and Implementation
Nonlinear Signals and Systems
Time/Frequency and Time/Scale Analysis
Signal Modeling and Representation
Filterbank and Wavelet Applications
Source and Signal Separation
Filterbanks
Emerging Applications and Fast Algorithms
Frequency and Phase Estimation
Spectral Analysis and Higher Order Statistics
Signal Reconstruction
Adaptive Filter Analysis
Transforms and Statistical Estimation
Markov and Bayesian Estimation and Classification
4: Signal Processing Theory & Methods II, Design and Implementation of Signal Processing Systems, Special Sessions, and Industry Technology Tracks
System Identification, Equalization, and Noise Suppression
Parameter Estimation
Adaptive Filters: Algorithms and Performance
DSP Development Tools
VLSI Building Blocks
DSP Architectures
DSP System Design
Education
Recent Advances in Sampling Theory and Applications
Steganography: Information Embedding, Digital Watermarking, and Data Hiding
Speech Under Stress
Physics-Based Signal Processing
DSP Chips, Architectures and Implementations
DSP Tools and Rapid Prototyping
Communication Technologies
Image and Video Technologies
Automotive Applications / Industrial Signal Processing
Speech and Audio Technologies
Defense and Security Applications
Biomedical Applications
Voice and Media Processing
Adaptive Interference Cancellation
5: Communications, Sensor Array and Multichannel
Source Coding and Compression
Compression and Modulation
Channel Estimation and Equalization
Blind Multiuser Communications
Signal Processing for Communications I
CDMA and Space-Time Processing
Time-Varying Channels and Self-Recovering Receivers
Signal Processing for Communications II
Blind CDMA and Multi-Channel Equalization
Multicarrier Communications
Detection, Classification, Localization, and Tracking
Radar and Sonar Signal Processing
Array Processing: Direction Finding
Array Processing Applications I
Blind Identification, Separation, and Equalization
Antenna Arrays for Communications
Array Processing Applications II
6: Multimedia Signal Processing, Image and Multidimensional Signal Processing, Digital Signal Processing Education
Multimedia Analysis and Retrieval
Audio and Video Processing for Multimedia Applications
Advanced Techniques in Multimedia
Video Compression and Processing
Image Coding
Transform Techniques
Restoration and Estimation
Image Analysis
Object Identification and Tracking
Motion Estimation
Medical Imaging
Image and Multidimensional Signal Processing Applications I
Segmentation
Image and Multidimensional Signal Processing Applications II
Facial Recognition and Analysis
Digital Signal Processing Education

Author Index
A B C D E F G H I
J K L M N O P Q R
S T U V W X Y Z

Bayesian Separation and Recovery of Convolutively Mixed Autoregressive Sources

Authors:

Simon J Godsill,
Christophe Andrieu,

Page (NA) Paper number 2223

Abstract:

In this paper we address the problem of the separation and recovery of convolutively mixed autoregressive processes in a Bayesian framework. Solving this problem requires the ability to solve integration and/or optimization problems of complicated posterior distributions. We thus propose efficient stochastic algorithms based on Markov chain Monte Carlo (MCMC) methods. We present three algorithms. The first one is a classical Gibbs sampler that generates samples from the posterior distribution. The two other algorithms are stochastic optimization algorithms that allow to optimize either the marginal distribution of the source s, or the marginal distribution of the parameters of the sources and mixing filters, conditional upon the observation. Simulations are presented.

IC992223.PDF (From Author) IC992223.PDF (Rasterized)

TOP


Estimation of Nonstationary Hidden Markov Models by MCMC Sampling

Authors:

Petar M Djurić,
Joon-Hwa Chun,

Page (NA) Paper number 2255

Abstract:

Hidden Markov models are very important for analysis of signals and systems. In the past two decades they attracted the attention of the speech processing community, and recently they have become the favorite models of biologists. Major weakness of conventional hidden Markov models is their inflexibility in modeling state duration. In this paper, we analyze nonstationary hidden Markov models whose state transition probabilities are functions of time, thereby indirectly modeling state durations by a given probability mass function. The objective of our work is to estimate all the unknowns of the nonstationary hidden Markov model ,its parameters and state sequence. To that end, we construct a Markov chain Monte Carlo sampling scheme in which all the posterior probability distributions of the unknowns are easy to sample from. Extensive simulation results show that the estimation procedure yields excellent results.

IC992255.PDF (Scanned)

TOP


A Bayesian Multiscale Framework for Poisson Inverse Problems

Authors:

Robert D Nowak,
Eric D Kolaczyk,

Page (NA) Paper number 1978

Abstract:

This paper describes a maximum a posteriori (MAP) estimation method for linear inverse problems involving Poisson data based on a novel multiscale framework. The framework itself is founded on a carefully designed multiscale prior probability distribution placed on the ``splits'' in the multiscale partition of the underlying intensity, and it admits a remarkably simple MAP estimation procedure using an expectation-maximization (EM) algorithm. Unlike many other approaches to this problem, the EM update equations for our algorithm have simple, closed-form expressions. Additionally, our class of priors has the interesting feature that the ``non-informative'' member yields the traditional maximum likelihood solution; other choices are made to reflect prior belief as to the smoothness of the unknown intensity.

IC991978.PDF (Scanned)

TOP


Bayesian Framework for Unsupervised Classification with Application to Target Tracking

Authors:

Rangasami L Kashyap,
Srinivas Sista,

Page (NA) Paper number 2196

Abstract:

We have given a solution to the problem of unsupervised classification of multidimensional data. Our approach is based on Bayesian estimation which regards the number of classes, the data partition and the parameter vectors that describe the density of classes as unknowns. We compute their MAP estimates simultaneously by maximizing their joint posterior probability density given the data. The concept of partition as a variable to be estimated is a unique feature of our method. This formulation also solves the problem of validating clusters obtained from various methods. Our method can also incorporate any additional information about a class while assigning its probability density. It can also utilize any available training samples that arise from different classes. We provide a descent algorithm that starts with an arbitrary partition of the data and iteratively computes the MAP estimates. The proposed method is applied to target tracking data. The results obtained demonstrate the power of Bayesian approach for unsupervised classification.

IC992196.PDF (From Author) IC992196.PDF (Rasterized)

TOP


Structure And Parameter Learning Via Entropy Minimization, With Applications To Mixture And Hidden Markov Models

Authors:

Matthew E Brand,

Page (NA) Paper number 1802

Abstract:

We develop a computationally efficient framework for finding compact and highly accurate hidden-variable models via entropy minimization. The main results are: 1) An entropic prior that favors small, unambiguous, maximally structured models. 2) A prior-balancing manipulation of Bayes' rule that allows one to gradually introduce or remove constraints in the course of iterative re-estimation. #1 and #2 combined give the information-theoretic Helmholtz free energy of the model and the means to manipulate it. 3) Maximum a posteriori (MAP) estimators such that entropy optimization and deterministic annealing can be performed wholly within expectation-maximization (EM). 4) Trimming tests that identify excess parameters whose removal will increase the posterior, thereby simplifying the model and preventing over-fitting. The end result is a fast and exact hill-climbing algorithm that mixes continuous and combinatoric optimization and evades sub-optimal equilibria.

IC991802.PDF (From Author)

TOP


Marginal MAP Estimation Using Markov Chain Monte Carlo

Authors:

Christian P Robert, Statistical Laboratory, CREST, INSEE, France (France)
Arnaud Doucet,
Simon J Godsill,

Page (NA) Paper number 1916

Abstract:

Markov chain Monte Carlo (MCMC) methods are powerful simulation-based techniques for sampling from high-dimensional and/or non-standard probability distributions. These methods have recently become very popular in the statistical and signal processing communities as they allow highly complex inference problems in detection and estimation to be addressed. However, MCMC is not currently well adapted to the problem of marginal maximum a posteriori (MMAP) estimation. In this paper, we present a simple and novel MCMC strategy, called State Augmentation for Marginal Estimation (SAME), that allows MMAP estimates to be obtained for Bayesian models. The methodology is very general and we illustrate the simplicity and utility of the approach by examples in MAP parameter estimation for Hidden Markov models (HMMs) and for missing data interpolation in autoregressive time series.

IC991916.PDF (From Author) IC991916.PDF (Rasterized)

TOP


Gibbs Sampling Approach For Generation Of Truncated Multivariate Gaussian Random Variables

Authors:

Jayesh H Kotecha,
Petar M Djurić,

Page (NA) Paper number 2263

Abstract:

In many Monte Carlo simulations, it is important to generate samples from given densities. Recently, researchers in statistical signal processing and related disciplines have shown increased interest for a generator of random vectors with truncated multivariate normal probability density functions (pdf's). A straightforward method for their generation is to draw samples from the multivariate normal density and reject the ones that are outside the acceptance region. This method, which is known as rejection sampling, can be very inefficient, especially for high dimensions and/or relatively small supports of the random vectors. In this paper we propose an approach for generation of vectors with truncated Gaussian densities based on Gibbs sampling which is simple to use and does not reject any of the generated vectors.

IC992263.PDF (From Author) IC992263.PDF (Rasterized)

TOP


Efficient Computation of the Bayesian Cramer-Rao Bound on Estimating Parameters of Markov Models

Authors:

Joseph Tabrikian, Duke University (U.K.)
Jeffrey L Krolik, Duke University (U.K.)

Page (NA) Paper number 2367

Abstract:

This paper presents a novel method for calculating the Hybrid Cramer-Rao lower bound (HCRLB) when the statistical model for the data has a Markovian nature. The method applies to both the non-linear/non-Gaussian as well as linear/Gaussian model. The approach solves the required expectation over unknown random parameters by several one-dimensional integrals computed recursively, thus simplifying a computationally-intensive multi-dimensional integration. The method is applied to the problem of refractivity estimation using radar clutter from the sea surface, where the backscatter cross section is assumed to be a Markov process in range. The HCRLB is evaluated and compared to the performance of the corresponding maximum a-posteriori estimator. Simulation results indicate that the HCRLB provides a tight lower bound in this application.

IC992367.PDF (From Author) IC992367.PDF (Rasterized)

TOP