logo Use CA10RAM to get 10%* Discount.
Order Nowlogo
(5/5)

Detecting signals of interest, particularly with wide signal variability, in noisy environments has long been a challenging issue in various fields of signal processing.

INSTRUCTIONS TO CANDIDATES
ANSWER ALL QUESTIONS

Abstract

In this paper, we compare several detection algorithms that are based on spectral matched (subspace) filters. Nonlin- ear (kernel) versions of these spectral matched (subspace) detectors are also discussed and their performance is com- pared with the linear versions. These kernel-based detec- tors exploit the nonlinear correlations between the spec- tral bands that are ignored by the conventional detectors. Several well-known matched detectors, such as matched subspace detector, orthogonal subspace detector, spectral matched filter and adaptive subspace detector (adaptive co- sine estimator) are extended to their corresponding kernel versions by using the idea of kernel-based learning theory. In kernel-based detection algorithms the data is implicitly mapped into a high dimensional kernel feature space by a nonlinear mapping which is associated with a kernel func- tion. The detection algorithm is then derived in the feature space which is kernelized in terms of the kernel functions in order to avoid explicit computation in the high dimensional feature space. Experimental results based on simulated toy- examples and real hyperspectral imagery show that the ker- nel versions of these detectors outperform the conventional linear detectors.

 

1  Introduction

Detecting signals of interest, particularly with wide signal variability, in noisy environments has long been a challeng- ing issue in various fields of signal processing. Among a number of previously developed detectors, the well-known matched subspace detector (MSD) [1], orthogonal subspace detector (OSD) [1, 2], spectral matched filter (SMF) [3, 4], and adaptive subspace detectors (ASD) also known as adap- tive cosine estimator (ACE) [5, 6] have been widely used to detect a desired signal (target).

Matched signal detectors, such as spectral matched fil- ter and matched subspace detectors (whether adaptive or non-adaptive), only exploit second order correlations, thus completely ignoring nonlinear (higher order) spectral inter- band correlations that could be crucial to discriminate be- tween target and background. In this paper, our aim is to introduce nonlinear versions of MSD, OSD, SMF and ASD

 

detectors which effectively exploits the higher order spec- tral inter-band correlations in a high (possibly infinite) di- mensional feature space associated with a certain nonlinear mapping via kernel-based learning methods [7]. A nonlin- ear mapping of the input data into a high dimensional fea- ture space is often expected to increase the data separability and reduce the complexity of the corresponding data struc- ture. The nonlinear versions of a number of signal process- ing techniques such as principal component analysis (PCA) [8], Fisher discriminant analysis [9], linear classifiers [10], and kernel-based anomaly detection [11] have already been defined in a kernel space.

This paper is organized as follows. Section 2 provides the background to the kernel-based learning methods and kernel trick. Section 3 introduces a linear matched subspace and its kernel version. The orthogonal subspace detector is defined in Section 4 as well as its kernel version. In Section 5 we describe the conventional spectral matched filter ad its kernel version in the feature space and reformulate the the expression in terms of the kernel function using the kernel trick. Finally, in Section 6 the adaptive subspace detector and its kernel version are introduced. Performance com- parison between the conventional and the kernel versions of these algorithms is provided in Section 7 and conclusions are given in Section 8.

 

2  Kernel-based Learning and Kernel Trick

Suppose that the input hyperspectral data is represented by the data space (      ) and is a feature space associated with by a nonlinear mapping function

(1)

 

where is an input vector in which is mapped into a potentially much higher – (could be infinite) – dimensional feature space. Due to the high dimensionality of the feature space , it is computationally not feasible to implement any algorithm directly in feature space. However, kernel-based learning algorithms use an effective kernel trick given by Eq. (2) to implement dot products in feature space by em- ploying kernel functions [7]. The idea in kernel-based techniques is to obtain a nonlinear version of an algorithm de- fined in the input space by implicitly redefining it in the feature space and then converting it in terms of dot prod- ucts. The kernel trick is then used to implicitly compute the dot products in without mapping the input vectors into

; therefore, in the kernel methods, the mapping  does  not need to be identified.

The kernel representation for the dot products in is expressed as

(2)

where is a kernel function in terms of the original data. There are a large number of Mercer kernels that have the kernel trick property, see [7] for detailed information about the properties of different kernels and kernel-based learn- ing. Our choice of kernel in this paper is the Gaussian RBF kernel and the associated nonlinear function with this ker- nel generates a feature space of infinite dimensionality

(5/5)
Attachments:

Related Questions

. Introgramming & Unix Fall 2018, CRN 44882, Oakland University Homework Assignment 6 - Using Arrays and Functions in C

DescriptionIn this final assignment, the students will demonstrate their ability to apply two ma

. The standard path finding involves finding the (shortest) path from an origin to a destination, typically on a map. This is an

Path finding involves finding a path from A to B. Typically we want the path to have certain properties,such as being the shortest or to avoid going t

. Develop a program to emulate a purchase transaction at a retail store. This program will have two classes, a LineItem class and a Transaction class. The LineItem class will represent an individual

Develop a program to emulate a purchase transaction at a retail store. Thisprogram will have two classes, a LineItem class and a Transaction class. Th

. SeaPort Project series For this set of projects for the course, we wish to simulate some of the aspects of a number of Sea Ports. Here are the classes and their instance variables we wish to define:

1 Project 1 Introduction - the SeaPort Project series For this set of projects for the course, we wish to simulate some of the aspects of a number of

. Project 2 Introduction - the SeaPort Project series For this set of projects for the course, we wish to simulate some of the aspects of a number of Sea Ports. Here are the classes and their instance variables we wish to define:

1 Project 2 Introduction - the SeaPort Project series For this set of projects for the course, we wish to simulate some of the aspects of a number of

Ask This Question To Be Solved By Our ExpertsGet A+ Grade Solution Guaranteed

expert
Um e HaniScience

743 Answers

Hire Me
expert
Muhammad Ali HaiderFinance

972 Answers

Hire Me
expert
Husnain SaeedComputer science

791 Answers

Hire Me
expert
Atharva PatilComputer science

998 Answers

Hire Me