logo Use CA10RAM to get 10%* Discount.
Order Nowlogo
(5/5)

Detecting signals of interest, particularly with wide signal variability, in noisy environments has long been a challenging issue in various fields of signal processing.

INSTRUCTIONS TO CANDIDATES
ANSWER ALL QUESTIONS

Abstract

In this paper, we compare several detection algorithms that are based on spectral matched (subspace) filters. Nonlin- ear (kernel) versions of these spectral matched (subspace) detectors are also discussed and their performance is com- pared with the linear versions. These kernel-based detec- tors exploit the nonlinear correlations between the spec- tral bands that are ignored by the conventional detectors. Several well-known matched detectors, such as matched subspace detector, orthogonal subspace detector, spectral matched filter and adaptive subspace detector (adaptive co- sine estimator) are extended to their corresponding kernel versions by using the idea of kernel-based learning theory. In kernel-based detection algorithms the data is implicitly mapped into a high dimensional kernel feature space by a nonlinear mapping which is associated with a kernel func- tion. The detection algorithm is then derived in the feature space which is kernelized in terms of the kernel functions in order to avoid explicit computation in the high dimensional feature space. Experimental results based on simulated toy- examples and real hyperspectral imagery show that the ker- nel versions of these detectors outperform the conventional linear detectors.

 

1  Introduction

Detecting signals of interest, particularly with wide signal variability, in noisy environments has long been a challeng- ing issue in various fields of signal processing. Among a number of previously developed detectors, the well-known matched subspace detector (MSD) [1], orthogonal subspace detector (OSD) [1, 2], spectral matched filter (SMF) [3, 4], and adaptive subspace detectors (ASD) also known as adap- tive cosine estimator (ACE) [5, 6] have been widely used to detect a desired signal (target).

Matched signal detectors, such as spectral matched fil- ter and matched subspace detectors (whether adaptive or non-adaptive), only exploit second order correlations, thus completely ignoring nonlinear (higher order) spectral inter- band correlations that could be crucial to discriminate be- tween target and background. In this paper, our aim is to introduce nonlinear versions of MSD, OSD, SMF and ASD

 

detectors which effectively exploits the higher order spec- tral inter-band correlations in a high (possibly infinite) di- mensional feature space associated with a certain nonlinear mapping via kernel-based learning methods [7]. A nonlin- ear mapping of the input data into a high dimensional fea- ture space is often expected to increase the data separability and reduce the complexity of the corresponding data struc- ture. The nonlinear versions of a number of signal process- ing techniques such as principal component analysis (PCA) [8], Fisher discriminant analysis [9], linear classifiers [10], and kernel-based anomaly detection [11] have already been defined in a kernel space.

This paper is organized as follows. Section 2 provides the background to the kernel-based learning methods and kernel trick. Section 3 introduces a linear matched subspace and its kernel version. The orthogonal subspace detector is defined in Section 4 as well as its kernel version. In Section 5 we describe the conventional spectral matched filter ad its kernel version in the feature space and reformulate the the expression in terms of the kernel function using the kernel trick. Finally, in Section 6 the adaptive subspace detector and its kernel version are introduced. Performance com- parison between the conventional and the kernel versions of these algorithms is provided in Section 7 and conclusions are given in Section 8.

 

2  Kernel-based Learning and Kernel Trick

Suppose that the input hyperspectral data is represented by the data space (      ) and is a feature space associated with by a nonlinear mapping function

(1)

 

where is an input vector in which is mapped into a potentially much higher – (could be infinite) – dimensional feature space. Due to the high dimensionality of the feature space , it is computationally not feasible to implement any algorithm directly in feature space. However, kernel-based learning algorithms use an effective kernel trick given by Eq. (2) to implement dot products in feature space by em- ploying kernel functions [7]. The idea in kernel-based techniques is to obtain a nonlinear version of an algorithm de- fined in the input space by implicitly redefining it in the feature space and then converting it in terms of dot prod- ucts. The kernel trick is then used to implicitly compute the dot products in without mapping the input vectors into

; therefore, in the kernel methods, the mapping  does  not need to be identified.

The kernel representation for the dot products in is expressed as

(2)

where is a kernel function in terms of the original data. There are a large number of Mercer kernels that have the kernel trick property, see [7] for detailed information about the properties of different kernels and kernel-based learn- ing. Our choice of kernel in this paper is the Gaussian RBF kernel and the associated nonlinear function with this ker- nel generates a feature space of infinite dimensionality

(5/5)
Attachments:

Related Questions

. Introgramming & Unix Fall 2018, CRN 44882, Oakland University Homework Assignment 6 - Using Arrays and Functions in C

DescriptionIn this final assignment, the students will demonstrate their ability to apply two ma

. The standard path finding involves finding the (shortest) path from an origin to a destination, typically on a map. This is an

Path finding involves finding a path from A to B. Typically we want the path to have certain properties,such as being the shortest or to avoid going t

. Develop a program to emulate a purchase transaction at a retail store. This program will have two classes, a LineItem class and a Transaction class. The LineItem class will represent an individual

Develop a program to emulate a purchase transaction at a retail store. Thisprogram will have two classes, a LineItem class and a Transaction class. Th

. SeaPort Project series For this set of projects for the course, we wish to simulate some of the aspects of a number of Sea Ports. Here are the classes and their instance variables we wish to define:

1 Project 1 Introduction - the SeaPort Project series For this set of projects for the course, we wish to simulate some of the aspects of a number of

. Project 2 Introduction - the SeaPort Project series For this set of projects for the course, we wish to simulate some of the aspects of a number of Sea Ports. Here are the classes and their instance variables we wish to define:

1 Project 2 Introduction - the SeaPort Project series For this set of projects for the course, we wish to simulate some of the aspects of a number of

Ask This Question To Be Solved By Our ExpertsGet A+ Grade Solution Guaranteed

expert
Um e HaniScience

695 Answers

Hire Me
expert
Muhammad Ali HaiderFinance

841 Answers

Hire Me
expert
Husnain SaeedComputer science

930 Answers

Hire Me
expert
Atharva PatilComputer science

900 Answers

Hire Me
March
January
February
March
April
May
June
July
August
September
October
November
December
2025
1950
1951
1952
1953
1954
1955
1956
1957
1958
1959
1960
1961
1962
1963
1964
1965
1966
1967
1968
1969
1970
1971
1972
1973
1974
1975
1976
1977
1978
1979
1980
1981
1982
1983
1984
1985
1986
1987
1988
1989
1990
1991
1992
1993
1994
1995
1996
1997
1998
1999
2000
2001
2002
2003
2004
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
2025
2026
2027
2028
2029
2030
2031
2032
2033
2034
2035
2036
2037
2038
2039
2040
2041
2042
2043
2044
2045
2046
2047
2048
2049
2050
SunMonTueWedThuFriSat
23
24
25
26
27
28
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
1
2
3
4
5
00:00
00:30
01:00
01:30
02:00
02:30
03:00
03:30
04:00
04:30
05:00
05:30
06:00
06:30
07:00
07:30
08:00
08:30
09:00
09:30
10:00
10:30
11:00
11:30
12:00
12:30
13:00
13:30
14:00
14:30
15:00
15:30
16:00
16:30
17:00
17:30
18:00
18:30
19:00
19:30
20:00
20:30
21:00
21:30
22:00
22:30
23:00
23:30