HACETTEPE UNIVERSITY

Department of Electrical and Electronics Engineering

Course Syllabus

 

 

 

 

ELE 673: Pattern Recognition

 

Credits: 3

 

Instructors:     Assist. Prof. S. Esen Yuksel, E-mail: eyuksel@ee.hacettepe.edu.tr

 

Lecture Hours: TBA

 

Description: This course provides an introduction to the theory and applications of pattern recognition. Topics include parametric and nonparametric classification, feature extraction, clustering. Current medical, defense and industrial applications.

 

Textbook:   Pattern Classification, R. O. Duda, P. E. Hart and Stork, Wiley, New York, 2nd Edition, 2001.

 

Recommended text:

1-      Pattern Recognition and Machine Learning by C. Bishop, Springer 2006

2-      Data Mining: Practical Machine Learning Tools and Techniques, Ian Witten, Eibe Frank and Mark Hall, Morgan Kaufmann Publishers, 3rd Ed., 2011.

 

Grading: Homeworks (20%), Project (20%), Midterm (30%), Final (30%)

 

There will be 4-5 homeworks and one project. For your project, you are required to choose your own topic at the beginning of the semester.

 

Attendance: Students are strongly encouraged to attend all classes.

 

Project:

 

In this course you will do a substantial project on a topic that is related to pattern recognition. This project can be: (1) a very extensive literature search and summary on a particular topic, (2) a good implementation and evaluation of a known result, in or (3) a small (but nontrivial) amount of original research. You may work on these projects individually or in groups of at most two, though if you work in a group, my expectations will be higher when I grade your project.

 

You are free to use any programming language and any opensource toolkit. You can write the codes yourself or use any code that is available in the public domain. In case you use somebody else's code, you are required to properly cite its source and know the details of the algorithms that the code implements.

 

You are required to submit a project proposal, a final report written in a conference paper format, and make a presentation during the mid-term and final weeks. When preparing your report, please use the IEEE conference format. Tentative schedule of the project is as follows:

 

I am using this course to select the students I would like to work with and to give you a chance to see if you would like to work with me. It is best if you take the class if you are contemplating working with me for your thesis, or with another professor on a project related to pattern recognition. These can be excellent sources to identify your project topic.

 

 

 

 

 COURSE OUTLINE

         WEEKS

I. INTRODUCTION ...................................................................................................……........1

I.1 MATHEMATICAL FOUNDATION

 

II. BAYES DECISION THEORY..............................................................................……. ......1

II.1 Bayes Classier for Continuous Case

II.2 The Gaussian Two-class classifier

II.3 Bayes Classier for Discrete Case

II.4 Error Probability and Receiver Operating Characteristics

 

III. MAXIMUM-LIKELIHOOD AND BAYESIAN PARAMETER ESTIMATION.......... 1

III.1 Maximum Likelihood Estimation

III.2 Application to Bayesian Classification

III.3 Learning the Mean of Gaussian Density Function

 

IV. NONPARAMETRIC TECHNIQUES ..........................................................................  2

IV.1 Probability Density Estimation

IV.2 Parzen Windows Estimation

IV.3 k Nearest Neighbor Estimation

IV.4 Nearest Neighbor Rule

IV.5 k Nearest Neighbor Rule

 

V. LINEAR DISCRIMINANT FUNCTIONS...................................................................... 3

V.1 Linear Discriminant Functions and Decision Surfaces

V.2 The Two-Category Case

V.3 Generalized Linear Discriminant Functions

V.4 Relaxation Procedure

V.5 Minimum Square Error Procedure

V.6 Ho-Kashyap Procedure

V.7 Linear Programming Procedure

V.8 Support Vector Machines

 

VI. UNSUPERVISED LEARNING & CLUSTERING .....................................................  2

VI.1 Mixture Densities & Identifiability

VI.2 Maximum Likelihood Estimates

VI.3 Applications to Normal Mixtures

VI.4 Unsupervised Bayesian Learning

VI.5 Clustering Techniques & Criterion

 

VII. ALGORITHM-INDEPENDENT MACHINE LEARNING .....................................  1

VII.1 Bias and Variance

VII.2 Bagging and Boosting

 

VIII. HYPERSPECTRAL IMAGE PROCESSING

VIII.1 Endmember detection

VIII.2 Abundance estimation

VIII.3 Segmentation and classification

 

 

 

Academic Integrity: Copying, communicating, or using disallowed materials during an exam or homework is cheating. Students caught cheating on a midterm or final exam will be reported to the campus disciplinary committee. Students caught cheating on homework will automatically receive a -100 from the homework and a zero possibly from all homeworks. Any sort of plagiarism will not be tolerated. This means no copying, no rewording, no paraphrasing, or giving false or inaccurate information. For more details about plagiarism, please see http://www.plagiarism.org/