
Ahmet Enis Cetin, Ph.D.
Research Professor
Department of Electrical and Computer Engineering
Contact
Building & Room:
3009 ERF
Address:
842 W. Taylor St., Chicago, IL 60607
Office Phone:
Email:
CV Download:
Related Sites:
About
Professional Achievements
Fellow of IEEE
2015 to present: Member, Turkish Academy of Sciences
Editorial Board Member, IEEE Signal Processing Magazine, 2013-2016.
Special Issue Editor, Signal Processing for Ambient Assisted Living System, IEEE Signal Processing Magazine, 2016.
Assoc. Editor, IEEE Trans. on CAS for Video Technology, 2014- 2016.
Editor-in-Chief, Signal, Image and Video Processing, Springer-Nature, SCI Impact Factor 1.8, March 2013-
Editorial Board Member, Signal Processing (EURASIP: European Signal Processing Society), 2006-2010.
Selected Publications
Badawi, Diaa, Hongyi Pan, Sinan Cem Cetin, and A. Enis Cetin. “Computationally Efficient Spatio-Temporal Dynamic Texture Recognition for Volatile Organic Compound (VOC) Leakage Detection in Industrial Plants.” IEEE Journal of Selected Topics in Signal Processing (2020).
Pan, H., Badawi, D., Zhang, X., & Cetin, A. E. (2019). Additive neural network for forest fire detection. Signal, Image and Video Processing, 1-8.
Muneeb, Usama, Erdem Koyuncu, Yasaman Keshtkarjahromd, Hulya Seferoglu, Mehmet Fatih Erden, and A. Enis Cetin. “Robust and Computationally-Efficient Anomaly Detection Using Powers-Of-Two Networks.” In ICASSP 2020-2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 2992-2996. IEEE, 2020.
Shahdloo, M., Ilicak, E., Tofighi, M., Saritas, E.U., Çetin, A.E. and Çukur, T., 2019. Projection onto epigraph sets for rapid self-tuning compressed sensing MRI. IEEE transactions on medical imaging, 38(7), pp.1677-1689.
Kapu, H., Saraswat, K., Ozturk, Y. and Cetin, A.E., 2017. Resting heart rate estimation using PIR sensors. Infrared Physics & Technology, 85, pp.56-61.
Töreyin, B. Uğur, Yiğithan Dedeoğlu, Uğur Güdükbay, and A. Enis Cetin. “Computer vision based method for real-time fire and flame detection.” Pattern recognition letters 27, no. 1 (2006): 49-58.
Cetin, A.E., Gerek, O.N. and Yardimci, Y., 1997. Equiripple FIR filter design by the FFT algorithm. IEEE Signal Processing Magazine, 14(2), pp.60-64.
Jabloun, F., Cetin, A.E. and Erzin, E., 1999. Teager energy based feature parameters for speech recognition in car noise. IEEE Signal Processing Letters, 6(10), pp.259-261.
Erzin, E. and Cetin, A.E., 1993, April. Interframe differential vector coding of line spectrum frequencies. In 1993 IEEE International Conference on Acoustics, Speech, and Signal Processing (Vol. 2, pp. 25-28). IEEE.
Notable Honors
2012, Best paper award, INTERNATIONAL CONFERENCE ON PROGRESS IN CULTURAL HERITAGE PRESERVATION
Education
Ph.D. Systems Engineering, University of Pennsylvania 1987
M.S.E. Electrical Engineering, University of Pennsylvania, 1986
B.Sc. Electrical Engineering, METU, Ankara, Turkey, 1984
Research Currently in Progress
Dr. Cetin's research interests are in the areas of inverse problems, biomedical image processing, computational camera design, ambient assisted living sensors and systems, agricultural systems, computer vision, and environmental monitoring systems.
Intellectual Property
Ahiska, B., Davey, M.K. and Cetin, A.E., Grandeye Ltd, 2011. Automatically expanding the zoom capability of a wide-angle video camera. U.S. Patent 7,990,422.
Cetin, A.E., Davey, M.K., Cuce, H.I., Castellari, A.E. and Mulayim, A., Grandeye Ltd, 2011. Method of compression for wide angle digital video. U.S. Patent 7,894,531.
Cetin, A.E. and Toreyin, B.U., Delacom Detection Systems LLC, 2013. Method, device and system for determining the presence of volatile organic compounds (VOC) in video. U.S. Patent 8,432,451.
Artistic and Professional Performances and Exhibits
I will be teaching ECE 491:
ECE 491 Introduction to Artificial Neural Networks
Description: Credit 3. This course provides an introduction to artificial neural networks and deep learning. Biophysical and mathematical models of neurons. Perceptron and its relation to the LMS algorithm. Parallel Computing and GPUs. Convolution neural networks, recurrent neural networks (LSTM and gated recurrent), and residual networks.
Prerequisites: MATH 310, ECE 310 or equivalent and basic computer programming skills.
Overview:
This is an introductory course on Artificial Neural Networks for senior undergraduate and junior graduate students. Prerequisites are linear algebra, calculus and basic computer programming. There will be 8 labs, 1 Midterm exam and 1 Final exam. Laboratory part of the course will cover practical applications and students will use deep learning libraries such as Keras and Tensorflow. Students will also learn how to train neural networks using GPUs.
Weekly Topics:
1. Introduction
2. Biophysical and Mathematical Models of Neurons
3. Early Artificial Neural Network (ANN) Structures: Perceptrons
4. Relation between perceptrons and adaptive FIR Filters and the Least Means Squares (LMS) algorithm, and gradient descent
5. Learning: Supervised, Unsupervised, Reinforcement Learning
6. Training Single Layer ANNs
7. Parallel computing and Graphics Processing Units (GPU)s
8. Training Multilayer ANNs: Back Propagation (BP), Empirical risk minimization and deep learning, batch normalization
9. Optimization methods for training deep models and regularization
10. Sequence modeling, Recurrent and Recursive Neural Networks (RNNs), Long Short-Term Memory (LSTM) Networks
11. Autoencoders: Denoising, Contractive, Stacked ANNs
12. Convolutional NNs and their deep versions
13. Applications to computer vision and image processing
14. Applications to Speech Recognition and Natural Language Processing
Recommended Textbooks:
• Ian Goodfellow, Y. Bengio, A. Courville, Deep Learning, MIT Press, 2016. http://www.deeplearningbook.org/
This is an online textbook by the people who are pioneers of deep learning and generative modeling.
● Daniel Graupe, Deep Learning Neural Networks: Design and Case Studies, World Scientific Press, 2016.
• J. M. Zurada, Introduction to Artificial Neural Systems, West Pub. Co.,
S. Paul, 1992.
• S. Haykin, Neural Networks, A Comprehensive Foundation, 2nd ed.
Macmillan, New York, 1999.