My CV and Resume

The pdf version of the cv is here. (latest updated: July 28, 2012)



To gain employment with a company or an institute where my technical and entrepreneurial skills, experience, creativity and knowledge, especially in the area of mathematics, machine learning, data mining and programming, can be utilized at the fullest.


Statistical machine learning, classification, regression, unsupervised learning, information-theoretic learning (ITL) [12] and probabilistic graphical models. Recent applications include neuroscience [34], computer vision[56], data fusion [78910] and data mining.


University of Florida, Gainesville
Ph.D. Electrical and Computer Engineering (2011)
M.S. Electrical and Computer Engineering (2006)

Chulalongkorn University, Bangkok, Thailand
B.S. Electrical and Computer Engineering (2001)


Integrated Brain Imaging Center (IBIC), University of Washington Medical Center, WA. (2011 to present)
Postdoctoral associate and machine learning scientist

Department of Electrical and Computer Engineering, University of Florida, FL. (2004 to 2011)
Graduate research assistant

USDA-ARS-Southwest Watershed Research Center (SWRC), AZ. (Fall Intern 2006)
Machine learning scientist and software developer
Automatic 3D-LiDAR point cloud processing, filtering and recognition,
Adaptive signal processing on hydrological data,

Integrated-circuit Design and Applications Research (IDAR), Bangkok, Thailand. (2001 to 2003)
Embedded system R&D Engineer
Develop hardware and firmware for digital high-accuracy measuring instruments

Intronics Co, Ltd., Bangkok, Thailand. (Summer Intern 2000)
Embedded system R&D Engineer
Develop hardware and firmware for testing electronics device in manufacturing processes


My machine learning projects/hobbies Wikipage –

Free math/science/machine learning tutorial channel “ChaLearn Academy”:
Facebook page:


Programming: Python, Java, proficient in MATLAB®
Mathematics and statistics: numerical/meta-heuristic/biologically-inspired optimization, Bayesian theory, information theory, graph/social network theory
Remote sensing: airborne/terrestrial 3D LiDAR laser scanning, multi-spectral image, GIS software/tools


Machine Learning and Computer Vision: Advanced Topics on Machine Learning, Information-Theoretic Learning (ITL), Mathematics for Intelligence Systems, Computer Vision and Image Processing, Pattern Recognition.

Mathematics and Statistics: Numerical Optimization, Numerical Linear Algebra, Meta-Heuristic Optimization, Introduction to Probability Theory I and II, Spatial Statistics.


Running the whole course, University of Florida
Spring 2010 Pattern Recognition (graduate-level): theory and applications of pattern recognition.

Teaching assistant and guest lecturer, University of Florida
Spring 2006 to 2009 Pattern Recognition (graduate-level): theory and applications of pattern recognition.
Fall 2007 Airborne Sensors and Instrumentation (graduate-level): fundamental remote sensing applications, data processing using pattern recognition approach.

Teaching assistant and guest lecturer, Chulalongkorn University
Spring 2001 Introduction to Microprocessor (undergrad-level)

Math tutor:
Special program for high school mathematics Olympiad at Triam Udom Suksa (1996)
Special program for US high school mathematics competitions in Gainesville, FL (2011)
Free tutorial website “ChaLearn Academy” for motivation to learn mathematics, science and machine learning. (2012-present)


2005 Member of Eta Kappa Nu (HKN) honor society
2003 Received a Distinguished Alumni Award from Chulalongkorn University
2001 Invention was broadcast to the public on several nation-wide TV programs
2001 Awarded a grant for the second prize in Thailand Innovation Award
2000 Received a Prince of Thailand research scholarship from the Engineering Institute of Thailand
1996 Attained the highest score in Physics in the National Examination
1996 Received a Silver Medal Award in the Thailand Mathematics Olympiad


Computational NeuroEngineering Laboratory (CNEL)
Adaptive Signal Processing Laboratory (ASPL)
Geosensing Engineering and Mapping Center (GEM)
National Center for Airborne Laser Mapping (NCALM)

PROFESSIONAL SERVICE – Reviewer (* indicates assistant reviewer)

IEEE Signal Processing Letters (SPL)
IEEE Transactions on Image Processing Journal (TIP)
IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI)*
The International Conference on Artificial Neural Networks (ICANN 2010, 2011)
IEEE International Conference on Systems, Man, and Cybernetics (SMC 2009)
IEEE International Conference on Computer Vision (ICCV 2009)
IEEE Transactions on Signal Processing
IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP 2004, 2008, 2009, 2010)
IEEE International Conference on Image Processing (ICIP 2005 – 2007)
IEEE Geoscience and Remote Sensing Society (IGARSS 2006)


[1]   K. Kampa, E. Hasanbelliu, and J. C. Principe, “Closed-form cauchy-schwarz pdf divergence for mixture of gaussians,” in Proc. of the 2011 International Joint Conference on Neural Networks (IJCNN), 2011.

[2]   E. Hasanbelliu, K. Kampa, J. Principe, and J. T. Cobb, “Surprise metric for online learning,” in Proc. SPIE Defense and Security Symposium, vol. xxxx-xx, Orlando, FL, Apr. 2011.

[3]   K. Kampa, C. Chou, S. Mehta, R. Tungaraza, W. Chaovalitwongse, and T. Grabowski, “Enhancement of fmri pattern classification with mutual information,” Radiology Research Symposium, vol. x, pp. xxx–xxx, 2012.

[4]   C. Chou, K. Kampa, S. Mehta, R. Tungaraza, W. Chaovalitwongse, and T. Grabowski, “Information theoretic based feature selection for multi-voxel pattern analysis of fmri data,” Brain Informatics, vol. x, pp. xxx–xxx, 2012.

[5]   K. Kampa, D. Putthividhya, J. Principe, and A. Rangarajan, “data-driven tree-structured bayesian network for image segmentation,” in IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), vol. x, 2012, pp. xxx–xxx.

[6]   K. Kampa, D. Putthividhya, and J. Principe, “Irregular tree-structured bayesian network for image segmentation,” in Proceedings of the 2011 International Workshop on Machine Learning for Signal Processing (MLSP 2011), vol. x, 2011, pp. xxx–xxx.

[7]   K. Kampa, K. Slatton, and J. Cobb, “Dynamic trees for sensor fusion,” in IEEE International Conference on Systems, Man and Cybernetics (SMC), 11-14 2009, pp. 2751 –2756.

[8]   K. Kampa, J. C. Principe, and K. C. Slatton, “Dynamic factor graphs: A novel framework for multiple features data fusion,” in IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2010.

[9]   K. Kampa, J. Principe, J. T. Cobb, and A. Rangarajan, “Deformable bayesian networks for data clustering and fusion,” in Proc. SPIE Defense and Security Symposium, vol. 8017-25, Orlando, FL, Apr. 2011.

[10]   K. Kampa, E. Hasanbelliu, J. Cobb, J. Principe, and K. Slatton, “Deformable bayesian network: A robust framework for underwater sensor fusion,” Oceanic Engineering, IEEE Journal of, vol. 37, no. 2, pp. 166–184, 2012.


  • Functional brain parcellation using Bayesian networks: In my PhD work, I have proposed data-driven tree structure Bayesian networks (DDT) as a hierarchical probabilistic framework for unsupervised clustering. In this experiment, I apply the framework to functional brain parcellation by first learning the hierarchical tree structure from the data, then perform statistical inference on every node in the tree by alternating between the message-passing and expectation-maximization algorithm iteratively. [on-going]
  • Distributed information in the brain: Select merely the voxels with small distinguishability value (V <0.3) and compare the accuracy when classified 1) using one independent voxel at a time and 2) using a group of voxels assumably working jointly. The results suggest there is a synergy among the low-V voxels, which when used in combination resulting in much better accuracy, implying that the information is distributed in the brain. [on-going]
  • Distinguishability measure on fMRI data: The distinguishability measure (V ), developed from pair-wise statistical test, measures how informative a voxel is when used to classify a set of tasks. The experimental results show high correlation between V and the mutual information; and the single-voxel accuracy. The measure can be employed as a feature selection and give good accuracy (>90%). Unlike mutual information, V is a normalized measure, convenient to compare across subjects and experiment settings. Furthermore, the complexity of the calculation of V is less than that of mutual information. [in preparation]
  • Feature selection on fMRI data using simulated annealing: The feature (voxel) selection problem is cast as an integer programming problem and solved using meta-heuristic optimization framework, in particular simulated annealing (SA). The results show that when the number of features is not very high (<1000), SA is a good candidate given a good searching strategy. When the number of features is very high (>40,000), SA gives less decent feature set and can be impractically slow, demanding better searching strategy. [in preparation]
  • Class-specific mutual information for fMRI data: Mutual information is a widely-used tool to determine “good” features based on the correlation between classes and features on average. However, the measure is not very informative about which feature maximally responses to which stimulus category in particular. Such problem can be ameliorated using the proposed class-specific mutual information. [3]
  • Mutual information-based feature selection for fMRI data: Classification of stimuli category using features (voxels) selected by mutual information. We use a haemodynamic model to extract the coefficients β from each stimulus-presented block, then compare the accuracy from several widely-adopted classifiers. Our proposed preprocessing pipeline yields the state-of-the-art accuracy of >90%. [4]
  • Image segmentation using multi-scale-evidence data-driven tree-structured Bayesian network (meDDT): Unlike the DDT framework that has the evidence only at the leaf of the tree, meDDT has the evidence for every node in the tree. Having evidences in multiple scales of the tree captures the multi-scale properties of the image, yielding better clustering results than DDT. [5]
  • Image segmentation using Data-driven tree-structured Bayesian network (DDT): An image is over-segmented into superpixels, a group of pixels sharing similar properties, which become nodes in a Bayesian network. A hierarchical tree structure is learned from the relationship of nodes in the image. The feature vector is extracted from each of the superpixels, then treated as the evidence to each node in the tree. The expectation-maximization framwork and message-passing algorithm are derived and applied alternatively in the framwork in order to infer the class probability at each node at the leaf of the tree. [6]
  • Robust evaluation methodology for unsupervised image segmentation using more generalized precision-recall framework derived from Cauchy-Schwarz divergence. [6]
  • Closed-form expression for Cauchy-Schwarz divergence of Gaussian mixture models: Kullback-Leibler divergence (DKL) is widely used as the divergnce measure between distributions, yet does not provide closed-form expression for distributions parameterized with Gaussian mixture models. I demonstrated that closed-from expression for such models can be obtained when using Cauchy-Schwarz divergence (DCS). Experimental results showed our proposed model outperformed DKL in both accuracy and run-time in static hand posture recognition using pdf distance-based classifier on Gaussian mixture models. [1]
  • K-nearest neighbor (k-nn) random-walk for CPT initialization: With a good choice of data structure, k-nn random walk on an arbitrary graph can be calculated efficiently. The k-nn random walk rooted at a seed point can be viewed as a conditional probability. This fact provides a fast and simple way to create conditional probability table (CPT) in a Bayesian network. [9]
  • Bayesian surprise metric with Gaussian mixture models (GMM): Inspired by human brain, “surprise” can be modeled using Kullback-Leibler divergence between the prior distribution and the posterior distribution. We proposed using GMM as the base distribution, the surprise metric is then derived on the GMM. The framework is applied to the outlier detection problem in data fusion framework for unmanned vehicle. [2]
  • Dynamic Factor Graphs: An information fusion framework developed on factor graphs, and the framework is a generalized version of factor graphs. [8]
  • Dynamic tree Bayesian network (DTBN) for data fusion: A data fusion framework incorporating multiple high-dimensional features using DTBN, a probabilistic graphical model whose structure can be adapted to fit with the data. DTBN combines both model selection and statistical inference in the same framework. [7, 10]
  • Feature extraction of sand waves in the hurricane surf zone using wavelet coefficients. [Kenneday et al., 2008]
  • Interpolating missing data using Bayesian networks: Use Bayesian networks to estimate the missing digital elevation model (DEM) obtained from LiDAR. [Cobb et al., 2006]
  • Information-theoretic multiscale segmentation for 3D LiDAR point-cloud: The project aims for segmenting a high quality ground surface, i.e., preserving good details of the terrain, yet minimal noise level. Multiscale sliding morphological windows are applied to the point cloud to populate the point-cloud sets, each with different probability of being a ground point. The final high-quality ground surface is then reconstructured by recruiting those points based on the thresholds derived from Bayes decision theory on Gaussian mixture model. [Kampa et al., 2004]


  • Derive mean-field variational approximation for Deformable Gaussian Bayesian Networks. (Fall 2010)
  • Model selection using variational Bayes GMM (VBGMM) vs Bayesian information criteria (BIC) for GMM. (Fall 2010)
  • Denoising binary images using Discriminative Random Fields (DRFs) (Spring 2010)
  • Alternative derivation of Mean Field Dynamic Trees (MFDTs) approximate inference algorithm for image modeling (Fall 2008)
  • Use information-theoretic learning (ITL) for adaptive signal processing (Fall 2008)
  • Use information-theoretic clustering as a good initialization for DTBN (Fall 2008)
  • Analysis of simulated annealing algorithm: how to initialize and obtain an optimal temperature schedule [class paper] (Fall 2007)
  • Heuristic optimization on minimizing cost for phonebook-client assignment problem [finished with the ranking #1] (Fall 2007)
  • Multiscale characteristic of soil surface roughness modeling using 2D wavelets (Spring 2007)
  • Develop software for automatic microscopic ground feature multiscale filtering on ground-based LiDAR (internship with USDA 2006)
  • Improve the accuracy of surface interpolation by combining statistical models and physical models in Bayesian framework (Spring 2006)
  • Dimensionality reduction and feature selection framework using Shannon’s mutual information and conditional entropy (2005)
  • Comparison of algorithms applied on land mine detection based on naïve Bayes classifier, AdaBoost, Neural Networks, Wavelets, k-mean and PCA [best class-paper awarded] (2004)
  • Soda can brand classification from side-view can image using mixture of Gaussian and vector quantization (LBG-VQ) in RGB color space (2004)


“Overview of machine learning for multi-voxel pattern analysis (MVPA): Gaussian Naive Bayes, LDA, logistic regression and SVM” MVPA interest-group, Integrated Brain Imaging Center, University of Washington.

“Data-driven tree-structured Bayesian network for unsupervised image segmentation”, University of Florida

“Probabilistic Graphical Models: Hidden Markov Models, Bayesian networks, Markov Random Fields and Conditional Random Fields” , University of Florida
“Basic Optimization Methods for Machine Learning” , University of Florida
“Generative and Discriminative Models for Classification” , University of Florida
“Linear Discriminant Analysis (LDA) and Quadratic Discriminant Analysis (QDA)”, University of Florida
“Bayesian and Beyond”, University of Florida

“Dimensionality Reduction for Machine Learning”, University of Florida
“Graphical Models: Message Passing Algorithms in Discrete and Continuous Cases”, University of Florida
“Graphical Models: Modeling by Bayesian Networks”, University of Florida
“Hidden Markov Model and Its Applications”, University of Florida
“Principal Component Analysis: An Introduction to Dimensionality Reduction and Manifold Learning”, University of Florida

“Statistical Graphical Models and Inference Algorithms”, University of Florida
“Conditional Entropy and Mutual Information for Feature Extraction and Clustering”, University of Florida

“Vegetation Filter on 3D LiDAR Data”, University of Florida

“Bayesian Networks: Interpolating missing ALSM data”, University of Florida

“Machine Learning and LiDAR Processing Algorithms”, Remote Sensing Workshop, Chulalongkorn University, Bangkok, Thailand
“Stem Cell Research Meets Machine Learning”, Mahidol University, Bangkok, Thailand
“Mathematics, AI, Science, and Your Future!!!”, Triam Udom Suksa High School, Bangkok, Thailand
“Machine Learning and Pattern Recognition in Remote Sensing Applications” , University of Florida
MATLAB®; Programming Techniques for Electrical and Computer Engineering Students, HKN Tutorial Session, University of Florida



K. Kampa, E. Hasanbelliu, J. Cobb, J. Principe, and K. Slatton, “Deformable bayesian network: A robust framework for underwater sensor fusion,” IEEE Journal of Oceanic Engineering, vol. 37, no. 2, pp. 166184, 2012.

K. Kampa and K. Clint Slatton, “Information-Theoretic Hierarchical Segmentation of Airborne Laser Swath Mapping Data,” IEEE Transactions in Geoscience and Remote Sensing, (in preparation).

Andrew B. Kennedy, K. Clint Slatton, Tian-Jian Hsu, Michael J. Starek, K. Kampa, “Ephemeral sand waves in the hurricane surf zone,” International Journal of Marine Geology, Geochemistry and Geophysics, vol. 250, May 2008, pp. 276 – 280.

Andrew B. Kennedy, K. Clint Slatton, Michael Starek, K. Kampa, and Hyun-Chong Cho, “Hurricane Response of Nearshore Borrow Pits from Airborne Bathymetric LiDAR,” Journal of Coastal Engineering .

K. Clint Slatton, Hyun-chong Cho, Kittipat Kampa, Bidhya Yadav, “Automated Detection and Quantitative Measurements of Forest Streams Using High-Resolution ALSM Data,” Geophys. Rsrch. Ltrs., Special issue: New perspectives on the earth from airborne laser swath mapping, 2007, (in revision).

K. Clint Slatton, H. Lee, K. Kampa, H. Jhee, “Segmentation of ALSM Point Data and the Prediction of Subcanopy Sunlight Distribution,” Proc. IEEE International Geoscience and Remote Sensing Symposium (IGARSS), invited, Jul., 2005.


C. Chou, K. Kampa, S. Mehta, R. Tungaraza, W. Chaovalitwongse, and T. Grabowski, Information theoretic based feature selection for multi-voxel pattern analysis of fmri data, Brain Informatics, vol. x, pp. xxxxxx, 2012.

K. Kampa, D. Putthividhya, J. C. Principe and A. Rangarajan, “Data-driven Tree-structured Bayesian Network for Image Segmentation,” IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), 2012.

K. Kampa, D. Putthividhya and J. C. Principe, “Irregular Tree-Structured Bayesian Network for Image Segmentation,” In Proc. of the 2011 International Workshop on Machine Learning for Signal Processing (MLSP 2011).

K. Kampa, E. Hasanbelliu and J. C. Principe, “Closed-form Cauchy-Schwarz pdf Divergence for Mixture of Gaussians,” In Proc. of the 2011 International Joint Conference on Neural Networks (IJCNN 2011). (oral presentation)

K. Kampa, J. Principe, J. Tory Cobb and A. Rangarajan, “Deformable Bayesian Networks for Data Clustering and Fusion,” Proc. SPIE 2011 Defense and Security Symposium, vol. 1, no. 8017-25, 2011. (oral presentation)

E. Hasanbelliu, K. Kampa, J. Principe and J. Tory Cobb “Surprise metric for online learning,” Proc. SPIE 2011 Defense and Security Symposium, vol. —-, no. —–, 2011. (oral presentation)

K. Kampa, H. Lee, K. C. Slatton, A. Rangarajan, J. C. Principe, “A New Framework for Segmentation of Aerial Images Using Dynamic Trees,” IEEE Geoscience and Remote Sensing Society (IGARSS), 2010. (oral presentation—but not attending)

K. Kampa, Jose C. Principe and K. C. Slatton, “Dynamic Factor Graphs: A Novel Framework for Multiple Features Data Fusion,” IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), 2010.

K. Kampa, K. C. Slatton and J. T. Cobb, “Dynamic Trees for Sensor Fusion,” IEEE International Conference on Systems, Man, and Cybernetics (SMC), 2009. (oral presentation)

K. Kampa and K. C. Slatton, “An Adaptive Multiscale Filter for Segmenting Vegetation in ALSM Data,” Proc. IEEE International Geoscience and Remote Sensing Symposium (IGARSS), vol. 6, Sep. 2004, pp. 3837 – 3840.

Andrew B. Kennedy, Clint Slatton, Tian-Jian Hsu, Michael Starek, and K. Kampa, “Large Scale Sand Waves in the Hurricane Surf Zone,” International Conference on Coastal Engineering (ICCE), 2008, Hamburg, Germany.

Hyun-chong Cho, K. Kampa, K. Clint Slatton, “Morphological Segmentation of LiDAR Digital Elevation Models to Extract Stream Channels in Forested Terrain,” Proc. IEEE 2007 International Geoscience and Remote Sensing Symposium (IGARSS), Barcelona, Spain, 23-27, Jul. , 2007, pp.3182 – 3185, doi: 10.1109/IGARSS.2007.4423521.

J. Tory Cobb, K. Kampa, K. Clint Slatton, “Using Bayesian Networks to Estimate Missing Airborne Laser Swath Mapping (ALSM) Data,” Proc. SPIE 2006 Defense and Security Symposium, vol. 6234, no. 6234-04, 2006.

* K. Clint Slatton, H. Lee, K. Kampa, H. Jhee, “Segmentation of ALSM Point Data and the Prediction of Subcanopy Sunlight Distribution,” Proc. IEEE 2005 International Geoscience and Remote Sensing Symposium (IGARSS), invited, vol. 1, Jul., 2005, pp. 525 – 528.

K.C. Slatton, K. Kampa, and H. Lee, “Estimating Leaf Area Index in Forests Using Airborne Laser Swath Mapping Data,” Eos Trans. AGU, 85(47), Fall Meet. Suppl., Abstract H13C-0445, Dec., 2004.

K.C. Slatton, H. Lee, K. Kampa, K. Nagarajan, V. Aggarwal, “Estimation of Optimal Transportation Paths and Optical Gaps in Forested Terrain,” 24th Army Science Conference, November 29 – December 2, 2004.


Assisting in revising and editing chapter2 of an upcoming book “Information Theoretic Learning: Renyi’s Entropy and Kernel Perspectives” by Jose Principe (ISBN-13: 978-1441915696).

  1. No comments yet.
  1. No trackbacks yet.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: