Abstract is missing.
- An Introduction to Feature ExtractionIsabelle Guyon, André Elisseeff. 1-25 [doi]
- Learning MachinesNorbert Jankowski, Krzysztof Grabczewski. 29-64 [doi]
- Assessment MethodsGérard Dreyfus, Isabelle Guyon. 65-88 [doi]
- Filter MethodsWlodzislaw Duch. 89-117 [doi]
- Search StrategiesJuha Reunanen. 119-136 [doi]
- Embedded MethodsThomas Navin Lal, Olivier Chapelle, Jason Weston, André Elisseeff. 137-165 [doi]
- Information-Theoretic MethodsKari Torkkola. 167-185 [doi]
- Ensemble LearningEugene Tuv. 187-204 [doi]
- Fuzzy Neural NetworksMadan M. Gupta, Noriyasu Homma, Zeng-Guang Hou. 205-233 [doi]
- Design and Analysis of the NIPS2003 ChallengeIsabelle Guyon, Steve R. Gunn, Asa Ben-Hur, Gideon Dror. 237-263 [doi]
- High Dimensional Classification with Bayesian Neural Networks and Dirichlet Diffusion TreesRadford M. Neal, Jianguo Zhang. 265-296 [doi]
- Ensembles of Regularized Least Squares Classifiers for High-Dimensional ProblemsKari Torkkola, Eugene Tuv. 297-313 [doi]
- Combining SVMs with Various Feature Selection StrategiesYi-Wei Chen, Chih-Jen Lin. 315-324 [doi]
- Feature Selection with Transductive Support Vector MachinesZhili Wu, Chun Hung Li. 325-341 [doi]
- Variable Selection using Correlation and Single Variable Classifier Methods: ApplicationsAmir Reza Saffari Azar Alamdari. 343-358 [doi]
- Tree-Based Ensembles with Dynamic Soft Feature SelectionAlexander Borisov, Victor Eruhimov, Eugene Tuv. 359-374 [doi]
- Sparse, Flexible and Efficient Modeling using L 1 RegularizationSaharon Rosset, Ji Zhu. 375-394 [doi]
- Margin Based Feature Selection and Infogain with Standard ClassifiersRan Gilad-Bachrach, Amir Navot. 395-401 [doi]
- Bayesian Support Vector Machines for Feature Ranking and SelectionWei Chu, S. Sathiya Keerthi, Chong Jin Ong, Zoubin Ghahramani. 403-418 [doi]
- Nonlinear Feature Selection with the Potential Support Vector MachineSepp Hochreiter, Klaus Obermayer. 419-438 [doi]
- Combining a Filter Method with SVMsThomas Navin Lal, Olivier Chapelle, Bernhard Schölkopf. 439-445 [doi]
- Feature Selection via Sensitivity Analysis with Direct Kernel PLSMark J. Embrechts, Robert A. Bress, Robert H. Kewley. 447-462 [doi]
- Information Gain, Correlation and Support Vector MachinesDanny Roobaert, Grigoris I. Karakoulas, Nitesh V. Chawla. 463-470 [doi]
- Mining for Complex Models Comprising Feature Selection and ClassificationKrzysztof Grabczewski, Norbert Jankowski. 471-488 [doi]
- Combining Information-Based Supervised and Unsupervised Feature SelectionSang-Kyun Lee, Seung-Joon Yi, Byoung-Tak Zhang. 489-498 [doi]
- An Enhanced Selective Naïve Bayes Method with Optimal DiscretizationMarc Boullé. 499-507 [doi]
- An Input Variable Importance Definition based on Empirical Data Probability DistributionVincent Lemaire, Fabrice Clérot. 509-516 [doi]
- Spectral Dimensionality ReductionYoshua Bengio, Olivier Delalleau, Nicolas Le Roux, Jean-François Paiement, Pascal Vincent, Marie Ouimet. 519-550 [doi]
- Constructing Orthogonal Latent Features for Arbitrary LossMichinari Momma, Kristin P. Bennett. 551-583 [doi]
- Large Margin Principles for Feature SelectionRan Gilad-Bachrach, Amir Navot, Naftali Tishby. 585-606 [doi]
- Feature Extraction for Classification of Proteomic Mass Spectra: A Comparative StudyIlya Levner, Vadim Bulitko, Guohui Lin. 607-624 [doi]
- Sequence Motifs: Highly Predictive Features of Protein FunctionAsa Ben-Hur, Douglas L. Brutlag. 625-645 [doi]