Algorithmic Learning Theory 14th International Conference, ALT 2003, Sapporo, Japan, October 17-19, 2003, Proceedings /

Λεπτομέρειες βιβλιογραφικής εγγραφής
Συγγραφή απο Οργανισμό/Αρχή: SpringerLink (Online service)
Άλλοι συγγραφείς: Gavaldà, Ricard (Επιμελητής έκδοσης, http://id.loc.gov/vocabulary/relators/edt), Jantke, Klaus P. (Επιμελητής έκδοσης, http://id.loc.gov/vocabulary/relators/edt), Takimoto, Eiji (Επιμελητής έκδοσης, http://id.loc.gov/vocabulary/relators/edt)
Μορφή: Ηλεκτρονική πηγή Ηλ. βιβλίο
Γλώσσα:English
Έκδοση: Berlin, Heidelberg : Springer Berlin Heidelberg : Imprint: Springer, 2003.
Έκδοση:1st ed. 2003.
Σειρά:Lecture Notes in Artificial Intelligence ; 2842
Θέματα:
Διαθέσιμο Online:Full Text via HEAL-Link
Πίνακας περιεχομένων:
  • Invited Papers
  • Abduction and the Dualization Problem
  • Signal Extraction and Knowledge Discovery Based on Statistical Modeling
  • Association Computation for Information Access
  • Efficient Data Representations That Preserve Information
  • Can Learning in the Limit Be Done Efficiently?
  • Inductive Inference
  • Intrinsic Complexity of Uniform Learning
  • On Ordinal VC-Dimension and Some Notions of Complexity
  • Learning of Erasing Primitive Formal Systems from Positive Examples
  • Changing the Inference Type - Keeping the Hypothesis Space
  • Learning and Information Extraction
  • Robust Inference of Relevant Attributes
  • Efficient Learning of Ordered and Unordered Tree Patterns with Contractible Variables
  • Learning with Queries
  • On the Learnability of Erasing Pattern Languages in the Query Model
  • Learning of Finite Unions of Tree Patterns with Repeated Internal Structured Variables from Queries
  • Learning with Non-linear Optimization
  • Kernel Trick Embedded Gaussian Mixture Model
  • Efficiently Learning the Metric with Side-Information
  • Learning Continuous Latent Variable Models with Bregman Divergences
  • A Stochastic Gradient Descent Algorithm for Structural Risk Minimisation
  • Learning from Random Examples
  • On the Complexity of Training a Single Perceptron with Programmable Synaptic Delays
  • Learning a Subclass of Regular Patterns in Polynomial Time
  • Identification with Probability One of Stochastic Deterministic Linear Languages
  • Online Prediction
  • Criterion of Calibration for Transductive Confidence Machine with Limited Feedback
  • Well-Calibrated Predictions from Online Compression Models
  • Transductive Confidence Machine Is Universal
  • On the Existence and Convergence of Computable Universal Priors.