ESANN1998

6th European Symposium on Artificial Neural Networks
Bruges, Belgium, April 22-23-24

[Electronic proceedings home page] [Electronic proceedings author index]

ESANN1998
Content of the proceedings

WARNING: you need Adobe Acrobat reader 7.0 or more to view the PDF files below



Special session: Radial basis networks


ES1998-305

What are the main factors involved in the design of a Radial Basis Function Network?

I. Rojas, M. Anguita, E. Ros, H. Pomares, O. Valenzuela, A. Prieto

Abstract
not yet available

Scanned document [PDF]

ES1998-301

A supervised radial basis function neural network

S. Wu, C. Van den Broeck

Abstract
not yet available

Scanned document [PDF]

ES1998-303

A comparison between weighted radial basis functions and wavelet networks

M. Sgarbi, V. Colla, L. Reyneri

Abstract
not yet available

Scanned document [PDF]

ES1998-304

An incremental local radial basis function network

A. Esposito, M. Marinaro, S. Scarpetta

Abstract
not yet available

Scanned document [PDF]

ES1998-306

A Tikhonov approach to calculate regularisation matrices

C. Angulo, A. Catala

Abstract
not yet available

Scanned document [PDF]

[Back to Top]


Models and architectures


ES1998-44

Analyses on the temporal patterns of spikes of auditory neurons by a neural network and tree-based models

T. Takahashi

Abstract
not yet available

Scanned document [PDF]

ES1998-11

Output jitter diverges to infinity, converges to zero or remains constant

J. Feng, B. Tirozzi, D. Brown

Abstract
not yet available

Scanned document [PDF]

ES1998-24

Wavelet interpolation networks

C. Bernard, S. Mallat, J.-J. Slotine

Abstract
Abstract. We describe a new approach to real time learning of unknown functions based on an interpolating wavelet estimation. We choose a subfamily of a wavelet basis relying on nested hierarchical allocation and update in real time our estimate of the unknown function. Such an interpolation process can be used for real time applications like neural network adaptive control, where learning an unknown function very fast is critical.

Manuscript from author [PDF]

ES1998-36

Generating arbitrary rhythmic patterns with purely inhibitory neural networks

Z. Yang, F. França

Abstract
not yet available

Scanned document [PDF]

[Back to Top]


Special session: Neural networks for control


ES1998-409

Brain-like intelligent control: from neural nets to larger-scale systems

P. Werbos

Abstract
not yet available

Scanned document [PDF]

ES1998-403

Control of a subsonic electropneumatic acoustic generator with dynamic recurrent neural networks

J.-P. Draye, L. Blondel, G. Cheron

Abstract
not yet available

Scanned document [PDF]

ES1998-406

Lazy learning for control design

G. Bontempi, M. Birattari, H. Bersini

Abstract
This paper presents two local methods for the control of discrete-time unknown nonlinear dynamical systems, when only a limited amount of input-output data is available. The modeling procedure adopts lazy learning a query-based approach for local modeling inspired to memory-based approximators. In the first method the lazy technique returns the forward and inverse models of the system which are used to compute the control action to take. The second is an indirect method inspired to adaptive control where the self-tuning identification module is replaced by a lazy approximator. Simulation examples of control of nonlinear systems starting from observed data are given.

Manuscript from author [PDF]

ES1998-402

Neural networks for the solution of information-distributed optimal control problems

M. Baglietto, T. Parisini, R. Zoppoli

Abstract
not yet available

Scanned document [PDF]

ES1998-404

Parsimonious learning feed-forward control

T. de Vries, L. Idema, W. Velthuis

Abstract
We introduce the Learning Feed-Forward Control configuration. In this configuration, a B-spline neural network is contained, which suffers from the curse of dimensionality. We propose a method to avoid the occurrence of this problem.

Manuscript from author [PDF]

ES1998-408

Fast orienting movements to visual targets: neural field model of dynamic gaze control

A. Schierwagen, H. Werner

Abstract
not yet available

Scanned document [PDF]

ES1998-401

Improved generalization ability of neurocontrollers by imposing NLq stability constraints

J. Suykens, J. Vandewalle

Abstract
not yet available

Scanned document [PDF]

ES1998-31

A RNN based control architecture for generating periodic action sequences

T. Kolb, W. Ilg, J. Wille

Abstract
not yet available

Scanned document [PDF]

[Back to Top]


Learning I


ES1998-7

NAR time-series prediction: a Bayesian framework and an experiment

M. Crucianu, Z. Uhry, J.-P. Asselin de Beauville, R. Bone

Abstract
We extend the Bayesian framework to Multi-Layer Perceptron models of Non-linear Auto-Regressive time-series. The approach is evaluated on an artificial time-series and some common simplifications are discussed.

Manuscript from author [PDF]

ES1998-8

Application of a neural net in classification and knowledge discovery

K. Schaedler, F. Wysotzki

Abstract
not yet available

Scanned document [PDF]

ES1998-14

Extending the CMAC model: adaptive input quantization

G. P. Klebus

Abstract
not yet available

Scanned document [PDF]

ES1998-17

One or two hidden layers perceptrons

M. Fernandez, C. Hernandez

Abstract
not yet available

Scanned document [PDF]

ES1998-18

On the error function of interval arithmetic backpropagation

M. Fernandez, C. Hernandez

Abstract
not yet available

Scanned document [PDF]

ES1998-302

A neural network for the identification of the dynamic behaviour of a wheelchair

L. Boquete, R. Barea, M. Mazo, I. Aranda

Abstract
not yet available

Scanned document [PDF]

[Back to Top]


Models and architectures


ES1998-9

What is observable in a class of neurodynamics?

J. Feng, D. Brown

Abstract
not yet available

Scanned document [PDF]

ES1998-12

Polyhedral mixture of linear experts for many-to-one mapping inversion

A. Karniel, R. Meir, G.F. Inbar

Abstract
Feed-forward control schemes require an inverse mapping of the controlled system. In adaptive systems as well as in biological modeling this inverse mapping is learned from examples. The biological motor control is very redundant, as are many robotic systems, implying that the inverse problem is ill posed. In this work a new architecture and algorithm for learning multiple inverses is proposed, the polyhedral mixture of linear experts (PMLE). The PMLE keeps all the possible solutions available to the controller in real time. The PMLE is a modified mixture of experts architecture, where each expert is linear and more than a single expert may be assigned to the same input region.The learning is implemented by the hinging hyperplanes algorithm. The proposed architecture is described and its operation is illustrated for some simple cases.

Manuscript from author [PDF]

ES1998-4

On-off intermittency in small neural networks with synaptic noise

A. Krawiecki, R.A. Kosinski

Abstract
not yet available

Scanned document [PDF]

[Back to Top]


Special session: Self-organising maps for data analysis


ES1998-201

Recurrent SOM with local linear models in time series prediction

T. Koskela, M. Varsta, J. Heikkonen, K. Kaski

Abstract
Recurrent Self-Organizing Map (RSOM) is studied in three different time series prediction cases. RSOM is used to cluster the series into local data sets, for which corresponding local linear models are estimated. RSOM includes recurrent difference vector in each unit which allows storing context from the past input vectors. Multilayer perceptron (MLP) network and autoregressive (AR) model are used to compare the prediction results. In studied cases RSOM shows promising results.

Manuscript from author [PDF]

ES1998-202

Self-organization and convergence of the one-dimensional Kohonen algorithm

A. Sadeghi

Abstract
not yet available

Scanned document [PDF]

ES1998-203

Finding structure in text archives

A. Rauber, D. Merkl

Abstract
not yet available

Scanned document [PDF]

ES1998-204

Methods for interpreting a self-organized map in data analysis

S. Kaski, J. Nikkila, T. Kohonen

Abstract
not yet available

Scanned document [PDF]

ES1998-209

Magnification control in neural maps

T. Villmann, M. Herrmann

Abstract
not yet available

Scanned document [PDF]

ES1998-205

Self-organizing ANNs for planetary surface composition research

E. Merényi

Abstract
The mineralogic composition of planetary surfaces is often mapped from remotely sensed spectral images. Advanced hyperspectral sensors today provide more detailed and more voluminous measurements than traditional classification algorithms can efficiently exploit. ANNs, and specifically Self-Organizing Maps, have been used at the Lunar and Planetary Laboratory, University of Arizona,to address these challenges.

Manuscript from author [PDF]

ES1998-206

A new dynamic LVQ-based classifier and its application to handwritten character recognition

S. Bermejo, J. Cabestany, M. Payeras

Abstract
not yet available

Scanned document [PDF]

ES1998-207

The self-organising map, robustness, self-organising criticality and power laws

J.A. Flanagan

Abstract
not yet available

Scanned document [PDF]

ES1998-208

Invariant feature maps for analysis of orientations in image data

S. McGlinchey, C. Fyfe

Abstract
not yet available

Scanned document [PDF]

ES1998-210

Forecasting time-series by Kohonen classification

A. Lendasse, M. Verleysen, E. de Bodt, M. Cottrell, P. Grégoire

Abstract
In this paper, we propose a generic non-linear approach for time series forecasting. The main feature of this approach is the use of a simple statistical forecasting in small regions of an input space adequately chosen and quantized. The partition of the space is achieved by the Kohonen algorithm. The method is then applied to a widely known time-series from the SantaFe competition, and the results are compared with the best ones published for this

Manuscript from author [PDF]

[Back to Top]


Special session: ANN for speech processing


ES1998-456

Introduction to speech recognition using neural networks

C. Wellekens

Abstract
As an introduction to a session dedicated to neural networks in speech processing, this paper describes the basic problems faced with in automatic speech recognition (ASR). Representation of speech, classification problems, speech unit models, training procedures and criteria are discussed. Why and how neural networks lead to challenging results in ASR is explained.

Manuscript from author [PDF]

ES1998-453

Self-organization in mixture densities of HMM based speech recognition

M. Kurimo

Abstract
not yet available

Scanned document [PDF]

ES1998-452

Speech recognition with a new hybrid architecture combining neural networks and continuous HMM

D. Willett, G. Rigoll

Abstract
not yet available

Scanned document [PDF]

ES1998-454

Hierarchies of neural networks for connectionist speech recognition

J. Fritsch, A. Waibel

Abstract
not yet available

Scanned document [PDF]

[Back to Top]


Learning II


ES1998-1

Training a sigmoidal network is difficult

B. Hammer

Abstract
In this paper we show that the loading problem for a $3$-node architecture with sigmoidal activation is NP-hard if the input dimension varies, if the classification is performed with a certain accuracy, and if the output weights are restricted.

Manuscript from author [PDF]

ES1998-6

Weight saliency regularization in augmented networks

P. Edwards, A. Murray

Abstract
not yet available

Scanned document [PDF]

ES1998-16

To stop learning using the evidence

D. Perrotta

Abstract
not yet available

Scanned document [PDF]

ES1998-21

Selecting among candidate basis functions by crosscorrelations

A. Poncet, A. Deiss, S. Holles

Abstract
not yet available

Scanned document [PDF]

ES1998-29

A multistage on-line learning rule for multilayer neural network

P. Thomas, G. Bloch, C. Humbert

Abstract
not yet available

Scanned document [PDF]

ES1998-37

Parameter-estimation-based learning for feedforward neural networks: convergence and robustness analysis

A. Alessandri, M. Maggiore, M. Sanguineti

Abstract
not yet available

Scanned document [PDF]

[Back to Top]


Special session: Cellular neural networks


ES1998-354

The CNN computer - a tutorial

T. Roska

Abstract
not yet available

Scanned document [PDF]

ES1998-352

On the robust design of uncoupled CNNs

B. Mirzai, D. Lim, G.S. Moschyts

Abstract
not yet available

Scanned document [PDF]

ES1998-355

To stop learning using the evidence

Y. Moreau, J. Vandewalle

Abstract
not yet available

Scanned document [PDF]

ES1998-351

Cellular neural networks: from chaos generation to compexity modelling

P. Arena, L. Fortuna

Abstract
not yet available

Scanned document [PDF]

ES1998-353

Ultrasound medical image processing using cellular neural networks

I. Aizenberg, N. Aizenberg, E. Gotko, J. Vandewalle

Abstract
not yet available

Scanned document [PDF]

[Back to Top]


Statistical methods


ES1998-28

Separation of sources in a class of post-nonlinear mixtures

C.G. Puntonet, M.R. Alvarez, A. Prieto, B. Prieto

Abstract
not yet available

Scanned document [PDF]

ES1998-50

Improving neural network estimation in presence of non i.i.d. noise

S. Hosseini, C. Jutten

Abstract
not yet available

Scanned document [PDF]

[Back to Top]


Learning and models


ES1998-19

A self-organising neural network for modelling cortical development

M. Spratling, G. Hayes

Abstract
This paper presents a novel self-organising neural network. It has been developed for use as a simplified model of cortical development. Unlike many other models of topological map formation all synaptic weights start at zero strength (so that synaptogenesis might be modelled). In addition, the algorithm works with the same format of encoding for both inputs to and outputs from the network (so that the transfer and recoding of information between cortical regions might be modelled).

Manuscript from author [PDF]

ES1998-20

Learning sensory-motor cortical mappings without training

M. Spratling, G. Hayes

Abstract
This paper shows how the relationship between two arrays of artificial neurons, representing different cortical regions, can be learned. The algorithm enables each neural network to self-organise into a topological map of the domain it represents at the same time as the relationship between these maps is found. Unlike previous methods learning is achieved without a separate training phase; the algorithm which learns the mapping is also that which performs the mapping.

Manuscript from author [PDF]

ES1998-32

Perception and action selection by anticipation of sensorimotor consequences

T. Seiler, V. Stephan, H.-M. Gross

Abstract
not yet available

Scanned document [PDF]

ES1998-33

Neural networks for financial forecast

G. Rotundo, B. Tirozzi, M. Valente

Abstract
not yet available

Scanned document [PDF]

ES1998-39

A neural approach to a sensor fusion problem

V. Colla, M. Sgarbi, L.M. Reyneri, A.M. Sabatini

Abstract
not yet available

Scanned document [PDF]

ES1998-46

Canonical correlation analysis using artificial neural networks

Pei Ling Lai, C. Fyfe

Abstract
not yet available

Scanned document [PDF]

[Back to Top]


Special session: ANN for the processing of facial information


ES1998-254

ANN for facial information processing: a review of recent approaches

R. Raducanu, M. Grana, A. D'Anjou, F.X. Albizuri

Abstract
not yet available

Scanned document [PDF]

ES1998-251

Hybrid Hidden Markow model / neural network models for speechreading

A. Rogozan, P. Deleglise

Abstract
This paper describes a new approach for visual speech recognition (also called speechreading) using hybrid HMM/NN models. First, we use the Self-Organising Map (SOM) to merge phonemes that appear visually similar into visemes1. Then we develop an hybrid speechreading system with two communicating components: HMM and NN, to take advantage from the qualities of both. The first component is a classical continuous HMM, while the second one is the Time Delay Neural Network (TDNN) or the Jordan partially recurrent Neural Network (JNN). At the beginning of the recognition process the HMM component segments and labels the visual data. In the case of visemes which are often confused by using the HMM, but rarely with the NN, we use the NN component to label the corresponding boundaries. For the other visemes, the final response is given by the HMM component. Finally, we evaluate the hybrid system on a continuously spelling task and we show that it outperform an HMM system and a NN one.

Scanned document [PDF]

ES1998-252

Face recognition: pre-processing techniques for linear autoassociators

E. Drege, F. Yang, M. Paindavoine, H. Abdi

Abstract
not yet available

Scanned document [PDF]

ES1998-253

Facial image retrieval using sequential classifiers

S. Gutta, H. Wechsler

Abstract
not yet available

Scanned document [PDF]

ES1998-40

Grouping complex face parts by nonlinear oscillations

S. Oka, M. Kitabata, Y. Ajioka, Y. Takefuji

Abstract
not yet available

Scanned document [PDF]

[Back to Top]


Optimization and associative networks


ES1998-13

On a Hopfield net arising in the modelling and control of over-saturated signalized intersections

F. Maghrebi

Abstract
not yet available

Scanned document [PDF]

ES1998-48

Construction of an interactive and competitive artificial neural network for the solution of path planning problems

E. Mulder, H.A.K. Mastebroeck

Abstract
not yet available

Scanned document [PDF]

ES1998-38

Learning associative mappings from few examples

J.A. Walter

Abstract
not yet available

Scanned document [PDF]

[Back to Top]