home *** CD-ROM | disk | FTP | other *** search
- Path: bloom-beacon.mit.edu!hookup!news.moneng.mei.com!howland.reston.ans.net!xlink.net!ira.uka.de!prechelt
- From: prechelt@ira.uka.de (Lutz Prechelt)
- Newsgroups: comp.ai.neural-nets,comp.answers,news.answers
- Subject: FAQ in comp.ai.neural-nets -- monthly posting
- Supersedes: <nn.posting_762405482@i41s14.ira.uka.de>
- Followup-To: comp.ai.neural-nets
- Date: 28 Mar 1994 02:17:47 GMT
- Organization: University of Karlsruhe, Germany
- Lines: 2420
- Approved: news-answers-request@MIT.Edu
- Expires: 2 May 1994 02:18:03 GMT
- Message-ID: <nn.posting_764821083@i41s14.ira.uka.de>
- Reply-To: prechelt@ira.uka.de (Lutz Prechelt)
- NNTP-Posting-Host: i41s25.ira.uka.de
- Keywords: questions, answers, terminology, bibliography
- Originator: prechelt@i41s25
- Xref: bloom-beacon.mit.edu comp.ai.neural-nets:7859 comp.answers:4333 news.answers:16911
-
- Archive-name: neural-net-faq
- Last-modified: 94/03/21
-
- (FAQ means "Frequently Asked Questions")
-
- ------------------------------------------------------------------------
- Anybody who is willing to contribute any question or
- information, please email me; if it is relevant,
- I will incorporate it. But: PLEASE format your contribution
- appropriately so that I can just drop it in.
-
- The monthly posting departs at the 28th of every month.
- ------------------------------------------------------------------------
-
- This is a monthly posting to the Usenet newsgroup comp.ai.neural-nets
- (and comp.answers, where it should be findable at ANY time)
- Its purpose is to provide basic information for individuals who are
- new to the field of neural networks or are just beginning to read this
- group. It shall help to avoid lengthy discussion of questions that usually
- arise for beginners of one or the other kind.
-
- >>>>> SO, PLEASE, SEARCH THIS POSTING FIRST IF YOU HAVE A QUESTION <<<<<
- and
- >>>>> DON'T POST ANSWERS TO FAQs: POINT THE ASKER TO THIS POSTING <<<<<
-
- This posting is archived in the periodic posting archive on
- host rtfm.mit.edu (and on some other hosts as well).
- Look in the anonymous ftp directory "/pub/usenet/news.answers",
- the filename is as given in the 'Archive-name:' header above.
- If you do not have anonymous ftp access, you can access the archives
- by mail server as well. Send an E-mail message to
- mail-server@rtfm.mit.edu with "help" and "index" in the body on
- separate lines for more information.
-
- For those of you who read this posting anywhere other than in
- comp.ai.neural-nets: To read comp.ai.neural-nets (or post articles to it)
- you need Usenet News access. Try the commands, 'xrn', 'rn', 'nn', or 'trn'
- on your Unix machine, 'news' on your VMS machine, or ask a local guru.
-
- The monthly posting is not meant to discuss any topic exhaustively.
-
- Disclaimer: This posting is provided 'as is'.
- No warranty whatsoever is expressed or implied,
- in particular, no warranty that the information contained herein
- is correct or useful in any way, although both is intended.
-
- >> To find the answer of question number <x> (if present at all), search
- >> for the string "-A<x>.)" (so the answer to question 12 is at "-A12.)")
-
- And now, in the end, we begin:
-
- ============================== Questions ==============================
-
- (the short forms and non-continous numbering is intended)
- 1.) What is this newsgroup for ? How shall it be used ?
- 2.) What is a neural network (NN) ?
- 3.) What can you do with a Neural Network and what not ?
- 4.) Who is concerned with Neural Networks ?
-
- 6.) What does 'backprop' mean ?
- 7.) How many learning methods for NNs exist ? Which ?
- 8.) What about Genetic Algorithms ?
- 9.) What about Fuzzy Logic ?
-
- 10.) Good introductory literature about Neural Networks ?
- 11.) Any journals and magazines about Neural Networks ?
- 12.) The most important conferences concerned with Neural Networks ?
- 13.) Neural Network Associations ?
- 14.) Other sources of information about NNs ?
-
- 15.) Freely available software packages for NN simulation ?
- 16.) Commercial software packages for NN simulation ?
- 17.) Neural Network hardware ?
-
- 19.) Databases for experimentation with NNs ?
-
- ============================== Answers ==============================
-
- ------------------------------------------------------------------------
-
- -A1.) What is this newsgroup for ?
-
- The newsgroup comp.ai.neural-nets is inteded as a forum for people who want
- to use or explore the capabilities of Artificial Neural Networks or
- Neural-Network-like structures.
-
- There should be the following types of articles in this newsgroup:
-
- 1. Requests
-
- Requests are articles of the form
- "I am looking for X"
- where X is something public like a book, an article, a piece of software.
- The most important about such a request is to be as specific as possible!
-
- If multiple different answers can be expected, the person making the
- request should prepare to make a summary of the answers he/she got
- and announce to do so with a phrase like
- "Please reply by email, I'll summarize to the group"
- at the end of the posting.
-
- The Subject line of the posting should then be something like
- "Request: X"
-
- 2. Questions
-
- As opposed to requests, question ask for a larger piece of information or
- a more or less detailed explanation of something.
- To avoid lots of redundant traffic it is important that the poster
- provides with the question all information s/he already has about the
- subject asked and state the actual question as precise and narrow as
- possible.
- The poster should prepare to make a summary of the answers s/he got
- and announce to do so with a phrase like
- "Please reply by email, I'll summarize to the group"
- at the end of the posting.
-
- The Subject line of the posting should be something like
- "Question: this-and-that"
- or have the form of a question (i.e., end with a question mark)
-
- 3. Answers
-
- These are reactions to questions or requests.
- As a rule of thumb articles of type "answer" should be rare.
- Ideally, in most cases either the answer is too specific to be of general
- interest (and should thus be e-mailed to the poster) or a summary
- was announced with the question or request (and answers should
- thus be e-mailed to the poster).
-
- The subject lines of answers are automatically adjusted by the
- news software.
- Note that sometimes longer threads of discussion evolve from an answer
- to a question or request. In this case posters should change the
- subject line suitably as soon as the topic goes too far away from the
- one announced in the original subject line. You can still carry along
- the old subject in parentheses in the form
- "Subject: <...new subject...> (was: <...old subject...>)
-
- 4. Summaries
-
- In all cases of requests or questions the answers for which can be assumed
- to be of some general interest, the poster of the request or question
- shall summarize the ansers he/she received.
- Such a summary should be announced in the original posting of the question
- or request with a phrase like
- "Please answer by email, I'll summarize"
-
- In such a case, people who answer to a question should NOT post their
- answer to the newsgroup but instead mail them to the poster of the question
- who collects and reviews them.
- After about 5 to 20 days after the original posting, its poster should
- make the summary of answers and post it to the newsgroup.
-
- Some care should be invested into a summary:
- a) simple concatenation of all the answers is not enough:
- instead, redundancies, irrelevancies, verbosities, and errors
- should be filtered out (as good as possible)
- b) the answers should be separated clearly
- c) the contributors of the individual answers should be identifiable
- (unless they requested to remain anonymous [yes, that happens])
- d) the summary should start with the "quintessence" of the answers,
- as seen by the original poster
- e) A summary should, when posted, clearly be indicated to be one
- by giving it a Subject line starting with "SUMMARY:"
-
- Note that a good summary is pure gold for the rest of the newsgroup
- community, so summary work will be most appreciated by all of us.
- (Good summaries are more valuable than any moderator ! :-> )
-
- 5. Announcements
-
- Some articles never need any public reaction.
- These are called announcements (for instance for a workshop,
- conference or the availability of some technical report or
- software system).
-
- Announcements should be clearly indicated to be such by giving
- them a subject line of the form
- "Announcement: this-and-that"
-
- 6. Reports
-
- Sometimes people spontaneously want to report something to the
- newsgroup. This might be special experiences with some software,
- results of own experiments or conceptual work, or especially
- interesting information from somewhere else.
-
- Reports should be clearly indicated to be such by giving
- them a subject line of the form
- "Report: this-and-that"
-
- 7. Discussions
-
- An especially valuable possibility of Usenet is of course that of
- discussing a certain topic with hundreds of potential participants.
- All traffic in the newsgroup that can not be subsumed under one of
- the above categories should belong to a discussion.
-
- If somebody explicitly wants to start a discussion, he/she can do so
- by giving the posting a subject line of the form
- "Subject: Discussion: this-and-that"
-
- It is quite difficult to keep a discussion from drifting into chaos,
- but, unfortunately, as many many other newsgroups show there seems
- to be no secure way to avoid this.
- On the other hand, comp.ai.neural-nets has not had many problems
- with this effect in the past, so let's just go and hope... :->
-
- ------------------------------------------------------------------------
-
- -A2.) What is a neural network (NN) ?
-
- [anybody there to write something better?
- buzzwords: artificial vs. natural/biological; units and
- connections; value passing; inputs and outputs; storage in structure
- and weights; only local information; highly parallel operation ]
-
- First of all, when we are talking about a neural network, we *should*
- usually better say "artificial neural network" (ANN), because that is
- what we mean most of the time. Biological neural networks are much
- more complicated in their elementary structures than the mathematical
- models we use for ANNs.
-
- A vague description is as follows:
-
- An ANN is a network of many very simple processors ("units"), each
- possibly having a (small amount of) local memory. The units are
- connected by unidirectional communication channels ("connections"),
- which carry numeric (as opposed to symbolic) data. The units operate
- only on their local data and on the inputs they receive via the
- connections.
-
- The design motivation is what distinguishes neural networks from other
- mathematical techniques:
-
- A neural network is a processing device, either an algorithm, or actual
- hardware, whose design was motivated by the design and functioning of human
- brains and components thereof.
-
- Most neural networks have some sort of "training" rule
- whereby the weights of connections are adjusted on the basis of
- presented patterns.
- In other words, neural networks "learn" from examples,
- just like children learn to recognize dogs from examples of dogs,
- and exhibit some structural capability for generalization.
-
- Neural networks normally have great potential for parallelism, since
- the computations of the components are independent of each other.
-
- ------------------------------------------------------------------------
-
- -A3.) What can you do with a Neural Network and what not ?
-
- [preliminary]
-
- In principle, NNs can compute any computable function, i.e. they can
- do everything a normal digital computer can do.
- Especially can anything that can be represented as a mapping between
- vector spaces be approximated to arbitrary precision by feedforward
- NNs (which is the most often used type).
-
- In practice, NNs are especially useful for mapping problems
- which are tolerant of a high error rate, have lots of example data
- available, but to which hard and fast rules can not easily be applied.
- NNs are, at least today, difficult to apply successfully to problems
- that concern manipulation of symbols and memory.
-
- ------------------------------------------------------------------------
-
- -A4.) Who is concerned with Neural Networks ?
-
- Neural Networks are interesting for quite a lot of very dissimilar people:
-
- - Computer scientists want to find out about the properties of
- non-symbolic information processing with neural nets and about learning
- systems in general.
- - Engineers of many kinds want to exploit the capabilities of
- neural networks on many areas (e.g. signal processing) to solve
- their application problems.
- - Cognitive scientists view neural networks as a possible apparatus to
- describe models of thinking and conscience (High-level brain function).
- - Neuro-physiologists use neural networks to describe and explore
- medium-level brain function (e.g. memory, sensory system, motorics).
- - Physicists use neural networks to model phenomena in statistical
- mechanics and for a lot of other tasks.
- - Biologists use Neural Networks to interpret nucleotide sequences.
- - Philosophers and some other people may also be interested in
- Neural Networks for various reasons.
-
- ------------------------------------------------------------------------
-
- -A6.) What does 'backprop' mean ?
-
- [anybody to write something similarly short,
- but easier to understand for a beginner ? ]
-
- It is an abbreviation for 'backpropagation of error' which is the
- most widely used learning method for neural networks today.
- Although it has many disadvantages, which could be summarized in the
- sentence
- "You are almost not knowing what you are actually doing
- when using backpropagation" :-)
- it has pretty much success on practical applications and is
- relatively easy to apply.
-
- It is for the training of layered (i.e., nodes are grouped
- in layers) feedforward (i.e., the arcs joining nodes are
- unidirectional, and there are no cycles) nets.
-
- Back-propagation needs a teacher that knows the correct output for any
- input ("supervised learning") and uses gradient descent on the error
- (as provided by the teacher) to train the weights. The activation
- function is (usually) a sigmoidal (i.e., bounded above and below, but
- differentiable) function of a weighted sum of the nodes inputs.
-
- The use of a gradient descent algorithm to train its weights makes it
- slow to train; but being a feedforward algorithm, it is quite rapid during
- the recall phase.
-
- Literature:
- Rumelhart, D. E. and McClelland, J. L. (1986):
- Parallel Distributed Processing: Explorations in the
- Microstructure of Cognition (volume 1, pp 318-362).
- The MIT Press.
- (this is the classic one) or one of the dozens of other books
- or articles on backpropagation :->
-
- ------------------------------------------------------------------------
-
- -A7.) How many learning methods for NNs exist ? Which ?
-
- There are many many learning methods for NNs by now. Nobody can know
- exactly how many.
- New ones (at least variations of existing ones) are invented every
- week. Below is a collection of some of the most well known methods;
- not claiming to be complete.
-
- The main categorization of these methods is the distiction of
- supervised from unsupervised learning:
-
- - In supervised learning, there is a "teacher" who in the learning
- phase "tells" the net how well it performs ("reinforcement learning")
- or what the correct behavior would have been ("fully supervised learning").
-
- - In unsupervised learning the net is autonomous: it just looks at
- the data it is presented with, finds out about some of the
- properties of the data set and learns to reflect these properties
- in its output. What exactly these properties are, that the network
- can learn to recognise, depends on the particular network model and
- learning method.
-
- Many of these learning methods are closely connected with a certain
- (class of) network topology.
-
- Now here is the list, just giving some names:
-
- 1. UNSUPERVISED LEARNING (i.e. without a "teacher"):
- 1). Feedback Nets:
- a). Additive Grossberg (AG)
- b). Shunting Grossberg (SG)
- c). Binary Adaptive Resonance Theory (ART1)
- d). Analog Adaptive Resonance Theory (ART2, ART2a)
- e). Discrete Hopfield (DH)
- f). Continuous Hopfield (CH)
- g). Discrete Bidirectional Associative Memory (BAM)
- h). Temporal Associative Memory (TAM)
- i). Adaptive Bidirectional Associative Memory (ABAM)
- j). Kohonen Self-organizing Map/Topology-preserving map (SOM/TPM)
- k). Competitive learning
- 2). Feedforward-only Nets:
- a). Learning Matrix (LM)
- b). Driver-Reinforcement Learning (DR)
- c). Linear Associative Memory (LAM)
- d). Optimal Linear Associative Memory (OLAM)
- e). Sparse Distributed Associative Memory (SDM)
- f). Fuzzy Associative Memory (FAM)
- g). Counterprogation (CPN)
-
- 2. SUPERVISED LEARNING (i.e. with a "teacher"):
- 1). Feedback Nets:
- a). Brain-State-in-a-Box (BSB)
- b). Fuzzy Congitive Map (FCM)
- c). Boltzmann Machine (BM)
- d). Mean Field Annealing (MFT)
- e). Recurrent Cascade Correlation (RCC)
- f). Learning Vector Quantization (LVQ)
- g). Backpropagation through time (BPTT)
- h). Real-time recurrent learning (RTRL)
- i). Recurrent Extended Kalman Filter (EKF)
- 2). Feedforward-only Nets:
- a). Perceptron
- b). Adaline, Madaline
- c). Backpropagation (BP)
- d). Cauchy Machine (CM)
- e). Adaptive Heuristic Critic (AHC)
- f). Time Delay Neural Network (TDNN)
- g). Associative Reward Penalty (ARP)
- h). Avalanche Matched Filter (AMF)
- i). Backpercolation (Perc)
- j). Artmap
- k). Adaptive Logic Network (ALN)
- l). Cascade Correlation (CasCor)
- m). Extended Kalman Filter(EKF)
-
- ------------------------------------------------------------------------
-
- -A8.) What about Genetic Algorithms ?
-
- There are a number of definitions of GA (Genetic Algorithm).
- A possible one is
-
- A GA is an optimization program
- that starts with
- a population of encoded procedures, (Creation of Life :-> )
- mutates them stochastically, (Get cancer or so :-> )
- and uses a selection process (Darwinism)
- to prefer the mutants with high fitness
- and perhaps a recombination process (Make babies :-> )
- to combine properties of (preferably) the succesful mutants.
-
- There is a newsgroup that is dedicated to the field of evolutionary
- computation called comp.ai.genetic.
- It has a detailed FAQ posting which, for instance, explains the terms
- "Genetic Algorithm", "Evolutionary Programming", "Evolution Strategy",
- "Classifier System", and "Genetic Programming".
- That FAQ also contains lots of pointers to relevant literature, software,
- other sources of information, et cetera et cetera.
- Please see the comp.ai.genetic FAQ for further information.
-
- ------------------------------------------------------------------------
-
- -A9.) What about Fuzzy Logic ?
-
- [preliminary]
- [Who will write an introduction?]
-
- Fuzzy Logic is an area of research based on the work of L.A. Zadeh.
- It is a departure from classical two-valued sets and logic, that uses
- "soft" linguistic (e.g. large, hot, tall) system variables and a
- continuous range of truth values in the interval [0,1], rather than
- strict binary (True or False) decisions and assignments.
-
- Fuzzy logic is used where a system is difficult to model exactly (but
- an inexact model is available), is controlled by a human operator or
- expert, or where ambiguity or vagueness is common. A typical fuzzy
- system consists of a rule base, membership functions, and an inference
- procedure.
-
- Most Fuzzy Logic discussion takes place in the newsgroup comp.ai.fuzzy,
- but there is also some work (and discussion) about combining fuzzy
- logic with Neural Network approaches in comp.ai.neural-nets.
-
- For more details see (for example):
-
- Klir, G.J. and Folger, T.A., Fuzzy Sets, Uncertainty, and
- Information, Prentice-Hall, Englewood
- Cliffs, N.J., 1988.
-
- Kosko, B., Neural Networks and Fuzzy Systems, Prentice Hall,
- Englewood Cliffs, NJ, 1992.
-
- ------------------------------------------------------------------------
-
- -A10.) Good introductory literature about Neural Networks ?
-
- 0.) The best (subjectively, of course -- please don't flame me):
-
- Hecht-Nielsen, R. (1990). Neurocomputing. Addison Wesley.
- Comments: "A good book", "comprises a nice historical overview and a chapter
- about NN hardware. Well structured prose. Makes important concepts clear."
-
- Hertz, J., Krogh, A., and Palmer, R. (1991). Introduction to the Theory of
- Neural Computation. Addison-Wesley: Redwood City, California.
- ISBN 0-201-50395-6 (hardbound) and 0-201-51560-1 (paperbound)
- Comments: "My first impression is that this one is by far the best book on
- the topic. And it's below $30 for the paperback."; "Well written, theoretical
- (but not overwhelming)"; It provides a good balance of model development,
- computational algorithms, and applications. The mathematical derivations
- are especially well done"; "Nice mathematical analysis on the mechanism of
- different learning algorithms"; "It is NOT for mathematical beginner.
- If you don't have a good grasp of higher level math, this book can
- be really tough to get through."
-
-
- 1.) Books for the beginner:
-
- Aleksander, I. and Morton, H. (1990). An Introduction to Neural Computing.
- Chapman and Hall. (ISBN 0-412-37780-2).
- Comments: "This book seems to be intended for the first year of university
- education."
-
- Beale, R. and Jackson, T. (1990). Neural Computing, an Introduction.
- Adam Hilger, IOP Publishing Ltd : Bristol. (ISBN 0-85274-262-2).
- Comments: "It's clearly written. Lots of hints as to how to get the
- adaptive models covered to work (not always well explained in the
- original sources). Consistent mathematical terminology. Covers
- perceptrons, error-backpropagation, Kohonen self-org model, Hopfield
- type models, ART, and associative memories."
-
- Dayhoff, J. E. (1990). Neural Network Architectures: An Introduction.
- Van Nostrand Reinhold: New York.
- Comments: "Like Wasserman's book, Dayhoff's book is also very easy to
- understand".
-
- Haykin, S. (1994). Neural Networks, a Comprehensive Foundation.
- Macmillan, New York, NY.
-
- McClelland, J. L. and Rumelhart, D. E. (1988).
- Explorations in Parallel Distributed Processing: Computational Models of
- Cognition and Perception (software manual). The MIT Press.
- Comments: "Written in a tutorial style, and includes 2 diskettes of NN
- simulation programs that can be compiled on MS-DOS or Unix (and they do
- too !)"; "The programs are pretty reasonable as an introduction to some
- of the things that NNs can do."; "There are *two* editions of this book.
- One comes with disks for the IBM PC, the other comes with disks for the
- Macintosh".
-
- McCord Nelson, M. and Illingworth, W.T. (1990). A Practical Guide to Neural
- Nets. Addison-Wesley Publishing Company, Inc. (ISBN 0-201-52376-0).
- Comments: "No formulas at all( ==> no good)"; "It does not have much
- detailed model development (very few equations), but it does present many
- areas of application. It includes a chapter on current areas of research.
- A variety of commercial applications is discussed in chapter 1. It also
- includes a program diskette with a fancy graphical interface (unlike the
- PDP diskette)".
-
- Muller, B. and Reinhardt, J. (1990). Neural Networks, An Introduction.
- Springer-Verlag: Berlin Heidelberg New York (ISBN: 3-540-52380-4 and
- 0-387-52380-4).
- Comments: The book was developed out of a course on neural-network
- models with computer demonstrations that was taught by the authors
- to Physics students. The book comes together with a PC-diskette.
- The book is divided into three parts:
- 1) Models of Neural Networks; describing several architectures
- and learing rules, including the mathematics.
- 2) Statistical Physiscs of Neural Networks; "hard-core" physics
- section developing formal theories of stochastic neural networks.
- 3) Computer Codes; explanation about the demonstration programs.
- First part gives a nice introduction into neural networks together
- with the formulas. Together with the demonstration programs a 'feel'
- for neural networks can be developed.
-
- Orchard, G.A. & Phillips, W.A. (1991). Neural Computation: A
- Beginner's Guide. Lawrence Earlbaum Associates: London.
- Comments: "Short user-friendly introduction to the area, with a
- non-technical flavour. Apparently accompanies a software package, but I
- haven't seen that yet".
-
- Wasserman, P. D. (1989). Neural Computing: Theory & Practice.
- Van Nostrand Reinhold: New York. (ISBN 0-442-20743-3)
- Comments: "Wasserman flatly enumerates some common architectures from an
- engineer's perspective ('how it works') without ever addressing the underlying
- fundamentals ('why it works') - important basic concepts such as clustering,
- principal components or gradient descent are not treated. It's also full of
- errors, and unhelpful diagrams drawn with what appears to be PCB board layout
- software from the '70s. For anyone who wants to do active research in the
- field I consider it quite inadequate"; "Okay, but too shallow"; "Quite
- easy to understand";
- "The best bedtime reading for Neural Networks. I have given
- this book to numerous collegues who want to know NN basics, but who never
- plan to implement anything. An excellent book to give your manager."
-
- Wasserman, P.D. (1993). Advanced Methods in Neural Computing.
- Van Nostrand Reinhold: New York (ISBN: 0-442-00461-3).
- Comments: Several neural network topics are discussed e.g.
- Probalistic Neural Networks, Backpropagation and beyond,
- neural control, Radial Basis Function Networks,
- Neural Engineering. Furthermore, several subjects
- related to neural networks are mentioned e.g.
- genetic algorithms, fuzzy logic, chaos. Just the functionality
- of these subjects is described; enough to get you started.
- Lots of references are given to more elaborate descriptions.
- Easy to read, no extensive mathematical background necessary.
-
-
- 2.) The classics:
-
- Kohonen, T. (1984). Self-organization and Associative Memory. Springer-Verlag:
- New York. (2nd Edition: 1988; 3rd edition: 1989).
- Comments: "The section on Pattern mathematics is excellent."
-
- Rumelhart, D. E. and McClelland, J. L. (1986). Parallel Distributed
- Processing: Explorations in the Microstructure of Cognition (volumes 1 & 2).
- The MIT Press.
- Comments: "As a computer scientist I found the two Rumelhart and McClelland
- books really heavy going and definitely not the sort of thing to read if you
- are a beginner."; "It's quite readable, and affordable (about $65 for both
- volumes)."; "THE Connectionist bible.".
-
-
- 3.) Introductory journal articles:
-
- Hinton, G. E. (1989). Connectionist learning procedures.
- Artificial Intelligence, Vol. 40, pp. 185--234.
- Comments: "One of the better neural networks overview papers, although the
- distinction between network topology and learning algorithm is not always
- very clear. Could very well be used as an introduction to neural networks."
-
- Knight, K. (1990). Connectionist, Ideas and Algorithms. Communications of
- the ACM. November 1990. Vol.33 nr.11, pp 59-74.
- Comments:"A good article, while it is for most people easy to find a copy of
- this journal."
-
- Kohonen, T. (1988). An Introduction to Neural Computing. Neural Networks,
- vol. 1, no. 1. pp. 3-16.
- Comments: "A general review".
-
-
- 4.) Not-quite-so-introductory literature:
-
- Anderson, J. A. and Rosenfeld, E. (Eds). (1988). Neurocomputing:
- Foundations of Research. The MIT Press: Cambridge, MA.
- Comments: "An expensive book, but excellent for reference. It is a
- collection of reprints of most of the major papers in the field.";
-
- Anderson, J. A., Pellionisz, A. and Rosenfeld, E. (Eds). (1990).
- Neurocomputing 2: Directions for Research. The MIT Press: Cambridge, MA.
- Comments: "The sequel to their well-known Neurocomputing book."
-
- Caudill, M. and Butler, C. (1990). Naturally Intelligent Systems.
- MIT Press: Cambridge, Massachusetts. (ISBN 0-262-03156-6).
- Comments: "I guess one of the best books I read"; "May not be suited for
- people who want to do some research in the area".
-
- Khanna, T. (1990). Foundations of Neural Networks. Addison-Wesley: New York.
- Comments: "Not so bad (with a page of erroneous formulas (if I remember
- well), and #hidden layers isn't well described)."; "Khanna's intention
- in writing his book with math analysis should be commended but he
- made several mistakes in the math part".
-
- Kung, S.Y. (1993). Digital Neural Networks, Prentice Hall,
- Englewood Cliffs, NJ.
-
- Levine, D. S. (1990). Introduction to Neural and Cognitive Modeling.
- Lawrence Erlbaum: Hillsdale, N.J.
- Comments: "Highly recommended".
-
- Lippmann, R. P. (April 1987). An introduction to computing with neural nets.
- IEEE Acoustics, Speech, and Signal Processing Magazine. vol. 2,
- no. 4, pp 4-22.
- Comments: "Much acclaimed as an overview of neural networks, but rather
- inaccurate on several points. The categorization into binary and continuous-
- valued input neural networks is rather arbitrary, and may work confusing for
- the unexperienced reader. Not all networks discussed are of equal importance."
-
- Maren, A., Harston, C. and Pap, R., (1990). Handbook of Neural Computing
- Applications. Academic Press. ISBN: 0-12-471260-6. (451 pages)
- Comments: "They cover a broad area"; "Introductory with suggested
- applications implementation".
-
- Pao, Y. H. (1989). Adaptive Pattern Recognition and Neural Networks
- Addison-Wesley Publishing Company, Inc. (ISBN 0-201-12584-6)
- Comments: "An excellent book that ties together classical approaches
- to pattern recognition with Neural Nets. Most other NN books do not
- even mention conventional approaches."
-
- Rumelhart, D. E., Hinton, G. E. and Williams, R. J. (1986). Learning
- representations by back-propagating errors. Nature, vol 323 (9 October),
- pp. 533-536.
- Comments: "Gives a very good potted explanation of backprop NN's. It gives
- sufficient detail to write your own NN simulation."
-
- Simpson, P. K. (1990). Artificial Neural Systems: Foundations, Paradigms,
- Applications and Implementations. Pergamon Press: New York.
- Comments: "Contains a very useful 37 page bibliography. A large number of
- paradigms are presented. On the negative side the book is very shallow.
- Best used as a complement to other books".
-
- Zeidenberg. M. (1990). Neural Networks in Artificial Intelligence.
- Ellis Horwood, Ltd., Chichester.
- Comments: "Gives the AI point of view".
-
- Zornetzer, S. F., Davis, J. L. and Lau, C. (1990). An Introduction to
- Neural and Electronic Networks. Academic Press. (ISBN 0-12-781881-2)
- Comments: "Covers quite a broad range of topics (collection of
- articles/papers )."; "Provides a primer-like introduction and overview for
- a broad audience, and employs a strong interdisciplinary emphasis".
-
- ------------------------------------------------------------------------
-
- -A11.) Any journals and magazines about Neural Networks ?
-
-
- [to be added: comments on speed of reviewing and publishing,
- whether they accept TeX format or ASCII by e-mail, etc.]
-
- A. Dedicated Neural Network Journals:
- =====================================
-
- Title: Neural Networks
- Publish: Pergamon Press
- Address: Pergamon Journals Inc., Fairview Park, Elmsford,
- New York 10523, USA and Pergamon Journals Ltd.
- Headington Hill Hall, Oxford OX3, 0BW, England
- Freq.: 6 issues/year (vol. 1 in 1988)
- Cost/Yr: Free with INNS membership ($45?), Individual $65, Institution $175
- ISSN #: 0893-6080
- Remark: Official Journal of International Neural Network Society (INNS).
- Contains Original Contributions, Invited Review Articles, Letters
- to Editor, Invited Book Reviews, Editorials, Announcements and INNS
- News, Software Surveys. This is probably the most popular NN journal.
- (Note: Remarks supplied by Mike Plonski "plonski@aero.org")
- -------
- Title: Neural Computation
- Publish: MIT Press
- Address: MIT Press Journals, 55 Hayward Street Cambridge,
- MA 02142-9949, USA, Phone: (617) 253-2889
- Freq.: Quarterly (vol. 1 in 1989)
- Cost/Yr: Individual $45, Institution $90, Students $35; Add $9 Outside USA
- ISSN #: 0899-7667
- Remark: Combination of Reviews (10,000 words), Views (4,000 words)
- and Letters (2,000 words). I have found this journal to be of
- outstanding quality.
- (Note: Remarks supplied by Mike Plonski "plonski@aero.org")
- -----
- Title: IEEE Transaction on Neural Networks
- Publish: Institute of Electrical and Electronics Engineers (IEEE)
- Address: IEEE Service Cemter, 445 Hoes Lane, P.O. Box 1331, Piscataway, NJ,
- 08855-1331 USA. Tel: (201) 981-0060
- Cost/Yr: $10 for Members belonging to participating IEEE societies
- Freq.: Quarterly (vol. 1 in March 1990)
- Remark: Devoted to the science and technology of neural networks
- which disclose significant technical knowledge, exploratory
- developments and applications of neural networks from biology to
- software to hardware. Emphasis is on artificial neural networks.
- Specific aspects include self organizing systems, neurobiological
- connections, network dynamics and architecture, speech recognition,
- electronic and photonic implementation, robotics and controls.
- Includes Letters concerning new research results.
- (Note: Remarks are from journal announcement)
- -----
- Title: International Journal of Neural Systems
- Publish: World Scientific Publishing
- Address: USA: World Scientific Publishing Co., 687 Hartwell Street, Teaneck,
- NJ 07666. Tel: (201) 837-8858; Eurpoe: World Scientific Publishing
- Co. Pte. Ltd., 73 Lynton Mead, Totteridge, London N20-8DH, England.
- Tel: (01) 4462461; Other: World Scientific Publishing Co. Pte. Ltd.,
- Farrer Road, P.O. Box 128, Singapore 9128. Tel: 2786188
- Freq.: Quarterly (Vol. 1 in 1990?)
- Cost/Yr: Individual $42, Institution $88 (plus $9-$17 for postage)
- ISSN #: 0129-0657 (IJNS)
- Remark: The International Journal of Neural Systems is a quarterly journal
- which covers information processing in natural and artificial neural
- systems. It publishes original contributions on all aspects of this
- broad subject which involves physics, biology, psychology, computer
- science and engineering. Contributions include research papers,
- reviews and short communications. The journal presents a fresh
- undogmatic attitude towards this multidisciplinary field with the
- aim to be a forum for novel ideas and improved understanding of
- collective and cooperative phenomena with computational capabilities.
- (Note: Remarks supplied by B. Lautrup (editor),
- "LAUTRUP%nbivax.nbi.dk@CUNYVM.CUNY.EDU" )
- Review is reported to be very slow.
- ------
- Title: Neural Network News
- Publish: AIWeek Inc.
- Address: Neural Network News, 2555 Cumberland Parkway, Suite 299, Atlanta, GA
- 30339 USA. Tel: (404) 434-2187
- Freq.: Monthly (beginning September 1989)
- Cost/Yr: USA and Canada $249, Elsewhere $299
- Remark: Commericial Newsletter
- ------
- Title: Network: Computation in Neural Systems
- Publish: IOP Publishing Ltd
- Address: Europe: IOP Publishing Ltd, Techno House, Redcliffe Way, Bristol
- BS1 6NX, UK; IN USA: American Institute of Physics, Subscriber
- Services 500 Sunnyside Blvd., Woodbury, NY 11797-2999
- Freq.: Quarterly (1st issue 1990)
- Cost/Yr: USA: $180, Europe: 110 pounds
- Remark: Description: "a forum for integrating theoretical and experimental
- findings across relevant interdisciplinary boundaries." Contents:
- Submitted articles reviewed by two technical referees paper's
- interdisciplinary format and accessability." Also Viewpoints and
- Reviews commissioned by the editors, abstracts (with reviews) of
- articles published in other journals, and book reviews.
- Comment: While the price discourages me (my comments are based upon
- a free sample copy), I think that the journal succeeds very well. The
- highest density of interesting articles I have found in any journal.
- (Note: Remarks supplied by brandt kehoe "kehoe@csufres.CSUFresno.EDU")
- ------
- Title: Connection Science: Journal of Neural Computing,
- Artificial Intelligence and Cognitive Research
- Publish: Carfax Publishing
- Address: Europe: Carfax Publishing Company, P. O. Box 25, Abingdon,
- Oxfordshire OX14 3UE, UK. USA: Carafax Publishing Company,
- 85 Ash Street, Hopkinton, MA 01748
- Freq.: Quarterly (vol. 1 in 1989)
- Cost/Yr: Individual $82, Institution $184, Institution (U.K.) 74 pounds
- -----
- Title: International Journal of Neural Networks
- Publish: Learned Information
- Freq.: Quarterly (vol. 1 in 1989)
- Cost/Yr: 90 pounds
- ISSN #: 0954-9889
- Remark: The journal contains articles, a conference report (at least the
- issue I have), news and a calendar.
- (Note: remark provided by J.R.M. Smits "anjos@sci.kun.nl")
- -----
- Title: Concepts in NeuroScience
- Publish: World Scientific Publishing
- Address: Same Address (?) as for International Journal of Neural Systems
- Freq.: Twice per year (vol. 1 in 1989)
- Remark: Mainly Review Articles(?)
- (Note: remarks by Osamu Saito "saito@nttica.NTT.JP")
- -----
- Title: International Journal of Neurocomputing
- Publish: Elsevier Science Publishers
- Freq.: Quarterly (vol. 1 in 1989)
- Remark: Review has been reported to be fast (less than 3 months)
- -----
- Title: Neurocomputers
- Publish: Gallifrey Publishing
- Address: Gallifrey Publishing, PO Box 155, Vicksburg, Michigan, 49097, USA
- Tel: (616) 649-3772
- Freq. Monthly (1st issue 1987?)
- ISSN #: 0893-1585
- Editor: Derek F. Stubbs
- Cost/Yr: $32 (USA, Canada), $48 (elsewhere)
- Remark: I only have one exemplar so I cannot give you much detail about
- the contents. It is a very small one (12 pages) but it has a lot
- of (short) information in it about e.g. conferences, books,
- (new) ideas etc. I don't think it is very expensive but I'm not sure.
- (Note: remark provided by J.R.M. Smits "anjos@sci.kun.nl")
- ------
- Title: JNNS Newsletter (Newsletter of the Japan Neural Network Society)
- Publish: The Japan Neural Network Society
- Freq.: Quarterly (vol. 1 in 1989)
- Remark: (IN JAPANESE LANGUAGE) Official Newsletter of the Japan Neural
- Network Society(JNNS)
- (Note: remarks by Osamu Saito "saito@nttica.NTT.JP")
- -------
- Title: Neural Networks Today
- Remark: I found this title in a bulletin board of october last year.
- It was a message of Tim Pattison, timpatt@augean.OZ
- (Note: remark provided by J.R.M. Smits "anjos@sci.kun.nl")
- -----
- Title: Computer Simulations in Brain Science
- -----
- Title: Internation Journal of Neuroscience
- -----
- Title: Neural Network Computation
- Remark: Possibly the same as "Neural Computation"
- -----
- Title: Neural Computing and Applications
- Freq.: Quarterly
- Publish: Springer Verlag
- Cost/yr: 120 Pounds
- Remark: Is the journal of the Neural Computing Applications Forum.
- Publishes original research and other information
- in the field of practical applications of neural computing.
-
- B. NN Related Journals
- ======================
-
- Title: Complex Systems
- Publish: Complex Systems Publications
- Address: Complex Systems Publications, Inc., P.O. Box 6149, Champaign,
- IL 61821-8149, USA
- Freq.: 6 times per year (1st volume is 1987)
- ISSN #: 0891-2513
- Cost/Yr: Individual $75, Institution $225
- Remark: Journal COMPLEX SYSTEMS devotes to the rapid publication of research
- on the science, mathematics, and engineering of systems with simple
- components but complex overall behavior. Send mail to
- "jcs@complex.ccsr.uiuc.edu" for additional info.
- (Remark is from announcement on Net)
- -----
- Title: Biological Cybernetics (Kybernetik)
- Publish: Springer Verlag
- Remark: Monthly (vol. 1 in 1961)
- -----
- Title: Various IEEE Transactions and Magazines
- Publish: IEEE
- Remark: Primarily see IEEE Trans. on System, Man and Cybernetics; Various
- Special Issues: April 1990 IEEE Control Systems Magazine.; May 1989
- IEEE Trans. Circuits and Systems.; July 1988 IEEE Trans. Acoust.
- Speech Signal Process.
- -----
- Title: The Journal of Experimental and Theoretical Artificial Intelligence
- Publish: Taylor & Francis, Ltd.
- Address: London, New York, Philadelphia
- Freq.: ? (1st issue Jan 1989)
- Remark: For submission information, please contact either of the editors:
- Eric Dietrich Chris Fields
- PACSS - Department of Philosophy Box 30001/3CRL
- SUNY Binghamton New Mexico State University
- Binghamton, NY 13901 Las Cruces, NM 88003-0001
- dietrich@bingvaxu.cc.binghamton.edu cfields@nmsu.edu
- -----
- Title: The Behavioral and Brain Sciences
- Publish: Cambridge University Press
- Remark: (Expensive as hell, I'm sure.)
- This is a delightful journal that encourages discussion on a
- variety of controversial topics. I have especially enjoyed reading
- some papers in there by Dana Ballard and Stephen Grossberg (separate
- papers, not collaborations) a few years back. They have a really neat
- concept: they get a paper, then invite a number of noted scientists
- in the field to praise it or trash it. They print these commentaries,
- and give the author(s) a chance to make a rebuttal or concurrence.
- Sometimes, as I'm sure you can imagine, things get pretty lively. I'm
- reasonably sure they are still at it--I think I saw them make a call
- for reviewers a few months ago. Their reviewers are called something
- like Behavioral and Brain Associates, and I believe they have to be
- nominated by current associates, and should be fairly well established
- in the field. That's probably more than I really know about it but
- maybe if you post it someone who knows more about it will correct any
- errors I have made. The main thing is that I liked the articles I
- read. (Note: remarks by Don Wunsch <dwunsch@blake.acs.washington.edu>)
- -----
- Title: International Journal of Applied Intelligence
- Publish: Kluwer Academic Publishers
- Remark: first issue in 1990(?)
- -----
- Title: Bulletin of Mathematica Biology
- -----
- Title: Intelligence
- -----
- Title: Journal of Mathematical Biology
- -----
- Title: Journal of Complex System
- -----
- Title: AI Expert
- Publish: Miller Freeman Publishing Co., for subscription call ++415-267-7672.
- Remark: Regularly includes ANN related articles, product
- announcements, and application reports.
- Listings of ANN programs are available on AI Expert affiliated BBS's
- -----
- Title: International Journal of Modern Physics C
- Publish: World Scientific Publ. Co.
- Farrer Rd. P.O.Box 128, Singapore 9128
- or: 687 Hartwell St., Teaneck, N.J. 07666 U.S.A
- or: 73 Lynton Mead, Totteridge, London N20 8DH, England
- Freq: published quarterly
- Eds: G. Fox, H. Herrmann and K. Kaneko
- -----
- Title: Machine Learning
- Publish: Kluwer Academic Publishers
- Address: Kluwer Academic Publishers
- P.O. Box 358
- Accord Station
- Hingham, MA 02018-0358 USA
- Freq.: Monthly (8 issues per year; increasing to 12 in 1993)
- Cost/Yr: Individual $140 (1992); Member of AAAI or CSCSI $88
- Remark: Description: Machine Learning is an international forum for
- research on computational approaches to learning. The journal
- publishes articles reporting substantive research results on a
- wide range of learning methods applied to a variety of task
- domains. The ideal paper will make a theoretical contribution
- supported by a computer implementation.
- The journal has published many key papers in learning theory,
- reinforcement learning, and decision tree methods. Recently
- it has published a special issue on connectionist approaches
- to symbolic reasoning. The journal regularly publishes
- issues devoted to genetic algorithms as well.
- -----
- Title: Journal of Physics A: Mathematical and General
- Publish: Inst. of Physics, Bristol
- Freq: 24 issues per year.
- Remark: Statistical mechanics aspects of neural networks
- (mostly Hopfield models).
-
- -----
- Title: Physical Review A: Atomic, Molecular and Optical Physics
- Publish: The American Physical Society (Am. Inst. of Physics)
- Freq: Monthly
- Remark: Statistical mechanics of neural networks.
-
-
- C. Journals loosely related to NNs
- ==================================
-
- JOURNAL OF COMPLEXITY
- (Must rank alongside Wolfram's Complex Systems)
-
- IEEE ASSP Magazine
- (April 1987 had the Lippmann intro. which everyone likes to cite)
-
- ARTIFICIAL INTELLIGENCE
- (Vol 40, September 1989 had the survey paper by Hinton)
-
- COGNITIVE SCIENCE
- (the Boltzmann machine paper by Ackley et al appeared here in Vol 9, 1983)
-
- COGNITION
- (Vol 28, March 1988 contained the Fodor and Pylyshyn critique of connectionism)
-
- COGNITIVE PSYCHOLOGY
- (no comment!)
-
- JOURNAL OF MATHEMATICAL PSYCHOLOGY
- (several good book reviews)
-
- ------------------------------------------------------------------------
-
- -A12.) The most important conferences concerned with Neural Networks ?
-
- [to be added: has taken place how often yet; most emphasized topics;
- where to get proceedings/calls-for-papers etc. ]
-
- A. Dedicated Neural Network Conferences:
- 1. Neural Information Processing Systems (NIPS)
- Annually since 1988 in Denver, Colorado; late November or early December;
- (Interdisciplinary conference with computer science, physics, engineering,
- biology, medicine, cognitive science topics. Covers all aspects of NNs)
- 2. International Joint Conference on Neural Networks (IJCNN)
- co-sponsored by INNS and IEEE
- 3. Annual Conference on Neural Networks (ACNN)
- 4. International Conference on Artificial Neural Networks (ICANN)
- Annually in Europe. First was 1991.
- Major conference of European Neur. Netw. Soc. (ENNS)
- 5. Artificial Neural Networks in Engineering (ANNIE)
- Anually since 1991 in St. Louis, Missouri; held in November.
- (Topics: NN architectures, pattern recognition, neuro-control,
- neuro-engineering systems.
- Contact: ANNIE; Engineering Management Department;
- 223 Engineering Management Building;
- University of Missouri-Rolla; Rolla, MO 65401;
- FAX: (314) 341-6567)
-
- B. Other Conferences
- 1. International Joint Conference on Artificial Intelligence (IJCAI)
- 2. Intern. Conf. on Acustics, Speech and Signal Processing (ICASSP)
- 3. Annual Conference of the Cognitive Science Society
- 4. [Vision Conferences?]
-
- C. Pointers to Conferences
- 1. The journal "Neural Networks" has a long list of conferences,
- workshops and meetings in each issue.
- This is quite interdisciplinary.
- 2. There is a regular posting on comp.ai.neural-nets from Paultje Bakker:
- "Upcoming Neural Network Conferences", which lists names, dates,
- locations, contacts, and deadlines.
-
- ------------------------------------------------------------------------
-
- -A13.) Neural Network Associations ?
-
- [Is this data still correct ? Who will send me some update ?]
-
- 1. International Neural Network Society (INNS).
- INNS membership includes subscription to "Neural Networks",
- the official journal of the society.
- Membership is $55 for non-students and $45 for students per year.
- Address: INNS Membership, P.O. Box 491166, Ft. Washington, MD 20749.
-
- 2. International Student Society for Neural Networks (ISSNNets).
- Membership is $5 per year.
- Address: ISSNNet, Inc., P.O. Box 15661, Boston, MA 02215 USA
-
- 3. Women In Neural Network Research and technology (WINNERS).
- Address: WINNERS, c/o Judith Dayhoff, 11141 Georgia Ave., Suite 206,
- Wheaton, MD 20902. Telephone: 301-933-9000.
-
- 4. European Neural Network Society (ENNS)
-
- 5. Japanese Neural Network Society (JNNS)
- Address: Japanese Neural Network Society
- Department of Engineering, Tamagawa University,
- 6-1-1, Tamagawa Gakuen, Machida City, Tokyo,
- 194 JAPAN
- Phone: +81 427 28 3457, Fax: +81 427 28 3597
-
- 6. Association des Connexionnistes en THese (ACTH)
- (the French Student Association for Neural Networks)
- Membership is 100 FF per year
- Activities : newsletter, conference (every year), list of members...
- Address : ACTH - Le Castelnau R2
- 23 avenue de la Galline
- 34170 Castelnau-le-Lez
- FRANCE
- Contact : jdmuller@vnet.ibm.com
-
- 7. Neurosciences et Sciences de l'Ingenieur (NSI)
- Biology & Computer Science
- Activity : conference (every year)
- Address : NSI - TIRF / INPG
- 46 avenue Felix Viallet
- 38031 Grenoble Cedex
- FRANCE
-
-
- ------------------------------------------------------------------------
-
- -A14.) Other sources of information about NNs ?
-
- 1. Neuron Digest
- Internet Mailing List. From the welcome blurb:
- "Neuron-Digest is a list (in digest form) dealing with all aspects
- of neural networks (and any type of network or neuromorphic system)"
- Moderated by Peter Marvit.
- To subscribe, send email to neuron-request@cattell.psych.upenn.edu
- comp.ai.neural-net readers also find the messages in that newsgroup
- in the form of digests.
-
- 2. Usenet groups comp.ai.neural-nets (Oha ! :-> )
- and comp.theory.self-org-sys
- There is a periodic posting on comp.ai.neural-nets sent by
- srctran@world.std.com (Gregory Aharonian) about Neural Network
- patents.
-
- 3. Central Neural System Electronic Bulletin Board
- Modem: 409-589-3338; Sysop: Wesley R. Elsberry;
- P.O. Box 4201, College Station, TX 77843; welsberr@orca.tamu.edu
- Many MS-DOS PD and shareware simulations, source code, benchmarks,
- demonstration packages, information files; some Unix, Macintosh,
- Amiga related files. Also available are files on AI, AI Expert
- listings 1986-1991, fuzzy logic, genetic algorithms, artificial
- life, evolutionary biology, and many Project Gutenberg and Wiretap
- etexts. No user fees have ever been charged. Home of the
- NEURAL_NET Echo, available thrugh FidoNet, RBBS-Net, and other
- EchoMail compatible bulletin board systems.
-
- 4. Neural ftp archive site ftp.funet.fi
- Is administrating a large collection of neural network papers and
- software at the Finnish University Network file archive site
- ftp.funet.fi in directory /pub/sci/neural
- Contains all the public domain software and papers that they
- have been able to find.
- All of these files have been transferred from FTP sites in U.S.
- and are mirrored about every 3 months at fastest.
- Contact: neural-adm@ftp.funet.fi
-
- 5. USENET newsgroup comp.org.issnnet
- Forum for discussion of academic/student-related issues in NNs, as
- well as information on ISSNNet (see A13) and its activities.
-
- 6. AI CD-ROM
- Network Cybernetics Corporation produces the "AI CD-ROM". It is
- an ISO-9660 format CD-ROM and contains a large assortment of
- software related to artificial intelligence, artificial life, virtual
- reality, and other topics. Programs for OS/2, MS-DOS, Macintosh, UNIX,
- and other operating systems are included. Research papers, tutorials,
- and other text files are included in ASCII, RTF, and other universal
- formats. The files have been collected from AI bulletin boards,
- Internet archive sites, University computer deptartments, and
- other government and civilian AI research organizations. Network
- Cybernetics Corporation intends to release annual revisions to the
- AI CD-ROM to keep it up to date with current developments in the field.
- The AI CD-ROM includes collections of files that address many
- specific AI/AL topics including:
- [... some stuff deleted...]
- - Neural Networks: Source code and executables for many different
- platforms including Unix, DOS, and Macintosh. ANN development tools,
- example networks, sample data, and tutorials are included. A complete
- collection of Neural Digest is included as well.
- [... lots of stuff deleted...]
- The AI CD-ROM may be ordered directly by check, money order, bank
- draft, or credit card from:
- Network Cybernetics Corporation
- 4201 Wingren Road Suite 202
- Irving, TX 75062-2763
- Tel 214/650-2002
- Fax 214/650-1929
- The cost is $129 per disc + shipping ($5/disc domestic or $10/disc foreign)
- (See the comp.ai FAQ for further details)
-
- 7. http://www.eeb.ele.tue.nl
- In World-Wide-Web (WWW, for example via the xmosaic program) you
- can read neural network information by opening the universal resource
- locator (URL) http://www.eeb.ele.tue.nl
- It contains a hypertext version of this FAQ and other NN-related
- information.
-
- 8. Neurosciences Internet Resource Guide
- This document aims to be a guide to existing, free, Internet-accessible
- resources helpful to neuroscientists of all stripes.
- An **ASCII text version** (86K) is available in the
- Clearinghouse of Subject-Oriented Internet Resource Guides as
- follows:
- anonymous FTP:
- host: una.hh.lib.umich.edu
- path: /inetdirsstacks
- file: neurosci:cormbonario
- gopher:
- via U. Minnesota list of gophers
- menu: North America/USA/Michigan/Clearinghouse.../
- All Guides/Neurosciences
- WWW:
- gopher://una.hh.lib.umich.edu/00/inetdirsstacks/
- neurosci:cormbonario
- We are also creating a **hypertext version** of the guide:
- WWW:
- http://http2.sils.umich.edu/Public/nirg/nirg1.html
-
- ------------------------------------------------------------------------
-
- -A15.) Freely available software packages for NN simulation ?
-
-
- [This is a bit chaotic and needs reorganization.
- A bit more information about what the various programs can do,
- on which platform they run, and how big they are would also be nice.
- And some important packages are still missing (?)
- Who volunteers for that ?]
-
- 1. Rochester Connectionist Simulator
- A quite versatile simulator program for arbitrary types of
- neural nets. Comes with a backprop package and a X11/Sunview
- interface.
- anonymous FTP from cs.rochester.edu (192.5.53.209)
- directory : pub/simulator
- files: README (8 KB)
- (documentation:) rcs_v4.2.justdoc.tar.Z (1.6 MB)
- (source code:) rcs_v4.2.justsrc.tar.Z (1.4 MB)
-
- 2. UCLA-SFINX
- ftp 131.179.16.6 (retina.cs.ucla.edu)
- Name: sfinxftp
- Password: joshua
- directory: pub/
- files : README
- sfinx_v2.0.tar.Z
- Email info request : sfinx@retina.cs.ucla.edu
-
- 3. NeurDS
- request from mcclanahan%cookie.dec.com@decwrl.dec.com
- simulator for DEC systems supporting VT100 terminal.
- OR
- anonymous ftp gatekeeper.dec.com [16.1.0.2]
- directory: pub/DEC
- file: NeurDS031.tar.Z ( please check may be NeurDSO31.tar.Z )
-
- 4. PlaNet5.7 (also known as SunNet)
- ftp 133.15.240.3 (tutserver.tut.ac.jp)
- pub/misc/PlaNet5.7.tar.Z
- or
- ftp 128.138.240.1 (boulder.colorado.edu)
- pub/generic-sources/PlaNet5.7.tar.Z (also the old PlaNet5.6.tar.Z)
- A popular connectionist simulator with versions to
- run under X Windows, and non-graphics terminals
- created by Yoshiro Miyata (Chukyo Univ., Japan).
- 60-page User's Guide in Postscript.
- Send any questions to miyata@sccs.chukyo-u.ac.jp
-
- 5. GENESIS
- GENESIS 1.4.1 (GEneral NEural SImulation System) is a general purpose
- simulation platform which was developed to support the simulation of
- neural systems ranging from complex models of single neurons to
- simulations of large networks made up of more abstract neuronal
- components. Most current GENESIS applications involve realistic
- simulations of biological neural systems. Although the software can
- also model more abstract networks, other simulators are more suitable
- for backpropagation and similar connectionist modeling.
- May be obtained via FTP from genesis.cns.caltech.edu [131.215.137.64].
- Use 'telnet' to genesis.cns.caltech.edu beforehands and login
- as the user "genesis" (no password required). If you answer all the
- questions asked of you an 'ftp' account will automatically be created
- for you. You can then 'ftp' back to the machine and download the
- software (ca. 3 MB). Contact: genesis@cns.caltech.edu.
-
- 6. Mactivation
- anonymous ftp from bruno.cs.colorado.edu [128.138.243.151]
- directory: /pub/cs/misc
- file: Mactivation-3.3.sea.hqx
-
- 7. <defunct>
-
- 8. Cascade Correlation Simulator
- A simulator based on Scott Fahlman's Cascade Correlation algorithm.
- Anonymous ftp from ftp.cs.cmu.edu [128.2.206.173]
- directory /afs/cs/project/connect/code
- file cascor1a.shar (206 KB)
- There is also a version of recurrent cascade correlation in the same
- directory in file rcc1.c (107 KB).
-
- 9. Quickprop
- A variation of the back-propagation algorithm developed by
- Scott Fahlman. A simulator is available in the same directory
- as the cascade correlation simulator above in file
- nevprop116.shar (137 KB)
- (see also the description of NEVPROP below)
-
- 10. DartNet
- DartNet is a Macintosh-based backpropagation simulator, developed
- at Dartmouth by Jamshed Bharucha and Sean Nolan as a pedagogical tool.
- It makes use of the Mac's graphical interface, and provides a number
- of tools for building, editing, training, testing and examining
- networks. This program is available by anonymous ftp from
- dartvax.dartmouth.edu [129.170.16.4] as
- /pub/mac/dartnet.sit.hqx (124 KB).
-
- 11. SNNS
- "Stuttgart Neural Network Simulator" from the University
- of Stuttgart, Germany.
- A luxurious simulator for many types of nets; with X11 interface:
- Graphical 2D and 3D topology editor/visualizer, training visualisation,
- etc.
- Currently supports backpropagation (vanilla, online, with momentum
- term and flat spot elimination, batch, time delay), counterpropagation,
- quickprop, backpercolation 1, generalized radial basis functions (RBF),
- RProp, ART1, ART2, ARTMAP, Cascade Correlation, Recurrent Cascade
- Correlation, Dynamic LVQ, Backpropagation through time (for recurrent
- networks), batch backpropagation through time (for recurrent networks),
- Quickpropagation through time (for recurrent networks),
- and is user-extendable.
-
- ftp: ftp.informatik.uni-stuttgart.de [129.69.211.2]
- directory /pub/SNNS
- file SNNSv3.0.tar.Z OR SNNSv3.0.tar.Za[a-d] ( 826 KB)
- manual SNNSv2.1.Manual.ps.Z (1270 KB)
- SNNSv2.1.Readme (8052 Bytes)
-
- 12. Aspirin/MIGRAINES
- Aspirin/MIGRAINES 6.0 consists of a code generator that builds neural network
- simulations by reading a network description (written in a language
- called "Aspirin") and generates a C simulation. An interface
- (called "MIGRAINES") is provided to export data from the neural
- network to visualization tools.
- The system has been ported to a large number of platforms.
- The goal of Aspirin is to provide a common extendible front-end language
- and parser for different network paradigms.
- The MIGRAINES interface is a terminal based interface
- that allows you to open Unix pipes to data in the neural
- network. This replaces the NeWS1.1 graphical interface
- in version 4.0 of the Aspirin/MIGRAINES software. The
- new interface is not a simple to use as the version 4.0
- interface but is much more portable and flexible.
- The MIGRAINES interface allows users to output
- neural network weight and node vectors to disk or to
- other Unix processes. Users can display the data using
- either public or commercial graphics/analysis tools.
- Example filters are included that convert data exported through
- MIGRAINES to formats readable by Gnuplot 3.0, Matlab, Mathematica,
- and xgobi.
- The software is available from two FTP sites:
- CMU's simulator collection on "pt.cs.cmu.edu" (128.2.254.155)
- in /afs/cs/project/connect/code/am6.tar.Z".
- and UCLA's cognitive science machine "ftp.cognet.ucla.edu" (128.97.50.19)
- in alexis/am6.tar.Z
- The compressed tar file is a little less than 2 megabytes.
-
- 13. Adaptive Logic Network kit
- Available from menaik.cs.ualberta.ca. This package differs from
- the traditional nets in that it uses logic functions rather than
- floating point; for many tasks, ALN's can show many orders of
- magnitude gain in training and performance speed.
- Anonymous ftp from menaik.cs.ualberta.ca [129.128.4.241]
- README /pub/atree/atree.readme (7 KB)
- unix source code and examples: /pub/atree/atree2.tar.Z (145 KB)
- Postscript documentation: /pub/atree/atree2.ps.Z ( 76 KB)
- MS-Windows 3.x executable: /pub/atree/a27exe.exe (412 KB)
- MS-Windows 3.x source code: /pub/atree/atre27.exe (572 KB)
-
- 14. NeuralShell
- Available from FTP site quanta.eng.ohio-state.edu
- (128.146.35.1) in directory "pub/NeuralShell", filename
- "NeuralShell.tar".
-
- 15. PDP
- The PDP simulator package is available via anonymous FTP at
- nic.funet.fi (128.214.6.100) in /pub/sci/neural/sims/pdp.tar.Z (0.2 MB)
- The simulator is also available with the book
- "Explorations in Parallel Distributed Processing: A Handbook of
- Models, Programs, and Exercises" by McClelland and Rumelhart.
- MIT Press, 1988.
- Comment: "This book is often referred to as PDP vol III which is a very
- misleading practice! The book comes with software on an IBM disk but
- includes a makefile for compiling on UNIX systems. The version of
- PDP available at nic.funet.fi seems identical to the one with the book
- except for a bug in bp.c which occurs when you try to run a script of
- PDP commands using the DO command. This can be found and fixed easily."
-
- 16. Xerion
- Xerion is available via anonymous ftp from
- ftp.cs.toronto.edu in the directory /pub/xerion.
- xerion-3.1.ps.Z (153 kB) and xerion-3.1.tar.Z (1322 kB) plus
- several concrete simulators built with xerion (about 40 kB each).
- Xerion runs on SGI and Sun machines and uses X Windows for graphics.
- The software contains modules that implement Back Propagation,
- Recurrent Back Propagation, Boltzmann Machine, Mean Field Theory,
- Free Energy Manipulation, Hard and Soft Competitive Learning, and
- Kohonen Networks. Sample networks built for each of the modules are
- also included.
- Contact: xerion@ai.toronto.edu
-
- 17. Neocognitron simulator
- An implementation is available for anonymous ftp at
- [128.194.15.32] tamsun.tamu.edu as
- /pub/neocognitron.Z.tar or
- [129.12.21.7] unix.hensa.ac.uk as
- /pub/uunet/pub/ai/neural/neocognitron.tar.Z
- The simulator is written in C and comes with a list of references
- which are necessary to read to understand the specifics of the
- implementation. The unsupervised version is coded without (!)
- C-cell inhibition.
-
- 18. Multi-Module Neural Computing Environment (MUME)
- MUME is a simulation environment for multi-modules neural computing. It
- provides an object oriented facility for the simulation and training
- of multiple nets with various architectures and learning algorithms.
- MUME includes a library of network architectures including feedforward,
- simple recurrent, and continuously running recurrent neural networks.
- Each architecture is supported by a variety of learning algorithms.
- MUME can be used for large scale neural network simulations as it provides
- support for learning in multi-net environments. It also provide pre- and
- post-processing facilities.
- The modules are provided in a library. Several "front-ends" or clients are
- also available. X-Window support by editor/visualization tool Xmume.
- MUME can be used to include non-neural computing modules (decision
- trees, ...) in applications.
- Version 0.73 of MUME has been deposited for anonymous ftp on
- mickey.sedal.su.oz.au (129.78.24.170) in directory /mume.
- Contact:
- Marwan Jabri, SEDAL, Sydney University Electrical Engineering,
- NSW 2006 Australia, marwan@sedal.su.oz.au
-
- 19. LVQ_PAK, SOM_PAK
- These are packages for Learning Vector Quantization and
- Self-Organizing Maps, respectively.
- They have been built by the LVQ/SOM Programming Team of the
- Helsinki University of Technology, Laboratory of Computer and
- Information Science, Rakentajanaukio 2 C, SF-02150 Espoo, FINLAND
- There are versions for Unix and MS-DOS available from
- cochlea.hut.fi (130.233.168.48) in
- /pub/lvq_pak/lvq_pak-2.1.tar.Z (340 kB, Unix)
- /pub/lvq_pak/lvq_p2r1.exe (310 kB, MS-DOS self-extract archive)
- /pub/som_pak/som_pak-1.1.tar.Z (246 kB, Unix)
- /pub/som_pak/som_p1r1.exe (215 kB, MS-DOS self-extract archive)
-
- 20. SESAME
- (Software Environment for the Simulation of Adaptive Modular Systems)
- SESAME is a prototypical software implementation which facilitates
- * Object-oriented building blocks approach.
- * Contains a large set of C++ classes useful for neural nets,
- neurocontrol and pattern recognition. No C++ classes can be used as
- stand alone, though!
- * C++ classes include CartPole, nondynamic two-robot arms, Lunar Lander,
- Backpropagation, Feature Maps, Radial Basis Functions, TimeWindows,
- Fuzzy Set Coding, Potential Fields, Pandemonium, and diverse utility
- building blocks.
- * A kernel which is the framework for the C++ classes and allows run-time
- manipulation, construction, and integration of arbitrary complex and
- hybrid experiments.
- * Currently no graphic interface for construction, only for visualization.
- * Platform is SUN4, XWindows
- Unfortunately no reasonable good introduction has been written until now.
- We hope to have something soon. For now we provide papers (eg. NIPS-92),
- a reference manual (>220 pages), source code (ca. 35.000 lines of
- code), and a SUN4-executable by ftp only.
- Sesame and its description is available for anonymous ftp on
- ftp ftp.gmd.de [129.26.8.90] in the directories
- gmd/as/sesame and gmd/as/paper
- Questions please to sesame-request@gmd.de
- There is only very limited support available. Currently we can not handle
- many users.
-
- 21. Nevada Backpropagation (NevProp)
- NevProp is a free, easy-to-use feedforward backpropagation
- (multilayer perceptron) program. It uses an interactive
- character-based interface, and is distributed as C source code that
- should compile and run on most platforms. (Precompiled executables are
- available for Macintosh and DOS.) The original version was Quickprop
- 1.0 by Scott Fahlman, as translated from Common Lisp by Terry Regier.
- We added early-stopped training based on a held-out subset of data, c
- index (ROC curve area) calculation, the ability to force gradient
- descent (per-epoch or per-pattern), and additional options.
- *** FEATURES: NevProp version 1.16...
- o UNLIMITED (except by machine memory) number of input PATTERNS;
- o UNLIMITED number of input, hidden, and output UNITS;
- o Arbitrary CONNECTIONS among the various layers' units;
- o Clock-time or user-specified RANDOM SEED for initial random weights;
- o Choice of regular GRADIENT DESCENT or QUICKPROP;
- o Choice of PER-EPOCH or PER-PATTERN (stochastic) weight updating;
- o GENERALIZATION to a test dataset;
- o AUTOMATICALLY STOPPED TRAINING based on generalization;
- o RETENTION of best-generalizing weights and predictions;
- o Simple but useful GRAPHIC display to show smoothness of generalization;
- o SAVING of results to a file while working interactively;
- o SAVING of weights file and reloading for continued training;
- o PREDICTION-only on datasets by applying an existing weights file;
- o In addition to RMS error, the concordance, or c index is displayed.
- The c index (area under the ROC curve) shows the correctness of the
- RELATIVE ordering of predictions AMONG the cases; ie, it is a
- measure of discriminative power of the model.
- *** AVAILABILITY:
- The most updated version of NevProp will be made available
- by anonymous ftp from the University of Nevada, Reno:
- "ftp.scs.unr.edu" in the directory "pub/goodman/nevpropdir".
- *** PLANS FOR NEXT RELEASE:
- Version 2 to be released in Spring of 1994 -- some of the new features:
- more flexible file formatting (including access to external data files;
- option to prerandomize data order; randomized stochastic gradient descent;
- option to rescale predictor (input) variables); linear output units as an
- alternative to sigmoidal units for use with continuous-valued dependent
- variables (output targets); cross-entropy (maximum likelihood) criterion
- function as an alternative to square error for use with categorical
- dependent variables (classification/symbolic/nominal targets); and
- interactive interrupt to change settings on-the-fly.
- (If you'd like to beta test prerelease version, contact goodman@unr.edu)
- *** SUPPORT:
- Limited support is available from Phil Goodman (goodman@unr.edu),
- University of Nevada Center for Biomedical Research.
-
- 22. Fuzzy ARTmap
- Available for anonymous ftp from park.bu.edu [128.176.121.56]
- as /pub/fuzzy-artmap.tar.Z (44 kB)
- (This is just a small example program.)
-
- 23. PYGMALION
- This is a prototype that stems from an ESPRIT project. It implements
- back-propagation, self organising map, and Hopfield nets.
- On imag.imag.fr [129.88.32.1] in directory archive/pygmalion:
- pygmalion.tar.Z (1534 kb)
-
- 24. Basis-of-AI-backprop:
- Here are some of the details of a set of back-propagation programs I
- have been working on. Earlier versions have been posted in
- comp.sources.misc and people around the world have used them and liked
- them. This package is free for ordinary users but shareware for
- businesses and government agencies ($200/copy, but then for this you get
- the professional version as well). I do support this package via email.
- Some of the highlights are:
- * in C for UNIX and DOS and DOS binaries
- * gradient descent, delta-bar-delta and quickprop
- * extra fast 16-bit fixed point weight version as well as a conventional
- floating point version
- * recurrent networks
- * numerous sample problems
- To get this version simply ftp to ftp.mcs.com where you will land in the
- directory /work/public/mcsnet.users. Then cd to drt and read readme.1st.
- The expanded professional version is $30/copy for ordinary
- individuals including academics and $200/copy for businesses and
- government agencies. Prices and contents subject to change without
- notice. Some of the highlights are an improved user interface, more
- activation functions, networks can be read into your own programs,
- dynamic node creation, weight decay, SuperSAB
- Contact: Don Tveter; 5228 N. Nashville Ave.; Chicago, Illinois 60656
- drt@mcs.com
-
- 25. Matrix Backpropagation
- MBP (Matrix Back Propagation) is an efficient implementation of the
- back-propagation algorithm for current-generation workstations. The
- algorithm includes a per-epoch adaptive technique for gradient
- descent. All the computations are done through matrix multiplications
- and make use of highly optimized C code. The goal is to reach almost
- peak-performances on RISCs with superscalar capabilities and fast
- caches. On some machines (and with large networks) a 30-40x speed-up
- can be measured respect to conventional implementations.
- The software is available by anonymous ftp from
- risc6000.dibe.unige.it:/pub/ [130.251.89.154]
- as MBPv1.1.tar.Z (unix version) and MBPv11.zip (DOS version). The
- documentation is included in the distribution as the postscript file
- mbpv11.ps. For more information, contact Davide Anguita
- <anguita@dibe.unige.it> or <anguita@icsi.berkeley.edu>.
-
- 26. WinNN
- WinNN is a shareware Neural Networks (NN) package for windows 3.1.
- WinNN incorporates a very user friendly interface with a powerful
- computational engine. WinNN is intended to be used as a tool for
- beginners and more advanced neural networks users, it provides an
- alternative to using more expensive and hard to use packages. WinNN
- can implement feed forward multi-layered NN and uses a modified
- fast back-propagation for training.
- Extensive on line help. Has various neuron functions.
- Allows on the fly testing of the network performance and generalization.
- All training parameters can be easily modified while WinNN is training.
- Results can be saved on disk or copied to the clipboard.
- Supports plotting of the outputs and weight distribution.
- Available for ftp from wuarcive.wustl.edu in pub/MSDOS_UPLOADS/win3
- and pub/MSDOS_UPLOADS/win; the file name is WINNN09.ZIP (542 kB).
-
- For some of these simulators there are user mailing lists. Get the
- packages and look into their documentation for further info.
-
- If you are using a small computer (PC, Mac, etc.) you may want to have
- a look at the Central Neural System Electronic Bulletin Board
- (see Answer 14)
- Modem: 509-627-6CNS; Sysop: Wesley R. Elsberry;
- P.O. Box 1187, Richland, WA 99352; welsberr@sandbox.kenn.wa.us
- There are lots of small simulator packages, the CNS ANNSIM file set.
- There is an ftp mirror site for the CNS ANNSIM file set at
- me.uta.edu (129.107.2.20) in the /pub/neural directory. Most ANN
- offerings are in /pub/neural/annsim.
-
- ------------------------------------------------------------------------
-
- -A16.) Commercial software packages for NN simulation ?
-
- [preliminary]
- [who will write some short comment on each of the most
- important packages ?]
-
- The Number 1 of each volume of the journal "Neural Networks" has a list
- of some dozens of commercial suppliers of Neural Network things:
- Software, Hardware, Support, Programming, Design and Service.
-
- 1. nn/xnn
- Name: nn/xnn
- Company: Neureka ANS
- Address: Klaus Hansens vei 31B
- 5037 Solheimsviken
- NORWAY
- Phone: +47-55544163 / +47-55201548
- Email: arnemo@eik.ii.uib.no
- Basic capabilities:
- Neural network development tool. nn is a language for specification of
- neural network simulators. Produces C-code and executables for the
- specified models, therefore ideal for application development. xnn is
- a graphical front-end to nn and the simulation code produced by nn.
- Gives graphical representations in a number of formats of any
- variables during simulation run-time. Comes with a number of
- pre-implemented models, including: Backprop (several variants), Self
- Organizing Maps, LVQ1, LVQ2, Radial Basis Function Networks,
- Generalized Regression Neural Networks, Jordan nets, Elman nets,
- Hopfield, etc.
- Operating system: nn: UNIX or MS-DOS, xnn: UNIX/X-windows
- System requirements: 10 Mb HD, 2 Mb RAM
- Approx. price: USD 2000,-
-
-
- 2. BrainMaker
- Name: BrainMaker, BrainMaker Pro
- Company: California Scientific Software
- Address: 10024 Newtown rd, Nevada City, CA, 95959 USA
- Phone,Fax: 916 478 9040, 916 478 9041
- Email: calsci!mittmann@gvgpsa.gvg.tek.com (flakey connection)
- Basic capabilities: train backprop neural nets
- Operating system: DOS, Windows, Mac
- System requirements:
- Uses XMS or EMS for large models(PCs only): Pro version
- Approx. price: $195, $795
-
- BrainMaker Pro 3.0 (DOS/Windows) $795
- Gennetic Training add-on $250
- ainMaker 3.0 (DOS/Windows/Mac) $195
- Network Toolkit add-on $150
- BrainMaker 2.5 Student version (quantity sales only, about $38 each)
-
- BrainMaker Pro C30 Accelerator Board
- w/ 5Mb memory $9750
- w/32Mb memory $13,000
-
- Intel iNNTS NN Development System $11,800
- Intel EMB Multi-Chip Board $9750
- Intel 80170 chip set $940
-
- Introduction To Neural Networks book $30
-
- California Scientific Software can be reached at:
- Phone: 916 478 9040 Fax: 916 478 9041 Tech Support: 916 478 9035
- Mail: 10024 newtown rd, Nevada City, CA, 95959, USA
- All Software has a 30 day money back guarantee, and unlimited free technical
- support.
- BrainMaker package includes:
- The book Introduction to Neural Networks
- BrainMaker Users Guide and reference manual
- 300 pages , fully indexed, with tutorials, and sample Neural Networks
- Netmaker
- Netmaker makes building and training Neural Networks easy, by
- importing and automatically creating BrainMaker's Neural Network
- files. Netmaker imports Lotus, Excel, dBase, and ASCII files.
- BrainMaker
- Full menu and dialog box interface, runs Backprop at 750,000 cps on a
- 33Mhz 486.
-
- Feature BrainMaker Professional Benefit
-
- User Interface
- Pull-down Menus, Dialog Boxes { { easy to learn and use; all parameters
- saved in a file you can edit.
- Programmable Output Files { { exports data in your format to
- spreadsheets, graphics packages, etc.
- Editing in BrainMaker { { quickly edit data, display, network
- connections, and more.
- Network Progress Display { monitors training with a simple
- graphic display.
- Fact Annotation { { attaches your comments to examples
- for display and printing.
- Printer Support { { HP LaserJet, DeskJet, InkJet,
- IBM Proprinter, Epson, etc.
- NetPlotter T { see how the input correlates with
- your output.
- Graphics Built In { shows trends, cycles, network
- responses, statistics, etc.;
- see it on screen, plotter, or printer.
- Dynamic Data Exchange { puts your network in other windows
- programs
-
- Performance
- Binary Mode T { uses binary files for greater speed.
- Batch Mode { add networks to your existing
- programs, train while you're away.
- EMS and XMS Memory { up to 8192 independent variables.
- Save Network Periodically { { saves results to a file in case of
- power failure.
- Fastest Algorithms { { 750,000 connections-per-second
- (486/50).
- Neurons per Layer 512 32,000 more inputs: model complex data
- with ease.
- Number of Layers 8 8 extra hidden layers can help tackle
- bigger problems.
-
- Training
- Specify Parameters by Layer { fine-tunes performance inside the netw
- Recurrence Networks { Puts feedback in your network,
- automates historical input.
- Prune Connections and Neurons { improves accuracy by trimming away
- excess "fat".
- Add Hidden Neurons In Training { { finds best size network quickly;
- fully automated with Professional.
- Custom Neuron Functions { { optimizes training to suit any need.
- Testing While Training { { trains for best performance on new
- data.
- Stop training when... { lets you decide when network has
- learned well.
- Heavy Weights { helps networks train.
- Hypersonic Training T { trains faster with this proprietary
- algorithm.
-
- Analysis, Advanced Functions
- Sensitivity Analysis { shows you which inputs determined
- your results.
- Neuron Sensitivity { shows you the total effect of one
- input on your results.
- Global Network Analysis { shows how the networks reacts to
- your inputs overall
- Contour Analysis { shows peaks and valleys of the output
- when two inputs change
- Data Correlator { finds important data and optimum
- time delays.
- Error Statistics Report { { check your network error rate during
- training.
- Print or Edit Weight Matrices { { examine, customize network internals.
- Competitor { ranks horses, teams, stocks, etc.
- in finish order.
- Run Time System { C source code - make programs with
- your network for resale.
- Chip Support { { Intel, American Neurologics,
- Micro Devices.
- Genetic Training Option G trains variations of your design
- and shows you which was the best.
-
- Network Data Management Functions
- NetMaker { { spreadsheet-like data manipulation
- and network file creation.
- NetChecker { { checks your files for errors and
- inconsistencies.
- Shuffle { { mixes up the order of examples for
- better training.
- Binary T { converts files to binary for quicker
- training.
- MinMax { { finds min / max / standard deviation
- of data for fine-tuned results.
- Data Importation { { reads data from Lotus, dBASE,
- Excel, ASCII, binary.
- Finacial Data { reads MetaStock, and Computrack
- Data Manipulation { { finds indicators, oscillators,
- running averages, etc.
- Cyclic Analysis { checks data for periodic or cyclic
- behavior.
- Data Types { { uses symbolic, text, picture,
- and numeric data.
-
- Documentation & User Support
- User's Guide { { an application development guide
- and quick-start booklet.
- Introduction to Neural Networks { { 324 pp, gets you up to date in this
- exciting field.
-
-
- 3. SAS Software/ Neural Net add-on
- Name: SAS Software
- Company: SAS Institute, Inc.
- Address: SAS Campus Drive, Cary, NC 27513, USA
- Phone,Fax: (919) 677-8000
- Email: saswss@unx.sas.com (Neural net inquiries only)
-
- Basic capabilities:
- Feedforward nets with numerous training methods
- and loss functions, plus statistical analogs of
- counterpropagation and various unsupervised
- architectures
- Operating system: Lots
- System requirements: Lots
- Uses XMS or EMS for large models(PCs only): Runs under Windows, OS/2
- Approx. price: Free neural net software, but you have to license
- SAS/Base software and preferably the SAS/OR, SAS/ETS,
- and/or SAS/STAT products.
- Comments: Oriented toward data analysis and statistical applications
-
-
- 4. NeuralWorks
- Name: NeuralWorks Professional II Plus (from NeuralWare)
- Company: NeuralWare Inc.
- Adress: Pittsburgh, PA 15276-9910
- Phone: (412) 787-8222
- FAX: (412) 787-8220
-
- Distributor for Europe:
- Scientific Computers GmbH.
- Franzstr. 107, 52064 Aachen
- Germany
- Tel. (49) +241-26041
- Fax. (49) +241-44983
- Email. info@scientific.de
-
- Basic capabilities:
- supports over 30 different nets: backprop, art-1,kohonen,
- modular neural network, General regression, Fuzzy art-map,
- probabilistic nets, self-organizing map, lvq, boltmann,
- bsb, spr, etc...
- Extendable with optional package.
- ExplainNet, Flashcode (compiles net in .c code for runtime),
- user-defined io in c possible. ExplainNet (to eliminate
- extra inputs), pruning, savebest,graph.instruments like
- correlation, hinton diagrams, rms error graphs etc..
- Operating system : PC,Sun,IBM RS6000,Apple Macintosh,SGI,Dec,HP.
- System requirements: varies. PC:2MB extended memory+6MB Harddisk space.
- Uses windows compatible memory driver (extended).
- Uses extended memory.
- Approx. price : call (depends on platform)
- Comments : award winning documentation, one of the market
- leaders in NN software.
-
-
- 5. MATLAB Neural Network Toolbox (for use with Matlab 4.x)
- Contact: The MathWorks, Inc. Phone: 508-653-1415
- 24 Prime Park Way FAX: 508-653-2997
- Natick, MA 01760 email: info@mathworks.com
- (Comment by Richard Andrew Miles Outerbridge, RAMO@UVPHYS.PHYS.UVIC.CA:)
- Matlab is spreading like hotcakes (and the educational discounts
- are very impressive). The newest release of Matlab (4.0) ansrwers
- the question "if you could only program in one language what would it be?".
- The neural network toolkit is worth getting for the manual alone. Matlab is
- available with lots of other toolkits (signal processing, optimization, etc.)
- but I don't use them much - the main package is more than enough. The nice
- thing about the Matlab approach is that you can easily interface the neural
- network stuff with anything else you are doing.
-
- 6. Propagator
- Contact: ARD Corporation,
- 9151 Rumsey Road, Columbia, MD 21045, USA
- propagator@ard.com
- Easy to use neural network training package. A true GUI implementation of
- backpropagation networks with five layers (32,000 nodes per layer).
- Features dynamic performance graphs, training with a validation set,
- and C/C++ source code generation.
- For Sun (Solaris 1.x & 2.x, $499),
- PC (Windows 3.x, $199)
- Mac (System 7.x, $199)
- Floating point coprocessor required, Educational Discount,
- Money Back Guarantee, Muliti User Discount
- Windows Demo on:
- nic.funet.fi /pub/msdos/windows/demo
- oak.oakland.edu /pub/msdos/neural_nets
- gatordem.zip pkzip 2.04g archive file
- gatordem.txt readme text file
-
- 7. NeuroForecaster
- Name: NeuroForecaster(TM)/Genetica 3.1
- Contact: Accel Infotech (S) Pte Ltd; 648 Geylang Road;
- Republic of Singapore 1438; Phone: +65-7446863; Fax: +65-7492467
- accel@solomon.technet.sg
- For IBM PC 386/486 with mouse, or compatibles MS Windows* 3.1, MS DOS 5.0 or
- above 4 MB RAM, 5 MB available harddisk space min 3.5 inch floppy drive,
- VGA monitor or above, Math coprocessor recommended.
- Neuroforecaster 3.1 for Windows is priced at US$999 per single user
- license. For a limited period only Genetica is bundled free-of-charge.
- Please email us (accel@solomon.technet.sg) for order form.
- More information about NeuroForecaster(TM)/Genetical may be found in
- ftp.nus.sg incoming/accel.
- NeuroForecaster is a user-friendly neural network program specifically
- designed for building sophisticated and powerful forecasting and
- decision-support systems (Time-Series Forecasting, Cross-Sectional
- Classification, Indicator Analysis)
- Features:
- * GENETICA Net Builder Option for automatic network creation and optimization
- * 12 Neuro-Fuzzy Network Models
- * Multitasking & Background Training Mode
- * Unlimited Network Capacity
- * Rescaled Range Analysis & Hurst Exponent to Unveil Hidden Market Cycles
- & Check for Predictability
- * Correlation Analysis to Compute Correlation Factors to Analyze the
- Significance of Indicators
- * Weight Histogram to Monitor the Progress of Learning
- * Accumulated Error Analysis to Analyze the Strength of Input Indicators
- Its user-friendly interface allows the users to build applications quickly,
- easily and interactively, analyze the data visually and see the results
- immediately.
- The following example applications are included in the package:
- * Credit Rating - for generating the credit rating of bank loan applications
- * Stock market 6 monthly returns forecast
- * Stock selection based on company ratios
- * US$ to Deutschmark exchange rate forecast
- * US$ to Yen exchange rate forecast
- * US$ to SGD exchange rate forecast
- * Property price valuation
- * XOR - a classical problem to show the results are better than others
- * Chaos - Prediction of Mackey-Glass chaotic time series
- * SineWave - For demonstrating the power of Rescaled Range Analysis and
- significance of window size
- Techniques Implemented:
- * GENETICA Net Builder Option - network creation & optimization based on
- Darwinian evolution theory
- * Backprop Neural Networks - the most widely-used training algorithm
- * Fastprop Neural Networks - speeds up training of large problems even on slow
- machines
- * Radial Basis Function Networks - best for pattern classification problems
- * Neuro-Fuzzy Network - combines the power of neuro and fuzzy computing
- technologies
- * Rescaled Range Analysis - computes Hurst exponents to unveil hidden cycles &
- check for predictability
- * Correlation Analysis - to identify significant input indicators
-
- ------------------------------------------------------------------------
-
- -A17.) Neural Network hardware ?
-
- [preliminary]
- [who will write some short comment on the most important
- HW-packages and chips ?]
-
- The Number 1 of each volume of the journal "Neural Networks" has a list
- of some dozens of suppliers of Neural Network support:
- Software, Hardware, Support, Programming, Design and Service.
-
- Here is a list of companies contributed by xli@computing-maths.cardiff.ac.uk:
-
- 1. HNC, INC.
- 5501 Oberlin Drive
- San Diego
- California 92121
- (619) 546-8877
- and a second address at
- 7799 Leesburg Pike, Suite 900
- Falls Church, Virginia
- 22043
- (703) 847-6808
- Note: Australian Dist.: Unitronics
- Tel : (09) 4701443
- Contact: Martin Keye
- HNC markets:
- 'Image Document Entry Processing Terminal' - it recognises
- handwritten documents and converts the info to ASCII.
- 'ExploreNet 3000' - a NN demonstrator
- 'Anza/DP Plus'- a Neural Net board with 25MFlop or 12.5M peak
- interconnects per second.
-
- 2. SAIC (Sience Application International Corporation)
- 10260 Campus Point Drive
- MS 71, San Diego
- CA 92121
- (619) 546 6148
- Fax: (619) 546 6736
-
- 3. Micro Devices
- 30 Skyline Drive
- Lake Mary
- FL 32746-6201
- (407) 333-4379
- MicroDevices makes MD1220 - 'Neural Bit Slice'
- Each of the products mentioned sofar have very different usages.
- Although this sounds similar to Intel's product, the
- architectures are not.
-
- 4. Intel Corp
- 2250 Mission College Blvd
- Santa Clara, Ca 95052-8125
- Attn ETANN, Mail Stop SC9-40
- (408) 765-9235
- Intel is making an experimental chip:
- 80170NW - Electrically trainable Analog Neural Network (ETANN)
- It has 64 'neurons' on it - almost fully internally connectted
- and the chip can be put in an hierarchial architecture to do 2 Billion
- interconnects per second.
- Support software has already been made by
- California Scientific Software
- 10141 Evening Star Dr #6
- Grass Valley, CA 95945-9051
- (916) 477-7481
- Their product is called 'BrainMaker'.
-
- 5. NeuralWare, Inc
- Penn Center West
- Bldg IV Suite 227
- Pittsburgh
- PA 15276
- They only sell software/simulator but for many platforms.
-
- 6. Tubb Research Limited
- 7a Lavant Street
- Peterfield
- Hampshire
- GU32 2EL
- United Kingdom
- Tel: +44 730 60256
-
- 7. Adaptive Solutions Inc
- 1400 NW Compton Drive
- Suite 340
- Beaverton, OR 97006
- U. S. A.
- Tel: 503 - 690 - 1236 FAX: 503 - 690 - 1249
-
- 8. NeuroDynamX, Inc.
- 4730 Walnut St., Suite 101B
- Boulder, CO 80301
- Voice: (303) 442-3539 Fax: (303) 442-2854
- Internet: techsupport@ndx.com
- NDX sells a number neural network hardware products:
- NDX Neural Accelerators: a line of i860-based accelerator cards for
- the PC that give up to 45 million connections per second for use
- with the DynaMind neural network software.
- iNNTS: Intel's 80170NX (ETANN) Neural Network Training System. NDX's president
- was one of the co-designers of this chip.
-
-
- And here is an incomplete list of Neurocomputers
- (provided by jon@kongle.idt.unit.no (Jon Gunnar Solheim)):
-
- Overview over known Neural Computers with their newest known reference.
- \subsection*{Digital}
- \subsubsection{Special Computers}
-
- {\bf AAP-2}
- Takumi Watanabe, Yoshi Sugiyama, Toshio Kondo, and Yoshihiro Kitamura.
- Neural network simulation on a massively parallel cellular array
- processor: AAP-2.
- In International Joint Conference on Neural Networks, 1989.
-
- {\bf ANNA}
- B.E.Boser, E.Sackinger, J.Bromley, Y.leChun, and L.D.Jackel.\\
- Hardware Requirements for Neural Network Pattern Classifiers.\\
- In {\it IEEE Micro}, 12(1), pages 32-40, February 1992.
-
- {\bf Analog Neural Computer}
- Paul Mueller et al.
- Design and performance of a prototype analog neural computer.
- In Neurocomputing, 4(6):311-323, 1992.
-
- {\bf APx -- Array Processor Accelerator}\\
- F.Pazienti.\\
- Neural networks simulation with array processors.
- In {\it Advanced Computer Technology, Reliable Systems and Applications;
- Proceedings of the 5th Annual Computer Conference}, pages 547-551.
- IEEE Comput. Soc. Press, May 1991. ISBN: 0-8186-2141-9.
-
- {\bf ASP -- Associative String Processor}\\
- A.Krikelis.\\
- A novel massively associative processing architecture for the
- implementation artificial neural networks.\\
- In {\it 1991 International Conference on Acoustics, Speech and
- Signal Processing}, volume 2, pages 1057-1060. IEEE Comput. Soc. Press,
- May 1991.
-
- {\bf BSP400}
- Jan N.H. Heemskerk, Jacob M.J. Murre, Jaap Hoekstra, Leon H.J.G.
- Kemna, and Patrick T.W. Hudson.
- The bsp400: A modular neurocomputer assembled from 400 low-cost
- microprocessors.
- In International Conference on Artificial Neural Networks. Elsevier
- Science, 1991.
-
- {\bf BLAST}\\
- J.G.Elias, M.D.Fisher, and C.M.Monemi.\\
- A multiprocessor machine for large-scale neural network simulation.
- In {\it IJCNN91-Seattle: International Joint Conference on Neural
- Networks}, volume 1, pages 469-474. IEEE Comput. Soc. Press, July 1991.
- ISBN: 0-7883-0164-1.
-
- {\bf CNAPS Neurocomputer}\\
- H.McCartor\\
- Back Propagation Implementation on the Adaptive Solutions CNAPS
- Neurocomputer.\\
- In {\it Advances in Neural Information Processing Systems}, 3, 1991.
-
- {\bf MA16 -- Neural Signal Processor}
- U.Ramacher, J.Beichter, and N.Bruls.\\
- Architecture of a general-purpose neural signal processor.\\
- In {\it IJCNN91-Seattle: International Joint Conference on Neural
- Networks}, volume 1, pages 443-446. IEEE Comput. Soc. Press, July 1991.
- ISBN: 0-7083-0164-1.
-
- {\bf Mindshape}
- Jan N.H. Heemskerk, Jacob M.J. Murre Arend Melissant, Mirko Pelgrom,
- and Patrick T.W. Hudson.
- Mindshape: a neurocomputer concept based on a fractal architecture.
- In International Conference on Artificial Neural Networks. Elsevier
- Science, 1992.
-
- {\bf mod 2}
- Michael L. Mumford, David K. Andes, and Lynn R. Kern.
- The mod 2 neurocomputer system design.
- In IEEE Transactions on Neural Networks, 3(3):423-433, 1992.
-
- {\bf NERV}\\
- R.Hauser, H.Horner, R. Maenner, and M.Makhaniok.\\
- Architectural Considerations for NERV - a General Purpose Neural
- Network Simulation System.\\
- In {\it Workshop on Parallel Processing: Logic, Organization and
- Technology -- WOPPLOT 89}, pages 183-195. Springer Verlag, Mars 1989.
- ISBN: 3-5405-5027-5.
-
- {\bf NP -- Neural Processor}\\
- D.A.Orrey, D.J.Myers, and J.M.Vincent.\\
- A high performance digital processor for implementing large artificial
- neural networks.\\
- In {\it Proceedings of of the IEEE 1991 Custom Integrated Circuits
- Conference}, pages 16.3/1-4. IEEE Comput. Soc. Press, May 1991.
- ISBN: 0-7883-0015-7.
-
- {\bf RAP -- Ring Array Processor }\\
- N.Morgan, J.Beck, P.Kohn, J.Bilmes, E.Allman, and J.Beer.\\
- The ring array processor: A multiprocessing peripheral for connectionist
- applications. \\
- In {\it Journal of Parallel and Distributed Computing}, pages
- 248-259, April 1992.
-
- {\bf RENNS -- REconfigurable Neural Networks Server}\\
- O.Landsverk, J.Greipsland, J.A.Mathisen, J.G.Solheim, and L.Utne.\\
- RENNS - a Reconfigurable Computer System for Simulating Artificial
- Neural Network Algorithms.\\
- In {\it Parallel and Distributed Computing Systems, Proceedings of the
- ISMM 5th International Conference}, pages 251-256. The International
- Society for Mini and Microcomputers - ISMM, October 1992.
- ISBN: 1-8808-4302-1.
-
- {\bf SMART -- Sparse Matrix Adaptive and Recursive Transforms}\\
- P.Bessiere, A.Chams, A.Guerin, J.Herault, C.Jutten, and J.C.Lawson.\\
- From Hardware to Software: Designing a ``Neurostation''.\\
- In {\it VLSI design of Neural Networks}, pages 311-335, June 1990.
-
- {\bf SNAP -- Scalable Neurocomputer Array Processor}
- E.Wojciechowski.\\
- SNAP: A parallel processor for implementing real time neural networks.\\
- In {\it Proceedings of the IEEE 1991 National Aerospace and Electronics
- Conference; NAECON-91}, volume 2, pages 736-742. IEEE Comput.Soc.Press,
- May 1991.
-
- {\bf Toroidal Neural Network Processor}\\
- S.Jones, K.Sammut, C.Nielsen, and J.Staunstrup.\\
- Toroidal Neural Network: Architecture and Processor Granularity
- Issues.\\
- In {\it VLSI design of Neural Networks}, pages 229-254, June 1990.
-
- {\bf SMART and SuperNode}
- P. Bessi`ere, A. Chams, and P. Chol.
- MENTAL : A virtual machine approach to artificial neural networks programming.
- In NERVES, ESPRIT B.R.A. project no 3049, 1991.
- (The report archived on neuroprose}
-
-
- \subsubsection{Standard Computers}
-
- {\bf EMMA-2}\\
- R.Battiti, L.M.Briano, R.Cecinati, A.M.Colla, and P.Guido.\\
- An application oriented development environment for Neural Net models on
- multiprocessor Emma-2.\\
- In {\it Silicon Architectures for Neural Nets; Proceedings for the IFIP
- WG.10.5 Workshop}, pages 31-43. North Holland, November 1991.
- ISBN: 0-4448-9113-7.
-
- {\bf iPSC/860 Hypercube}\\
- D.Jackson, and D.Hammerstrom\\
- Distributing Back Propagation Networks Over the Intel iPSC/860
- Hypercube}\\
- In {\it IJCNN91-Seattle: International Joint Conference on Neural
- Networks}, volume 1, pages 569-574. IEEE Comput. Soc. Press, July 1991.
- ISBN: 0-7083-0164-1.
-
- {\bf SCAP -- Systolic/Cellular Array Processor}\\
- Wei-Ling L., V.K.Prasanna, and K.W.Przytula.\\
- Algorithmic Mapping of Neural Network Models onto Parallel SIMD
- Machines.\\
- In {\it IEEE Transactions on Computers}, 40(12), pages 1390-1401,
- December 1991. ISSN: 0018-9340.
-
- ------------------------------------------------------------------------
-
- -A19.) Databases for experimentation with NNs ?
-
- [are there any more ?]
-
- 1. The neural-bench Benchmark collection
- accessible via anonymous FTP on
- "ftp.cs.cmu.edu" [128.2.206.173]
- in directory
- "/afs/cs/project/connect/bench"
- In case of problems email contact is "neural-bench@cs.cmu.edu".
- The data sets in this repository include the 'nettalk' data,
- 'two spirals', protein structure prediction, vowel recognition,
- sonar signal classification, and a few others.
-
- 2. UCI machine learning database
- accessible via anonymous FTP on
- "ics.uci.edu" [128.195.1.1]
- in directory
- "/pub/machine-learning-databases"
-
- 3. NIST special databases of the National Institute Of Standards
- And Technology:
- NIST special database 2:
- Structured Forms Reference Set (SFRS)
-
- The NIST database of structured forms contains 5,590 full page images
- of simulated tax forms completed using machine print. THERE IS NO REAL
- TAX DATA IN THIS DATABASE. The structured forms used in this database
- are 12 different forms from the 1988, IRS 1040 Package X. These
- include Forms 1040, 2106, 2441, 4562, and 6251 together with Schedules
- A, B, C, D, E, F and SE. Eight of these forms contain two pages or
- form faces making a total of 20 form faces represented in the
- database. Each image is stored in bi-level black and white raster
- format. The images in this database appear to be real forms prepared
- by individuals but the images have been automatically derived and
- synthesized using a computer and contain no "real" tax data. The entry
- field values on the forms have been automatically generated by a
- computer in order to make the data available without the danger of
- distributing privileged tax information. In addition to the images
- the database includes 5,590 answer files, one for each image. Each
- answer file contains an ASCII representation of the data found in the
- entry fields on the corresponding image. Image format documentation
- and example software are also provided. The uncompressed database
- totals approximately 5.9 gigabytes of data.
-
- NIST special database 3:
- Binary Images of Handwritten Segmented Characters (HWSC)
-
- Contains 313,389 isolated character images segmented from the
- 2,100 full-page images distributed with "NIST Special Database 1".
- 223,125 digits, 44,951 upper-case, and 45,313 lower-case character
- images. Each character image has been centered in a separate
- 128 by 128 pixel region, error rate of the segmentation and
- assigned classification is less than 0.1%.
- The uncompressed database totals approximately 2.75 gigabytes of
- image data and includes image format documentation and example software.
-
-
- NIST special database 4:
- 8-Bit Gray Scale Images of Fingerprint Image Groups (FIGS)
-
- The NIST database of fingerprint images contains 2000 8-bit gray scale
- fingerprint image pairs. Each image is 512 by 512 pixels with 32 rows
- of white space at the bottom and classified using one of the five
- following classes: A=Arch, L=Left Loop, R=Right Loop, T=Tented Arch,
- W=Whirl. The database is evenly distributed over each of the five
- classifications with 400 fingerprint pairs from each class. The images
- are compressed using a modified JPEG lossless compression algorithm
- and require approximately 636 Megabytes of storage compressed and 1.1
- Gigabytes uncompressed (1.6 : 1 compression ratio). The database also
- includes format documentation and example software.
-
- More short overview:
- Special Database 1 - NIST Binary Images of Printed Digits, Alphas, and Text
- Special Database 2 - NIST Structured Forms Reference Set of Binary Images
- Special Database 3 - NIST Binary Images of Handwritten Segmented Characters
- Special Database 4 - NIST 8-bit Gray Scale Images of Fingerprint Image Groups
- Special Database 6 - NIST Structured Forms Reference Set 2 of Binary Images
- Special Database 7 - NIST Test Data 1: Binary Images of Handprinted Segmented
- Characters
- Special Software 1 - NIST Scoring Package Release 1.0
-
- Special Database 1 - $895.00
- Special Database 2 - $250.00
- Special Database 3 - $895.00
- Special Database 4 - $250.00
- Special Database 6 - $250.00
- Special Database 7 - $1,000.00
- Special Software 1 - $1,150.00
-
- The system requirements for all databases are a 5.25" CD-ROM drive
- with software to read ISO-9660 format.
-
- Contact: Darrin L. Dimmick
- dld@magi.ncsl.nist.gov (301)975-4147
-
- If you wish to order the database, please contact:
- Standard Reference Data
- National Institute of Standards and Technology
- 221/A323
- Gaithersburg, MD 20899
- (301)975-2208 or (301)926-0416 (FAX)
-
- 4. CEDAR CD-ROM 1: Database of Handwritten
- Cities, States, ZIP Codes, Digits, and Alphabetic Characters
-
- The Center Of Excellence for Document Analysis and Recognition (CEDAR)
- State University of New York at Buffalo announces the availability of
- CEDAR CDROM 1: USPS Office of Advanced Technology
- The database contains handwritten words and ZIP Codes
- in high resolution grayscale (300 ppi 8-bit) as well as
- binary handwritten digits and alphabetic characters (300 ppi
- 1-bit). This database is intended to encourage research in
- off-line handwriting recognition by providing access to
- handwriting samples digitized from envelopes in a working
- post office.
- Specifications of the database include:
- + 300 ppi 8-bit grayscale handwritten words (cities,
- states, ZIP Codes)
- o 5632 city words
- o 4938 state words
- o 9454 ZIP Codes
- + 300 ppi binary handwritten characters and digits:
- o 27,837 mixed alphas and numerics segmented
- from address blocks
- o 21,179 digits segmented from ZIP Codes
- + every image supplied with a manually determined
- truth value
- + extracted from live mail in a working U.S. Post
- Office
- + word images in the test set supplied with dic-
- tionaries of postal words that simulate partial
- recognition of the corresponding ZIP Code.
- + digit images included in test set that simulate
- automatic ZIP Code segmentation. Results on these
- data can be projected to overall ZIP Code recogni-
- tion performance.
- + image format documentation and software included
- System requirements are a 5.25" CD-ROM drive with software to read ISO-
- 9660 format.
- For any further information, including how to order the
- database, please contact:
- Jonathan J. Hull, Associate Director, CEDAR, 226 Bell Hall
- State University of New York at Buffalo, Buffalo, NY 14260
- hull@cs.buffalo.edu (email)
-
- 5. AI-CD-ROM (see above under "other sources of information about NNs")
-
- 6. Time series archive
- Various datasets of time series (to be used for prediction learning
- problems) are available for anonymous ftp at
- ftp.santafe.edu in pub/Time-Series.
- For example:
- - fluctuations in a far-infrared laser
- - Physiological data of patients with sleep apnea
- - High frequency currency exchange rate data
- - Intensity of a white dwarf star
- - J.S. Bachs final (unfinished) fugue from "Die Kunst der Fuge"
- Some of the datasets were used in a prediction contest and are described
- in detail in the book "Time series prediction: Forecasting the future
- and understanding the past", edited by Weigend/Gershenfield, Proceedings
- Volume XV in the Santa Fe Institute Studies in the Sciences of Complexity
- series of Addison Wesley (1994).
-
- ------------------------------------------------------------------------
-
-
-
- That's all folks.
-
- ========================================================================
-
- Acknowledgements: Thanks to all the people who helped to get the stuff
- above into the posting. I cannot name them all, because
- I would make far too many errors then. :->
-
- No ? Not good ? You want individual credit ?
- OK, OK. I'll try to name them all. But: no guarantee....
-
- THANKS FOR HELP TO:
- (in alphabetical order of email adresses, I hope)
-
- Allen Bonde <ab04@harvey.gte.com>
- Accel Infotech Spore Pte Ltd <accel@solomon.technet.sg>
- Alexander Linden <al@jargon.gmd.de>
- S.Taimi Ames <ames@reed.edu>
- Axel Mulder <amulder@move.kines.sfu.ca>
- anderson@atc.boeing.com
- Davide Anguita <anguita@ICSI.Berkeley.EDU>
- Avraam Pouliakis <apou@leon.nrcps.ariadne-t.gr>
- Kim L. Blackwell <avrama@helix.nih.gov>
- Paul Bakker <bakker@cs.uq.oz.au>
- Jamshed Bharucha <bharucha@casbs.Stanford.EDU>
- Yijun Cai <caiy@mercury.cs.uregina.ca>
- L. Leon Campbell <campbell@brahms.udel.edu>
- Yaron Danon <danony@goya.its.rpi.edu>
- David Ewing <dave@ndx.com>
- David DeMers <demers@cs.ucsd.edu>
- Denni Rognvaldsson <denni@thep.lu.se>
- Donald Tveter <drt@mcs.com>
- Frank Schnorrenberg <fs0997@easttexas.tamu.edu>
- Gary Lawrence Murphy <garym@maya.isis.org>
- gaudiano@park.bu.edu
- Lee Giles <giles@research.nj.nec.com>
- Glen Clark <opto!glen@gatech.edu>
- Phil Goodman <goodman@unr.edu>
- guy@minster.york.ac.uk
- Joerg Heitkoetter <heitkoet@lusty.informatik.uni-dortmund.de>
- Ralf Hohenstein <hohenst@math.uni-muenster.de>
- Jean-Denis Muller <jdmuller@vnet.ibm.com>
- Jeff Harpster <uu0979!jeff@uu9.psi.com>
- Jonathan Kamens <jik@MIT.Edu>
- JJ Merelo <jmerelo@casip.ugr.es>
- Jon Gunnar Solheim <jon@kongle.idt.unit.no>
- Josef Nelissen <jonas@beor.informatik.rwth-aachen.de>
- Kjetil.Noervaag@idt.unit.no
- Luke Koops <koops@gaul.csd.uwo.ca>
- William Mackeown <mackeown@compsci.bristol.ac.uk>
- Peter Marvit <marvit@cattell.psych.upenn.edu>
- masud@worldbank.org
- Yoshiro Miyata <miyata@sccs.chukyo-u.ac.jp>
- Madhav Moganti <mmogati@cs.umr.edu>
- Jyrki Alakuijala <more@ee.oulu.fi>
- mrs@kithrup.com
- Maciek Sitnik <msitnik@plearn.edu.pl>
- R. Steven Rainwater <ncc@ncc.jvnc.net>
- Michael Plonski <plonski@aero.org>
- Lutz Prechelt <prechelt@ira.uka.de> [creator of FAQ]
- Richard Andrew Miles Outerbridge <ramo@uvphys.phys.uvic.ca>
- Richard Cornelius <richc@rsf.atd.ucar.edu>
- Rob Cunningham <rkc@xn.ll.mit.edu>
- Robert.Kocjancic@IJS.si
- Osamu Saito <saito@nttica.ntt.jp>
- Sheryl Cormicle <sherylc@umich.edu>
- Ted Stockwell <ted@aps1.spa.umn.edu>
- Thomas G. Dietterich <tgd@research.cs.orst.edu>
- Thomas.Vogel@cl.cam.ac.uk
- Ulrich Wendl <uli@unido.informatik.uni-dortmund.de>
- Matthew P Wiener <weemba@sagi.wistar.upenn.edu>
- Wesley Elsberry <welsberr@orca.tamu.edu>
-
- Bye
-
- Lutz
-
- --
- Lutz Prechelt (email: prechelt@ira.uka.de) | Whenever you
- Institut fuer Programmstrukturen und Datenorganisation | complicate things,
- Universitaet Karlsruhe; 76128 Karlsruhe; Germany | they get
- (Voice: ++49/721/608-4068, FAX: ++49/721/694092) | less simple.
-