home *** CD-ROM | disk | FTP | other *** search
- Path: sparky!uunet!think.com!ames!agate!ucbvax!CATTELL.PSYCH.UPENN.EDU!neuron-request
- From: neuron-request@CATTELL.PSYCH.UPENN.EDU ("Neuron-Digest Moderator")
- Newsgroups: comp.ai.neural-nets
- Subject: Neuron Digest V10 #24 (jobs + discussion + software)
- Message-ID: <16793.725498849@cattell.psych.upenn.edu>
- Date: 27 Dec 92 23:27:29 GMT
- Sender: daemon@ucbvax.BERKELEY.EDU
- Reply-To: "Neuron-Request" <neuron-request@cattell.psych.upenn.edu>
- Distribution: world
- Organization: University of Pennsylvania
- Lines: 530
-
- Neuron Digest Sunday, 27 Dec 1992
- Volume 10 : Issue 24
-
- Today's Topics:
- Doctoral Program in Philosophy-Psychology-Neuroscience
- some information needed
- Job Opportunity
- Follow-up on product guide
- Very Fast Simulated Reannealing (VFSR) v6.35 in Netlib
- NIPS workshop summary
-
-
- Send submissions, questions, address maintenance, and requests for old
- issues to "neuron-request@cattell.psych.upenn.edu". The ftp archives are
- available from cattell.psych.upenn.edu (130.91.68.31). Back issues
- requested by mail will eventually be sent, but may take a while.
-
- ----------------------------------------------------------------------
-
- Subject: Doctoral Program in Philosophy-Psychology-Neuroscience
- From: Andy Clark <andycl@syma.sussex.ac.uk>
- Date: Tue, 15 Dec 92 16:43:25 +0000
-
-
-
- First Announcement of a New Doctoral Programme in
-
-
- PHILOSOPHY-NEUROSCIENCE-PSYCHOLOGY
-
- at
-
- Washington University in St. Louis
-
-
-
- The Philosophy-Neuroscience-Psychology (PNP) program offers a unique
- opportunity to combine advanced philosophical studies with in-depth work
- in Neuroscience or Psychology. In addition to meeting the usual
- requirements for a Doctorate in Philosophy, students will spend one year
- working in Neuroscience or Psychology. The Neuroscience option will draw
- on the resources of the Washington University School of Medicine which is
- an internationally acknowledged center of excellence in neuroscientific
- research. The initiative will also employ several new PNP related
- Philosophy faculty and post-doctoral fellows.
-
-
- Students admitted to the PNP program will embark upon a five-year course
- of study designed to fulfill all the requirements for the Ph.D. in
- philosophy, including an academic year studying neuroscience at
- Washington University's School of Medicine or psychology in the
- Department of Psychology. Finally, each PNP student will write a
- dissertation jointly directed by a philosopher and a faculty member from
- either the medical school or the psychology department.
-
- THE FACULTY
-
- Roger F. Gibson, Ph.D., Missouri, Professor and Chair:
- Philosophy of Language, Epistemology, Quine
-
- Robert B. Barrett, Ph.D., Johns Hopkins, Professor:
- Pragmatism, Renaissance Science, Philosophy of Social
- Science, Analytic Philosophy.
-
- Andy Clark, Ph.D., Stirling, Visiting Professor (1993-6) and
- Acting Director of PNP:
- Philosophy of Cognitive Science, Philosophy of Mind,
- Philosophy of Language, Connectionism.
-
- J. Claude Evans, Ph.D., SUNY-Stony Brook, Associate Pro-
- fessor: Modern Philosophy, Contemporary Continental
- Philosophy, Phenomenology, Analytic Philosophy, Social and
- Political Theory.
-
- Marilyn A. Friedman, Ph.D., Western Ontario, Associate
- Professor: Ethics, Social Philosophy, Feminist Theory.
-
- William H. Gass, Ph.D., Cornell, Distinguished University
- Professor of the Humanities: Philosophy of Literature,
- Photography, Architecture.
-
- Lucian W. Krukowski, Ph.D., Washington University, Pro-
- fessor: 20th Century Aesthetics, Philosophy of Art,
- 18th and 19th Century Philosophy, Kant, Hegel,
- Schopenhauer.
-
- Josefa Toribio Mateas, Ph.D., Complutense University,
- Assistant Professor: Philosophy of Language, Philosophy
- of Mind.
-
- Larry May, Ph.D., New School for Social Research, Pro-
- fessor: Social and Political Philosophy, Philosophy of
- Law, Moral and Legal Responsibility.
-
- Stanley L. Paulson, Ph.D., Wisconsin, J.D., Harvard, Pro-
- fessor: Philosophy of Law.
-
- Mark Rollins, Ph.D., Columbia, Assistant Professor:
- Philosophy of Mind, Epistemology, Philosophy of Science,
- Neuroscience.
-
- Jerome P. Schiller, Ph.D., Harvard, Professor: Ancient
- Philosophy, Plato, Aristotle.
-
- Joyce Trebilcot, Ph.D., California at Santa Barbara, Associ-
- ate Professor: Feminist Philosophy.
-
- Joseph S. Ullian, Ph.D., Harvard, Professor: Logic, Philos-
- ophy of Mathematics, Philosophy of Language.
-
- Richard A. Watson, Ph.D., Iowa, Professor: Modern Philoso-
- phy, Descartes, Historical Sciences.
-
- Carl P. Wellman, Ph.D., Harvard, Hortense and Tobias Lewin
- Professor in the Humanities: Ethics, Philosophy of Law,
- Legal and Moral Rights.
-
- EMERITI
-
- Richard H. Popkin, Ph.D., Columbia: History of Ideas,
- Jewish Intellectual History.
-
- Alfred J. Stenner, Ph.D., Michigan State: Philosophy of
- Science, Epistemology, Philosophy of Language.
-
- FINANCIAL SUPPORT
-
- Students admitted to the Philosophy-Neuroscience-Psychology (PNP) program
- are eligible for five years of full financial support at competitive
- rates in the presence of satisfactory academic progress.
-
- APPLICATIONS
-
- Application for admission to the Graduate School should be made to:
- Chair, Graduate Admissions
- Department of Philosophy
- Washington University
- Campus Box 1073
- One Brookings Drive
- St. Louis, MO 63130-4899
-
- Washington University encourages and gives full consideration to all
- applicants for admission and financial aid without regard to race, color,
- national origin, handicap, sex, or religious creed. Services for
- students with hearing, visual, orthopedic, learning, or other
- disabilities are coordinated through the office of the Assistant Dean for
- Special Services.
-
-
- ------------------------------
-
- Subject: some information needed
- From: Antonio Villani <ANTONIO%IVRUNIV.bitnet@ICINECA.CINECA.IT>
- Organization: "Information Center - Verona University - Italy"
- Date: Tue, 15 Dec 92 16:56:46 -0100
-
- I'm looking for any kind of information about 'avalanche network' and
- 'neural networks for prediction' applied to dynamic signal processing.
- Can someone help me? Thanks in advance
-
- Antonio Villani
- antonio@ivruniv.bitnet
-
-
- ------------------------------
-
- Subject: Job Opportunity
- From: Marwan Jabri <marwan@ee.su.oz.au>
- Date: Thu, 17 Dec 92 10:19:55 +1100
-
- The University of Sydney
- Department of Electrical Engineering
- Systems Engineering and Design Automation Laboratory
-
- Girling Watson Research Fellowship
- Reference No. 51/12
-
-
- Applications are invited for a Girling Watson Research Fellowship at
- Sydney University Electrical Engineering. The applicant should have a
- strong research and development experience, preferably with a background
- in one or more of the following areas: machine intelligence and
- connectionist architectures, microelectronics, pattern recognition and
- classification.
-
- The Fellow will work with the Systems Engineering and Design Automation
- Laboratory (SEDAL), one of the largest laboratories at Sydney University
- Electrical Engineering. The Fellow will join a group of 18 people (8 staff
- and 10 postgraduate students). SEDAL currently has projects on pattern
- recognition for implantable devices, VLSI implementation of connectionist
- architectures, time series prediction, knowledge integration and
- continuous learning, and VLSI computer aided design. The Research Fellow
- position is aimed at:
-
- o contributing to the research program
- o helping with the supervision of postgraduate students
- o supporting some management aspects of SEDAL
- o providing occasional teaching support
-
- Applicants should have either a PhD or an equivalent industry research and
- development experience. The appointment is available for a period of
- three years, subject to satisfactory progress.
-
- Salary is in the range of Research Fellow: A$39,463 to A$48,688.
-
-
- Applications quoting the reference number 51/12 can be sent to:
-
- The Staff Office
- The University of Sydney
- NSW 2006
- AUSTRALIA
-
-
-
- For further information contact
-
- Dr. M. Jabri,
- Tel: (+61-2) 692-2240,
- Fax: (+61-2) 660-1228,
- Email: marwan@sedal.su.oz.au
-
-
- ------------------------------
-
- Subject: Follow-up on product guide
- From: eric@sunlight.llnl.gov (Eric Keto)
- Date: Thu, 17 Dec 92 17:01:36 -0800
-
- >> Thanks for posting my question about neural net product reviews. I
- >> received a response with the information that there is a recent
- >> review in the July-August 1992 PC AI magazine.
- >
- >Could you write up a little note about what you found and send it to
- >neuron@cattell... I'm sure others would like to know also.
-
- OK, I finally got this magazine.
-
- Here is your note:
-
- In the July-August 1992 issue of PC-AI magazine there is a "4th Annual
- Product Guide" which includes "information on products in a number of AI
- areas" including neural nets. The list of products is quite long, 13
- pages of tiny type, and the descriptions are quite brief: product name,
- vendor, 20 words or so on the description, requirements, price. This is
- certainly not a critical review, but it is an extensive list.
-
- Eric Keto (eric@sunlight.llnl.gov)
-
-
- ------------------------------
-
- Subject: Very Fast Simulated Reannealing (VFSR) v6.35 in Netlib
- From: Lester Ingber <ingber@alumni.cco.caltech.edu>
- Date: Fri, 18 Dec 92 05:47:07 -0800
-
- Very Fast Simulated Reannealing (VFSR) v6.35
-
- Netlib requested an early update, and VFSR v6.35 is now in Netlib
- and soon will be updated in Statlib. The code is stable, and is
- being used widely. The changes to date typically correct typos and
- account for some problems encountered on particular machines.
-
- NETLIB
- Interactive:
- ftp research.att.com
- [login as netlib, your_login_name as password]
- cd opt
- binary
- get vfsr.Z
- Email:
- mail netlib@research.att.com
- send vfsr from opt
-
- STATLIB
- Interactive:
- ftp lib.stat.cmu.edu
- [login as statlib, your_login_name as password]
- cd general
- get vfsr
- Email:
- mail statlib@lib.stat.cmu.edu
- send vfsr from general
-
- EXCERPT FROM README
- 2. Background and Context
-
- VFSR was developed in 1987 to deal with the necessity of
- performing adaptive global optimization on multivariate nonlinear
- stochastic systems[2]. VFSR was recoded and applied to several
- complex systems, in combat analysis[3], finance[4], and neuro-
- science[5]. A comparison has shown VFSR to be superior to a
- standard genetic algorithm simulation on a suite of standard test
- problems[6], and VFSR has been examined in the context of a
- review of methods of simulated annealing[7]. A project comparing
- standard Boltzmann annealing with "fast" Cauchy annealing with
- VFSR has concluded that VFSR is a superior algorithm[8]. A paper
- has indicated how this technique can be enhanced by combining it
- with some other powerful algorithms[9].
-
-
- || Prof. Lester Ingber [10ATT]0-700-L-INGBER ||
- || Lester Ingber Research Fax: 0-700-4-INGBER ||
- || P.O. Box 857 Voice Mail: 1-800-VMAIL-LI ||
- || McLean, VA 22101 EMail: ingber@alumni.caltech.edu ||
-
-
- ------------------------------
-
- Subject: NIPS workshop summary
- From: "Scott A. Markel x2683" <sam@sarnoff.com>
- Date: Wed, 16 Dec 92 15:57:30 -0500
-
-
- [[ Editor's Note: Since I did not go to NIPS, I greatly appreciated this
- summary. I hope (and urge) that readers will contribute their own
- summaries of future conferecnes and workshops. -PM ]]
-
- NIPS 92 Workshop Summary
- ========================
-
- Computational Issues in Neural Network Training
- ===============================================
-
- Main focus: Optimization algorithms used in training neural networks
- - ----------
-
- Organizers: Scott Markel and Roger Crane
- - ----------
-
- This was a one day workshop exploring the use of optimization algorithms, such
- as back-propagation, conjugate gradient, and sequential quadratic programming,
- in neural network training. Approximately 20-25 people participated in the
- workshop. About two thirds of the participants used some flavor of back
- propagation as their algorithm of choice, with the other third using conjugate
- gradient, sequential quadratic programming, or something else. I would guess
- that participants were split about 60-40 between industry and the academic
- community.
-
-
- The workshop consisted of lots of discussion and the following presentations:
-
- Introduction
- - ------------
- Scott Markel (David Sarnoff Research Center - smarkel@sarnoff.com)
-
- I opened by saying that Roger and I are mathematicians and started
- looking at neural network training problems when neural net researchers
- were experiencing difficulties with back-propagation. We think there are
- some wonderfully advanced and robust implementations of classical
- algorithms developed by the mathematical optimization community that are
- not being exploited by the neural network community. This is due largely
- to a lack of interaction between the two communities. This workshop was
- set up to address that issue. In July we organized a similar workshop
- for applied mathematicians at SIAM '92 in Los Angeles.
-
-
- Optimization Overview
- - ---------------------
- Roger Crane (David Sarnoff Research Center - rcrane@sarnoff.com)
-
- Roger gave a very brief, but broad, historical overview of optimization
- algorithm research and development in the mathematical community. He
- showed a time line starting with gradient descent in the 1950's and
- progressing to sequential quadratic programming (SQP) in the 1970's and
- 1980's. SQP is the current state of the art optimization algorithm for
- constrained optimization. It's a second order method that solves a
- sequence of quadratic approximation problems. SQP is quite frugal with
- function evaluations and handles both linear and nonlinear constraints.
- Roger stressed the robustness of algorithms found in commercial packages
- (e.g. NAG library) and that reinventing the wheel was usually not a good
- thing to do since many subtleties will be missed. A good reference for
- this material is
-
- Practical Optimization
- Gill, P. E., Murray, W., and Wright, M. H.
- Academic Press: London and New York
- 1981
-
- Roger's overview generated a lot of discussion. Most of it centered
- around the fact that second order methods involve using the Hessian, or
- an approximation to it, and that this is impractical for large problems
- (> 500-1000 parameters). Participants also commented that the
- mathematical optimization community has not yet fully realized this and
- that stochastic optimization techniques are needed for these large
- problems. All classical methods are inherently deterministic and work
- only for "batch" training.
-
-
- SQP on a Test Problem
- - ---------------------
- Scott Markel (David Sarnoff Research Center - smarkel@sarnoff.com)
-
- I followed Roger's presentation with a short set of slides showing actual
- convergence of a neural network training problem where SQP was the
- training algorithm. Most of the workshop participants had not seen this
- kind of convergence before. Yann Le Cun noted that with such sharp
- convergence generalization would probably be pretty bad. I noted that
- sharp convergence was necessary if one was trying to do something like
- count local minima, where generaization is not an issue.
-
-
- In Defense of Gradient Descent
- - ------------------------------
- Barak Pearlmutter (Oregon Graduate Institute - bap@merlot.cse.ogi.edu)
-
- By this point back propagation and its many flavors had been well
- defended from the audience. Barak's presentation captured the main
- points in a clarifying manner. He gave examples of real application
- neural networks with thousands, millions, and billions of connections.
- This underscored the need for stochastic optimization techniques. Barak
- also made some general remarks about the characteristics of error
- surfaces. Some earlier work by Barak on gradient descent and second
- order momentum can be found in the NIPS-4 proceedings (p. 887). A strong
- plea was made by Barak, and echoed by the other participants, for fair
- comparisons between training methods. Fair comparisons are rare, but
- much needed.
-
-
- Very Fast Simulated Reannealing
- - -------------------------------
- Bruce Rosen (University of Texas at San Antonio - rosen@ringer.cs.utsa.edu)
-
- This presentation focused on a new optimization technique called Very
- Fast Simulated Reannealing (VFSR), which is faster than Boltzmann
- Annealing (BA) and Fast (Cauchy) Annealing (FA). Unlike back
- propagation, which Bruce considers mostly a method for pattern
- association/classification/generalization, simulated annealing methods
- are perhaps best used for functional optimization. He presented some
- results on this work, showing a comparison of Very Fast Simulated
- Reannealing to GA for function optimization and some recent work on
- function optimization with BA, FA, and VFSR.
-
- Bruce's (and Lester Ingber's) code is available from netlib -
-
- Interactive:
- ftp research.att.com
- [login as netlib, your_login_name as password]
- cd opt
- binary
- get vfsr.Z
- Email:
- mail netlib@research.att.com
- send vfsr from opt
-
- Contact Bruce (rosen@ringer.cs.utsa.edu) or Lester
- (ingber@alumni.cco.caltech.edu) for further information.
-
-
- General Comments
- - ----------------
- Yann Le Cun (AT&T Bell Labs - yann@neural.att.com)
-
- I asked Yann to summarize some of the comments he and others had been
- making during the morning session. Even though we didn't give him much
- time to prepare, he nicely outlined the main points. These included
-
- - - large problems require stochastic methods
- - - the mathematical community hasn't yet addressed the needs of the neural
- network community
- - - neural network researchers are using second order information in a variety of
- ways, but are definitely exploring uncharted territory
- - - symmetric sigmoids are necessary; [0,1] sigmoids cause scaling problems
- (Roger commented that classical methods would accommodate this)
-
-
- Cascade Correlation and Greedy Learning
- - ---------------------------------------
- Scott Fahlman (Carnegie Mellon University - scott.fahlman@cs.cmu.edu)
-
- Scott's presentation started with a description of QuickProp. This
- algorithm was developed in an attempt to address the slowness of back
- propagation. QuickProp uses second order information ala modified Newton
- method. This was yet another example of neural network researchers
- seeing no other alternative but to do their own algorithm development.
- Scott then described Cascade Correlation. CasCor and CasCor2 are greedy
- learning algorithms. They build the network, putting each new node in
- its own layer, in response to the remaining error. The newest node is
- trained to deal with the largest remaining error component. Papers on
- QuickProp, CasCor, and Recurrent CasCor can be found in the neuroprose
- archive (see fahlman.quickprop-tr.ps.Z, fahlman.cascor-tr.ps.Z, and
- fahlman.rcc.ps.Z).
-
-
- Comments on Training Issues
- - ---------------------------
- Gary Kuhn (Siemens Corporate Research - gmk@learning.siemens.com)
-
- Gary presented
-
- 1. a procedure for training with stochastic conjugate gradient.
- (G. Kuhn and N. Herzberg, Some Variations on Training of Recurrent Networks,
- in R. Mammone & Y. Zeevi, eds, Neural Networks: Theory and Applications,
- New York, Academic Press, 1991, p 233-244.)
-
- 2. a sensitivity analysis that led to a change in the architecture of a speech
- recognizer and to further, joint optimization of the classifier and its
- input features. (G. Kuhn, Joint Optimization of Classifier and Feature
- Space in Speech Recognition, IJCNN '92, IV:709-714.)
-
- He related Scott Fahlmans' interest in sensitivity to Yann Le Cun's emphasis on
- trainability, by showing how a sensitivity analysis led to improved
- trainability.
-
-
- Active Exemplar Selection
- - -------------------------
- Mark Plutowski (University of California - San Diego - pluto@cs.ucsd.edu)
-
- Mark gave a quick recap of his NIPS poster on choosing a concise subset
- for training. Fitting these exemplars results in the entire set being
- fit as well as desired. This method has only been used on noise free
- problems, but looks promising. Scott Fahlman expressed the opinion that
- exploiting the training data was the remaining frontier in neural network
- research.
-
-
- Final Summary
- - -------------
- Incremental, stochastic methods are required for training large networks.
- Robust, readily available implementations of classical algorithms can be
- used for training modest sized networks and are especially effective
- research tools for investigating mathematical issues, e.g. estimating the
- number of local minima.
-
-
- ------------------------------
-
- End of Neuron Digest [Volume 10 Issue 24]
- *****************************************
-