home *** CD-ROM | disk | FTP | other *** search
- Path: sparky!uunet!munnari.oz.au!spool.mu.edu!agate!doc.ic.ac.uk!warwick!uknet!yorkohm!minster!george
- From: george@minster.york.ac.uk
- Newsgroups: comp.ai.neural-nets
- Subject: Recurrent networks
- Message-ID: <728134656.26714@minster.york.ac.uk>
- Date: 27 Jan 93 11:37:36 GMT
- Organization: Department of Computer Science, University of York, England
- Lines: 21
-
- I've been trying to get neural networks to learn sequences
- of type DIFFERENT->SAME->DIFFERENT states, e.g.
-
- A->B->C and X->B->Y
-
- is a simple example. However, with a recurrent network (feeding
- back the hidden layer), the output of a network for both pattern
- sequences can be given as: (fn O() is output layer, H() hidden)
-
- O(H(A)) and O(H(X)) both must equal B
-
- This seems to force the hidden representation for A and X to be
- identical. On the next step when B is given as input, the context
- fed back, i.e. the hidden representation H(A) or H(X), will
- not distinguish the network's state to output C or Y as appropriate.
-
- i.e. O(H(B, H(A))) = O(H(B, H(X)))
-
- How can this problem be overcome?
-
- Thanks - George Bolt
-