Please use this identifier to cite or link to this item: http://localhost:8080/xmlui/handle/123456789/131123
Full metadata record
DC FieldValueLanguage
dc.contributor.authorHubert Cardot
dc.date.accessioned2017-04-30T13:31:39Z-
dc.date.available2017-04-30T13:31:39Z-
dc.date.issued2011
dc.identifier.isbn978-953-307-685-0
dc.identifier.urihttp://hdl.handle.net/123456789/131123-
dc.description.abstractThe RNNs (Recurrent Neural Networks) are a general case of artificial neural networks where the connections are not feed-forward ones only. In RNNs, connections between units form directed cycles, providing an implicit internal memory. Those RNNs are adapted to problems dealing with signals evolving through time. Their internal memory gives them the ability to naturally take time into account. Valuable approximation results have been obtained for dynamical systems.
dc.language.isoeng
dc.publisherInTech
dc.relation.isbasedon10.5772/631
dc.relation.urihttp://www.intechopen.com/books/recurrent-neural-networks-for-temporal-data-processing
dc.rights.uriCC BY-NC-SA (姓名標示-非商業性-相同方式分享)
dc.sourceInTech
dc.subject.classificationComputer and Information Science
dc.subject.classification Numerical Analysis and Scientific Computing
dc.titleRecurrent Neural Networks for Temporal Data Processing
dc.type電子教課書
dc.classification自然科學類
Theme:教科書-自然科學類

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.