Implementation of an Associative Memory using a Restricted Hopfield Network

α
Tet Yeap
Tet Yeap
α University of Ottawa University of Ottawa

Send Message

To: Author

Implementation of an Associative Memory using a  Restricted Hopfield Network

Article Fingerprint

ReserarchID

6W1ZI

Implementation of an Associative Memory using a  Restricted Hopfield Network Banner

AI TAKEAWAY

Connecting with the Eternal Ground
  • English
  • Afrikaans
  • Albanian
  • Amharic
  • Arabic
  • Armenian
  • Azerbaijani
  • Basque
  • Belarusian
  • Bengali
  • Bosnian
  • Bulgarian
  • Catalan
  • Cebuano
  • Chichewa
  • Chinese (Simplified)
  • Chinese (Traditional)
  • Corsican
  • Croatian
  • Czech
  • Danish
  • Dutch
  • Esperanto
  • Estonian
  • Filipino
  • Finnish
  • French
  • Frisian
  • Galician
  • Georgian
  • German
  • Greek
  • Gujarati
  • Haitian Creole
  • Hausa
  • Hawaiian
  • Hebrew
  • Hindi
  • Hmong
  • Hungarian
  • Icelandic
  • Igbo
  • Indonesian
  • Irish
  • Italian
  • Japanese
  • Javanese
  • Kannada
  • Kazakh
  • Khmer
  • Korean
  • Kurdish (Kurmanji)
  • Kyrgyz
  • Lao
  • Latin
  • Latvian
  • Lithuanian
  • Luxembourgish
  • Macedonian
  • Malagasy
  • Malay
  • Malayalam
  • Maltese
  • Maori
  • Marathi
  • Mongolian
  • Myanmar (Burmese)
  • Nepali
  • Norwegian
  • Pashto
  • Persian
  • Polish
  • Portuguese
  • Punjabi
  • Romanian
  • Russian
  • Samoan
  • Scots Gaelic
  • Serbian
  • Sesotho
  • Shona
  • Sindhi
  • Sinhala
  • Slovak
  • Slovenian
  • Somali
  • Spanish
  • Sundanese
  • Swahili
  • Swedish
  • Tajik
  • Tamil
  • Telugu
  • Thai
  • Turkish
  • Ukrainian
  • Urdu
  • Uzbek
  • Vietnamese
  • Welsh
  • Xhosa
  • Yiddish
  • Yoruba
  • Zulu

Abstract

A trainable analog restricted Hopfield Network is presented in this paper. It consists of two layers of nodes, visible and hidden nodes, connected by weighted directional paths forming a bipartite graph with no intralayer connection. An energy or Lyapunov function was derived to show that the proposed network will converge to stable states. The proposed network can be trained using either the modified SPSA or BPTT algorithms to ensure that all the weights are symmetric. Simulation results show that the presence of hidden nodes increases the network’s memory capacity. Using EXOR as an example, the network can be trained to be a dynamic classifier. Using A, U, T, S as training characters, the network was trained to be an associative memory. Simulation results show that the network can perform perfect re-creation of noisy images. Its recreation performance has higher noise tolerance than the standard Hopfield Network and the Restricted Boltzmann Machine. Simulation results also illustrate the importance of feedback iteration in implementing associative memory to re-create from noisy images.

References

20 Cites in Article
  1. J John,Hopfield (1982). Neural networks and physical systems with emergent collective computational abilities.
  2. J John,David Hopfield,Tank (1985). neural" computation of decisions in optimization problems.
  3. J John,David Hopfield,Tank (1986). Computing with neural circuits: A model.
  4. Edwardc Robertj Mceliece,Posner,Eugener Rodemich,Venkatesh (1987). The capacity of the hopfield associative memory.
  5. J Amos,Romain Storkey,Valabregue (1999). The basins of attraction of a new hopfield learning rule.
  6. Elizabeth Gardner (1987). Maximum Storage Capacity in Neural Networks.
  7. Elizabeth Gardner (1988). The space of interactions in neural network models.
  8. Elizabeth Gardner,Bernard Derrida (1988). Optimal storage properties of neural network models.
  9. Geoffrey David H Ackley,Terrence Hinton,Sejnowski (1985). A learning algorithm for boltzmann machines.
  10. Ruslan Salakhutdinov,Geoffrey Hinton (2009). An Efficient Learning Procedure for Deep Boltzmann Machines.
  11. Geoffrey Hinton (2012). A practical guide to training restricted boltzmann machines.
  12. Ilya Sutskever,Geoffrey Hinton (2009). Temporal-Kernel Recurrent Neural Networks.
  13. Ruslan Salakhutdinov,Andriy Mnih,Geoffrey Hinton (2007). Restricted Boltzmann machines for collaborative filtering.
  14. C James,Spall (1992). Multivariate stochastic approximation using a simultaneous perturbation gradient approximation.
  15. C James,Spall (1998). An overview of the simultaneous perturbation method for efficient optimization.
  16. Werbos Paul (1990). Backpropagation through time: what it does and how to do it.
  17. Simon Haykin (2010). Neural networks and learning machines.
  18. Marvin Minsky,Seymour Papert (2017). (1969) Marvin Minsky and Seymour Papert, Perceptrons, Cambridge, MA: MIT Press, Introduction, pp. 1-20, and p. 73 (figure 5.1).
  19. Yann Lecun,Bernhard Boser,John Denker,Donnie Henderson,Richard Howard,Wayne Hubbard,Lawrence Jackel (1989). Backpropagation Applied to Handwritten Zip Code Recognition.
  20. Yann Lecun,Yoshua Bengio (1995). Convolutional networks for images, speech, and time series.

Funding

No external funding was declared for this work.

Conflict of Interest

The authors declare no conflict of interest.

Ethical Approval

No ethics committee approval was required for this article type.

Data Availability

Not applicable for this article.

How to Cite This Article

Tet Yeap. 2021. \u201cImplementation of an Associative Memory using a Restricted Hopfield Network\u201d. Global Journal of Research in Engineering - F: Electrical & Electronic GJRE-F Volume 21 (GJRE Volume 21 Issue F2): .

Download Citation

Journal Specifications

Crossref Journal DOI 10.17406/gjre

Print ISSN 0975-5861

e-ISSN 2249-4596

Keywords
Classification
GJRE-F Classification: FOR Code: 290903
Version of record

v1.2

Issue date

May 19, 2021

Language
en
Experiance in AR

Explore published articles in an immersive Augmented Reality environment. Our platform converts research papers into interactive 3D books, allowing readers to view and interact with content using AR and VR compatible devices.

Read in 3D

Your published article is automatically converted into a realistic 3D book. Flip through pages and read research papers in a more engaging and interactive format.

Article Matrices
Total Views: 2220
Total Downloads: 940
2026 Trends
Related Research

Published Article

A trainable analog restricted Hopfield Network is presented in this paper. It consists of two layers of nodes, visible and hidden nodes, connected by weighted directional paths forming a bipartite graph with no intralayer connection. An energy or Lyapunov function was derived to show that the proposed network will converge to stable states. The proposed network can be trained using either the modified SPSA or BPTT algorithms to ensure that all the weights are symmetric. Simulation results show that the presence of hidden nodes increases the network’s memory capacity. Using EXOR as an example, the network can be trained to be a dynamic classifier. Using A, U, T, S as training characters, the network was trained to be an associative memory. Simulation results show that the network can perform perfect re-creation of noisy images. Its recreation performance has higher noise tolerance than the standard Hopfield Network and the Restricted Boltzmann Machine. Simulation results also illustrate the importance of feedback iteration in implementing associative memory to re-create from noisy images.

Our website is actively being updated, and changes may occur frequently. Please clear your browser cache if needed. For feedback or error reporting, please email [email protected]

Request Access

Please fill out the form below to request access to this research paper. Your request will be reviewed by the editorial or author team.
X

Quote and Order Details

Contact Person

Invoice Address

Notes or Comments

This is the heading

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut elit tellus, luctus nec ullamcorper mattis, pulvinar dapibus leo.

High-quality academic research articles on global topics and journals.

Implementation of an Associative Memory using a Restricted Hopfield Network

Tet Yeap
Tet Yeap University of Ottawa

Research Journals