From aba50b4913f926d41ae9cdbdb21360c5d8e43047 Mon Sep 17 00:00:00 2001
From: johannes bilk <johannes.bilk-2@exp2.physik.uni-giessen.de>
Date: Mon, 24 Jul 2023 12:58:26 +0200
Subject: [PATCH] =?UTF-8?q?added=20=E2=80=98hopfield=E2=80=99=20to=20readm?=
 =?UTF-8?q?e,=20nothing=20written=20on=20it.?=
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit

---
 README.md | 1 +
 1 file changed, 1 insertion(+)

diff --git a/README.md b/README.md
index 98ed780..4e44d75 100644
--- a/README.md
+++ b/README.md
@@ -120,6 +120,7 @@ They generate dummy data, where one can set different parameters.
     - [regression loss](https://datamonje.com/regression-loss-functions/)
     - [regularization](http://www.chioka.in/differences-between-l1-and-l2-as-loss-function-and-regularization/)
     - [activation functions](https://towardsdatascience.com/creating-neural-networks-from-scratch-in-python-6f02b5dd911)
+    - [hopfield layer](https://ml-jku.github.io/hopfield-layers/)
     - [rnn - theory](https://www.freecodecamp.org/news/the-ultimate-guide-to-recurrent-neural-networks-in-python/)
     - [rnn - implementation](https://towardsdatascience.com/recurrent-neural-networks-rnns-3f06d7653a85)
     - [rnn - implementation](https://medium.com/@VersuS_/coding-a-recurrent-neural-network-rnn-from-scratch-using-pytorch-a6c9fc8ed4a7)
-- 
GitLab