Welcome!

edit

Hello, Starbag, and welcome to Wikipedia! Thank you for your contributions. I hope you like the place and decide to stay. Here are a few links to pages you might find helpful:

You may also want to complete the Wikipedia Adventure, an interactive tour that will help you learn the basics of editing Wikipedia. You can visit the Teahouse to ask questions or seek help.

Please remember to sign your messages on talk pages by typing four tildes (~~~~); this will automatically insert your username and the date. If you need help, check out Wikipedia:Questions, ask me on my talk page, or ask for help on your talk page, and a volunteer should respond shortly. Again, welcome!  I dream of horses  If you reply here, please ping me by adding {{U|I dream of horses}} to your message  (talk to me) (My edits) @ 17:42, 18 January 2019 (UTC)Reply

The current version of the LSTM article is quite well written. You can add more details, but I will not let you re-write all the main parts of the article or add your opinion. For example, LSTM is not a RNN. LSTM is a unit of a RNN. This is a very common misconception or inappropriate terminology used, which should be avoided, because it doesn't convey what LSTM really is.

Do you have a source that LSTM is just a unit, not a full network? It is certainly used as a network. I dream of horses Starbag (talk) 20:00, 15 February 2019 (UTC)Reply
Yes, I'm your source, because I've studied LSTMs. Vanilla RNNs have units. LSTM is just a sophisticated unit that can be used by a RNN. Examples of other units are GRU. You can call a RNN with LSTM units an LSTM network. Indeed, I'm already doing it in the current version of the article, but LSTM should not be introduced as RNNs, because this is not what LSTM is. Also, please, stop adding info about the history to the introduction of the article. There's already a section for that. The intro should be used to describe the main idea of the article: the current article already does that! The introduction of the gates and cell memory is necessary to introduce the LSTM.
I've studied LSTMs, too, and don't agree. Where is your reference? Now the intro is for experts only and has no references at all - what's good about this? Please think about it. But ok, I'll make only small steps for now, adding a reference: In 2017, Facebook performed some 4.5 billion automatic translations every day using LSTM Starbag (talk) 20:13, 15 February 2019 (UTC)Reply
Not sure why I was pinged when Nbro should've been, but since I was, I'll point out that you can't use yourself as a source for this very reason. On Wikipedia, this is called "original research", off-wiki, it's called "circular sourcing", and in either case it is Not To Be Done.  I dream of horses  If you reply here, please ping me by adding {{U|I dream of horses}} to your message  (talk to me) (My edits) @ 20:17, 15 February 2019 (UTC)Reply
In fact, I'm not really citing myself or any of my works. I've just written parts of this article based on my knowledge of the field. I'm a master's student in AI.
I would like to note that you should cite the original articles and not the articles that cite other articles.
You added the info that LSTM can be trained using neuro-evolution and you cited a source, but the source is invalid. You should fix it.
Fixed. Nbro, you write: "I would like to note that you should cite the original articles and not the articles that cite other articles." But look at Wikipedia:Verifiability which says: "Base articles largely on reliable secondary sources. While primary sources are appropriate in some cases, relying on them can be problematic. For more information, see the Primary, secondary, and tertiary sources section of the NOR policy, and the Misuse of primary sources section of the BLP policy." Starbag (talk) 21:25, 15 February 2019 (UTC)Reply
Nbro, you were wrong: I looked up the original LSTM paper which states: LSTM is a recurrent neural network architecture. It is not just a single unit. Added the reference to back this up, in line with what you wrote: "you should cite the original articles." Plus more references because there was no reference whatsoever in the intro - only the template "citation needed" Starbag (talk) 22:04, 15 February 2019 (UTC)Reply
You can call a RNN which uses LSTM units an LSTM architecture, yes. This is not completely inconsistent with what I said and had written. If you refer to the RNN which uses LSTM units, you can call it LSTM network (as I had already done before you added your changes). In fact, I didn't revert your last edit because I think that this is still tolerable. However, note that the only novel concept, with respect to the plain RNN, that the "LSTM" paper introduced is just the LSTM unit. This is the reason I didn't want to explain LSTM in those terms.