Around 2015 and 2016 we saw sequence-to-sequence (#seq2seq) models applied to data-to-text #NLG for the first time. These models were trained end-to-end and were very exciting because it raised the prospect of reducing the amount of hand-crafted #GrammarEngineering one would have to do to create a #NaturalLanguageGeneration system.
#seq2seq #nlg #grammarengineering #naturallanguagegeneration