GPT Generative Pre-trained Transformer.
arXiv:2010.04389v1paper 2020
NLP natural language processing
NLG natural language generation, text generation (informally)
The goal of text generation to make machines express in human language.
It is one of the most important yet challenging tasks in NLP.
Since 2014, various neural encoder-decoder models pioneered by Seq2Seq have been proposed to achieve the goal by learning to map input text to output text. However, the input text alone often provides limited knowledge to generate the desired output, so the performance of text generation is still far from satisfaction in many real-world scenarios. To address this issue, researchers have considered
knowledge-enhanced text generation incorporate various forms of knowledge beyond the input text into the generation models
In this survey, we present a comprehensive review of the research on knowledge enhanced text generation over the past five years. The main content includes two parts:
This survey can have broad audiences, researchers and practitioners, in academia and industry.
triplet triple triplets triples = es: RDF relates entities by the subject-predicate-object format where the subject and object are related to one another by the predicate. The triple is a minimal representation for information without losing the context
nlp triplet extraction
ref: linkedin/triplets-concept-extraction-from-english-sentence
treebank a text corpus where each sentence belonging to the corpus has a syntactic structure added to it.
Definitional question answering
answering questions of the type “Who is X” and “What is X.”
We use text triplets. We further choose relevant triplets based on a manually built list of terms that are found in definitions in general.
ref: springer
Seq2Seq sequence-to-sequence from a given input sequence generate an expected output sequence