Transformer-Model for hierarchical text
I have a NLP task/idea in mind, where the input and output text is structured purely hierarchically, like multi-level bulletpoint lists or a table of content.
The question is: Is there any resaetch for this particular type of text for transformer modells? I am especially interested in possibilities to encode the position to represent the multi-level structure. Furthermore, do you know of any datasets containing such text samples (e.g. table of contents; multi-level notes; mind-maps; …)?