Memory reconsolidation for natural language processing
AUTOR(ES)
Tu, Kun
FONTE
Springer Netherlands
RESUMO
We propose a model of memory reconsolidation that can output new sentences with additional meaning after refining information from input sentences and integrating them with related prior experience. Our model uses available technology to first disambiguate the meanings of words and extracts information from the sentences into a structure that is an extension to semantic networks. Within our long-term memory we introduce an action relationships database reminiscent of the way symbols are associated in brain, and propose an adaptive mechanism for linking these actions with the different scenarios. The model then fills in the implicit context of the input and predicts relevant activities that could occur in the context based on a statistical action relationship database. The new data both of the more complete scenario and of the statistical relationships of the activities are reconsolidated into memory. Experiments show that our model improves upon the existing reasoning tool suggested by MIT Media lab, known as ConceptNet.
ACESSO AO ARTIGO
http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=2777198Documentos Relacionados
- New trends in natural language processing: statistical natural language processing.
- Levenshtein distance for information extraction in databases and for natural language processing.
- Retrieval Does Not Induce Reconsolidation of Inhibitory Avoidance Memory
- Natural language processing and the representation of clinical data.
- Representing Information in Patient Reports Using Natural Language Processing and the Extensible Markup Language