Monitoring Term Drift Based on Semantic Consistency in an Evolving Vector Field

February 05, 2015 Β· Entered Twilight Β· πŸ› IEEE International Joint Conference on Neural Network

πŸŒ… TWILIGHT: Old Age
Predates the code-sharing era β€” a pioneer of its time

"Last commit was 10.0 years ago (β‰₯5 year threshold)"

Evidence collected by the PWNC Scanner

Repo contents: LICENSE, README.md, src, trackBmus.py

Authors Peter Wittek, SÑndor DarÑnyi, Efstratios Kontopoulos, Theodoros Moysiadis, Ioannis Kompatsiaris arXiv ID 1502.01753 Category cs.CL: Computation & Language Cross-listed cs.LG, cs.NE, stat.ML Citations 14 Venue IEEE International Joint Conference on Neural Network Repository https://github.com/peterwittek/concept_drifts ⭐ 3 Last Checked 2 months ago
Abstract
Based on the Aristotelian concept of potentiality vs. actuality allowing for the study of energy and dynamics in language, we propose a field approach to lexical analysis. Falling back on the distributional hypothesis to statistically model word meaning, we used evolving fields as a metaphor to express time-dependent changes in a vector space model by a combination of random indexing and evolving self-organizing maps (ESOM). To monitor semantic drifts within the observation period, an experiment was carried out on the term space of a collection of 12.8 million Amazon book reviews. For evaluation, the semantic consistency of ESOM term clusters was compared with their respective neighbourhoods in WordNet, and contrasted with distances among term vectors by random indexing. We found that at 0.05 level of significance, the terms in the clusters showed a high level of semantic consistency. Tracking the drift of distributional patterns in the term space across time periods, we found that consistency decreased, but not at a statistically significant level. Our method is highly scalable, with interpretations in philosophy.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

πŸ“œ Similar Papers

In the same crypt β€” Computation & Language

πŸŒ… πŸŒ… Old Age

Attention Is All You Need

Ashish Vaswani, Noam Shazeer, ... (+6 more)

cs.CL πŸ› NeurIPS πŸ“š 166.0K cites 8 years ago