A parallel architecture for meaning comparison
ABSTRACT In this paper we present a fine grained parallel architecture that performs meaning comparison using vector cosine similarity (dot product). Meaning comparison assigns a similarity value to two objects (e.g. text documents) based on how similar their meanings (represented as two vectors) are to each other. The novelty of our design is the fine grained parallelism which is not exploited in available hardware based dot product processor designs and can not be achieved in traditional server class processors like the Intel Xeon. We compare the performance of our design against that of available hardware based dot product processors as well a server class processor using optimum software code performing the same computation. We show that our hardware design can achieve a speedup of 62,000 times compared to an available hardware design and a speedup of 8866 times with 33% (1.5 times) less power consumption, compared to software code running on Intel Xeon processor for 1024 basis vectors. Our design can significantly reduce the amount of servers required for similarity comparison in a distributed search engine. Thus it can enable reduction in energy consumption, investment, operational costs and floor area in search engine data centers. This design can also be deployed for other applications which require fast dot product computation.
- SourceAvailable from: Dominic Widdows
Conference Proceeding: Semantic Vector Products : Some Initial Investigations[show abstract] [hide abstract]
ABSTRACT: Semantic vector models have proven their worth in a number of natural language applications whose goals can be accomplished by modelling individual semantic concepts and measuring similarities between them. By comparison, the area of semantic compositionality in these models has so far remained underdeveloped. This will be a crucial hurdle for semantic vector models: in order to play a fuller part in the modelling of human language, these models will need some way of modelling the way in which single concepts are put together to form more complex conceptual structures. This paper explores some of the opportunities for us- ing vector product operations to model compositional phenomena in natural language. These vector opera- tions are all well-known and used in mathematics and physics, particularly in quantum mechanics. Instead of designing new vector composition operators, this paper gathers a list of existing operators, and a list of typical composition operations in natural language, and de- scribes two small experiments that begin to investigate the use of certain vector operators to model certain language phenomena. Though preliminary, our results are encouraging. It is our hope that these results, and the gathering of other untested semantic and vector compositional challenges into a single paper, will stimulate further research in this area.Proceedings of the Second AAAI Symposium on Quantum Interaction; 01/2008
Conference Proceeding: Parallel Computation of Similarity Measures Using an FPGA-Based Processor Array.22nd International Conference on Advanced Information Networking and Applications, AINA 2008, GinoWan, Okinawa, Japan, March 25-28, 2008; 04/2008
- [show abstract] [hide abstract]
ABSTRACT: This paper proposes a framework for repre- senting the meaning of phrases and sentences in vector space. Central to our approach is vector composition which we operationalize in terms of additive and multiplicative func- tions. Under this framework, we introduce a wide range of composition models which we evaluate empirically on a sentence similarity task. Experimental results demonstrate that the multiplicative models are superior to the additive alternatives when compared against human judgments.01/2008;