next up previous contents
Next: Mixed approaches Up: Non-organized semantic representation Previous: Semantic vectors at Shinshu

A semantic representation using neural networks at ETL

Doctor Takahashi from ETL laboratories proposed a method to represent semantic contents of Japanese words or sentences with real vectors, thanks to neural networks.

These vectors, called semantic representation vectors (SRV) have all a fixed size and are obtained thanks to recursive auto-associative memories (RAAM) neural networks trained on a corpus, in order to assign similar SRV to similar words or sentences. Moreover the transformation from a sentence to a SRV is reversible so different sentences have different SRVs.

The semantic representation is thus obtained in a vector space whose base vectors are not defined by a list of concepts but automatically adjusted by neural networks.



Jean-Philippe Vert
Sun Dec 6 11:05:42 MET 1998