1-3hit |
Iku OHAMA Takuya KIDA Hiroki ARIMURA
Latent variable models for relational data enable us to extract the co-cluster structure underlying observed relational data. The Infinite Relational Model (IRM) is a well-known relational model for discovering co-cluster structures with an unknown number of clusters. The IRM and several related models commonly assume that the link probability between two objects depends only on their cluster assignment. However, relational models based on this assumption often lead us to extract many non-informative and unexpected clusters. This is because the cluster structures underlying real-world relationships are often blurred by biases of individual objects. To overcome this problem, we propose a multi-layered framework, which extracts a clear de-blurred co-cluster structure in the presence of object biases. Then, we propose the Multi-Layered Infinite Relational Model (MLIRM) which is a special instance of the proposed framework incorporating the IRM as a co-clustering model. Furthermore, we reveal that some relational models can be regarded as special cases of the MLIRM. We derive an efficient collapsed Gibbs sampler to perform posterior inference for the MLIRM. Experiments conducted using real-world datasets have confirmed that the proposed model successfully extracts clear and interpretable cluster structures from real-world relational data.
Iku OHAMA Hiromi IIDA Takuya KIDA Hiroki ARIMURA
Latent variable models for relational data enable us to extract the co-cluster structures underlying observed relational data. The Infinite Relational Model (IRM) is a well-known relational model for discovering co-cluster structures with an unknown number of clusters. The IRM assumes that the link probability between two objects (e.g., a customer and an item) depends only on their cluster assignment. However, relational models based on this assumption often lead us to extract many non-informative and unexpected clusters. This is because the underlying co-cluster structures in real-world relationships are often destroyed by structured noise that blurs the cluster structure stochastically depending on the pair of related objects. To overcome this problem, in this paper, we propose an extended IRM that simultaneously estimates denoised clear co-cluster structure and a structured noise component. In other words, our proposed model jointly estimates cluster assignment and noise level for each object. We also present posterior probabilities for running collapsed Gibbs sampling to infer the model. Experiments on real-world datasets show that our model extracts a clear co-cluster structure. Moreover, we confirm that the estimated noise levels enable us to extract representative objects for each cluster.
Zenshiro KAWASAKI Keiji SHIBATA Masato TAJIMA
This paper presents an extension of the database query language SQL to include queries against a database with natural language annotations. The proposed scheme is based on Concept Coupling Model, a language model for handling natural language sentence structures. Integration of the language model with the conventional relational data model provides a unified environment for manipulating information sources comprised of relational tables and natural language texts.