Personalized Knowledge Graph Summarization with LLM

Supervised by Professor Yujun Yan, Still working~ Based on my interest in graph neural networks, I proactively contacted Professor Yujun Yan for learning related to this area.

Working

This project aims to address the following problem: generating a personalized, compact knowledge graph based on the user’s historical data on the knowledge graph, while meeting the user’s needs as much as possible.

First, we researched current cutting-edge knowledge graph reasoning algorithms and knowledge graph databases, ultimately selecting GreaseLM and CommenceQA.

Next, since we currently lack historical data for the knowledge graph, we constructed historical data using random walks and sentence embeddings from a language model.

We are now in the process of designing algorithms and experiments, hoping to incorporate LLMs for text analysis and processing.

We use PyG, network , wandb … to build our experimental pipeline efficiently.

What I get

The biggest gain from this project has been the improvement in my source code reading skills and my knowledge in the field of Graph Learning. Since I needed to design experiments based on GreaseLM, I thoroughly read the relevant source code and made modifications based on it, significantly enhancing my ability to read source code. Additionally, during the literature review phase, I gained a deeper understanding of Graph Learning. I’m a good user of PyG, network, wandb , etc. right now~

Currently, I am very eager to combine Graphs and LLMs to construct AGI (a naive dream :) .