Towards Learning Powerful Deep Graph Neural Networks and Embeddings
2020-06
Loading...
View/Download File
Persistent link to this item
Statistics
View StatisticsJournal Title
Journal ISSN
Volume Title
Title
Towards Learning Powerful Deep Graph Neural Networks and Embeddings
Authors
Published Date
2020-06
Publisher
Type
Thesis or Dissertation
Abstract
Learning powerful data embeddings has recently become the core of machine learning algorithms especially in natural language processing and computer vision domains. In the graph domain, the applications of learning graph embeddings are vast and have distinguished use-cases across multi-cross domains such as bioinformatics, chemoinformatics, social networks and recommendation systems. To date, graph remains the most fundamental data structure that can represent many forms of real-world datasets. However, due to its rich but complex data structure, graph presents a significant challenge in forging powerful graph embeddings. Even standard deep learning techniques such as Recurrent Neural Networks (RNNs) or Convolutional Neural Networks (CNNs) are not capable enough to operate on the data lying beyond 1D sequence of say words or 2D pixel-grid of images and therefore, cannot generalize to arbitrary graph structure. Recently, Graph Neural Networks (GNNs) have been proposed to alleviate such limitations but the current state is far from being mature in both theory and applications. To that end, this thesis aims at developing powerful graph embedding models for solving wide-variety of real-world problems on the graph. We study some of the major approaches for devising graph embedding namely Graph Kernel Or Spectrum and GNN. We expose and tackle some of their fundamental weakness and contribute several novel state-of-the-art graph embedding models. These models can achieve superior performance in solving many real-world problems on graphs such as node classification, graph classification or link prediction over existing methods and that too comes with desirable theoretical guarantees. We first study the capabilities of Graph Kernel or Spectrum approaches toward yielding powerful graph embeddings in terms of uniqueness, stability, sparsity and computationally efficiency. Second, we propose Graph Capsule Neural Network that can yield powerful graph embeddings by capturing much more information encoded in the graph structure in comparison with existing GNNs. Third, we devise a first ever universal and transferable GNN and thus, makes transfer learning possible in graph domain. Specifically with this particular GNN, graph embeddings can be shared and transfered across different models and domains, reaping the huge benefits of transfer learning. Lastly, there is a dearth of theoretical explorations of GNN models such as their generalization properties. We take the first step towards developing a deeper theoretical understanding of GNN models by analyzing their stability and deriving their generalization guarantees. To the best of our knowledge, we are the first to study stability bounds on graph learning in a semi-supervised setting and derive related generalization bounds for GNN models. In summary, this thesis contributes several state-of-the-art graph embeddings and novel graph theory, specifically (i) Powerful Graph Embedding called Family of Graph Spectral Distances (Fgsd) (ii) Highly Informative GNN Called Graph Capsule Neural Network (GCAPS) (iii) Universal and Transferable GNN called Deep Universal and Transferable Graph Neural Network (DUGNN) (iv) Stability Theory and Generalization Guarantees of GNN.
Description
University of Minnesota Ph.D. dissertation. June 2020. Major: Computer Science. Advisor: Zhi-Li Zhang. 1 computer file (PDF); xiv, 144 pages.
Related to
Replaces
License
Collections
Series/Report Number
Funding information
Isbn identifier
Doi identifier
Previously Published Citation
Other identifiers
Suggested citation
Verma, Saurabh. (2020). Towards Learning Powerful Deep Graph Neural Networks and Embeddings. Retrieved from the University Digital Conservancy, https://hdl.handle.net/11299/216106.
Content distributed via the University Digital Conservancy may be subject to additional license and use restrictions applied by the depositor. By using these files, users agree to the Terms of Use. Materials in the UDC may contain content that is disturbing and/or harmful. For more information, please see our statement on harmful content in digital repositories.