Between Dec 19, 2024 and Jan 2, 2025, datasets can be submitted to DRUM but will not be processed until after the break. Staff will not be available to answer email during this period, and will not be able to provide DOIs until after Jan 2. If you are in need of a DOI during this period, consider Dryad or OpenICPSR. Submission responses to the UDC may also be delayed during this time.
 

Learning nonlinear functions and graph topologies

Loading...
Thumbnail Image

Persistent link to this item

Statistics
View Statistics

Journal Title

Journal ISSN

Volume Title

Title

Learning nonlinear functions and graph topologies

Published Date

2021-08

Publisher

Type

Thesis or Dissertation

Abstract

Network science has successfully permeated a wide gamut of disciplines. A key first step to applying network science concepts to a complex system of interest, is obtaining a graph representation thereof, that accurately captures the dependencies among its constituent nodes. This is a challenging task, especially when accounting for generally nonlinear or time-varying nodal interactions. A key domain where these may arise, that is also of particular interest to this thesis, is that of identifying the topology of networks of the brain from functional magnetic resonance imaging (fMRI) time-series. To handle complex dependencies, the present thesis introduces novel nonlinear counterparts of linear association measures such as partial correlation and Granger causality, along with a nonlinear graph change detection scheme to unveil temporal variations. Choosing the type of nonlinearity to model is instrumental, as it dictates the form of nodal dependencies under consideration. To address this challenge in a data-driven fashion, the present thesis introduces the use of multi-kernel learning approaches to the realm of graph topology identification. The merits of the proposed framework are illustrated on real and synthetic resting-state fMRI data. Key to the novel approaches for capturing complex dependencies are methods for learning nonlinear functions. In this context, the second part of this thesis deals with Gaussian process (GP) based online learning. Two highly desirable, yet conflicting properties are identified, namely adaptability to a wide range of operational environments and scalability. The successful joint pursuit of these is elusive in the GP literature. To this end, a novel framework leveraging random feature kernel approximants and ensemble learning techniques is introduced. Algorithms for supervised learning are developed and associated performance analyses are provided. Finally, a novel unsupervised learning scheme extending the ubiquitous GP latent variable model is developed, capitalizing on the same key concepts. Tests on benchmark datasets highlight the benefits of the proposed approaches over state-of-the-art methods.

Keywords

Description

University of Minnesota Ph.D. dissertation. August 2021. Major: Electrical/Computer Engineering. Advisor: Georgios B. Giannakis. 1 computer file (PDF); vii, 89 pages.

Related to

Replaces

License

Collections

Series/Report Number

Funding information

Isbn identifier

Doi identifier

Previously Published Citation

Other identifiers

Suggested citation

Karanikolas, Vasileios Georgios. (2021). Learning nonlinear functions and graph topologies. Retrieved from the University Digital Conservancy, https://hdl.handle.net/11299/225035.

Content distributed via the University Digital Conservancy may be subject to additional license and use restrictions applied by the depositor. By using these files, users agree to the Terms of Use. Materials in the UDC may contain content that is disturbing and/or harmful. For more information, please see our statement on harmful content in digital repositories.