You do not have permission to edit this page, for the following reason:
The action you have requested is limited to users in the group: Users.
Project description (free text)
Give a concise project description. Include:
The rapid advancement of foundation models, such as large-scale language models and vision transformers, has revolutionized various fields in artificial intelligence (AI). These models, characterized by their immense scale and pre-training on vast datasets, offer impressive generalization capabilities across numerous tasks. Integrating these powerful models into federated learning (FL), a distributed machine learning approach presents a promising area for exploration. The intersection of Federated Learning (FL) and Foundation Models (FM) introduces several potential benefits. For instance, FL can address challenges associated with non-iid (non-independent and identically distributed) and biased data by leveraging the advanced capabilities of FMs, leading to enhanced performance across various tasks and domains. More specifically, FMs can enhance FL in several ways: FMs can provide a robust starting point for FL by offering pre-trained models that can be fine-tuned efficiently. FMs can act as strong generators, synthesizing diverse data to enrich the training datasets used in FL. FMs can serve as effective teachers through knowledge distillation, helping to address suboptimal performance in FL models. Additionally, FMs introduce a new paradigm for sharing knowledge in FL. Unlike the traditional approach of exchanging high-dimensional model parameters, FMs enable the adoption of techniques like prompt tuning, offering a more efficient and flexible sharing mechanism within federated systems. The aim of this thesis proposal is to explore the intersection of Federated Learning (FL) and Foundation Models (FM), with a primary focus on developing a novel aggregation method and a new paradigm for knowledge sharing among clients in FL.
Summary:
This is a minor edit Watch this page
Cancel
Home
Research
Education
Partners
People
Contact