You do not have permission to edit this page, for the following reason:
The action you have requested is limited to users in the group: Users.
Project description (free text)
Give a concise project description. Include:
Since the advent of Federated Learning during recent years, industries, institutions and the research community are able to train multiple models on data that stays locally and then aggregate model parameters to form a global model. This set of methods has several advantages including the ability to reduce the burden on a single server instance as well as to preserve privacy which could be valuable when working with highly sensitive data such as Electronic Health Records (EHR). There are several strategies proposed to perform the federated optimization where the most simple is when the server aggregates parameters (e.g. neural network weights) by computing the average of parameters. Past research has shown that despite the effectiveness the simpler federated optimization strategies are sensitive to the computational resource available at each local node and the statistical heterogeneity of the data (since most machine learning methods rely on the data to be identically distributed). Therefore many extensions to the simpler federated optimization have been proposed to measure and communicate the local heterogeneity, which could for some applications breach data privacy. This master thesis is about to study, understand, test and develop innovative and alternative methods based on implicit ways of using the statistical heterogeneity from each local node.
Summary:
This is a minor edit Watch this page
Cancel
Home
Research
Education
Partners
People
Contact