diff --git a/_config.yml b/_config.yml index 811f91b..b28ac2f 100644 --- a/_config.yml +++ b/_config.yml @@ -14,8 +14,8 @@ linkedin_username: MLSec linkedin_id: 77059723 address: | - University of Cagliari - Cagliari (Italy) + University of Cagliari & University of Genoa + # Set your Google Analytics tracking ID (set up yours at http://www.google.com/analytics/) # google_analytics: UA-XXXXXXXX-X diff --git a/_events/_new_event.md b/_events/_new_event.md deleted file mode 100644 index 165a32a..0000000 --- a/_events/_new_event.md +++ /dev/null @@ -1,9 +0,0 @@ ---- -type: raw_event -date: 2023-12-07T17:00:00+1:00 -name: New Seminar! -description: 'Franziska Boenisch' -hide_from_announcments: false -thumbnail: /_images/next.png - ---- diff --git a/_events/boenisch.md b/_events/boenisch.md new file mode 100644 index 0000000..b797d68 --- /dev/null +++ b/_events/boenisch.md @@ -0,0 +1,11 @@ +--- +type: events +date: 2023-12-07T17:00:00+1:00 +speaker: Franziska Boenisch +affiliation: CISPA +title: "Can individuals trust privacy mechanisms for machine learning? A case study of federated learning" +bio: "Franziska is a tenure-track faculty at the CISPA Helmholtz Center for Information Security where she co-leads the SprintML lab. Before, she was a Postdoctoral Fellow at the University of Toronto and Vector Institute in Toronto advised by Prof. Nicolas Papernot. Her current research centers around private and trustworthy machine learning with a focus on decentralized applications. Franziska obtained her Ph.D. at the Computer Science Department at Freie University Berlin, where she pioneered the notion of individualized privacy in machine learning. During her Ph.D., Franziska was a research associate at the Fraunhofer Institute for Applied and Integrated Security (AISEC), Germany. She received a Fraunhofer TALENTA grant for outstanding female early career researchers and the German Industrial Research Foundation prize for her research on machine learning privacy." +abstract: "What is the trusted computing base for privacy? This talk will answer this question from the perspective of individual users. I will first focus on a case study of federated learning (FL). My work shows that vanilla FL currently does not provide meaningful privacy for individual users who cannot trust the central server orchestrating the FL protocol. This is because gradients of the shared model directly leak individual training data points.The resulting leakage can be amplified by a malicious attacker through small, targeted manipulations of the model weights. My work thus shows that the protection that vanilla FL claims to offer is but a thin facade: data may never \"leave'' personal devices explicitly but it certainly does so implicitly through gradients. Then, I will show that the leakage is still exploitable for what is considered the most private instantiation of FL: a protocol that combines secure aggregation with differential privacy. This highlights that individuals unable to trust the central server should instead rely on verifiable mechanisms to obtain privacy. I will conclude my talk with an outlook on how such verifiable mechanisms can be designed in the future, as well as how my work generally advances the ability to audit privacy mechanisms. " +youtube: +zoom: https://us02web.zoom.us/meeting/register/tZcqcu6orjIjEtbYTZTIdikT4rCZM1F3zk4h +--- diff --git a/_includes/announcements.html b/_includes/announcements.html index 0dea340..27580c0 100644 --- a/_includes/announcements.html +++ b/_includes/announcements.html @@ -16,7 +16,7 @@

Updates

{% for n in all_events_sorted limit:7 %}
  • {% if n.type %} - New event! {{ n.description }} is coming to our seminar on {{ n.date | date_to_string: "ordinal", "US" }}! + New event! {{ n.speaker }} is coming to our seminar on {{ n.date | date_to_string: "ordinal", "US" }}! {% else %} {{ n.content }} {% endif %} diff --git a/_includes/footer.html b/_includes/footer.html index 7dbb50a..c9961f7 100644 --- a/_includes/footer.html +++ b/_includes/footer.html @@ -8,13 +8,15 @@

    {{ site.address | newline_to_br }} + {% if site.schoolurl %} {{site.schoolurl}} - - +
    {% endif %} - +Thanks to: Elsa Project + +