You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
abstract = {The Lottery Ticket Hypothesis continues to have a profound practical impact on the quest for small scale deep neural networks that solve modern deep learning tasks at competitive performance. These lottery tickets are identified by pruning large randomly initialized neural networks with architectures that are as diverse as their applications. Yet, theoretical insights that attest their existence have been mostly focused on deed fully-connected feed forward networks with ReLU activation functions. We prove that also modern architectures consisting of convolutional and residual layers that can be equipped with almost arbitrary activation functions can contain lottery tickets with high probability.},
185
185
spotlight={true},
186
186
img={proof_network_overview.png},
187
+
}
188
+
@inproceedings{
189
+
fischer2022plant,
190
+
title={Plant 'n' Seek: Can You Find the Winning Ticket?},
191
+
author={Jonas Fischer and Rebekka Burkholz},
192
+
booktitle={The Tenth International Conference on Learning Representations},
193
+
year={2022},
194
+
url={https://openreview.net/forum?id=9n9c8sf0xm},
195
+
pdf={https://openreview.net/pdf?id=9n9c8sf0xm},
196
+
abstract={The lottery ticket hypothesis has sparked the rapid development of pruning algorithms that aim to reduce the computational costs associated with deep learning during training and model deployment. Currently, such algorithms are primarily evaluated on imaging data, for which we lack ground truth information and thus the understanding of how sparse lottery tickets could be. To fill this gap, we develop a framework that allows us to plant and hide winning tickets with desirable properties in randomly initialized neural networks. To analyze the ability of state-of-the-art pruning to identify tickets of extreme sparsity, we design and hide such tickets solving four challenging tasks. In extensive experiments, we observe similar trends as in imaging studies, indicating that our framework can provide transferable insights into realistic problems. Additionally, we can now see beyond such relative trends and highlight limitations of current pruning methods. Based on our results, we conclude that the current limitations in ticket sparsity are likely of algorithmic rather than fundamental nature. We anticipate that comparisons to planted tickets will facilitate future developments of efficient pruning algorithms.},
abstract={The lottery ticket hypothesis conjectures the existence of sparse subnetworks of large randomly initialized deep neural networks that can be successfully trained in isolation. Recent work has experimentally observed that some of these tickets can be practically reused across a variety of tasks, hinting at some form of universality. We formalize this concept and theoretically prove that not only do such universal tickets exist but they also do not require further training. Our proofs introduce a couple of technical innovations related to pruning for strong lottery tickets, including extensions of subset sum results and a strategy to leverage higher amounts of depth. Our explicit sparse constructions of universal function families might be of independent interest, as they highlight representational benefits induced by univariate convolutional architectures.},
description: "Große KI-Modelle wie ChatGPT brauchen riesige Rechenzentren und jede Menge Energie und werden fast ausschließlich von Tech-Giganten entwickelt. Welche Vorteile hätte es, Deep Learning zu demokratisieren? Und wie können kleinere KI-Modelle dazu beitragen, die Abhängigkeit von großen Tech-Konzernen zu reduzieren? Wie man Deep Learning demokratisieren kann, das erforscht Dr. Rebekka Burkholz am CISPA Helmholtz-Zentrum für Informationssicherheit. Im „Forschungsquartett“-Gespräch mit detektor.fm-Redakteurin Esther Stephan erklärt sie, warum das notwendig ist, und wieso kleinere KI-Modelle vielleicht sogar besser sind."
- title: "Maschinelles Lernen und Künstliche Intelligenz"
65
+
date: 2024-08-29
66
+
speaker: "Rebekka Burkholz"
67
+
venue: "CISPA TL;DR"
68
+
description: "Runde 2 unserer Sommer-Konferenz-Reihe: auf der ICML in Wien haben wir uns mit Rebekka Burkholz hingesetzt um über ihre Forschung und das Neueste im Bereich des maschinellen Lernens zu sprechen. Rebekka kam 2021 zum CISPA kam und ist seitdem mit einem ERC-Starting Grant ausgezeichnet worden, um mit ihrer Forschung neuronale Netzwerke effizienter zu machen. Im Podcast sprechen wir darüber, wie sie ihren wissenschaftlichen Hintergrund aus der Physik im Bereich KI anwendet und wie KI in Zukunft die Gesellschaft beeinflussen kann."
- title: "My Leadership Style is 'We-Learn-Together'"
73
+
date: 2024-03-08
74
+
speaker: "Rebekka Burkholz"
75
+
venue: "Scholarly Communication"
76
+
description: "Listen to this interview of Rebekka Burkholz, faculty at the CISPA Helmholtz Center for Information Security. We talk about the composition of research groups and of research papers. Rebekka Burkholz: \"I have the feeling that this meta-reading becomes more important as a person's career progresses. Because early on, a researcher is typically very focused on the details of each paper and they try to understand what this method does and so on — and of course, researchers need to begin that way, really spending the time to attain to expertise in a particular focus. But with time, as a researcher has seen more ideas (and of course, in one particular focus, methods and questions all share some similarity), then the person acquires more and more overview as they continue reading. They are reading, essentially, for the links between findings, for implications of the findings and those links — and in this way, a more experienced reader of the research actually becomes engaged in a sort of literature discussion.\""
- title: "Two ERC Starting Grants for Dr. Rebekka Burkholz & Dr. Julian Loss"
80
+
date: "2023-11-02"
81
+
speaker: "Rebekka Burkholz & Julian Loss"
82
+
venue: "CISPA TL;DR"
83
+
description: "Dr. Rebekka Burkholz and Dr. Julian Loss seem to have liked it on our podcast – they both are returning for their second episode of TL;DR! The two CISPA Faculty are working on completely different things, but they both have been awarded with a prestigious research grant by the European Research Council (ERC) this fall: the ERC Starting Grant. We talk about what it means for them to receive this grant, what their research has in common and how to facilitate interdisciplinary research. Now available at all your favorite podcast platforms!"
- title: "Maschinelles Lernen mit Dr. Rebekka Burkholz"
88
+
date: "2022-12-07"
89
+
speaker: "Rebekka Burkholz"
90
+
venue: "CISPA TL;DR"
91
+
description: "CISPA-Faculty Dr. Rebekka Burkholz spricht in dieser Folge mit uns darüber, was relationales maschinelles Lernen ist und welche Chancen Methoden des maschinellen Lernens in der Diagnostik und Behandlung von Krankheiten eröffnen. Die Mathematikerin gibt zudem Einblicke, was Informatiker:innen und Mathematiker:innen unterscheidet und was aus ihrer Sicht helfen würde, mehr Frauen für eine Karriere in der Forschung zu begeistern."
Copy file name to clipboardExpand all lines: _data/news.yml
+8-5Lines changed: 8 additions & 5 deletions
Original file line number
Diff line number
Diff line change
@@ -1,8 +1,8 @@
1
1
- date: 2. June 2025
2
-
headline: "Rebekka and Celia are presenting at [NetSci](https://netsci2025.github.io/) in Maastricht."
2
+
headline: "Rebekka and Celia are presenting at [NetSci](https://netsci2025.github.io/) in Maastricht with a satellite keynote and a lightning talk."
3
3
4
4
- date: 1. June 2025
5
-
headline: Welcome Baraah!
5
+
headline: "Welcome Baraah!"
6
6
7
7
- date: 7. May 2025
8
8
headline: "Congratulations to Rebekka for [receiving tenure](https://cispa.de/en/burkholz-tenured) at CISPA."
@@ -13,6 +13,9 @@
13
13
- date: 24. March 2025
14
14
headline: "Rebekka is at [CPAL](https://cpal.cc/spotlight_track/) presenting three [papers](/publications) as recent spotlights."
15
15
16
+
- date: 13. February 2025
17
+
headline: "Celia is presenting her work on graph rewiring at Cohere Labs ([see talk here](/media/#celia-rubio-madrigal--cohere-labs-feb-13-2025))."
18
+
16
19
- date: 22. January 2025
17
20
headline: "Two papers
18
21
[(1)](https://openreview.net/forum?id=g6v09VxgFw)
@@ -33,19 +36,19 @@
33
36
headline: "Welcome to Chao, Rahul, and Dong!"
34
37
35
38
- date: 14. June 2024
36
-
headline: "Celia, Advait and Adarsh are presenting at the Helmholtz AI Conference: AI for Science ([HAICON](https://eventclass.it/haic2024/scientific/external-program/session?s=S-05a)) in Düsseldorf."
39
+
headline: "Celia, Advait and Adarsh are presenting at the Helmholtz AI Conference: AI for Science ([HAICON](https://eventclass.it/haic2024/scientific/external-program/session?s=S-05a)) in Düsseldorf ([see talk here](/media/#celia-rubio-madrigal--haicon-jun-14-2024))."
37
40
38
41
- date: 1. May 2024
39
42
headline: "Our paper on [improving GATs](https://openreview.net/forum?id=Sjv5RcqfuH) has been accepted at ICML 2024."
40
43
41
44
- date: 1. February 2024
42
-
headline: Welcome Tom!
45
+
headline: "Welcome Tom!"
43
46
44
47
- date: 16. January 2024
45
48
headline: "Two papers [(1-Spotlight)](https://openreview.net/forum?id=qODvxQ8TXW) [(2)](https://openreview.net/forum?id=wOSYMHfENq) have been accepted at ICLR 2024."
46
49
47
50
- date: 1. October 2023
48
-
headline: Welcome Celia!
51
+
headline: "Welcome Celia!"
49
52
50
53
- date: 21. September 2023
51
54
headline: "Our paper on [balancing GATs](https://openreview.net/forum?id=qY7UqLoora) has been accepted at NeurIPS 2023."
description: "My main goal is to develop efficient deep learning algorithms that are robust to noise, require small sample sizes, and are generally applicable in the sciences. My work is founded in theory with implications for real world applications and is often characterized by a complex network science perspective. My favourite applications and sources of inspiration are currently the biomedical domain, pharmacy, and physics. My group is supported by the ERC starting grant [SPARSE-ML](https://cispa.de/en/erc-burkholz)."
Copy file name to clipboardExpand all lines: _pages/home.md
+5-1Lines changed: 5 additions & 1 deletion
Original file line number
Diff line number
Diff line change
@@ -8,6 +8,10 @@ permalink: /
8
8
9
9
# Relational Machine Learning Lab
10
10
11
-
We are an ML research group led by [Dr. Rebekka Burkholz](https://sites.google.com/view/rebekkaburkholz). We invite you to explore our research interests and our latest [publications](publications) in top-tier conferences (NeurIPS, ICML, ICLR).
11
+
> **New positions available!**
12
+
> We are currently hiring for PhD and Postdoc positions. Check the [openings](/openings) page and apply now!
We are an ML research group led by [Dr. Rebekka Burkholz](https://sites.google.com/view/rebekkaburkholz). We invite you to explore our research interests and our latest [publications](publications) in top-tier conferences (NeurIPS, ICML, ICLR), and to watch some of our [talks](media/#talks).
12
16
13
17
We are part of the [CISPA Helmholtz Center for Information Security](https://cispa.de), at the [Saarland University](https://www.uni-saarland.de) campus in Saarbrücken, Germany.
Watch below some of our recent **[talks](#talks)** at conferences and seminars, where members of our [group](/team) present highlights from our list of [publications](/publications).
11
+
12
+
You can also listen to insightful **[podcasts](#podcasts)** featuring Dr. Rebekka Burkholz (some are in German and some in English).
If you are interested in joining our group or learning more about our research, please reach out to [Dr. Rebekka Burkholz](https://sites.google.com/view/rebekkaburkholz/).
12
16
13
17
For further information on pursuing a PhD or postdoc at the CISPA Helmholtz Center for Information Security, see the [CISPA Career Portal](https://career.cispa.de/).
0 commit comments