AI4Physics

AI4Physics is a research initiative at the Department of Physics and Astronomy at Uppsala University, which aims at supporting the use of artificial intelligence (AI) and machine learning (ML) in research throughout the department.

AI is leading to an imminent paradigm shift in physics. AI methods are already used to reform data analysis for experiments across physics, to develop and test new models in theoretical physics, and to transform physics education research. The AI4Physics initiative aims to integrate such AI methods in the research and teaching of our Department.

The AI4Physics initiative includes a seminar series given by invited researchers at the forefront of AI/ML developments in physics research. These seminars will be accompanied by lunch mingles, where members of the department give flash talks that showcase the use of AI/ML in their research.

A second part of the initiative is AI learning and training workshops tailored for the needs of the faculty at the department. These workshops will be organized once per year, starting at a basic level and progressing to discuss more advanced tools.

Past Events

Speaker: Bruno Loureiro (École Normale Supérieure, Paris)

Speaker: A statistical physics perspective on machine learning theory

Abstract: The past decade has witnessed a surge in the development and adoption of machine learning algorithms to solve day-a-day computational tasks. Yet, a solid theoretical understanding of even the most basic tools used in practice is still lacking, as traditional statistical learning methods are unfit to deal with the modern regime in which the number of model parameters are of the same order as the quantity of data – a problem known as the curse of dimensionality. Curiously, this is precisely the regime studied by Physicists since the mid 19th century in the context of interacting many-particle systems. This connection, which was first established in the seminal work of Elisabeth Gardner and Bernard Derrida in the 80s, is the basis of a long and fruitful marriage between these two fields.

In this talk I will motivate and review the connections between Statistical Physics and problems from Machine Learning, in particular concerning recent progress in our understanding of shallow neural networks.

Speaker: Jim Halverson (Northeastern University)

Speaker: Neural Networks and Field Theory

Abstract:Machine learning techniques are leading to breakthroughs in the physical sciences and everyday life. The vast majority of the progress is powered by neural networks. In this colloquium I'll review the essentials of neural networks and we will quickly discover that their foundations make field theoretic language unavoidable. I will present two central results in ML theory regarding the statistics and dynamics of neural networks, their relation to free field theories, and the ML origin of interactions. I will then pivot and suggest using this formalism for physics, including what it would mean for a neural net ensemble to define a full-blown QFT. Time permitting, I will present progress in this direction, such as the realization of φ4 theory, unitarity, and conformal symmetry.

Slides from the event pdf, 5 MB.

The seminar was complemented by a tutorial on 22 May.

Title: Getting Your Hands Dirty with Neural Networks and Field Theory

Abstract: In this hands-on code-based tutorial, we will compute correlations in neural network field theory and demonstrate how interactions shut off in a large-N limit. Time permitting, we'll realize global symmetries and exemplify unitary NN-FTs on a lattice.

In this workshop, you got a general introduction to machine learning and a broad survey of techniques and models of relevance for physics research. Lectures were mixed with practical exercises.

Time

Thursday

Friday

9-10

Introduction to ML pdf, 11 MB.
Niklas Wahlström (UU/IT)
Room: 101195, Heinz-Otto Kreiss

Introduction to LLM pdf, 55 MB.
Ekta Vats (UU/IT)
Room: 80127

10-10.30

Coffee

Coffee

10.30-11.20

Workshop:
101136-B, Evelyn Sokolowski

A playful introduction to machine learning: creating an image classifier

Bor Gregorcic and Giulia Polverini

Demo:
Room: 80127
How to do NLP-like research in physics

Yong Sheng Koay and Eliel Carmargo-Molina
Performance of Large Multimodal Models on physics tasks involving visual representations pdf, 2 MB.
Giulia Polverini

11.20-11.40

Deep learning courses at IFA
Christian Glaser

LLM course at IFA
Bor Gregorcic

14-14.30

Coffee

Coffee

14.30-16

Ethel - An AI-Based Virtual Teaching Assistant at ETH Zurich pdf, 10 MB.
Gerd Kortemeyer, ETH Zurich

Room: 101121 Sonja Lyttkens

Panel discussion:
Can we trust AI in physics research?

Ulf Danielsson (UU/IFA),
Eliel Camargo-Molina (UU/Gotland), Stephanie Lowry (Örebro),
David Sumpter (UU/IT)

Room: 101121 Sonja Lyttkens

Titles and abstract for talks: Thursday

  • Niklas Wahlström (UU/IT): Introduction to Machine Learning and Deep Neural Networks
    Abstract:
    Machine learning involves creating computer programs that can autonomously learn to solve complex problems using data, without explicit programming for each task. This session will introduce the fundamentals of machine learning and present the key ingredients in the model class that has revolutionized the field in the recent decade: deep neural networks (DNNs). In particular, we will explore the importance of depth in these models and give pointers to different DNN architectures tailored for different types of data.

Niklas Wahlström an Associate Professor at the Division of Systems and Control, Department of Information Technology, Uppsala University. His research interests lie in physics-informed machine learning and applications of machine learning in physics. He has developed several courses in machine learning, both at Master's and PhD level. Niklas received his PhD degree from Linköping University, has held visiting positions at ETH Zürich (Switzerland) and Imperial College (UK). Since 2016, he has been affiliated with Uppsala University, first as a postdoctoral researcher, and since 2019 in his present position.

  • Bor Gregorcic and Giulia Polverini (UU/IFA): A playful introduction to machine learning: creating an image classifier

Abstract: In this workshop you get hands-on experience with image classifiers (without coding!).

  • Christian Glaser(UU/IFA): Deep learning courses at IFA
    Abstract: Christian, who has developed and teach the courses 1FA370: Applied Deep Learning in Physics and Engineering and 1FA006: Advanced Applied Deep Learning in Physics and Engineering, will present the courses’ philosophy, layout and content.
  • Martina Laurenza (UU/IFA): Machine Learning for Background Suppression in High Energy Physics
    Abstract: This tutorial provides an introductory overview of using Machine Learning (ML) for background suppression in High Energy Physics (HEP). Participants will have the opportunity to modify scripts and tune algorithm parameters, observing how these changes affect the discrimination power of the models. The session offers a hands-on first look into applying ML techniques for background suppression in HEP analyses.
  • Gerd Kortemeyer (ETH Zurich): Ethel - An AI-Based Virtual Teaching Assistant at ETH Zurich
    Abstract: When we think of AI today, chatbots often come to mind first. However, in higher education, the potential applications of AI go far beyond bots. Examples include personalized feedback on complex assignments, support for exam grading, targeted practice exercises, assistance in interactive lectures, accessible preparation of teaching materials, and help with programming tasks. With a particular focus on physics education, this talk presents the experiments and practical experiences from the Ethel project at ETH Zurich, which is now running in its third semester and currently serving over 2000 students in 12 courses.
    Gerd Kortemeyer is a member of the rectorate of ETH Zurich and an associate of the ETH AI Center. He is also an Associate Professor Emeritus at Michigan State University. He holds a Ph.D. in physics from Michigan State University, where he taught for two decades. His research focusses on technology-enhanced learning of STEM disciplines; currently, he is advancing the research and development of AI-based tools and workflows for teaching, learning, and assessment.

Codebase

Ethel: A virtual teaching assistant

Using artificial-intelligence tools to make LATEX content accessible to blind readers

Grading assistance for a handwritten thermodynamics exam using artificial intelligence: An exploratory study

Assessing confidence in AI-assisted grading of physics exams through psychometrics: An exploratory study

Titles and abstract for talks: Friday

  • Ekta Vats (UU/IT): Introduction to LLM
    Abstract: Large Language Models (LLMs) have revolutionized natural language processing, but their potential extends far beyond text understanding. This session delves into LLMs, exploring their theory and practical applications. We will explore the convergence of language and images, establish basic knowledge in LLMs, and understand how LLMs and its variants can enhance various tasks, such as OCR/HTR, image captioning, visual question answering, image generation, and beyond. The ethical considerations and potential biases inherent in deploying LLMs for vision tasks will be discussed, providing insights into their individual capabilities and limitations.
    Ekta Vats is an Assistant Professor in Machine Learning, Docent in Computerised Image Processing, and a Beijer Researcher at The Beijer Laboratory for Artificial Intelligence Research. She leads the Uppsala Vision, Language and Learning group whose research mission is to build fundamental AI/ML methods for computer vision and language modeling to address societal challenges. She teaches several AI/ML courses at Uppsala University, and has worked as an AI Scientist at Silo AI (now part of AMD).
  • Yong Sheng Koay and Eliel Camargo-Molina (UU/IFA): How to do NLP-like research in physics
    Abstract: Using our recent experiments with transformers for QFT Lagrangians as an example, we will discuss transformers model selection, data preparation, training workflows, and evaluation methods choices. The session aims to provide practical guidance for leveraging state-of-the-art transformer models in interdisciplinary research.
  • Giulia Polverini(UU/IFA): Performance of Large Multimodal Models on physics tasks involving visual representations
    The recent advancement of chatbots to process not only text but also images significantly broadens their potential applications in physics education. This is particularly relevant given that physics relies heavily on multiple forms of representation, such as graphs, diagrams, etc. To assess the true reasoning and visual abilities of these tools, we evaluated their performance on conceptual multiple-choice physics tests across various topics and languages. In this presentation, I will discuss some of the strengths and limitations identified in our study—insights that are valuable for the thoughtful and effective integration of these AI tools into educational physics
  • Bor Gregorcic (UU/IFA): LLM course at IFA
    Abstract: In 2025, the Division of Physics Education Research has developed an introductory course in AI for educators in the natural sciences. We will present the course philosophy, layout and content.
  • Gerd Kortemeyer (ETH Zurich): Teacher Lunch: Cheat sites and artificial intelligence usage in online introductory physics courses
    As a result of the pandemic, many physics courses moved online. Alongside, the popularity of Internet-based problem-solving sites and forums rose. With the emergence of large language models, another shift occurred. How has online help-seeking behavior among introductory physics students changed, and what is the effect of different patterns of online resource usage? In a mixed-method approach, we investigated student choices and their impact on assessment components of an online introductory physics course for scientists and engineers. A year ago, we found that students still mostly relied on traditional Internet resources and that their usage strongly influenced the outcome of low-stake unsupervised quizzes. We empirically found distinct clusters of help-seeking and resource-usage patterns among the students; the impact of students’ cluster membership on the supervised assessment components of the course, however, is nonsignificant. Today, there is evidence that students shifted almost completely to AI.

Cheat sites and artificial intelligence usage in online introductory physics courses: What is the extent and what effect does it have on assessments?

  • Jos Cooper (ESS): AI (and beyond) in neutron scattering – the state of the art, and current challenges
    Abstract: Neutron scattering is a very well established field, providing unrivalled information about materials since 1930. It is only relatively recently, however, that artificial intelligence started to be used in neutron scattering, with one of the first conferences being held in 2019. The introduction of AI has not revolutionised the field in the way that some areas of astronomy, or structural biology have been. However, applications have been slowly increasing, and it is widely recognised as incredibly important for the future. In this talk I will highlight several projects using AI in neutron scattering and go through some of the challenges which currently exist in the field. I will finish with an outlook, and some predictions about where AI will benefit researchers the most.
    Jos Cooper is the project lead on the ESTIA neutron reflectometer, being built at the European Spallation Source. A large research infrastructure which will be the newest and most powerful neutron source in the world. He has a background in thin film magnetism, electrochemistry, and neutron reflectometry beginning from his PhD at Cambridge, then working at the ISIS neutron and muon source as instrument scientist, to his current position building the next generation of neutron reflectometer in Lund. He has had an active interest in AI, and how it can help advance neutron science, since 2018. He has ongoing research, collaborating with Uppsala among others, investigating how to optimize experiments at large scale facilities, to make best use of these invaluable resources.

Speaker: Jennifer Ngadiuba (Fermilab)

Title: Big Data, Fast Decisions: Real-Time Machine Learning to Accelerate Scientific Discovery

Seminar on machine learning techniques for particle physics experiments, followed by mingle event with flash talks about research using AI/ML at the Department of Physics and Astronomy.

Abstract: In the era of big data, the ability to make rapid, data-driven decisions is transforming scientific research across disciplines. At the forefront of this revolution is fast machine learning, which enables real-time insights at unprecedented scales. This talk will explore cutting-edge techniques in fast machine learning for high-energy physics, focusing on real-time data processing and decision-making. I will discuss the integration of AI at the edge, recent advancements in algorithms, and how these innovations are accelerating discovery in fundamental physics. From identifying rare events to optimizing complex systems, fast machine learning is pushing the boundaries of what is possible in scientific exploration.


Speaker: Pietro Vischia

Title: Differential Programming at the Frontiers of Computation

In this seminar, proof-of-concept results for the gradient-based optimization of experimental design will be presented. The workshop is a mix of theoretical lectures and practical exercises of differential programming.

Designing the next generation colliders and detectors involves solving optimization problems in high-dimensional spaces where the optimal solutions may nest in regions that even a team of expert humans would not explore. Furthermore, the large amount of data we need to generate to study physics for the next runs of large HEP machines and that we will need for future colliders is staggering, requiring rethinking of our simulation and reconstruction paradigm. Differentiable programming enables the incorporation of domain knowledge, encoded in simulation software, into gradient-based pipelines, resulting in the capability of optimizing a given simulation setting and performing inference through classically intractable settings.

Differentiable programming revolutionizes computational modeling by enabling the seamless integration of gradient-based optimization into complex systems, fostering advancements across scientific and engineering disciplines. The workshop will mix theoretical lectures and practical exercises, with a focus on the latter. I will illustrate how automatic differentiation provides exact derivatives calculated numerically with almost no computational overhead, powering all modern machine learning. Once models are fully differentiable, the conceptual leap to differentiable programming will be done, to "make the world differentiable". Finally, I will illustrate how the computational and environmental challenges of large-scale computing models require shifting to new information encoding paradigmas, delving into introductory notions and exercises on spiking networks powered by neuromorphic computing platforms and on quantum machine learning.

Speaker: Miguel Marques (Ruhr Universität Bochum)

Title: Predicting and characterizing materials with machine learning


Contact

Last modified:

AltStyle によって変換されたページ (->オリジナル) /