Les missions du poste

Établissement : Université Paris-Saclay GS Informatique et sciences du numérique École doctorale : Sciences et Technologies de l'Information et de la Communication Laboratoire de recherche : Laboratoire des Signaux et Systèmes Direction de la thèse : Zeno TOFFANO ORCID 0000000185943291 Début de la thèse : 2026-10-01 Date limite de candidature : 2026-07-31T23:59:59 Ce projet de doctorat porte sur l'étude des interactions stratégiques entre des acteurs utilisant des ressources quantiques et devant prendre des décisions en situation d'incertitude. L'objectif est de définir et d'analyser une nouvelle classe de systèmes, les jeux quantiques à contraintes probabilistes, dont les résultats sont influencés par le comportement quantique, les contraintes probabilistes et logiques, ainsi que par des environnements de décision bruités. Un élément central de ce travail consistera à étudier l'émergence des équilibres dans ces jeux, leur représentation à l'aide de structures de problèmes complémentaires et variationnels, et leur calcul algorithmique. Ce sujet relie des concepts issus de la théorie des jeux quantiques, de la prise de décision stochastique, de l'inférence probabiliste en logique quantique et des méthodes modernes d'analyse d'équilibre. This project lies at the intersection of stochastic optimization and stochastic game theory and is commonly referred to as chance-constrained games. Although the topic was initiated in the early 1960s, it has not attracted sustained attention in the literature. A main reason is that chance constraints typically render the resulting formulations non-convex, making equilibrium analysis and computation challenging. Building on our recent results on chance constraints, substantial progress has been achieved on both the theoretical and algorithmic sides, enabling new tractable approximations and solution methods for problems that were previously out of reach.

In the framework considered here, uncertainty is modeled in a unified manner that covers both classical randomness (e.g., demand, prices, renewable supply) and quantum-induced uncertainty arising from measurement outcomes, noisy dynamics, or incomplete knowledge of the underlying state. Each player's feasible and preferred outcomes are described through chance constraints that impose reliability requirements-such as meeting a payoff threshold or respecting risk limits with high probability-where probability may be classical (from a distribution on random variables) or quantum (from a density operator and a measurement model). In the quantum-probabilistic formulation, events are represented by projectors or, more generally, effects (POVM elements), and their likelihood is evaluated via operator expressions such as Tr(), providing a natural bridge between chance constraints and quantum logic: the relevant events form a non-Boolean structure in which non-commutativity encodes contextuality and incompatibility of measurements. This perspective also accommodates incomplete distributional/state information in a common language-via ambiguity sets over probability measures, density operators, or noise channels-leading to robust and distributionally robust chance-constrained games.

The central methodological challenge remains the same across these settings: chance constraints introduce non-convex feasible regions, and the presence of operator-valued variables and non-commuting events can further complicate the geometry. The project therefore develops and strengthens tractable reformulations and safe approximations-convex inner/outer approximations, risk surrogates, and (when appropriate) semidefinite/operator relaxations-while preserving game-theoretic structure and equilibrium interpretability. The resulting theory and algorithms are designed to be practically impactful on hot application areas, with a primary focus on energy markets, where strategic behavior under uncertainty, reliability requirements, and risk management are central, and where emerging quantum-enabled tools for secure coordination, forecasting, and optimization can be incorporated within the same chance-constrained game framework. The objectives of this project are threefold:

Existence of Nash equilibrium (quantum chance-constrained games with quantum-logic structure): We consider an n-player strategic game in which each player's action set may be finite or infinite and each player's payoff vector is random, with randomness induced by an underlying model that may be classical or quantum. In the quantum setting, uncertainty and correlations are generated by quantum states, measurements, and channels (including entanglement), so the relevant events appearing in chance constraints are naturally represented by quantum-logical propositions-e.g., projection operators (PVMs), more generally effects (POVM elements), and their algebraic relations. We formulate the game as a chance-constrained game by requiring that payoff or feasibility events hold with at least a prescribed probability, where probability is understood in the noncommutative (Born-rule) sense. The objective is to prove existence of Nash equilibria under broad regimes by leveraging quantum logic tools to model and analyze constraints and correlations, including:

Event structure: orthomodular lattices of projections, effect algebras, and the non-distributive nature of quantum propositions (to treat conjunction/disjunction of measurement-dependent events).

Correlation structure: classical dependence vs. quantum dependence via entanglement and nonlocal correlations (captured operationally by joint POVMs, tensor-product structure, and admissible commuting/noncommuting observables).

Model generality: general induced distributions from measurements, and partial specification via sets of compatible states/channels (density operators, CPTP maps).

Game types: cooperative/noncooperative and static/dynamic games, with dynamics represented by quantum operations, instruments, and sequential measurements (where the logical structure of time-ordered propositions matters).

Computation of Nash equilibrium (optimization with quantum-logic constraints and circuit/measurement parametrizations): Building on the equilibrium characterization, the candidate will develop deterministic and stochastic optimization approaches to compute equilibrium solutions when chance constraints are defined over quantum-logical events. This includes using quantum logic tools both as modeling primitives and as computational handles, for example:

Operator-theoretic formulations: expressing chance constraints in terms of expectations of projection/effect operators and translating feasibility into operator inequalities.

Convex-analytic tools: semidefinite programming (SDP) relaxations for constraints over density operators/POVMs, moment/positivity conditions from operator algebras, and distributionally robust sets over states/channels consistent with data.

Algorithmic strategy parametrizations: representing players' quantum strategies via quantum logic gates/circuits, parametrized POVMs, and local instruments; then applying best-response iterations, projected/variational methods, gradient-based learning, and sample-based (measurement-driven) stochastic approximation.

Noncommutativity-aware approximation schemes: scenario methods and risk approximations adapted to sequential/noncommuting measurements (where joint events depend on the measurement context).

Applications (risk-aware strategic interaction with quantum-logical resources): The developed framework will be applied to domains where uncertainty, reliability requirements, and strategic interaction are central, and where quantum structure is either intrinsic or offers new capabilities. In addition to classical applications such as electricity markets under stochastic demand and renewables, we project the framework to:

Quantum networks and resource markets: strategic allocation of entanglement, repeater scheduling, quantum channel usage, and shared quantum compute resources under chance constraints on fidelity, latency, and outage-where constraints are naturally written on quantum-logical events (successful entanglement swapping, passing a verification measurement, meeting a fidelity threshold).

Mechanisms with non-classical correlation devices: game-theoretic models in which a correlating device is implemented by shared quantum states and measurements, allowing cooperative or competitive advantage subject to probabilistic (risk) guarantees; quantum logic clarifies what events are being guaranteed and how they combine across contexts.

Robustness to device/state uncertainty: applications in which incomplete calibration (unknown states, noisy gates, imperfect measurements) leads to ambiguity sets over operators; chance constraints and equilibria are then studied with operator-algebraic and SDP-based tools. The candidate will be familiarized with the large and rich literature on stochastic game theory and stochastic optimization, together with the operator-theoretic and logical foundations of quantum probability. The project will start from our recent results and progressively broaden them to the full range of uncertainty models targeted in the first objective of the thesis, including settings where payoffs and constraints are induced by measurements and noisy dynamics. In this unified view, chance constraints remain the central tool, with probabilities evaluated either from classical distributions or-when uncertainty is quantum-through density operators and measurement effects, for instance via expressions of the form
Tr()
Tr().

Alongside stochastic optimization and stochastic game theory, quantum logic will be considered as one of the main tools of the project, complementing the remaining material by providing a principled way to represent and manipulate events as projectors or POVM elements and to account for non-Boolean event structures arising from non-commutativity (e.g., measurement incompatibility and contextuality). The candidate will benefit from our existing results and experience recently developed on this topic [6,7,8,9,10], and will leverage them to develop tractable approximations, equilibrium concepts, and algorithms that combine chance-constrained methods, robust/distributionally robust modeling, and game-theoretic analysis within a framework that accommodates both classical and quantum notions of uncertainty.

Le profil recherché

Le candidat doit avoir les compétences suivantes:

1. Master en informatique avec de solides connaissances de base en mathématiques et en théorie des probabilités ou un master en mathématiques appliquées avec des connaissances en optimisation convexe. Des connaissances en logique mathématique et en calcul quantique seront un plus.

2. Connaissance en programmation Phyton.

Postuler sur le site du recruteur

Ces offres pourraient aussi vous correspondre.