Loading...
Thumbnail Image
Publication

Error-Sensitive Dynamic Calibration for Online Optimization with Switching Cost

Oloko, Toye
Talata, Zsolt
Hashemi, Morteza
Citations
Altmetric:
Abstract
Neural networks (NNs) have achieved state-of-the-art performance in a wide range of applications; however, their deployment in online settings often exposes a lack of robustness due to various dynamic and uncertain conditions. In contrast, traditional online algorithms offer formal robustness guarantees, yet generally do not attain the predictive accuracy of modern neural network models. To address this trade-off, we propose a novel neural network architecture that incorporates a dynamic calibration layer, designed to enhance robustness in online environments without sacrificing predictive performance. The proposed dynamic calibration layer consists of a differentiable optimization component that adjusts the NN output in real time, enabling end-to-end training via standard backpropagation. We provide theoretical analysis establishing performance bounds relative to an offline optimal benchmark. Next, we leverage these bounds to constrain the function class used for dynamic calibration, ensuring both practical feasibility and theoretical soundness. Empirical evaluation is conducted on a real-world case study involving data center energy management. Comparative results against existing online and hybrid approaches demonstrate that our architecture consistently outperforms baseline methods, both in terms of average performance and robustness measured in terms of competitive ratio.
Description
These are the slides from a presentation given at the Allerton conference held at the University of Illinois Urbana Champagne on 09/19/2025.
Date
2025-09-19
Journal Title
Journal ISSN
Volume Title
Publisher
University of Kansas
Research Projects
Organizational Units
Journal Issue
Keywords
Online Convex Optimization, Error-Sensitive, Dynamic Calibration
Citation
DOI
Embedded videos