In this study, a formal framework aiming to drive the interaction between a human operator and a team of unmanned aerial vehicles (UAVs) is experimentally tested. The goal is to enhance human performance by controlling the interaction between agents based on an online monitoring of the operator’s mental workload (MW) and performance. The proposed solution uses MW estimation via a classifier applied to cardiac features. The classifier output is introduced as a human MW state observation in a Partially Observable Markov Decision Process (POMDP) which models the human-system interaction dynamics, and aims to control the interaction to optimize the human agent’s performance. Based on the current belief state about the operator’s MW and performance, along with the mission phase, the POMDP policy solution controls which task should be suggested -or not- to the operator, assuming the UAVs are capable of supporting the human agent. The framework was evaluated using an experiment in which 13 participants performed 2 search and rescue missions (with/without adaptation) with varying workload levels. In accordance with the literature, when the adaptive approach was used, the participants felt significantly less MW, physical and temporal demands, frustration, and effort, and their flying score was also significantly improved. These findings demonstrate how such a POMDP-based adaptive interaction control can improve performance while reducing operator workload.