Lockheed Martin's chief technology officer Craig Martell said the future of warfare will center on human-machine collaboration rather than fully autonomous systems. Speaking at Axios' AI+DC Summit, Martell emphasized that humans must train with AI systems to understand their limitations before deployment. He stressed personal accountability, stating "if it gets it wrong, my fault" when describing the decision to deploy AI-powered military platforms.
Martell's comments come as military forces increasingly adopt autonomous weapons amid intensifying debates over accountability and trust in these systems. The former Defense Department chief digital and AI officer brings unique insight into how classified military AI systems are being implemented. His perspective highlights growing concerns about balancing technological advancement with human oversight in combat situations.
The Army recently received its first autonomous Black Hawk helicopter, developed with a Lockheed Martin subsidiary, which can complete missions independently or under remote supervision. The aircraft is currently undergoing rigorous testing as part of a broader military push toward autonomous systems. This delivery reflects the growing centrality of drone warfare and unmanned vehicles in modern combat operations.
Martell envisions future scenarios where pilots work alongside swarms of autonomous aircraft for protection and support. This human-machine teaming approach aims to leverage AI capabilities while maintaining human decision-making authority. The strategy represents an alternative to fully autonomous weapons systems that operate without human intervention or oversight.