The Air Force has quietly assembled an “AI Action Team” to help leaders think through ethical challenges, develop literacy in the force and identify new applications for the tech, the service’s senior enlisted leader said this month.
Among those new uses, CMSAF David Wolfe said, was promotion board screening and ranking. Speaking at an event hosted by Military Officers Association of America May 8, Wolfe suggested that smart computing could help the service improve in what he said was a longtime area of weakness: employing the best personnel.
“We don’t really do talent management in the Air Force; we do replacement management,” he said. “And that’s on us, to try to get way better at that.”
The administrative and policy requirements that come alongside aligning people with jobs have become “way too long of a list,” Wolfe said.
“I get why we do that, but what ends up happening quite frequently is, it might not be the right person at the right place, and we have definitely got to do better,” he said.
Last December, within a week after stepping into his position as senior enlisted airman, he began to assemble the AI action team.
The move, Wolfe said, was spurred by an audience question at a service forum.
“We put, initially, about 30 people on the team — the best from around the Air Force, officer [and] enlisted. Talent is all we want,” he said, adding that candidates with knowledge of current AI developments were prioritized.
“We’ve grown to about 100 people on this team now, and we’re getting ready to roll out what I think will be some meaningful training … to get everybody to a baseline of AI literacy.”
That work, he said, will help the service better explore practical uses for AI as well as ethical and safety guardrails.
“We take our already awesome people and we just level them up and make them even more capable than they already are by automating processes that they don’t have to put a bunch of time and effort into anymore,” he said.
Current experiments with automating portions of the Air Force officer promotion boards process aim to find those efficiencies, he said.
The service is “not letting AI pick [officers], but automating the processes that happen in the background so that when the human looks at it, it’s easy to see, easy to discern and gives us a better chance of making a really good decision as we start to really dive in,” Wolfe said.
Army officials announced last fall that the service was integrating AI into promotion boards to screen out non-competitive candidates and reduce the number of decisions that had to be made by humans.
And Navy officials said in late April that they were expanding a pilot program that recommended next jobs for sailors based on their skills and experience.
As the services contend with a Pentagon mandate to accelerate use and integration of AI tools, leaders have expressed eagerness to use technology to move faster, while acknowledging concerns about trusting it fully.
When the Army announced earlier this year it would use AI tools to update doctrine, a leader of the effort compared AI to a “resourceful and motivated young officer who might not know all the information, but they can certainly assist you in cutting some corners.”
Speaking at the same MOAA forum May 9, Master Chief Petty Officer of the Navy John Perryman said his service had seen success with an AI orientation class that about 500 sailors had taken so far.
“Large language models and things like that, with a little bit of training, almost anybody can use,” he said. “And the benefits of getting more people using that technology, you know, we’re just beginning to scratch the surface of that.”



