Maj. Gen. William Hank Taylor, commanding general of the 8th Army, told Business Insider he is using AI tools like ChatGPT to make decisions that can impact thousands of soldiers.
He added that “Chat and I” have become “really close lately.” According to the Major General, he has been using AI to build models to “help all of us,” especially for predicting next steps based on weekly reports.
“Raising Concerns”? People who think like that should not in any position of power, or using potentially dangerous tools.
“That is an excellent suggestion, Dave. I totally agree that launching the nukes is the best option in the current situation.”
The good news is that nuclear strikes have to be authorized by the President. The bad news is that I’m pretty sure he’s running his decisions through The Validation Machine too.
Turds can float to the top if they’re full of hot air.
This is what happens when you give boomers technology
That’s all you have got? I can’t find his age, but based on his military career I am guessing he is Gen X and not a boomer. Besides, the age of someone using tools like this is kind of irrelevant in this case. It’s his rank and power, and the fact that he is probably typing in information that is not cleared, into a system that is basically recording everything he asks or discusses with it. Add to that the fact that he is actually listening to the ramblings of a program he has no idea what the program is using to formulate its decision and it is even more scary. This has serious implications way beyond “boomer bad…”. It means that if the owners/programmers of chatgpt can effectively recognize when someone in a position of power is using their tool, they could manipulate the responses to their benefit, or they could just screw with things. Either way it’s a dangerous situation, especially when the people involved have control of military forces.
It ain’t just Boomers. Gen X, Millennials, Gen Z, and Gen Alpha are all using GAI LLM’s to the detriment of society as a whole.
From what limited studies there have been done on this that I’m aware of, so-called “Gen Z” is far more prone to using LLMs than my “generation” (“Gen X”).
FAR more prone.
im not surprised gen z and younger is using the most.
Another “anyone older than me is a Boomer” dumbass. Let me guess, you’re a teenager? Nothing wrong with that, just assuming from the judgy comment.
I’m sure there are plenty of use cases for AI in something as wildly complex as what a general faces. But…
He added that “Chat and I” have become “really close lately.”
We’d have to see video to understand what he really meant there, but the quote is hella concerning. Was that a jokey kinda thing? Was he dead serious about leaning on an LLM? No way to tell from a chopped up quote, no body language or tone to read.
tl;dr: This really isn’t a story unless one is determined to read what one wants into it.
Real terminator prequel vibes here
Well, I guess the US military will be losing more conflicts even faster than they have historically. Seeing as LLMs can only recycle things already known, they’ll only hallucinate tactics and strategies and analyses already known.
And we know from WWI, WWII, Korea, Vietnam, Afghanistan, etc. etc. etc. just how well fighting the last war in the current war works.
Well yeah, this will literally give control of the military to OpenAI.
I think the speed at which people seem to develop anthropomorphic relationships with a piece of software programmed to agree with you is the thing that scares me most about LLMs.
Also, yknow, this is the express, so take this article with kilos of salt.
That knowledge is not new, but the speed at which people flock to chatbots is very impressive.
Like without ever-present ad campaigns even people closest to me started using them.
It’s… scary.





