Category

Blog

Connecting veterans with resources and each other

By BlogNo Comments

The good news is: there are a thousand resources for veterans out there, went the reasoning. The bad news is there are a thousand resources for veterans out there.

This was the rationale behind the launch, in the early summer of 2017, of a multi-platform digital resource for military veterans — intended to vet (pun intended) those resources, then make them more easily accessible to men and women leaving military service.

Many of those former military people, studies showed, were finding themselves “lost in transition.”

After interviewing then-Senator Johnny Isakson (R-GA), chair of the Senate Veterans Affairs Committee, for ConnectingVets.com.

To help serve this audience of those who had served, CBS Radio (which some months later was merged into the firm Entercom) assembled a team of more than a dozen digital journalists, most of them veterans themselves, some the spouses and family members of vets.

Preparing to interview Senator Jon Tester (D-MT), then-ranking member of the Senate Veterans Affairs Committee.

 

 

 

 

I was fortunate to be brought onto the project as managing editor of ConnectingVets.com — the products of which — videos, articles, podcasts, and live radio programs — can be found on Facebook, YouTube, Instagram and Twitter.

One of my first responsibilities was recording interviews for the project with each of the leading members of Congress dealing with issues impacting U. S. veterans.

Talking with Congressman Phil Roe (R-TN), then-chair of the House Veterans Affairs Committee.

The first lesson I took from those conversations: at a time of hyper-partisanship on Capitol Hill, when lawmakers gather to discuss issues impacting military veterans, there is far less rancor than during debate over most other topics.

With then-Congressman Tim Walz (D-MN). At the time, the retired command sergeant major of his state’s National Guard was ranking member of the House Veterans Affairs Committee.

Members from both major parties seemed willing to entertain each other’s proposals — and appear commonly interested in ensuring that the Department of Veterans Affairs and other government agencies are providing vets with quality care and services.

In the last year of my tenure with the team, we were able to present ConnectingVets reporting and resources to a national broadcast audience — as I create, produced and hosted daily Eye on Veterans reports — and a 2-hour weekend program — on CBS News Radio.

On the Air at 13 … and some of the fun that followed

By BlogNo Comments

On September 23, 1971, there I was: prepubescent, wearing braces — and on the radio. If you think I looked goofy, wait until you hear how I sounded:

https://soundcloud.com/chas-henry-media/ktao-first-radio-broadcast-1971-excerpt?utm_source=clipboard&utm_medium=text&utm_campaign=social_sharing

A few years later, I was in the Marine Corps, stationed alongside Kaneohe Bay, Hawai’i. During nights and weekends — when I wasn’t deployed aboard ships in the Western Pacific or Indian Ocean — I worked part-time as a disc jockey in Honolulu. (My younger sister tells me I was a lot more fun back then.) Here I was — October 8, 1978 — as “Charlie O’Henry” on Top 40 powerhouse KKUA.

In 1981, the Marine Corps decided I’d been allowed enough (too much?) time in paradise. On May 30 of that year, I tried to depart the island airwaves with a bit of mystery and panache.

Desert Storm

By BlogNo Comments

Because my own memories of it remain vivid, it is difficult to fathom the number of years that have passed since the 1991 Gulf War.

MREs amid the oil fires 559x437

As we attacked through fields of burning oil wells on our way toward Kuwait City, the sky — even at midday — was pitch black. (Photo/George Spear)

It is sobering, too, to consider the fighting that has transpired across Southwest Asia since Operations Desert Shield and Desert Storm. Where does Gulf War I fit in the context of those subsequent conflicts?

On the 25th anniversary of Operation Desert Storm, I collected reflections from a number of fellow Gulf War veterans — among them former Secretary of State Colin Powell. In 1991, he chaired the Joint Chiefs of Staff.

First, most agreed, the actions in 1991 provided Americans evidence that their military forces had achieved a level of effectiveness superior to a low ebb experienced in the aftermath of the Vietnam war.

Powell described Desert Storm as significantly different from wars before or since in that the diplomatic goals it supported were so limited — as was the space and time in which fighting took place.

Others with whom I spoke pointed up that — because of real-time coverage on CNN — many Americans felt they experienced Operation Desert Storm in a more first-hand fashion than they had previous wars.

Chas & Colin A for web

In January 2016, recalling the Gulf War with retired General, and former Secretary of State, Colin Powell.  In 1991 — as chairman of the Joint Chiefs of Staff — he was senior military advisor to then-President George H. W. Bush. (Photo/Pat Piper)

Then there was the introduction of new technologies on the battlefield. The Gulf War was the first conflict in which Global Positioning Systems played a significant role — allowing ground forces quick, precise navigation over desert terrain devoid of such features as hills, forests, rivers and lakes.

Powell told me that the military had so few GPS systems in supply warehouses at that time that officers were sent to Radio Shack stores across the U. S. — to buy every additional unit they could find.

In 2016, I gathered these and more observations about the Gulf War into a series of radio reports for Westwood One News. Each is no longer than two minutes.  Click here to listen.

Veterans on Veterans Day

By BlogNo Comments

Early into my work with Westwood One News,  I was afforded the opportunity to spend considerable time with a variety of military veterans — sharing their reflections on Veterans Day and the veteran experience in a series of 11 reports, each about a minute long.

Participants ranged from college students to national leaders — discussing how they view their service and the day set aside to honor it. Individuals in the series mulled the nature of military service, how it is viewed by those who have served in uniform and those who have not — and the challenges faced when transitioning from it back to life in civil society. Many offered advice on how to make the most of benefits available to those who have served. One had created a web site aggregating very practical ways non-veterans can turn Veterans Day into a day of meaningful service to those they would like to honor.

Click here to listen to the series as it was broadcast on Veterans Day 2015.

Marine Corps Marathon Hall of Fame Induction

By BlogNo Comments

When I was first posted as a Marine to the Washington, D. C. area, in 1989, a friend recommended I volunteer with a group that helped publicize the Marine Corps Marathon.

It was still a time in the event’s history when not all of the Corps’ top leaders thought the considerable effort required to put it on was worthwhile.

Chas Receiving MCM HOF Award 2

Colonel Joseph Murray — then-commander of Marine Corps Base, Quantico — passes along the etched Tiffany crystal pyramid that comes with Hall of Fame membership. (Photo/Cpl. Timothy Turner)

In those early years, I produced TV public service announcements to encourage spectators to come out and cheer on the field of runners. After retiring from military service in 1996, my work as a broadcast journalist kept me associated with the event.

Over the years, I have interviewed dozens and dozens of interesting participants — and broadcast live, from-the-course reports on radio stations WTOP and WNEW. Most recently, I’ve co-hosted NBC Sports Washington race day morning broadcasts from the starting line.  On October 23, 2015, the organizers of the event kindly inducted me as the 38th member of the Marine Corps Marathon Hall of Fame.

Enjoying a race day morning with co-anchor Jill Sorenson.


Marathon 2001 1

Memories of running the Marine Corps Marathon in 2001.  It was a poignant, as well as challenging experience. We ran past the Pentagon, which had been the target of a terrorist attack just weeks before.  As we passed it, we cheered — and were cheered on by — construction workers already rebuilding.

Robot Wars: A Documentary about Drones

By BlogNo Comments

In future wars, will human soldiers be replaced by weapons that think for themselves?

Lots of remotely controlled systems are already on the battlefield.  In 2012, I spoke with scientists, analysts — and the nation’s top military officer — about how remote engagement and autonomous systems might be changing the American way of war.  Here is the CBS Radio documentary that resulted from those conversations.

When we humans go to war, our least favorite way is hand to hand, face to face.

“It speaks to human nature,” says Massachusetts Institute of Technology Professor Missy Cummings, a former Navy fighter pilot. “We don’t really like to kill, and if we are going to kill, we like to do it from far away.”

Over centuries that has led to creation of weapons that allowed us to separate ourselves from our adversaries — first by yards, then miles. Now, technology allows attacks half a world away.

Until a decade ago, most of the remote engagement capability was owned by the U. S. or Israel. Not anymore.

Unmanned platforms – in the air, on the ground, and on or under the water — are becoming less and less expensive. So are the sensors that help guide them. And nanotechnology is making them smaller.

Today, U. S. soldiers in Afghanistan launch throw-bots into the air by hand, and mini-helicopters deliver frontline supplies by remote control. Adding artificial intelligence to the mix, we are now seeing some platforms operating without even remote human control. An unmanned aircraft flown by an onboard computer recently refueled another unmanned plane – in the air – as it, too, flew completely on its own.

These tools of remote engagement are already changing modern battlefields. And some people worry we may not be giving enough thought to how much they’re going to change things.

Simon Ramo has been thinking about this sort of thing for a long time. 99 years old, he knows something about national security. Remember the defense firm TRW? He’s the R.

“A huge revolution in cost, in loss of lives, takes place,” says Ramo, “if you go to the partnership of man and machine — and let the robots do the dying.”

Such a partnership, he says, does more than save life and limb. It also saves the huge expense of maintaining a big military presence overseas.

Peter Singer of the Brookings Institution agrees that remote engagement allows modern military forces to “go out and blow things up, but not have to send people into harm’s way.”

But he says robot wars are much more complex than that.

“Every other previous revolution in war has been about a weapon that changed the how,” says Singer. “That is, a machine or system where it either went further, faster, or had a bigger boom.”

Robots, he says, fundamentally change who goes out to fight very human wars.

Chas Interviews CJCS GEN Martin Dempsey, October 17, 2012 1a

Chas discusses the rise of drones on the battlefield, and potential dangers of autonomous weaponry, with then-Joint Chiefs Chair General Martin Dempsey.

“It doesn’t change the nature of war,” says General Martin Dempsey, chairman of the Joint Chiefs of Staff. “But it does in some ways affect the character of war.”

The nature of war, says Dempsey, is a contest of human will. The character, on the other hand: “What do you intend? How do you behave with it? And then what’s the outcome you produce?”

“This is not a system which we’ve just simply turned loose,” says the general. “It’s very precisely managed, and the decisions made are made by human beings, not by algorithms.”

What capability are those humans managing? Battlefield commanders say — most importantly: an ability to provide persistent surveillance and the intelligence that comes from it.

“When you have an aircraft that can fly over an evolving battlefield, and in an unblinking way observe the battlefield,” says Air Force Lieutenant General Frank Gorenc, “they have the ability to describe to manned aircraft that are coming in, that can provide the firepower, much more accurate data.”

Commanders whose unmanned systems roam on the ground or in and under water gain similar benefits. That’s why many say “don’t call them drones.” In military terminology, drones are dumbed-down vehicles capable of following only a predetermined path. In the air, pilots in smart planes used drones as targets. So while most people around the world have come to call them drones, the people operating them prefer the term unmanned systems.

Well, some of them. General Gorenc says even if there is no one in the driver’s seat, it takes a lot of humans to keep the systems working. “There’s hardly anything unmanned about it,” he says, “even in the most cursory of analysis. So it takes significant resources to do that mission.” A mission that is possible because as the vehicles have developed, so too have the sensors providing them an understanding of precisely where they are at any given time, and optics that have improved the images they collect and send back.

Besides loitering for hours or days over places commanders want to keep an eye on, what can these systems do? We will likely see more unmanned craft delivering supplies — meaning air crews or truck convoys will be put in less danger.

Dempsey says it is possible, too, that a wounded soldier could soon be bundled inside a remotely piloted aircraft for evacuation to a field hospital.

“Logistics resupply and casualty evac could certainly be a place where we could leverage technology and remote platforms,” he says.

And of course, as Georgetown University Professor Daniel Byman notes, some unmanned systems — most notably the Predator drone — can kill.

“It’s that persistent intelligence capability, to me,” says Byman, “that enables the targeting of individuals — where before you wouldn’t — in part because of the risk to the pilot, but also in part because you weren’t sure what else you might hit. And now you can be, not a hundred percent confident, but more confident than you were.”

There has been controversy about the two ways those drones deal death — by targeted or signature strikes.

“A targeted strike is based on a positive identification of a particular individual or particular group of individuals,” says Christopher Swift of the University of Virginia’s Center for National Security Law, “whether they’re moving in a convoy, or whether they’re at a fixed location, or whether they’re out on the battlefield.”

Signature strikes, on the other hand, use sensors to watch for trends of behavior that seem suspicious then launch an attack when it appears — to a computer algorithm — that the series of behaviors point to bad guys doing, or getting ready to do, bad things.

Signature strikes bring with them a greater risk of killing or wounding people seen as innocents. And death by remote control can be perceived as callous, prompting a backlash.

While recently in Yemen, Swift talked with a number of tribal leaders about the unmanned system attack that killed terrorist provocateur Anwar al-Aulaqi.

“They were more concerned about the drone strike on his 16-year-old son,” says Swift, “because they saw him as a minor, rather than as a militant, and there was some sympathy for him” — even though Swift says many of the same people thought the boy’s father got what he deserved.

Some civil liberties groups challenge the legality of both targeted and signature strikes. But Swift says he believes that “international law is not a restraint on our ability to do it. It’s a series of guidelines that tell us the things we should avoid in order to do these kinds of operations better.”

A key aspect of better, says Swift, is ensuring that remote engagement is always paired with human contact.

“You can’t get to the human dimension of managing these political and social relationships at a local level,” he says, “and understanding how local people see their own security issues if we’re just fighting these wars using drones, if we’re fighting from over the horizon.”

Not everyone acquiring unmanned craft will be concerned about tactical nuance. Reports in early October, for instance, indicated that Hezbollah fighters may have begun using an unmanned surveillance craft — flying it over sensitive sites in Israel.

Who is selling to customers on U. S. and Israeli “no sale” lists? China is in the game.

“They have imported, and actually stolen, a lot from Russia,” says Siemon Wezeman, who researches proliferation of unmanned systems at the Stockholm International Peace Research Institute. “They are now really on the way of developing technology which is getting on par with what you would expect from Western European countries.”

Siemon Wezeman and Chas at Stockholm International Peace Research Institute 1a

In Sweden, Chas visited with Siemon Wezeman of the Stockholm International Peace Research Institute. Wezeman had created a comprehensive list of where military-related drone technology was being used around the world, and how.

And Wezeman says more and more nations and groups are shopping for the technology.

“You see in the last few years even poor and underdeveloped countries in Africa getting involved in acquiring them, and in some cases even thinking about producing them.”

According to Wezeman, the majority of presently-available unmanned aerial vehicles (UAVs) are the sort used for surveillance. “Most of them still are unarmed. There are very few armed UAVs in service. But the development is in the direction of armed UAVs.”

In some ways, remote controlled war could prove a more effective tactic for small groups of bad guys, says National War College Professor Mike Mazarr — offering personal opinions on the topic, not necessarily those of the Defense Department.

“I think very often the U. S. is going to be trying to use them to achieve big national-level goals that are very challenging and difficult,” says Mazarr. “And other actors are going to be trying to achieve much more limited, discrete goals — to keep us from doing certain things.”

The use of any robots scares some people who worry about machines making potentially disastrous mistakes. Advocates of the technology offer the reminder that to err is human.

“Who makes more mistakes: humans or machines?” asks Byman. “The answer, of course, is: it depends. But often machines can avoid mistakes that humans would otherwise make.”

“It may take a human to do a final check on an engine, or turning the last centimeters on a screw,” says Dean Cheng, an analyst at the Heritage Foundation, “but getting the screws to that mechanic could well become a robotic function. And it would be faster, and probably more accurate.”

Robotic accuracy could bring improved safety to even manned aircraft when it comes to taking off and landing.

Retired Rear Admiral Bill Shannon, who until recently oversaw unmanned aircraft initiatives in the U. S. Navy, says, when onboard robotic systems interact with GPS and other sensor data, planes automatically “know their geodetic position over the ground. They land with precision, repeatable precision, regardless of reference to the visual horizon.”

Cummings adds that the U. S. Air Force, at first, insisted that human operators control the take-offs and landings of its remote aircraft. They turned out to be more accident-prone than robotic systems. “From Day One,” she notes, “all the Army’s UAVs had auto land and take-off capability. And as a consequence they haven’t lost nearly as many due to human error in these areas.”

Still, after watching failures in some other supposedly smart systems — automated trading software on Wall Street, for instance — many say they fear movement toward unmanned systems that think for themselves.

“If you optimize [these systems] to work very quickly,” says Byman, “to try to take shots that we’d otherwise miss — you’ll make more mistakes. If you optimize them to be very careful, you’ll miss opportunities. So there are going to be costs either way.”

The U. S. Army is funding research at Georgia Tech into whether it is possible to create an “artificial conscience” that could be installed in robots operating independently on a battlefield.

“There’s nothing in artificial intelligence or robotics that could discriminate between a combatant and a civilian,” says Noel Sharkey, a professor at the University of Sheffield, in the UK. “It would be impossible to tell the difference between a little girl pointing an ice cream at a robot, or someone pointing a rifle at it.”

“As you begin to consider the application of lethal force,” Dempsey adds, “I think you have to pause, and understand how to keep the man in the loop in those systems.”

So what if a battlefield robot does goes haywire. Who is responsible?

“How do you do legal accountability when you don’t have someone in the machine?” worries Singer. “Or what about when it’s not the human that’s making the mistake, but you have a software glitch? Who do you hold responsible for these incidents of ‘unmanned slaughter,’ so to speak?”

“It could be the commander who sent if off,” speculates Sharkey. “It could be the manufacturer, it could be the programmer who programmed the mission. The robot could take a bullet in its computer and go berserk. So there’s no way of really determining who’s accountable, and that’s very important for the laws of war.”

That is why Cummings thinks we will not soon see the fielding of lethal autonomous systems. “Wherever you require knowledge,” she observes, “decisions being made that require a judgment, require the use of experience – computers are not good at that, and will likely not be good at that for a long time.”

Those who chafe at what they call a lack of imagination in the use of robots, though, say that should not stop or slow the integration of such systems in areas where they can do better than humans.

“There are some generals who assume that the role of robots is to help the human being that they assume is still going to be there,” says Ramo. “We’re talking about warfare being changed so that you should quit thinking about the soldier. He shouldn’t be there in the first place.”

Too, say critics, robots should not necessarily look like people — pointing to a robot being created to fight fires onboard Navy ships. It walks around on two legs, about the height of a sailor carrying a fire hose.

Shannon says problems sometime result when people who built manned systems try to create something similar, just minus the human. He encountered the phenomenon with designers determining what visual information would be available to those piloting unmanned aircraft from the ground.

“They don’t need to give the operator the pilot’s view,” he says. “They can give them, for example, a God’s-eye view of the air vehicle and the sensors interacting with the environment — as opposed to a very, very narrow view of what a pilot might see as they look out their windscreen.”

Shannon says he would frequently look for innovative design ideas from people not tied to systems built around human pilots. “Often I see it when I get someone who’s come from outside of aviation,” he says — someone with experience “for example, creating that environment in the gaming industry.”

The brave new world of robot wars could well require the nation to field a new type of warrior, as well.

“The person who is physically capable and mentally capable of engaging in high-risk dogfights,” notes Byman, “may be very different from the person who is a very good drone pilot.”

Cummings anticipates some in the military will find it difficult to accept such a shift. “Fundamentally, it raises that question about value of self,” she says. “’If that computer can do it, what does that make me?'”

In the end, robots thrown into war efforts are put there for one reason: to win. Would it be possible to win a war by remote control?

“You could put together an elaborate strategy,” muses Mazarr, “that would affect the society, the economy, the national willpower of a country that, I could certainly imagine — depending on what was at stake, the legitimacy of its government, a variety of other things — of absolutely winning a war in these ways.”

The nation’s top military officer is not so sure. “It’s almost inconceivable to me,” says Dempsey, “that we would ever be able to wage war remotely. And I’m not sure we should aspire to that. There are some ethical issues there, I think.”

Another ethical consideration is raised by those who worry that remote engagement seems “bloodless” to those employing it.

“It always creates the risk that you’ll use it too quickly,” notes Byman. “Because it’s relatively low cost, and relatively low risk from an American point of view, [it’s possible] that you’ll be likely to use it before thinking it through. Use it even though some of the long term consequences might be negative.”

“You could increasingly be in a world where states are constantly attacking each other,” suggests Mazarr — “in effect, in ways that some people brush off and say, ‘well, that’s just economic warfare,’ or ‘it’s just harassment,’ but others increasingly see as actually a form of conflict.”

Finally, it is worth noting that the sensor information, so important to controlling unmanned systems, flows through data networks — webs susceptible, at least in theory, to being hacked.

“When you’re in the creation of the partnership of human beings and robots, you’re into cyber warfare,” says Ramo, “and you’ve got to be better than your enemies at that, or your robotic operations will not do you very much good.”

Susceptibility to being attacked with remote systems leads Mazarr to ask if the U. S. — with its highly interlinked, interdependent economy — might do better to try to limit the use of remote controlled systems, rather than expanding their use.

“Given the likely proliferation of these kind of things to more and more actors,” he says, “given the vulnerability of the U. S. homeland, given the difficulty we have as a society in taking the actions necessary to make ourselves resilient against these kind of attacks — would it be better to move in the direction of an international regime to control, or limit, or eliminate the use of some of these things?”

Jody Williams thinks so. In 1997 she was awarded the Nobel Peace Prize for a campaign that created an anti-landmine treaty. “I know we can do the same thing with killer robots,” says Williams. “I know we can stop them before they ever hit the battlefield.” She’s working with the group Human Rights Watch in an effort to do so.