“Shield AI Collaborates with Sentient Vision Systems to Offer AI-Enabled Wide Area Motion Imagery Capability” –News Release

The unmanned aircraft sensor payload capability is varied based on the Coast Guard’s desired mission and search conditions: MWIR 3.5 is a mid-wave infrared for thermal imaging capability, for use at night or periods of low visibility; EO-900 is a high-definition telescopic electro-optical (EO) imager to zoom in on targets at greater distance; and ViDAR is a visual detection and ranging wide-area optical search system that is a comprehensive autonomous detection solutions for EO video. Courtesy Photo.

I am passing along this vender’s news release. It references two systems we have discussed in the past, Vidar and the V-Bat UAS. The Coast Guard has a history with these systems.

 Shield AI Collaborates with Sentient Vision Systems to Offer AI-Enabled Wide Area Motion Imagery Capability

SAN DIEGO (Aug. 10, 2023) – Shield AI, an American defense technology company building the world’s best AI pilot, and Sentient Vision Systems (Sentient), an Australia-based leader in AI-enabled passive wide area search, are pleased to announce a strategic collaboration aimed at delivering a wide area motion imagery (WAMI) solution for Department of Defense (DoD), Australian Defense Forces (ADF) and other international customers.

The companies will jointly develop and integrate a ViDAR-enabled, wide-area-search capability onto Shield AI’s V-BAT unmanned aircraft, which will enable Shield AI’s V-BAT to intelligently classify, track, and read-and-react to targets in dynamic missions. Shield AI plans to fly the capability on V-BAT next year.

“This work with our Australian partner, Sentient, is a unique opportunity to fuse the innovation prowess of two companies from allied countries on opposite sides of the world. Together, we are shaping the future of defense technology,” said Brandon Tseng, Shield AI’s President, Co-founder, and former U.S. Navy SEAL.

ViDAR is Sentient’s AI system, which uses an Electro-Optic or Infrared (EO/IR) sensor to detect and classify targets in the imagery stream that would be invisible to a human operator or to a conventional radar. With these enhanced capabilities, V-BAT will be even more proficient in executing the most challenging missions, offering a level of capability that significantly bolsters threat deterrence, thereby reinforcing international peace and security.

“Sentient is excited and proud to be working with Shield AI on this truly breakthrough solution,” said Mark Palmer, Sentient’s Chief Technology Officer. “We look forward to combining the AI expertise and operational understanding of our two great teams to deliver superior ISR capabilities for today’s rapidly changing defense and security environment.”

About Shield AI 
Founded in 2015, Shield AI is a venture-backed defense technology company whose mission is to protect service members and civilians with intelligent systems. In pursuit of this mission, Shield AI is building the world’s best AI pilot. Its AI pilot, Hivemind, has flown a fighter jet (F-16), a vertical takeoff and landing drone (V-BAT), and a quadcopter (Nova). The company has offices in San Diego, Dallas, Washington DC and abroad. Shield AI’s products and people are currently in the field actively supporting operations with the U.S. Department of Defense and U.S. allies. For more information, visit www.shield.ai. Follow Shield AI on LinkedIn, Twitter, and Instagram.

Media contact: media@shield.ai

About Sentient Vision Systems                                                                                      Sentient Vision Systems specializes in passive, modular optical sensors for persistent, wide area motion imagery. Sentient’s artificial Intelligence (AI)-enabled edge solutions better detect and identify small objects in real time, on land and at sea. With more than 20 years of development in moving target indication, AI computer vision and machine learning, Sentient has deployed thousands of systems worldwide in the field of Intelligence, Surveillance, Search and Rescue, enhancing situational awareness, accelerating informed decisions, and saved numerous lives. For more information, visit www.sentientvision.com. Follow Sentient on LinkedIn, Twitter and YouTube.

Media contact: media@sentientvision.com

Editorial enquiries :

Shield AIMedia contact: media@shield.ai

Sentient Vision Systems Media contact: media@sentientvision.com

“As it helps combat unlawful fishing internationally, NGA is ‘posturing’ for an AI-driven future” –DefenseScoop

DefenseScoop reports that the National Geospatial-Intelligence Agency (NGA) is using AI to sort data for an application called Enhanced Domain Awareness (EDA) to provide maritime domain awareness information to US and partner countries to help them deal with Illegal, Unregulated, Unreported (IUU) fishing.

“Just to give you a sense of the scale and how much we’re talking about — there’s around 470 U.S. own space-based remote sensing systems that are available today. And that’s going to expand to around 1,400 by 2030, we expect. So globally, there’s about a seven-fold increase in those systems. So the limiting factor isn’t how much of the Earth we can observe or how often — it’s how quickly we can derive insights from that data. And so that’s where, of course, AI and automation comes in. It helps us increase the speed and our capability to react to military and human humanitarian response efforts,” he explained.

This grew out of a SOUTHCOM effort and isn’t limited to IUU.

“When you log into EDA, whether it is on the U.S. side or the partner-nation side, many of the projects that you’ll see in there revolve around different priority lines of effort,” Kurey explained. For Southcom, besides IUU “you’ll find other things like counter-narcotics missions, and things like that. But it’s all encompassing, and you’ll find information and data and projects that I mentioned before will support a tactical environment, operational environment, or strategic messaging,” he said.

The system is being extended to other users.

As the platform continues to mature and blossom, now other combatant commands — including U.S. Northern Command, European Command and Indo-Pacific Command — are beginning to explore how they can integrate it into their own initiatives for data organization and support.

“Predicting illegal fishing activity is tip of the iceberg for mature AI technology” –BAE

BAE Systems technology applies machine learning analytics to automate low-level detection of activities of interest, such as fishing, from available data streams.

Below is a company press release, but it is an interesting one, with relevance to Coast Guard missions. The Obangame Express Exercise is one the Coast Guard has participated in, in the past. More info on the exercise here and here.


BAE Systems technology applies machine learning analytics to automate low-level detection of activities of interest, such as fishing, from available data streams.

The old “finding a needle in a haystack” analogy doesn’t begin to articulate the challenge associated with illegal fishing detection and identification. While a ship may be larger than a needle, the ocean is certainly larger than your biggest haystack. Add the need to not only find the ship, but determine its recent activities, anticipate future movements, and compare them with all other ships in the area — and do it in near real-time using open source data feeds.

At the Obangame Express event, which is the largest multinational maritime exercise in Western Africa, BAE Systems’ Multi-INT Analytics for Pattern Learning and Exploitation (MAPLE) as a Service, MaaS for short, was integrated with SeaVision, the U.S. Navy’s premier tool for unclassified interagency and coalition maritime data sharing. SeaVision is a maritime situational awareness tool that ingests maritime vessel position data from various government and commercial sources and simultaneously displays them on the same screen in a web browser.

“Military organizations use illegal fishing as a model application due to the unclassified nature of the available data,” said Neil Bomberger, chief scientist at BAE Systems’ FAST LabsTM research and development organization. “Successful detection of illegal fishing activity helps address a serious challenge and highlights another use case for our mature artificial intelligence technology.”

Giving depth to data

While manual analysis of individual vessel tracks is possible, it gets exponentially more challenging and time-consuming for large numbers of vessels. BAE Systems technology applies machine learning analytics to automate low-level detection of activities of interest, such as fishing, from available data streams. This enables analysts to quickly answer time-sensitive questions, prioritize manual data analysis activities, identify higher-level trends, and focus on decision-making instead of manual data analysis.

During the event, BAE Systems’ MaaS technology processed streaming data and automatically detected vessel behavior events that SeaVision displayed as an additional data layer to support user-friendly and timely analysis. The technology provides full visibility into the data to allow the users to check whether the detected behavior warrants further investigation. This helps build trust in the automation and supports additional analysis.

Decades in the making

BAE Systems’ FAST Labs maritime sensemaking capabilities are rooted in artificial intelligence and machine learning algorithms. Backed by nearly two decades of development, their behavior recognition and pattern analysis capabilities continue to show significant utility in real-world environments.

The cloud-based artificial intelligence technology was matured via work on the Geospatial Cloud Analytics (GCA) program. In the months since the successful event, the FAST Labs organization has continued to develop and mature its autonomy portfolio. Elements of its autonomy technology have proven successful in multiple domains including air, land, and sea.

“This successful event delivers on the promise of mature artificial intelligence technology – easy to integrate, incorporating trust, and providing fast and actionable information in a real-world scenario,” continued Bomberger. “The event showcased how our artificial intelligence technology can be deployed in a cloud environment, integrated with a government tool, and used to address relevant maritime activities.”

“Australia improving rescue efforts with artificial intelligence” –Indo-Pacific Defense Forum

RAAF C-27J conducts machine learning.

The Indo-Pacific Defense Forum is reporting that Australia is attempting to apply Artificial Intelligence (AI) to the visual search part of the SAR problem.

“Our vision was to give any aircraft and other defense platforms, including unmanned aerial systems, a low-cost, improvised SAR capability,” Wing Commander Michael Gan, who leads AI development for RAAF’s Plan Jericho, said in a news release from Australia’s Department of Defence. Plan Jericho, which was launched in 2015, is an RAAF 10-year blueprint to become one of the world’s most technologically advanced air forces.

It is a collaborative effort of the RAAF Air Mobility Group’s No. 35 Squadron, the Royal Australian Navy’s Warfare Innovation Branch and the University of Tasmania’s Australian Maritime College.

“There is a lot of discussion about AI in [the Department of] Defence, but the sheer processing power of machine learning applied to SAR has the potential to save lives and transform SAR,” Lt. Harry Hubbert of the Navy’s Warfare Innovation Branch, who developed algorithms for AI-Search, said in the news release.

I have to wonder if this is related to VIDAR, which has been included in the Coast Guard Scan Eagle UAVs, and can this be applied to Minotaur?

“Coast Guard turns to DOD’s new AI center for maintenance help” –DefenseSystems.Com

DefenseSyetems reports the Coast Guard is working with DOD’s Joint Artificial Intelligence Center (JAIC)

“We’re looking see how do a better job doing predictive maintenance for aircraft, helicopters in particular,”

So far, JAIC has released the first version of an algorithm to help with H60 Blackhawk maintenance to the Special Operations Command that will then head to the Army, Air Force and Navy. It is also working on solutions to help firefighters predict a fire’s movements and intensity and aid humanitarian assistance and disaster relief efforts, like California’s wildfires.