SS: A Deeper Dive

Sentient Systems

Our Sentient Systems team fuses data from multiple sensors to provide situation awareness and understanding for a wide variety of existing and emerging applications. We develop state-of-the-art detection, tracking, characterization, and attribution algorithms that “see through the clutter” to find the “information” (objects, activities, events of interest, and actors) and understand actions, intent, and the likelihood of future events.

Applying AI to Sensor Processing and Data Fusionstanding The Human Terrain

We research and develop include a mix of novel and state-of-the-art AI/ML approaches fused with traditional signal processing and statistical techniques to process and understand radar, acoustic, optical, or other sensor data. We use supervised and unsupervised methods, including deep learning, generative adversarial networks, feature detection and analysis, and context reasoning, to detect, track, and characterize objects of interest. We closely integrate these advanced techniques with traditional signal processing techniques (e.g., Kalman filtering and Space Time Adaptive Processing) and statistical methods (e.g., random forests and Bayesian approaches). These fused approaches have improved system speed and accuracy by orders of magnitude.

Projects include:

  • STAPiNN: Space Time Adaptive Processing in NNs
  • INSIDER: Inertial Navigation System Inspection and Detection of Evolving Roles
  • CARIBOU: Context- and AI-Based Reasoning for Identification onBOard UUVs
  • DIFFUSE: Deep Inference and Fusion Framework Utilizing Supporting Evidence

Fusing Physics- and Non-physics-based Information

We leverage a wide variety of probabilistic and machine learning modeling approaches to fuse physics-based sensor data (e.g., radar, acoustic, imagery, video) with non-physics-based data (e.g., HUMINT, news, context information, semantic models, multimedia) to produce all-source situation awareness and explanation. We select, test, and implement the best methods for a given domain and available data. Our novel fusion approaches for handling overlapping and conflicting sensor data and models adaptively provide the best possible state estimates and predictions.

Projects include:

  • DICE: DAC Integration and Contextual Explanation
  • PANORAMA: Predictive Analytics for NOrmalcy Reasoning and AnoMaly Analysis
  • MAGPIE: Maritime Gray-Zone Patterns, Indicators, and Effectors
  • RISOMS: Rapid Identification of Space Objects using Multisensor Signatures

Optimize Sensor and Information Collection

Boston Fusion develops methods to estimate sensor and data fusion uncertainty results and select the best sensors and operating modes to reduce that uncertainty. This includes developing and demonstrating new information theory, optimal sensor scheduling algorithms, and novel user displays.

Projects include:

  • (UC)2: Uncertainty Characterization Using Copulas
  • SEE-DATA: Sensor Exploitation for Evidence and Discovery for Anticipatory Threat Analysis
  • UMIMMI: Universal Multivariate Information Measures for Multisensor Inference
  • SANDMan: Situation Aware Network Deception Management
  • ADEN: Anomaly Detection Engine for Networks
Human Machine