Skip to content

DIU CSO — Endpoint Accuracy 2: CUAS Close-In Kinetic Defeat Enhancement#

Technical Brief — Part 1: Aided Target Recognition for CROWS#

Vendor Argus Defense
Contact production@argusdefense.us
Date 2026-05-12
CAGE / SAM Registration in progress; CAGE acknowledgement expected before prototype award
Submission addresses Part 1 only (Aided Target Recognition for CROWS). Parts 2, 3, and 4 are out of scope for this submission.
Repository (technical artifact) https://github.com/PoggyBobby/argus-aitr
Hosted demo Link to be added with the application form
Recorded walkthrough 3-5 min link to be added with the application form

1. Executive summary#

ArgusAiTR is a non-intrusive Aided Target Recognition overlay for CROWS-class Remote Weapon Stations. It taps the RWS video output, runs passive detection, multi-object tracking, monocular ranging, and kinematic threat classification on commodity edge compute, and renders overlays plus slew-to-cue back to the operator. CROWS firmware is unchanged. If the AiTR pipeline drops, the operator retains full standard RWS capability — verifiably, via a single click or automatic FPS/error-based degradation.

The submitted technical artifact is a working software prototype. Pipeline is end-to-end: ingest → stabilize → detect → track → range → classify → operator HMI → audit. Every CSO-required behavior — passive operation, day/night-path architecture, robustness to adverse conditions, mandatory human-in-the-loop, graceful fallback — is implemented and exercisable from the operator console. 111 tests pass on the developer Mac.

Argus is, today, a one-person company partnered with an AI development assistant (Claude Code). This proposal is explicit about that baseline and explicit about the post-award scaling plan: hire one ML engineer and one systems engineer; engage Kongsberg US (CROWS OEM) for surrogate access; source supplementary EO/IR sensors (FLIR/Teledyne); port to NVIDIA Jetson AGX Orin; fine-tune on Argus-licensed UAS imagery; integrate, validate, and deliver the live-fire range demonstration in Month 3. Honesty here is a deliberate posture. The reviewer who sees a one-person team claiming five-engineer capacity rightly discounts; the reviewer who sees a one-person team with a working system and a credible scaling plan can evaluate on merit.


2. Problem understanding#

The CSO specifies an AiTR system that augments CROWS against Group 1 and Group 2 UAS. Concretely, the CSO requires:

  1. Passive detection, classification, ranging, tracking at ~600 m detection / ~100 m engagement, against UAS at < 30 m/s.
  2. Day-and-night operation, with night preferred.
  3. Robustness to weapon firing shake, muzzle flash, and high-frequency platform jitter.
  4. Cluttered-scene performance (natural and man-made).
  5. Seamless RWS integration with open system architecture.
  6. Improved target discrimination to minimize collateral damage.
  7. Operator workload reduction through automated target handling.
  8. Graceful fallback to standard RWS on AiTR malfunction or degradation.
  9. Human-in-the-loop required.
  10. DoD AI Ethical Principles compliance.
  11. Clear IP rights for all proposed technology.

ArgusAiTR addresses each of these. Pointers to specific code modules appear in §4 and in the appendix table.

The harder-to-state requirements — "minimize false positives in cluttered conditions," "operate effectively even in cluttered background conditions both natural and man-made" — are the genuine engineering challenges. The PoC artifact does not solve them in 48 hours from one person. It solves the architectural problem (how the system must be shaped so those problems can be solved by Month 3) and demonstrates the algorithmic core on representative scenarios. The proposal makes that distinction explicit and back-loads the remaining work into the post-award plan with named milestones and named partners.


3. Technical approach#

3.1 Architecture (non-intrusive overlay)#

                       ┌──────────────────────────────────────┐
                       │ PerturbationSim (test/demo harness    │
                       │ injecting jitter, muzzle flash,       │
                       │ vibration, dropout)                   │
                       └──────────────┬───────────────────────┘
 VideoSource ─► Stabilizer ─► Detector ─► Tracker ─► Ranger ─► Classifier ─► HMI
   │                                                                          │
   ├─ FileSource (PoC)                                                        │
   ├─ RTSPSource (stub for production)                                        ▼
   ├─ ThermalSource (stub for production)                                OperatorAction
   └─ CROWSAdapter (mock now / GVA post-award)                          (HITL)
                                                                         AuditLogger
                                                                         (JSONL; CoT/TAK
                                                                          and DDS sinks
                                                                          stubbed for
                                                                          post-award)

Why this shape: The CSO calls for "seamless integration with current RWS architectures" and the ability to "revert to standard operational capabilities without any performance loss." A non-intrusive overlay achieves both with the minimum integration risk and the fastest path through the 3-month post-award window. The system reads CROWS' existing video output, processes it on Argus edge compute, and renders overlays plus slew-to-cue back. CROWS firmware is unchanged. The same architectural surfaces accept deeper, co-located integration as Part-2 follow-on.

3.2 Module-by-module (implemented vs designed-and-stubbed)#

The table below is the honest implementation status. "Implemented" means the code is in the repository and exercised by the test suite. "Stub" means the integration surface is documented and the class is instantiable, but the operation raises NotImplementedError with a clear "post-award" message.

Stage Implementation in PoC Production path
Video ingest FileSource (cv2.VideoCapture; looping) RTSPSource (stub) — CROWS HD-SDI → IP bridge → RTSP
Thermal ingest None ThermalSource (stub) — cooled MWIR + EO/IR fusion
Stabilization IdentityStabilizer default; FeatureStabilizer (Shi-Tomasi + LK + partial-affine) available Trajectory-smoothed feature stabilizer + IMU fusion
Detection YOLODetector wrapping Ultralytics YOLOv8n on COCO weights, filtered to {airplane, bird, kite} as UAS proxies and relabeled uniformly RT-DETR (Apache-2.0) fine-tuned on Argus-licensed UAS imagery, or YOLOv8 commercially licensed; same Detector ABC
Tracking IoUTracker — IoU + Hungarian assignment, tentative-to-confirmed lifecycle, per-track velocity from bbox centers ByteTrack/BoT-SORT for dense scenes; same Tracker ABC; IoUTracker stays as failsafe
Ranging MonocularRanger — size-priors + focal-length-based, confidence scaled by bbox-px and prior tightness Adds platform-motion triangulation (slewing turret parallax) + optional limited-duration laser ranging (CSO permits this)
Threat classification ThreatHeuristic — transparent kinematic rules with reasoning strings Supervised classifier with RWS-context and environment features
CROWS adapter MockCROWSAdapter (logs slew commands to JSONL); GVAAdapter stub GVAAdapter over Generic Vehicle Architecture (STANAG 4754)
Telemetry sink JSONFileSink; TAKTelemetrySink stub TAK CoT, DDS, MQTT, encrypted edge-network sinks (Part 4 alignment)
Operator HMI FastAPI + MJPEG + SSE + vanilla-JS console, served at localhost:8000 Same web console + optional Qt embedded-panel build for vehicle integration
Audit AuditLogger writing JSONL across pipeline + HMI events Same plus durable storage with rotation, signed-line provenance
Pipeline orchestrator Threaded ingest → ... → publish loop with FPS cap, rolling FPS window, degraded/offline state machine Same; deployed on Jetson AGX Orin with TensorRT-optimized inference

3.3 Open System / interoperability#

  • Pydantic contracts at every stage boundary (Frame, Detection, TrackedTarget, OperatorAction, SystemHealth).
  • ABCs for VideoSource, Detector, Tracker, Stabilizer, CROWSAdapter, TelemetrySink — every component is swappable without changing the orchestrator.
  • Stubs for the post-award integration surfaces (RTSP, thermal, GVA, TAK) so misuse fails loudly with a documented message.
  • On-the-wire data is JSON; no closed proprietary formats.

The docs/integration-contract.md document is the normative integration plan and includes the production wire-format intentions.

3.4 Robustness (the "adverse conditions" requirement)#

The CSO names "shake from weapon firing, muzzle flash, and high frequency jitter from the base vehicle platform" as required-to-handle. The PoC ships a Perturbation Simulator (src/argus_aitr/stabilize/perturb.py) that injects these conditions on demand from the operator HMI. Four named profiles:

  • clean — pass-through baseline.
  • vehicle_idle — sub-pixel sinusoidal vibration only.
  • engagement — jitter (translation + rotation) + muzzle flash + vibration.
  • severe — engagement + occasional dropped frames.

The muzzle flash injection is a radial saturation lobe at the lower-frame center — modeling the saturation pattern of a flash through daylight optics, not a global wash. During the demo, switching to engagement and severe is the moment that directly demonstrates the CSO clause. The tracker holds through the simulated disturbance because the architecture is correct, not because the model is heroic — and the proposal does not claim more than that.

3.5 Cluttered scenes and false-positive minimization#

False positives are the failure mode the CSO is most pointed about. The PoC's defenses are:

  1. Detector confidence floor (configurable; 0.20 default). Below-floor detections are not surfaced.
  2. Tracker confirmation lifecycle — single-frame detections are not emitted; the tracker requires N hits to confirm. One-frame noise cannot reach the operator.
  3. Kinematic threat heuristic — a target with no closing-speed evidence is unknown. Static or receding targets are non-threat. Only fast-closing targets with credible time-to-arrival are probable or high.
  4. Reasoning strings on every classification — every threat verdict carries a human-readable explanation (e.g., "closing 12.4 m/s; predicted time-to-arrival 7.2 s") that the operator and the auditor can sanity-check.
  5. Operator HITL gate — even a high verdict does nothing without an explicit operator click. The system advises; it does not engage.

Post-award, the supervised classifier upgrade adds environment features and RWS context for hardened cluttered-scene performance.

3.6 Graceful fallback#

The pipeline publishes a SystemHealth snapshot every tick with aitr_status ∈ {nominal, degraded, offline}:

  • nominal — pipeline FPS above floor, no consecutive errors.
  • degraded — FPS below configured floor (5 fps default) or consecutive detector errors. The HMI auto-displays a prominent AiTR DEGRADED — REVERT TO MANUAL banner.
  • offline — operator clicked Disable AiTR (or upstream system flagged a failure). HMI shows AiTR OFFLINE — STANDARD RWS VIEW.

In both degraded and offline states, the operator continues with the underlying video feed; AiTR overlays disappear; no slew commands can be issued by AiTR. Every transition is recorded in the audit log.

3.7 HITL hard gate (mandatory)#

The CSO mandates HITL. ArgusAiTR enforces it architecturally:

  • The only path that generates an "engage" event is the POST /op/engage_request endpoint, which is only reachable via the operator console's confirmation modal.
  • The mock CROWS adapter has no autonomous slew path. Every send_slew_command call is operator-attributed and audited.
  • The production GVA adapter (stub today) inherits the same CROWSAdapter contract — there is no place in the codebase where commands originate without an operator action.

A reviewer can verify this property by grepping the codebase for send_slew_command and confirming the only caller is the operator-action handler.


4. Demonstration#

4.1 The 48-hour artifact (submitted with this application)#

A public GitHub repository (argus-aitr) with:

  • The full source for the pipeline, HMI, and integration surfaces.
  • A one-command launcher (scripts/run_demo.py) that brings up the operator console on localhost.
  • A synthetic offline test clip generator so the demo works without internet.
  • 111 passing tests on Apple Silicon Python 3.11.
  • All proposal documents in docs/ (architecture, ethics, roadmap, licensing, integration contract, operator handbook).
  • A short recorded walkthrough (3-5 min) demonstrating the operator workflow, adverse-conditions toggle, failsafe banner, and audit log.

The hosted demo on Hugging Face Space (free CPU tier) is a low-friction way for reviewers to poke the running system without installing anything.

4.2 Phase 2 in-person pitch (if selected)#

Same demonstrator, run live from an Argus laptop. The session covers:

  • Live operator-console walkthrough on real UAS footage.
  • Q&A on architecture, integration plan, partnership pipeline, IP posture.
  • Schedule alignment with DIU on the post-award window.

4.3 Post-award 3-month plan (live-fire validation)#

See docs/roadmap.md for the month-by-month breakdown. Summary:

  • Month 1. Hire ML + systems engineers. Engage Kongsberg US for CROWS surrogate access. Source supplementary EO + cooled MWIR. Port pipeline to Jetson AGX Orin with TensorRT.
  • Month 2. Acquire Argus-licensed UAS imagery; fine-tune RT-DETR (Apache-2.0). Promote IoU tracker → ByteTrack for dense scenes. Add platform-motion triangulation. Indoor → outdoor range testing through 600 m envelope.
  • Month 3. Government range scheduled. Operator workflow rehearsals. Live UAS engagement: Group 1 quadcopter at 100 m, with adverse conditions injected and multi-target scenarios.

5. AI Ethics#

docs/ai-ethics.md contains the full mapping with code-location citations. Summary:

Principle ArgusAiTR posture Implementation
Responsible Operator owns every engagement decision. AiTR advises. HITL gate on POST /op/engage_request + modal confirm + no autonomous slew path.
Equitable Threat assessment uses kinematics, not identity/appearance. classify/threat_heuristic.py rules; no biometric / demographic data ingested.
Traceable Append-only audit log records every detection, classification, operator action, CROWS command. audit/log.py fans events to JSONL; reasoning strings on verdicts.
Reliable Failsafe degradation; test suite covers failure paths. pipeline.py degraded/offline state machine; 111 tests including failure-mode tests.
Governable Single-click /aitr/disable; operator always authoritative. HMI toggle + audit log of disable/enable transitions.

6. IP and licensing#

docs/licensing.md has the full dependency inventory. Summary:

  • Argus-authored code: Apache-2.0 (LICENSE). Covers pipeline, HMI, audit framework, perturbation simulator, integration contracts, all stubs.
  • OSS dependencies: Catalogued with licenses. Mostly BSD/MIT/Apache.
  • One AGPL-3.0 dependency (Ultralytics YOLOv8): Used for convenient model loading and inference in the PoC. For production deployment Argus has two clean migration paths, planned for Month 1–2 post-award:
  • Replace with RT-DETR (Apache-2.0). Same Detector ABC; one-file change.
  • Purchase Ultralytics commercial license.
  • Data rights: No fine-tuning in the PoC; no training-data dependency in this artifact. Demo video clips, if any, are CC-licensed with attribution in samples/README.md. Post-award training data is Argus-licensed.
  • Supply chain: No restricted-country contributions to Argus-authored code; no use of components originating from restricted jurisdictions per EAR/ITAR; domestic-controlled hosting.
  • Argus is not a research-only organization, reseller, or integrator — per the CSO's stated exclusions. Argus is a product company building AiTR overlay software with the intent to deliver and field it.

7. Team and scaling plan#

Today. Argus Defense is a one-person company. The PI is the founder and the only engineer. Claude Code (Anthropic's AI development partner) is the development force-multiplier and is the named technical contributor on every commit.

This is the honest baseline. The proposal does not claim Argus has the engineering depth of a five-person team today. The submitted artifact demonstrates that the PI plus AI partnership has the architectural judgment and execution speed to ship a working, tested AiTR pipeline in 48 hours from a blank directory. That is the calibration the rest of the proposal is built on.

Post-award. On contract award, Argus will hire:

  • 1 ML engineer — owns the UAS-detector fine-tuning, dataset curation, evaluation harness. RT-DETR / YOLO experience required.
  • 1 systems engineer — owns the CROWS surrogate integration, Jetson port, GVA adapter, supplementary-sensor integration.
  • 1 integration / test engineer (Month 2–3) — owns the live-fire range preparation, operator workflow validation, V&V documentation.

Partnership pipeline.

Partner Role
Kongsberg US (M153 OEM) CROWS surrogate access; interface specs; integration review
NSWC / Army C5ISR Center Government-side range access; SME consult; FMS channel
FLIR / Teledyne Cooled MWIR sensor sourcing
Edmund / MOOG EO telephoto + gimbal integration
NVIDIA Jetson AGX Orin + DGSC partnership

Argus has identified these as the credible partner set. The roadmap does not depend on any single partnership closing — multiple paths exist for CROWS access (Kongsberg, FMS, NSWC) and for sensors (multiple vendors). The first 30 days of the post-award window are gate-checks on partnership progress with explicit fallback paths if any single channel does not close in time.


8. Schedule#

Phase Window Deliverable
Phase 1 application (this submission) NOW Public repo + hosted demo + recorded walkthrough + this brief + one-pager
Phase 2 pitch DIU schedule In-person live demo + Q&A
Award & kickoff DIU schedule Contract execution; project kickoff
Month 1 (post-award) Months 1 Hires + Kongsberg engagement + supplementary sensors + Jetson port
Month 2 (post-award) Months 2 Fine-tuned UAS detector + ByteTrack + range testing
Month 3 (post-award) Months 3 Live-fire range demonstration + final report + transition recommendation

9. Compliance checklist#

CSO requirement Where
Submission indicates Part(s) addressed on first page Cover, top of this brief
Open System Architecture docs/architecture.md + Pydantic contracts + ABCs
DoD AI Ethical Principles compliance docs/ai-ethics.md + audit log
Clear ownership / licensing / data rights docs/licensing.md + LICENSE
Not a research-only org, reseller, or integrator §7 above
CAGE / SAM registration In progress Confirmed before prototype agreement
Passive detection / classification / ranging / tracking §3.2
~600 m detection / ~100 m engagement (architecturally addressed) Sensor + range upgrades in docs/roadmap.md; intrinsics override exposed in FileSource for varied optics
Group 1 & 2 UAS @ < 30 m/s Architecture-agnostic; fine-tuning scope in docs/roadmap.md
Cluttered backgrounds (natural + man-made) Partial Confidence floor + kinematic gate + tracker confirmation; supervised classifier post-award
Adverse conditions (shake, muzzle flash, jitter) stabilize/perturb.py demonstrated; production stabilizer in roadmap.md Month 1
Operator workload reduction HMI surfaces ranked targets, reasoning strings, one-click HITL actions
Improved target discrimination Partial Kinematic gate today; supervised classifier post-award
HITL hard gate §3.7
Graceful fallback on degradation §3.6
Seamless RWS integration Non-intrusive overlay posture; docs/integration-contract.md

10. Closing#

Argus Defense submits this proposal with a working artifact, an honest baseline, a credible post-award plan, and a working partnership pipeline. The submission is calibrated to be defensible — every claim points to either code in the repository or to a named milestone in the roadmap. No claim is vaporware.

The 48-hour PoC artifact answers two reviewer questions immediately: "Can this team architect the system correctly?" and "Can this team execute under pressure?" The roadmap answers the next two: "What partnerships and hires close the remaining gap?" and "How long until you can take this to a range?"

Argus would value the opportunity to deliver against this CSO. We look forward to the Phase 2 pitch.

— Argus Defense, 2026-05-12