Loren Data's SAM Daily™

fbodaily.com
Home Today's SAM Search Archives Numbered Notes CBD Archives Subscribe
SAMDAILY.US - ISSUE OF OCTOBER 01, 2025 SAM #8710
SOURCES SOUGHT

D -- AFOTEC AI Technology Showcase & Demonstration

Notice Date
9/29/2025 9:20:46 AM
 
Notice Type
Sources Sought
 
NAICS
513210 —
 
Contracting Office
FA7046 HQ AFOTEC A 7K KIRTLAND AFB NM 87117-0001 USA
 
ZIP Code
87117-0001
 
Solicitation Number
A29AFOTEC
 
Response Due
10/25/2025 7:30:00 AM
 
Archive Date
11/09/2025
 
Point of Contact
Austin Stolpestad, Phone: 5058467015
 
E-Mail Address
austin.stolpestad.1@us.af.mil
(austin.stolpestad.1@us.af.mil)
 
Small Business Set-Aside
NONE No Set aside used
 
Description
1.0 Purpose and Introduction The Air Force Operational Test and Evaluation Center (AFOTEC) is seeking information from industry partners on their ability to conduct on-site AI Technology Showcases and demonstrations for AFOTEC personnel. The purpose of this RFI is to identify partners with mature, relevant AI platforms and a willingness to demonstrate their capabilities in a hands-on, educational setting to our test teams, analysts, and leadership. This is for market research purposes only and does not constitute a Request for Proposal (RFP). 2.0 Background and Problem Statement AFOTEC�s Test & Evaluation (T&E) lifecycle is currently characterized by labor-intensive document review, fragmented data sources, inaccessible corporate knowledge, and manual reporting processes. To maintain our analytical edge, AFOTEC is seeking to adopt AI-powered capabilities to automate workflows, synthesize information, and enhance data-driven decision-making. We need to expose our workforce to the art of the possible and understand how commercially available technologies can solve our specific T&E challenges. 3.0 Event Concept: AI Technology Showcase & Workshops AFOTEC envisions a series of 3-4 hour Industry Day events (week of November 3-7) where selected vendors can showcase their AI platforms and capabilities directly to our personnel. The goal is to move beyond high-level briefings and provide tangible, use-case-driven demonstrations. A successful proposal would detail a workshop agenda that includes technology showcases and demonstrations focused on the capability areas described below. 4.0 Requested Information: Capability Showcase Areas Respondents are requested to provide information (not to exceed 5 pages total) describing their ability to conduct an Industry Day showcase that demonstrates solutions for the five capability areas below. For each area, please describe your proposed approach for the showcase and/or demonstrations. Capability Area 1: Automated Test Design and Requirements Traceability Objective: To replace labor-intensive document review with AI-enabled synthesis, creating a dynamic digital thread from system requirements to test outcomes. Requested Information: What specific AI technologies or platforms in your portfolio address this objective? How would you demonstrate your solution's ability to ingest various requirements documents (e.g., ICDs, CDDs) and automatically extract key entities and map their relationships? Capability Area 2: Integrated T&E Lifecycle Automation Objective: To provide a unified digital backbone that automates the end-to-end T&E lifecycle, including document generation, data ingestion pipelines, and dynamic reporting. Requested Information: What technologies would you showcase to demonstrate an integrated workflow automation engine? How would you demonstrate the automation of a typical document staffing and approval process, including notifications and audit trails? Capability Area 3: AI-Assisted Test Measures Development Objective: To turn measures development from a subjective art into a repeatable, data-grounded process using NLP to assist analysts. Requested Information: What specific NLP or semantic analysis tools would you showcase to address this objective? How would you demonstrate your solution�s ability to analyze source documents and automatically recommend candidate Measures of Effectiveness (MOEs), Performance (MOPs), and Suitability (MOSs)? Capability Area 4: Automated Document Classification and Compliance Enforcement Objective: To streamline the handling of our growing document sets by automatically tagging, categorizing, and routing files based on their content and sensitivity in accordance with user-defined rules. Requested Information: What technologies in your portfolio support automated, content-based document classification? How would you demonstrate your system's ability to enforce complex rule sets, such as Security Classification Guidelines (SCGs), on a set of sample documents? Capability Area 5: Automated Documentation Review and Knowledge Retrieval Objective: To enhance the quality of test documentation and provide analysts with an intelligent ""search"" capability across all of AFOTEC's corporate knowledge. Requested Information: What technologies would you showcase to demonstrate a natural language search capability over a large, unstructured document repository? How would you demonstrate your system�s ability to provide direct, cited answers to questions and ensure the factual accuracy of its responses? Tangible Requirements for AI Adoption at AFOTEC 1. A Centralized, Authoritative T&E Knowledge Repository Problem Addressed: The memo's most critical finding: ""No searchable AFOTEC repository,"" rampant version control issues, reliance on single individuals (the historian), and catastrophic data loss incidents. Requirement: An enterprise-wide system that serves as the single, authoritative source of truth for all programmatic artifacts, including test plans, data, models, briefs, and final reports. This system must enforce version control and have a robust data retention policy to prevent accidental or administrative data loss, codifying knowledge management as a system-dependent process. 2. An AI-Powered, Natural Language Search and Retrieval Capability Problem Addressed: The inability of personnel to find lessons learned, past test designs, or technical details without manually contacting individuals or searching through disconnected drives. Requirement: A capability built on Retrieval Augmented Generation (RAG) that allows any user to ask plain-language questions across the entire knowledge repository and receive direct, fact-checked answers with citations to the source documents. This capability must include a ""hallucination grader"" to ensure the factual consistency and trustworthiness required for T&E. 3. An Automated Data Ingestion and Processing Pipeline Problem Addressed: The time-consuming, resource-intensive, and error-prone manual process of gathering performance metric data from disparate systems, as highlighted by the desire for ""analysis tools that can apply machine learning and automation."" Requirement: An automated data pipeline, similar to the Metric Reporting Automation use case, that can connect to various data sources, extract targeted information, perform calculations, and populate standardized dashboards. This would eliminate the manual data wrangling that currently consumes significant analyst time. 4. A Scalable Document Digitization Capability Problem Addressed: The significant constraint on Digital Engineering (MBSE) and analysis caused by critical data being ""trapped within a vast repository of scanned documents,"" requiring thousands of labor-hours for manual extraction. Requirement: A capability that uses domain-specific AI to automatically identify and extract structured data from unstructured scanned documents like engineering drawings, parts lists, and legacy technical reports. This system must be able to process thousands of documents to unlock data for integration into PLM and MBSE systems. 5. An Integrated Workflow and Task Management Engine Problem Addressed: The documented inefficiency and user frustration with cumbersome tools like AFTOPS and the temporary ACAP staffing solution, which lack proper notifications, tasking, and audit trails. Requirement: A unified workflow engine that automates the entire document lifecycle (generation, review, approval) with clear tasking, automated notifications, and a searchable audit trail. This system must provide a single view of all tasks, from program milestones to individual analyst assignments, mirroring the success of the AWESOME project at DOT&E. 6. A Model-Based Document Generation Capability Problem Addressed: The limited adoption of MBSE due to the challenge of linking technical models (e.g., in Cameo) to the generation of standard test documentation, resulting in isolated models with little immediate value. Requirement: A capability to serve as the ""connective tissue"" between MBSE tools and the T&E process. The system must be able to ingest data and requirements directly from models via APIs and automatically generate formatted, compliant test artifacts (Test Plans, DMAPs, etc.), thereby driving MBSE adoption by providing a clear return on investment.
 
Web Link
SAM.gov Permalink
(https://sam.gov/workspace/contract/opp/eeceb212babd433b833c55924f219dd8/view)
 
Place of Performance
Address: Kirtland AFB, NM, USA
Country: USA
 
Record
SN07607828-F 20251001/250929230045 (samdaily.us)
 
Source
SAM.gov Link to This Notice
(may not be valid after Archive Date)

FSG Index  |  This Issue's Index  |  Today's SAM Daily Index Page |
ECGrid: EDI VAN Interconnect ECGridOS: EDI Web Services Interconnect API Government Data Publications CBDDisk Subscribers
 Privacy Policy  Jenny in Wanderland!  © 1994-2024, Loren Data Corp.