DARPA: SBIR Opportunity: Third Party Verification of COTS Compliance with Requirements

Suspense Date: 8 June 2021 Description: The Defense Advanced Research Projects Agency (DARPA) Small Business Programs Office (SBPO) is issuing an SBIR/STTR Opportunity (SBO) inviting submissions of innovative research concepts in the technical domain(s) of Electronics, Information Systems. In particular, DARPA is interested in understanding the feasibility of Third Party Verification of COTS Software Compliance with Requirements.

Category

Opportunity

DoD Communities of Interest

Cyber

Subject

SBIR Opportunity: Third Party Verification of COTS Compliance with Requirements

Due Date

8 June 2021

Government Organization

Defense Advanced Research Projects Agency (DARPA)

Description
The Defense Advanced Research Projects Agency (DARPA) Small Business Programs Office (SBPO) is issuing an SBIR/STTR Opportunity (SBO) inviting submissions of innovative research concepts in the technical domain(s) of Electronics, Information Systems. In particular, DARPA is interested in understanding the feasibility of Third Party Verification of COTS Software Compliance with Requirements. This SBO will open for proposals on May 06, 2021 and close at 12:00 p.m. ET on June 8, 2021.

TOPIC OVERVIEW
a. Objective
Develop and test lightweight techniques by which third parties can conduct rapid postmarket product verification and validate vendor software compliance with usage requirements.

b. Description
DARPA seeks to develop a solution to identify vulnerabilities and defects in the software and firmware of high-risk, cyber-physical systems such as an unmanned sensor, unmanned ground vehicles, Internet of Things (IoT) devices, or medical devices that have the potential to impact a national scale. The developed techniques should produce direct evidence that the system under analysis faithfully implements its requirements with some degree of confidence. Systems of interest include but are not limited to resource-limited embedded systems. General-purpose computing devices (e.g., smartphones, etc.) are not of interest at this time. Inputs to the proposed solution should be limited to device requirements and software binaries. Highly automated solutions are sought, human in the loop solutions are acceptable. In any case, the evidence production should take no more than a small number of days.

c. Phase I
As a proof of concept, at the end of Phase 1, performers should demonstrate the ability to lift software binaries from a relevant platform into an analyzable form and the ability to analyze the software for a relevant property. The proof of concept may include human-in-the-loop analysis but should outline which portions can be automated in Phase II. Phase 1 will also include a study on the class of requirements that can be addressed, and importantly, characterizing the class of requirements that cannot be addressed. Proposers that can provide documentation that they have produced the proof of concept and can provide the results of the requirements study will be considered for a (DP2) award.

Schedule/Milestones/Deliverables Phase I fixed payable milestones for this program should include:
· Month 1: Kickoff meeting materials describing the technical approach and expected outcomes
· Month 3: Report on the production of an analyzable representation of the target software
· Month 5: Interim report describing properties to be analyzed and the analysis approach
· Month 7: Demonstration of the analysis
· Month 10: Final Phase I Report summarizing approach; characterization of analyzable properties; description of processor architectures supported; results; comparison with alternative state-of-the-art methodologies

d. Phase II
Phase II performers will develop, verify, and validate techniques that provide evidence that software behaviors comply with usage requirements. The approach must be generalizable to multiple device families. The performer must substantiate that the false negative and false positive rates are less than 5%. Justification must be provided on what this approach addresses the class of systems and requirements and what class of systems and requirements are unsuited for this approach.

Schedule/Milestones/Deliverables Phase II, fixed milestones for this program, should include:
· Month 2: Kickoff meeting materials, including details on technical approach and execution plan
· Month 6: Interim report on assurance generation
· Month 9: Interim report describing the analysis, including the practical bound of analysis for system size and complexity
· Month 12: Interim report quantifying system performance, comparing with alternative state-of-the-art approaches or other conventional methods, and
documenting lessons learned
· Month 18: Demonstration of end to end capabilities on a representative device
· Month 24: Final Phase II report documenting final prototype architectures and algorithms; methods; results; comparisons with alternative methods; and quantification of accuracy, robustness, and generalizability, i. e. Dual Use Applications (Phase III)
IoT devices and other types of embedded computing devices are proliferating and increasingly perform crucial functions in many situations. Yet, it remains difficult for end-users to ascertain the fitness of these devices for the purposes they are being used.

The properties of reliability, safety, and security of such devices are often judged on the developer's reputation. The military is increasingly utilizing mass-produced devices to support operations. Again, with nothing more than the developer's reputation and promises on the fitness of these devices for the use case in which they are deployed. The ability to determine if a device meets the use case requirements in which it will be used is useful to support mission-critical capabilities both in the DoD and in the private sector.

Website

https://beta.sam.gov/opp/fa8c40ef210a436987a2a3b963e5de8b/view