Quiz: Module 6, Lesson 2 - Mission Control: Applying Elite Systems Engineering to AI Projects
Instructions: Answer the following questions based on the lecture notes for M6, L2.
Part 1: Multiple Choice
1. What is the core question that "Verification" answers in the NASA Systems Engineering lifecycle? a) "Did we build the right thing?" b) "Did we build the thing right?" c) "Did the stakeholders like the thing?" d) "Did we build the thing on time?"
2. Which of the following is NOT one of the four DORA metrics for measuring engineering performance? a) Deployment Frequency b) Lead Time for Changes c) Number of Lines of Code Written d) Change Failure Rate
3. In the NASA project lifecycle, what is the very first step before any technical work begins? a) Design Solution Definition b) Product Implementation c) Technical Requirements Definition d) Stakeholder Expectations Definition
Part 2: Short Answer
4. A team builds an AI-powered fraud detection system. It meets all the technical requirements: it runs fast, integrates with the bank's systems perfectly, and correctly identifies 99.9% of fraudulent transactions in the test dataset. However, when deployed, it generates a high number of false positives, freezing the accounts of legitimate customers and causing widespread frustration. Is this a failure of Verification or Validation? Explain why.
Answer Key
1. b) "Did we build the thing right?" 2. c) Number of Lines of Code Written. 3. d) Stakeholder Expectations Definition. 4. (Sample Answer): This is a failure of Validation. The system passed Verification because it was built correctly according to the technical specifications (it met the 99.9% detection rate on the test data). However, it failed Validation because it did not build the right thing. It failed to meet the stakeholders' (and customers') implicit expectation of not having their legitimate transactions constantly blocked. The system works technically, but it doesn't work for the users in the real world, making it a validation failure.