Overview
Title
Submission for OMB Review; Comment Request
Agencies
ELI5 AI
The U.S. Army wants to see if a new game can help them understand how people think about systems, which is useful for jobs like fixing cyber problems and building things. They are asking people to play the game and say what they think, but there are questions about how much it will cost, why they only picked certain players, and how they will keep everyone’s information safe.
Summary AI
The Department of Defense, through the U.S. Army Research Institute for the Behavioral and Social Sciences (ARI), is seeking public comments on a new information collection proposal. This proposal involves evaluating a game designed to assess systems thinking abilities, which are important for jobs in fields like cybersecurity and engineering. Participants, who are freelance workers from Amazon's Mechanical Turk platform, will play the game, answer evaluation questions, and provide demographic information. Feedback will help refine the game's usability and clarity, and findings will be summarized in a technical report. Public comments on this proposal are open until March 29, 2021.
Abstract
The Department of Defense has submitted to OMB for clearance the following proposal for collection of information under the provisions of the Paperwork Reduction Act.
Keywords AI
Sources
AnalysisAI
The document from the Federal Register presents a notice from the Department of Defense, specifically highlighting a new information collection proposal by the U.S. Army Research Institute for the Behavioral and Social Sciences (ARI). The proposal seeks to gather feedback on a game-based assessment tool designed to measure systems thinking abilities, which are crucial for success in various technical career fields such as cybersecurity and engineering. This assessment will involve participants from Amazon's Mechanical Turk (MTurk), who will complete the game, answer evaluation questions, and provide demographic information. The deadline for public comments is March 29, 2021.
Summary of Key Points
The aim of the survey is to assess the effectiveness of a game meant to measure systems thinking, ensuring it performs as intended and is relevant for specific job roles. Respondents are freelance workers on MTurk who will voluntarily partake in this study and receive compensation. The gathered data will help refine the game's clarity and usability.
Issues and Concerns
Several significant issues arise from this document:
Cost Transparency: There is no mention of the total cost involved in conducting this survey, including payments to the MTurk participants. This omission makes it difficult for the public to evaluate whether the expenditure might be excessive or unjustified.
Participant Selection Method: The choice of MTurk, a platform known for a specific demographic profile, might not adequately justify its selection, and the sole reliance on this platform might skew the diversity of participants. This could potentially affect the reliability and validity of the feedback collected.
Complex Language: The document contains technical jargon that could be challenging for the general public to comprehend. This lack of clarity could impede transparency and the public's ability to engage meaningfully with the proposal.
Data Protection Concerns: The document lacks details regarding the safeguarding of collected demographic information. Given the sensitivity of personal data, this raises potential privacy concerns.
Broad Public Impact
For the general public, the document outlines a procedural yet important part of developing assessments that can indirectly impact fields that depend heavily on systems thinking capabilities. Public engagement and feedback, however, might be limited due to aforementioned issues, especially concerning transparency and participant diversity.
Impact on Stakeholders
Participants on MTurk: Freelancers on MTurk will benefit from this proposal through compensation for their participation. However, the selection of MTurk as the platform could raise questions about equitable participant representation.
Defense Technology Sectors: Sectors reliant on robust systems thinking abilities, such as cybersecurity and engineering, may see potential benefits from improved assessment tools. However, the reliability of the tool remains uncertain without addressing diversity and methodology concerns.
General Public: Members of the public who might seek to offer input could be handcuffed by the document’s complexity and lack of budget transparency, reducing their ability to gauge the proposal's implications or to trust its outcomes fully.
In summary, while the proposal aims to enhance assessment methodologies in vital career fields, the current approach raises several concerns that could impact its effectiveness and public perception. Increasing transparency regarding costs, participant selection, data protection, and simplifying language could significantly enhance public trust and engagement.
Issues
• The document does not specify the total cost involved in conducting the information collection, including payments to participants on MTurk, potentially hindering assessment of whether the spending could be considered wasteful.
• The choice of freelance workers from Amazon's Mechanical Turk (MTurk) for the study might favor this platform over others, but the justification for selecting MTurk participants is not clear.
• The method of recruiting participants solely through the MTurk website might limit the diversity of participants, potentially affecting the validity of the assessment evaluation.
• The document includes complex language and technical terminology that may not be easily understood by the general public, potentially limiting transparency.
• Information about how the collected data will be protected, especially given the collection of demographic information, is not specified or clarified in the document.