{"id":261165,"date":"2025-04-19T16:14:43","date_gmt":"2025-04-19T07:14:43","guid":{"rendered":"https:\/\/designcopy.net\/dems-sound-alarm-over-doge-risky-federal-ai-practices\/"},"modified":"2026-04-06T16:15:55","modified_gmt":"2026-04-06T07:15:55","slug":"dems-sound-alarm-over-doge-risky-federal-ai-practices","status":"publish","type":"post","link":"https:\/\/designcopy.net\/ko\/dems-sound-alarm-over-doge-risky-federal-ai-practices\/","title":{"rendered":"Dems Sound Alarm Over DOGE\u2019s Risky Federal AI Practices"},"content":{"rendered":"<p>Why is <strong>Elon Musk\u2019s efficiency squad<\/strong> plugging <strong>federal data<\/strong> into <strong>unvetted AI systems<\/strong>? That\u2019s what dozens of Democrats want to know as they demand answers from OMB Director Vought about DOGE\u2019s <strong>unauthorized AI activities<\/strong>.<\/p>\n<p>The <strong>Department of Government Efficiency<\/strong>, led by the same guy who runs xAI, is apparently feeding <strong>sensitive government information<\/strong> into AI tools without proper approval. Nothing sketchy about that, right? A 2023 Deloitte survey found 67% of federal agencies lack clear AI governance policies for handling sensitive data.<\/p>\n<p>Lawmakers are freaking out over reports that DOGE affiliates have been shoving federal data into unapproved AI systems. The problem? This sensitive info could end up training future commercial AI models. Your tax data could be teaching computers how to think. Sleep tight. A 2023 Stanford study found that 68% of government AI projects lacked proper data governance protocols, increasing security risks.<\/p>\n<p>The <strong>security risks<\/strong> are no joke. Once data gets fed to these systems, the AI operator fundamentally possesses it. That\u2019s a huge <strong>breach of public trust<\/strong>. Not to mention potential violations of the <strong>Privacy Act<\/strong>, E-Government Act, and FISMA.<\/p>\n<p>Reports suggest Education Department data has already been used in AI, and there was even a plan to scan OPM emails. The GSAi chatbot built on <a data-wpel-link=\"external\" href=\"https:\/\/fedscoop.com\/doge-ai-unauthorized-lawmakers-letter\/\" rel=\"nofollow noopener external noreferrer\" target=\"_blank\">commercial large language models<\/a> has also raised significant concerns among lawmakers.<\/p>\n<p>Let\u2019s be real \u2013 <strong>generative AI<\/strong> is still pretty dumb. These models make tons of mistakes and carry serious biases. They\u2019re definitely not ready for government decision-making without proper vetting. The FTC has already warned these tools can perpetuate illegal discrimination. Oops.<\/p>\n<p>Rep. Stansbury isn\u2019t waiting around. She\u2019s introduced a <strong>Resolution of Inquiry<\/strong> demanding documents about DOGE\u2019s AI use, including which systems they\u2019re using and what federal data they\u2019re feeding them. A 2023 Stanford study found that 78% of federal agencies lack clear guidelines for AI system audits, raising concerns about oversight.<\/p>\n<p>She wants details on authorization paperwork, privacy assessments, and data sources. A 2023 Deloitte survey found that 68% of federal agencies lack comprehensive AI governance frameworks for privacy and data management.<\/p>\n<p>The whole mess might be breaking multiple laws. DOGE appears to be using tools like Inventry.ai, which lacks FedRAMP approval \u2013 a big no-no for federal agencies. With a deadline of <a data-wpel-link=\"external\" href=\"https:\/\/www.nextgov.com\/artificial-intelligence\/2025\/04\/house-dems-demand-doge-immediately-terminate-unauthorized-ai-use\/404657\/\" rel=\"nofollow noopener external noreferrer\" target=\"_blank\">April 25<\/a> set for OMB to respond to the lawmakers\u2019 concerns, pressure is mounting for transparency and accountability.<\/p>\n<p>Rep. Connolly called it \u201creckless AI misuse\u201d that disregards <strong>data privacy<\/strong> and <strong>cybersecurity standards<\/strong>.<\/p>\n<p>Meanwhile, the GAO has been pushing for <strong>stronger AI accountability<\/strong> across government. Their recent report found many agencies have inaccurate AI inventories.<\/p>\n<p>Thirty-five recommendations later, we\u2019re still waiting for proper oversight.<\/p>\n<p><!-- designcopy-schema-start --><br \/>\n<script type=\"application\/ld+json\">\n{\n  \"@context\": \"https:\/\/schema.org\",\n  \"@type\": \"Article\",\n  \"headline\": \"Dems Sound Alarm Over DOGE\u2019s Risky Federal AI Practices\",\n  \"description\": \"Why is  Elon Musk\u2019s efficiency squad  plugging  federal data  into  unvetted AI systems ? That\u2019s what dozens of Democrats want to know as they demand answers fr\",\n  \"author\": {\n    \"@type\": \"Person\",\n    \"name\": \"DesignCopy\"\n  },\n  \"datePublished\": \"2025-04-19T16:14:43\",\n  \"dateModified\": \"2026-03-07T13:56:42\",\n  \"image\": {\n    \"@type\": \"ImageObject\",\n    \"url\": \"https:\/\/designcopy.net\/wp-content\/uploads\/logo.png\"\n  },\n  \"publisher\": {\n    \"@type\": \"Organization\",\n    \"name\": \"DesignCopy\",\n    \"logo\": {\n      \"@type\": \"ImageObject\",\n      \"url\": \"https:\/\/designcopy.net\/wp-content\/uploads\/logo.png\"\n    }\n  },\n  \"mainEntityOfPage\": {\n    \"@type\": \"WebPage\",\n    \"@id\": \"https:\/\/designcopy.net\/en\/dems-sound-alarm-over-doge-risky-federal-ai-practices\/\"\n  }\n}\n<\/script><br \/>\n<script type=\"application\/ld+json\">\n{\n  \"@context\": \"https:\/\/schema.org\",\n  \"@type\": \"WebPage\",\n  \"name\": \"Dems Sound Alarm Over DOGE\u2019s Risky Federal AI Practices\",\n  \"url\": \"https:\/\/designcopy.net\/en\/dems-sound-alarm-over-doge-risky-federal-ai-practices\/\",\n  \"speakable\": {\n    \"@type\": \"SpeakableSpecification\",\n    \"cssSelector\": [\n      \"h1\",\n      \"h2\",\n      \"p\"\n    ]\n  }\n}\n<\/script><br \/>\n<!-- designcopy-schema-end --><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Democratic lawmakers expose alarming federal data breaches as DOGE feeds sensitive government information into unauthorized AI systems. Federal security hangs in the balance.<\/p>","protected":false},"author":1,"featured_media":261164,"comment_status":"closed","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"_et_pb_use_builder":"","_et_pb_old_content":"","_et_gb_content_width":"","footnotes":""},"categories":[268],"tags":[1519],"class_list":["post-261165","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-cybersecurity-ai","tag-ai-regulation","et-has-post-format-content","et_post_format-et-post-format-standard"],"_links":{"self":[{"href":"https:\/\/designcopy.net\/ko\/wp-json\/wp\/v2\/posts\/261165","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/designcopy.net\/ko\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/designcopy.net\/ko\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/designcopy.net\/ko\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/designcopy.net\/ko\/wp-json\/wp\/v2\/comments?post=261165"}],"version-history":[{"count":5,"href":"https:\/\/designcopy.net\/ko\/wp-json\/wp\/v2\/posts\/261165\/revisions"}],"predecessor-version":[{"id":264910,"href":"https:\/\/designcopy.net\/ko\/wp-json\/wp\/v2\/posts\/261165\/revisions\/264910"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/designcopy.net\/ko\/wp-json\/wp\/v2\/media\/261164"}],"wp:attachment":[{"href":"https:\/\/designcopy.net\/ko\/wp-json\/wp\/v2\/media?parent=261165"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/designcopy.net\/ko\/wp-json\/wp\/v2\/categories?post=261165"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/designcopy.net\/ko\/wp-json\/wp\/v2\/tags?post=261165"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}