Update, 8/30, 2:50 EST: the United States Army's Deputy of Cybersecurity confirmed with BuzzFeed the existence of a computer security flaw that enables unauthorized access to users without proper security clearance. They say the best fix is to make soldiers aware of proper conduct, instead of fixing the technology itself.

The U.S. Army has been aware for years of a major security flaw in the system soldiers use to access computers — and has done nothing to fix it, two sources, including an officer who alerted superiors to the risk, told BuzzFeed.

Update: Roy Lundgren, the Army's Deputy of Cybersecurity, confirmed with BuzzFeed that the security failure exists and has the potential to provide users unauthorized access. [Full statement below.]

Today countless computers, and the soldiers who use them, remain vulnerable to a simple hack, which can be executed by someone with little or no security expertise.

The officer, who reported the flaw two years ago, was told to keep quiet, despite evidence of its widespread exploitation. Another soldier, who went to his superiors and even Congress, got no results.

The hack allows users with access to shared Army computers to assume the identities of other personnel, gaining their securities clearances in the process, by exploiting issues with the computers' long and buggy log-out process, according to the sources familiar with the flaw.

The officer, an Army lieutenant, spoke on the condition he not be named; he is referred to here as "Mark." He discovered the flaw in October 2011, when he was playing around on his military computer during one of his 18-hour shifts. Being "of the hacker mind-set and being really, really bored," as he puts it, he wanted to see if there were any holes in it.

That's when he discovered the major, and obvious, computer security flaw.

"Oh shit," Mark said to himself when he figured it out. "This isn't good."

He described to BuzzFeed calling in his superiors — two middle-ranking officers, one in military intelligence and the other in computer communications.

As Mark described it, their eyes grew wide.

But, according to Mark, they told him there was nothing they could do. It would cost too much to fix it, they told him. It would require redoing too many contracts. "The term they used is that it would be 'impractical' to try and fix it," he says.

Instead, they made him sign the Army's version of a nondisclosure agreement. If he told anyone else about what he found, he could face prison time, he said.

"I'm showing you this so you can fix this," Mark recounts telling the officers. "This is obviously a huge problem. I'm probably not the only asshole who figured out how to do this."

Update: "If an issue is reported to our cybersecurity directorate, we would normally contact the system owner and ask them for an assessment," the Army told BuzzFeed, not commenting on the response to this specific report. "Often the risk is known and mitigating factors are already being applied and/or the organization has developed a plan of action to correct the issue."

At least one other soldier besides Mark has tried to formally report the security flaw, going to his military superiors as well as Congress and the Pentagon. This soldier's efforts, too, were met with inaction and silence.

Mark made a second attempt to report the security flaw when a new officer replaced one of his superiors. But again, nothing came of it.

"At that point I could try to talk with one of the division-level guys, but I know from personal experience that he is one of the people who plays the game," he said. "I wondered if it would raise a red flag about me if I tried to keep addressing the flaw."

Update: When asked about Mark's non-disclosure form, the Army did not comment. The proper steps, it said, is to report within their chain of command with the supporting professional IT/cybersecurity staff.

Big private tech companies like Google, Facebook, and Microsoft routinely seek out and sometimes pay people like Mark who expose security flaws. Some have set up bounty systems giving any member of the public who finds and reports a bug up to $20,000.

The military has no such system. If reporting to a superior goes nowhere, then in reality, there is little recourse for soldiers who discover computer security problems. They could report a bug to the Department of Defense inspector general, which handles complaints about fraud, waste, and abuse. But that's not an obvious avenue for computer issues. Moreover, if their superiors found out, they could face retaliation.

One refrain in the wake of the National Security Agency leaks is that Edward Snowden should have reported his concerns up the chain of command rather than leaking documents to the press. But the internal reporting system is seriously broken in the military. All too often when a soldier reports misconduct or illegal activity, it is swept under the rug.

Perhaps the most egregious recent example of such a mind-set is the tragically late response to reports of widespread sexual assault in the service. Women's reports weren't just ignored — the victims were subject to retaliation including but not limited to being barred from medical treatment, having their information made public, and being discharged from the military. Recent pressure on the issue led to an updated version of the Military Whistleblower Protection Act, first created in 1988. The fact that it had to be updated to specifically include people reporting sexual assault speaks to its inadequacy.

Retaliation against internal whistle-blowers is a fact of military life. Between October 2012 and April 2013, the Department of Defense's inspector general's office received 695 complaints about "whistleblower reprisal, restriction of service members from contacting an IG or member of Congress, procedurally improper mental health referrals and senior official misconduct." Those are only the cases which were reported.

Mark's case suggests serious issues with the military's security reporting infrastructure too, even when the issue at hand is ideologically neutral.