!Discover over 1,000 fresh articles every day

Get all the latest

نحن لا نرسل البريد العشوائي! اقرأ سياسة الخصوصية الخاصة بنا لمزيد من المعلومات.

AI Denial of Care – Humana Also Uses AI Tool with 90% Error Rate to Deny Care, Lawsuit Claims

Humana, one of the largest healthcare insurance providers in the country, uses an AI model that suffers from an error rate of up to 90% to override doctors’ medical judgment and deny care to seniors on Humana’s Medicare plans.

Lawsuit Against Humana

According to the lawsuit filed on Tuesday, Humana’s use of the AI model constitutes a “fraudulent scheme” that leaves senior beneficiaries either with crushing medical debts or without the necessary care covered by their plans. Meanwhile, the large insurance company is purportedly reaping a “financial gain.”

The Lawsuit and Compensation Claim

The lawsuit was filed in U.S. federal court in Western Kentucky by two individuals who had Humana’s Medicare plans and said that necessary and covered care was unjustly denied, harming their health and finances. The lawsuit seeks class action status for an unknown number of other beneficiaries nationwide who may be in similar situations. Humana offers Medicare plans to 5.1 million people in the United States.

The AI Model

Both complaints allege that insurance companies use the flawed model to determine the precise date of payment denials for post-incident care covered by Medicare plans, such as stays in skilled nursing facilities and rehabilitation centers for inpatients. The AI-powered model determines these dates by comparing the patient’s diagnosis, age, living situation, and physical condition with similar patients in a database of 6 million patients. Consequently, the model generates predictions for the patient’s medical needs, length of stay, and discharge date.

Claims and Medical Guidelines

However, the plaintiffs allege that the model does not fully take into account all of the patient’s circumstances, doctors’ recommendations, and the patient’s actual condition. They claim that the predictions are unfair and inflexible. For example, under Medicare plans, patients who spend three days in the hospital are entitled to up to 100 days of covered care in nursing homes. However, with the use of the designated model, patients rarely stay in nursing homes for more than 14 days before claim denials begin.

Impact on Patients and Sustainable Claims

While few people appeal coverage denials overall, more than 90 percent of those who appealed denials based on AI claims had their denials reversed, according to the lawsuits.

Nonetheless, insurance companies continue to use the model, and NaviHealth staff are required to adhere to the AI-driven predictions, keeping post-incident care duration within 1 percent of the days estimated by the model. NaviHealth employees who do not comply face penalties and termination. The lawsuit filed on Tuesday claims that Humana relies on the affected patients’ circumstances and a lack of knowledge and resources to appeal the erroneous AI-supported decisions.

Plaintiffs’ Cases

One plaintiff in the lawsuit filed on Tuesday is Joan Burrows from Minnesota. On November 23, 2021, Burrows, then 86 years old, was hospitalized after falling at home and breaking her leg. Doctors put her leg in a cast and ordered that she put no weight on it for six weeks. On November 26, she was transferred to a rehabilitation center for six weeks. But just two weeks later, Humana began denying coverage. Burrows and her family filed appeals, but Humana denied the appeals, claiming Burrows was able to return home despite being bedridden and using a catheter.

Her family had no choice but to pay out of pocket. They tried to transfer her to a less expensive facility, but she received inadequate care there, and her health further declined. Due to the poor quality of care, the family decided to move her home on December 22, even though she was still unable to use her injured leg and go to the bathroom by herself and still had a catheter.

The plaintiff

Susan Hagedorn from North Carolina is the other one. On September 10, 2022, Hagedorn was hospitalized due to a urinary tract infection, sepsis, and inflammation of the spine. She remained in the hospital until October 26, when she was transferred to a skilled nursing facility. Upon her transfer, she had eleven discharge diagnoses, including sepsis, acute kidney failure, kidney stones, nausea, vomiting, urinary tract infection, spinal swelling, and a spinal abscess. In the nursing facility, she experienced severe pain and was on the maximum allowable dose of pain medication oxycodone. She also developed pneumonia.

On November 28, she returned to the hospital for an appointment, at which point her blood pressure had risen, and she was sent to the emergency room. There, doctors discovered that her condition had significantly deteriorated.

At the same time, the day before, on November 27, Humana decided to deny coverage for part of her stay in the skilled nursing facility, refusing to pay from November 14 to November 28. Humana stated that Hagedorn no longer needed the level of care provided by the facility and that she should return home. The family paid $24,000 out of pocket for her care, and to this day, Hagedorn remains in a skilled nursing facility.

Impact on Patients and Sustainable Claims

Patients claim that Humana and United Health are aware that the nH Predict model is “highly inaccurate” but use it anyway to avoid paying for covered care and to make more profits. The denial is “systematic, unlawful, harmful, and oppressive.”

The lawsuit against Humana accuses them of breach of contract, unfair dealing, unjust enrichment, and bad faith insurance violations in several states. It seeks damages for financial losses, emotional distress, restitution and/or compensation, and to prevent Humana from using the AI-based model to deny claims.

Source: https://arstechnica.com/science/2023/12/humana-also-using-ai-tool-with-90-error-rate-to-deny-care-lawsuit-claims/


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *