HSPN ELEVATE: The Common Thread Linking Documentation, Satisfaction, and Revenue Protection

This article is sponsored by nVoq. This article is based on a Hospice News discussion with Jason Banks, vice president of post-acute business development at nVoq. This discussion took place on September 7, 2023 during the Hospice News ELEVATE Conference. The article below has been edited for length and clarity.

Hospice News: Today we will talk about threads of clinical documentation and satisfaction and also revenue protection. Let’s just dig into what the problem is with that renewed focus on quality in hospice.

Jason Banks: In addition to the clinician shortage, as we’ve talked about, a lot of hospices, and CMS is pretty clear about this, they’re going to start to see increased scrutiny for every area of the business. Whether that’s short lengths of stays, longer lengths of stays, GIP levels of care qualification, medical eligibility or medical necessity requirements, really every area, hospice is under the microscope now.


This seems to go in waves. We had a lot of this in home health, and now it’s hospice’s turn to be scrutinized. As Dexter pointed out earlier, I think it’s interesting that hospice organizations seem to just not win, right? Your GIP length of stay is too short, your GIP length of stay is too long. It’s like Goldilocks and the Three Bears, unfortunately.

I have my own opinions about it. I think that the requirements for GIP are way too stringent. I, myself, having run four inpatient units here in Chicago, there were certainly times where exacerbation of symptom management and getting control over those symptoms took longer than a four- or five-day period. There’s also tons of placement issues that nobody ever talks about, right? How do you get somebody like me to feel comfortable, who’s never taken care of mom or dad, to now go into the home environment and 24/7 care and with maybe a half an hour’s worth of instruction, and trust me to take care of mom or dad in the home? It’s just not a viable option.

A lot of times we, unfortunately, had to eat a lot of the reimbursement because we had to switch them to a routine level of care, even if they were in our inpatient unit, while our social work team worked on placement issues or they worked on caregiver issues. They worked on things like that. It’s tremendously painful.


I think there’s easy solutions to this, which CMS has started to look into. One of those is to provide respite care in the home. I don’t know why we’re not doing this today. It makes total sense to provide respite care in the home. It does not make sense to transport fragile, terminally ill patients outside of their home in order to provide respite care for the caregiver.

There’s also things that I think CMS could do very, very quickly to improve the quality of care. One of them is better coordination between the auditors that control hospice and control senior living. I talked about earlier today, if you don’t own both entities, there’s a friction there. Some of the friction comes in, “Okay, I’ve got a hospice patient inside of a SNF.” “Well, don’t provide full rails because they’re trap hazards inside of a SNF.” “Well, they’re hospice patients. They’re going to fall, and if they fall, they’re going to die.” You have this imbalance of, okay, we’re trying to find the right balance from a SNF perspective and not get dinged on an audit, versus what’s right for the patient and family at the end of life. Those two don’t match, and nobody’s really talking about that.

Another one is psychotropic medications. That is very common in hospice. By the way, mood-altering medications are needed. You would want it yourself. Yet, in senior living environments, they get dinged for it. We get told, “Don’t bring the psychotropic medications into the facility even if they’re hospice patients.” I think there’s simple fixes that can be employed. Then more difficult ones are things like extending the reach of GIP to be a longer time cycle, moving respite care to the home. I feel like there should be some intermediary between routine care and continuous care. It shouldn’t be an all or nothing. Why is it eight hours? Why can’t it be six hours or four hours at a different level of reimbursement, right? But it’s got to be an eight-hour time period.

There are things that I think CMS can look at and fix and do it fairly quickly that would alleviate some of the burden that’s put on these hospice providers. The net of it is that the folks that are running hospice and palliative care, they have so much on their plate regulatory-wise.

It’s an administrative burden.

One example of that is the addendum for non-covered services. If you are any hospice worth their salt, you are already providing services for diagnoses that weren’t the primary diagnosis for that hospice patient all along. We didn’t need to be told to do that, and we certainly didn’t need a long administrative process to administer the addendum for non-covered services. Why not make it an all-inclusive benefit where we are reimbursed for not just the primary hospice diagnosis care, but exacerbating diagnoses as well? Things like complex wound care. Even though that may not be the primary reason or driver for hospice care, why aren’t hospices reimbursed at the appropriate level to take care of all of these other comorbid conditions? By the way, we were eating it anyway. We were doing it anyway. There was no hospice patient where they had an underlying disease that we knew about that we weren’t treating for as a part of our symptom management anyway.

It seems to me it was a problem that didn’t need to be solved at the time. That’s where nVoq comes in from a speech recognition perspective and says, “Hey, clinician. Here’s an area that you missed from a documentation perspective where you could get dinged by CMS for not documenting it, either this element or this way.” We teach things like defensive documentation, which is unfortunate that we have to teach that, but we do.

Walk us through why the Renewed Focus of Quality in 2024 is a focus and maybe how that ties into what you were saying, the clinical documentation, what does that mean as far as that goes?

We just thought initially when we were providing speech recognition to clinicians in hospice that the efficiency of the documentation gained would mean a higher level of quality of the documentation, and it did mean that. What they also started to ask us to do was to provide some visual cues to the clinicians at the time of documentation that would further enhance the quality of that documentation.

One example is a CTI. We can tell the hospice physician who’s writing the CTI, “Hey, Doc, you want to make sure that you include the PPS score, the prognostic statement, the disease progression, all of the things that are required elements for CMS in order to pass survey readiness from a medical eligibility perspective.” If the Doc misses one of those, it reminds them. It pops up and says, “Hey, you missed the PPS score. You missed the prognostic statement. You missed disease progression. You missed something that is required by CMS in order to determine medical eligibility of that patient.”

We started to build out a feature that we call Note Assist. It’s the first time that we, in real time, our AI, machine learning, is listening to the clinician in real time and pulling data elements out and saying, “Yes, this meets the requirement. No, this doesn’t meet the requirement.” With that, we started to find multiple use cases for it. Not just hospice CTIs. IDG notes, recertification notes, even down to psychosocial assessments, spiritual care assessments. We’ve had a number of internal components that we required as a part of our documentation that we have built into the system that enables the clinician to get the documentation right the first time every single time.

What you find is that clinicians thrive at bedside care of patients and families. They love doing it. They don’t love doing documentation, but they do want to do a good job at documentation. They’re just so overwhelmed at the time of doing it, which is normally at the kitchen table at seven, eight o’clock at night, while they’re trying to clean up from dinner or whatever they’re doing, and the quality ultimately suffers. The more that we can push that up– and when we see differences– just immediately. The documentation gets done way closer to the visit time. It doesn’t go back and forth from QA to the clinician nearly as much as it used to. Ultimately, that results in clinician satisfaction and clinician retention. We measure all of those things.

Documentation errors are definitely one of the top reasons why it brings regulatory attention to a hospice’s doorstep. That documentation being foolproof is one of the things that we’ve heard as being efficient for clients. In being able to meet those quality challenges, I wanted to ask you a little bit more about that speech recognition tool that you’re talking about. What comes to mind to me is like if you’re trying to use it on your cell phone, how often it gets it right versus wrong, but go ahead and tell us a little bit more about that.

That is a common thread. We hear a lot from organizations that say, “Hey, we’ve tried to use speech recognition before and it just bombed out on us,” and we hear that across the board. We say, “Well, what did you use?” They say, “Well, whatever is available on our phone or our tablet device.” Normally, the first frustration point that they have is, it’s not accurate. It’s not recognizing either the medical terminology, the dialect of the individual or accent of the individual.

What they find traditionally when they use us is immediately, they’re over 95% accurate when they first start using it, and it gets better from there. We build out all of our medical dictionaries, we build out all our medication formularies, we update them quarterly. It’s highly accurate right out of the box, so clinicians are not going to get frustrated with the tool, which ultimately is counterproductive to why they try it in the first place. They wind up not sticking with commercially available solutions because they’re just not accurate, and they don’t have time to train the system on medical terminology. They just wind up burning out, fizzing out and stop using the platform altogether, and so we find that time and time again.

Then we work with clinicians on both individual and group basis, and we start to do things like add proper nouns to the dictionary. For instance, you might have local physicians, local facilities that you use in your particular geography that needed to get added to the dictionary. We show the provider organization, here’s how you go add those to the dictionary so that they show up the right time every time. It really increases the accuracy. That’s one of the biggest components to adoption, is this has got to be very close to 100% accuracy right out of the gate. If clinicians start using a tool that’s not accurate, or it’s buggy, or it has downtime or any of that, they will stop using it. It will add to their frustration, which is exactly the opposite effect that you’re trying to solve for.

How have you seen that speech recognition impacts quality, and as far as nVoq goes, what have you seen?

It’s amazing because we have a way to go into an EMR and tell an organization, here are the notes where you have deficiencies in. The vast majority of hospices that we’ve had the good fortune of working with, they have a significant deficiency in documentation, and we measure that quality right off the bat.

What we do is we go in and we start reading all of their hospice CTIs, recertification narratives, admission notes, psychosocial notes, and we look for the key data elements that are required by CMS. Without any hesitation, I can tell you that about 30% of all documentation is deficient. It’s deficient to a point where the organization is at a very high risk, and they’re at risk of two different things. One is survey readiness. The other one is a claw back of revenue. For those of you who are billing or experts in billing in hospice, as you know, CMS can go pay a claim and then claw back that claim later based on a wide variety of factors: medical eligibility, lack of documentation. At a bare minimum, you’ve got a lot of re-work to do to go prove that that documentation meets the actual reality of the patient.

Again, we’re trying to push all of those things as far upstream as possible, telling the clinician, “Hey, these are the things that CMS requires that you document. You don’t want to get it back from QA. We don’t want you to get it back from QA. We’re going to give you this cue card of what to document in the medical record that will pass muster when it gets to CMS auditors, or payment regulatory bodies, or your own internal QA, so that you’re not having to do the documentation over again.”

We find that usually, we see about a 15% improvement in documentation quality simply by using speech recognition alone. Then we see another 20% to 30% bump in quality by using Note Assist, which is that feature on top of it. We start to measure those data elements. Again, this is really intended for organizations that are getting to the size of an organization where it becomes difficult to manage the quality of it.

I know for us, when I ran the hospice here in Chicago, we had 500 admissions a month. Our census was about 550, so you could calculate our average length of stay. It’s about 35 days, which is our average length of stay. Highly acute population. We had enormous manpower, enormous horsepower, looking at everything from our admission documentation to our hospice CTI documentation. Even if you’re an organization that has applied a lot of that manpower, there’s a cost for that. There’s a significant cost. In many circumstances, the people that were reviewing the hospice documentation were either RNs or above. They were advanced practice nurses, some of them were doctors that were actually reviewing the medical eligibility documentation requirements and CTI. It’s a very costly endeavor.

This is where we look at how we can leverage AI, how can we leverage machine learning to replace some of those administrative functions? I get asked this question a lot, which is, “What is nVoq’s position on generative AI?” I tell them, “We don’t have a position on it.” Meaning, we’re not believers that we will ever or should ever replace clinical judgment. We are here to help the clinician get more efficient at what they’re doing and provide a higher quality of documentation, but at no point do I ever see us actually doing documentation for the clinician. That needs to come from that clinician. It’s sacred.

We have to make them more efficient, we have to make them more effective in what they do, but ultimately, it still relies on the clinician. Then the administrative functions, we can provide AI around, we can provide machine learning around, to help assist with that: cut down the cost, improve the quality. We’re never going to– gosh, I hope we’re never going to have robots in the home that are taking care of patients and families.

Moving on to these changes that you’ve seen tied to that clinical satisfaction piece, dig into that a little bit more.

Yes. From a clinician satisfaction perspective, we measure clinician satisfaction in a multitude of different ways and manifests itself in a multitude of different ways. Obviously, you have key indicators that tell you that a clinician is more satisfied. They have less after-hours documentation, they’re getting their notes done much quicker. There are also lagging indicators of success as well.

We actually do a survey with clinician and gather clinician sentiment on how they feel about using speech recognition. Also, we start to measure things like retention rates of those clinicians, and we start to look to tie those to reasons why they’re either staying or leaving the organization. I could tell you, we did extensive work when I ran the hospice organization here in Chicago on exit surveys. We asked every clinician that left voluntarily why they were leaving the organization, but we had a fatal flaw, which was we didn’t peel the onion enough.

Most of the clinicians, the overwhelming majority, left the organization because of the work-life balance. That was the number one issue of the reason we had clinicians leaving the organization. If we had been able to peel that onion some more to say, “What do you mean by work-life balance? Is it drive time that’s burdening you? Is it after-hours documentation time that’s burdening you? Is it just the mechanics of the actual visit that are burdening you?” What we would have found– and there’s a very large independent study. I think it’s the first one that’s ever been done on hospice nurses in the hospice industry, done by the Amity Group. It has about 1,000 responses, and it’s a very new study that just came out. Overwhelmingly, the clinicians said the number one issue is additional documentation time is the reason why they either had left the organization that they were with or considered leaving the organization or the profession. That was really eye-opening.

We spend a lot of time measuring, how can we impact those leading indicators that are leading to clinician dissatisfaction, and how can we strip those out? Some of our RN case managers were carrying 20, 22, 23 patients on their caseload. If you think about our acuity level of a 35-day length of stay, that means that every other day, they have a patient dying that’s on their caseload. As we all know, the admission and death are the most time-consuming pieces of the hospice life cycle.

I think nVoq speech recognition is a piece of the puzzle, it’s not the entire puzzle. There are things that can be applied in terms of caseload. There are things that can be applied in terms of routing of your clinicians so that they’re more centrally located from a living environment to where they’re going to be serving out in the community. There’s lots of different things that can be done.

We played with a lot of different models when we were trying to figure out how to better that work-life balance. We had an admission care navigator model where we would do the admission visit and the 24-hour follow-up visit would be done by the admission nurse so as to not disrupt the RN case manager. What would happen was we’d have the admission nurse do the admission visit, then do the 24-hour follow-up. Then theoretically, that hospice case would fall right in line with whatever frequency that that RN case manager could put on his or her schedule. Sometimes it worked, sometimes it didn’t. We tried seven-day work weeks where we split up caseloads between two different nurses who split a seven-day model, and so we had a seven-day coverage model versus a traditional five-day coverage model.

Again, I think there’s lots of opportunities on the provider side. There’s also lots of opportunities on CMS side to accommodate for some of this.

That’s an interesting point as far as how that technology part plays into the retention part. I want to visit the revenue part now. Let’s move on to the money part. How does this relate to a hospice bottom line at a hospice organization?

We did a large study with Amedisys. This was presented back in Tampa at a post-acute long-term care conference earlier this year, where we looked at the ability of speech recognition to drive down Medicare denials of claims. What we found was Amedisys had 15% of their Medicare claims that were at risk of denial because the hospice CTI was missing one of those key data elements that we talked about earlier. PPS score, the prognostic statement, the disease progression, the ADLs, whatever it was, it was missing one or a multiple of those. We took them from 15% claims at risk of denial to 3% in about nine months. For an organization like Amedisys, that’s millions and millions of dollars.

For any other organization that’s working with us, we think we can make a dramatic impact on reducing the cost of their QA spend, improving their first-time claim pass-through by the quality of the documentation, and just overall, protecting them from claw backs of revenue. In addition, and we started to measure this over time, we are seeing that clinicians that are using this, not only we’re seeing a higher retention rate of those clinicians, we’re starting to see some of the agencies that we’re working with are starting to use those clinicians in testimonial videos as recruiting tools. They’re actually recording those clinicians that are using our platform, and now they’re putting those out on social media saying, this is a way that we invest in our clinicians in order to make their work-life balance better. Ultimately, we’re starting to see some of the data come in about increased applicants, increased capacity of those organizations. It’s really interesting.

The area that we still have not yet measured, but I think it will bear some fruit, is the impact on the patients and families themselves. If you think about it, this is a change of practice. For the first time ever, hospice patients and families are able to hear what goes into the medical record. For many years, I sat in patients’ homes typing a note. They had no idea what I was doing, probably wondering if I’m playing Tetris or something, and thinking to themselves, “This guy is sitting here for a half an hour, 45 minutes. It’s weird.” They’re starting to hear what’s going into the medical record.

If you know anything about hospice CAHPS, there’s one question on the CAHPS scores that dictates how the family will answer every other question on the CAHPS score. That question on the CAHPS survey is, how did I feel about the care that was being provided to me? That will dictate the way that they answer every single other question on the CAHPS score. We’re getting comments from nurses, some social workers, from chaplains saying, “I get comments back from the family saying, thank you. Thank you for letting me know what’s actually going into our medical record.”

nVoq Incorporated provides a HIPAA and PCI-DSS compliant, cloud-based speech recognition platform supporting a wide variety of healthcare delivery scenarios including post-acute care with an emphasis on home health care and hospice. To learn more visit: https://www.nvoq.com/.

Companies featured in this article: