4 tips keep you in the black and off the feds' radar screen. 1. Compare yourself. First determine your own average visit per episode number. Then benchmark that figure against both your own earlier-period numbers and your peers' numbers for the same period, Vance recommends. 2. Compare apples to apples. It's important to know what data the report you choose compares, Vance cautions. If you try to compare your numbers against others that don't include the same information, you'll never get an accurate picture of your performance. 3. Investigate. If your average visit per episode number is off from the benchmark, it's time to "drill down," Vance urges. Being above or below the number by even two or three visits can indicate clinical or financial problems that need addressing, she cautions. 4. Train staff. Once you know where your numbers lie, you can better train your staff to use transdisciplinary teams to boost patient outcomes in the most efficient way, Vance exhorts. 5. Eye productivity. Once you get a handle on using visit benchmarks to guide clinical and operational practice, you can also use it to gauge whether you have enough--or too much--staff to meet your needs. You can see how many visits per episode are done and get a sense of how many episodes a clinician can manage, Vance recommends.
It's time to put on your detective hat and follow the benchmarking clues to better costs and outcomes.
Comparing yourself to your peers can give you a big signal of what you need to work on at your home health agency to boost profit and improve your patient outcomes and clinical practices, advises consultant Karen Vance with BKD in Springfield, MO.
And one of the most important numbers to monitor is the average number of visits per episode, Vance recommends. "You want to watch [this number] very carefully," Vance urged in an Eli Research teleconference on clinical management and benchmarking earlier this year. "It is the one number over which you have the most influence and control. [And] it gives you most directly what your average cost is."
Beat them to the punch: Besides furnishing you with critical operational, clinical and financial information, the benchmark may keep you off authorities' radar screens. Medical reviewers, federal investigators and even surveyors use visit utilization to target providers for further scrutiny and investigation, experts warn.
Here are Vance's tips for using average visit per episode benchmarking to thrive under the prospective payment system:
Where to find it: HHAs can obtain benchmarking reports from a variety of companies (including a report offered by Eli Research and BKD).
Some reports offer data only from agencies that subscribe to a service, while others use cost report data from all HHAs. Reports that contain cost report data will be more inclusive, but also will contain older data due to the delay in the collection and release of cost report information.
Examples: All Missouri Medicare agencies had 16.19 visits per episode in 2003, according to a report compiled by BKD from cost report data for that year. In comparison, 152 agencies using The Analyzer software system from VantaHealth Technologies saw average visits per episode of 19.00 in 2005, according to a presentation by Roberson, Muck & Associates at the recent annual meeting of the Association for Home & Hospice Care of North Carolina in Durham, NC.
The Analyzer data didn't include low utilization payment adjustments (LUPAs) while the BKD information included all episode adjustments. Be aware of crucial differences like these when you do your own numbers comparison, Vance urges.
Remember: A high or low number doesn't necessarily mean there is a problem. Rather, it is a red flag to investigate and find out whether a problem exists, Vance explains. "It tells me where to look first."
To get to the root of the potential problem, look at average visits per episode by discipline. For example, one common problem scenario is low therapy numbers combined with high aide numbers, Vance warns. That could mean your patient isn't learning to function independently and instead aides are picking up the slack.
Another potential problem is therapy numbers that are the same for every patient even when patients' conditions vary greatly, Vance offers.
Once you have disciplines benchmarked, you can delve even further and look at individual clinicians' average visit numbers. Clinicians whose visits fall outside of the norm without justification are candidates for individual coaching, Vance recommends.
Clinicians should remember they are trying to get patients as independent as possible so they can function as well at home as possible when they are no longer home care-eligible. And a home health plan should aim to keep patients out of the hospital.
Warning: Vance advises strongly against setting out a cookie-cutter plan with a pre-set number of visits based on the patient's home health resource group (HHRG) score. Instead, encourage staff to use their clinical judgment to give patients the care they need in the most appropriate number of visits.
Note: For information on Eli's 2006 Home Health Operations Dashboard, which includes the agency, local, regional and national benchmarks discussed in this article, call 1-888-779-3718 x326 or email dashboard@eliresearch.com.