The quality of patient data at America’s hospitals and caregiver organizations is clearly an important issue- but also a notoriously tricky one. Data errors lead to additional cost, lost revenue and declines in customer satisfaction, not to mention headline risk and potential dangers to patient outcomes. It seems obvious that every institution wants to achieve 100 percent accuracy and maintain a culture that values data quality. The story on the front lines isn’t so simple, however.
Speed versus Accuracy
Since most duplicate records and other errors are generated at the point of data entry, it’s tempting to assume that issues can be solved through better training and coaching of front line staff. It’s important to note these professionals work their hardest to ensure data accuracy and expedite the intake time. Think about the actual process that plays out regularly at the “front of the house” during intake. In an emergency room setting, speed is often of the essence- priority is appropriately placed on delivering patient care as promptly as possible.
Less dramatic but related scenarios play out regularly in lower stress medical environments as well. With waiting rooms filled with impatient clients and the need to keep doctors in front of a steady stream of patients, it’s not rational for a clerk to focus on a questionable record when there are others who need to be taken care of.
In other words, data integrity is a noble aspiration, but in the complicated everyday world of healthcare, there must be technological support that can better help well-trained personnel achieve data accuracy.
Keeping Quality at the Core
Another path to better data quality is to implement healthcare technology that guards against duplicate records. This is not an either/or decision; intake employees need detection software equipped to flag exception items. This helps them retain a keen eye for detail, but also creates a backstop during fast paced, high intensity moments.
Leading healthcare core systems include a module designed to detect duplicate records. Although this healthcare technology performs reasonably well, they are not nearly as robust at detection as solutions built expressly for that purpose. ARGO has conducted numerous analyses in which it identified significant volumes of duplicate records that passed undetected through these basic modules. Since these databases grow over time, the cumulative impact on back office costs and patient experience can be striking.
Fortunately, ARGO’s IntegrityID solution is designed so that integration with the existing core system is not necessary. Its own duplicate detection routine proceeds along a parallel path, enhancing the efficiency of the overall process.
Every medical facility strives to do the best job it can with the healthcare technology at its disposal, with the dual goals of better patient experience and financial performance. The question becomes how focused the front desk is- and realistically should be- at preventing duplicate entries upfront, and how good a job existing software does at catching such issues. Following a dispassionate review of the operational landscape and viable alternatives, providers may find that an additional software solution is the best answer.
For more information, download our IntegrityID solution briefs.