Hospital information management is in essence knowledge management along with data management systems. Hospital information management enables users to share information that meets technical, social, managerial, and economical requirements. Information management began in the 1960s when Medicare and Medicaid essentially drove the health care system. At this time, computers were huge, and many times hospitals shared a mainframe to manage hospital accounting systems. In the 1970s, it became evident that computers were the future. Computers were smaller, which made it possible for each department within the hospital to have a computer. Specific departments, such as the lab and radiology, could have their own computers on which to keep information.
The 1980s brought about diagnosis-related groups, which defined a classification system that identified services the patient received directly related to reimbursement. “For the first time, hospitals needed to pull significant information from both clinical and financial systems in order to be reimbursed. At the same time, personal computers, widespread, non-traditional software applications, and networking solutions entered the market”. As a result, applications were created to enable financial and clinical systems to talk to each other.
By the 1990s, health care was driven by completion and consolidation. Hospitals had access to broad, distributed computing systems and robust networks, and data and reporting became integrated. The 2000s brought about the beginnings of outcome-based reimbursement. Technological advancements allowed for the installation of bedside clinical applications. As the years passed, everything became more complicated and the amount of data collected was enormous, so that information management is currently driven by accountable care organizations and value-based purchasing initiatives.