Using big data technologies to optimize workflows in hospitals
There are various time-critical workflows within a hospital related to the diagnosis and treatment of patients which have the highest impact on efficiency and quality of care to patients. Data analysis is the smartest way to identify weaknesses in workflows helping to develop improved protocols.
Some illnesses such as stroke —the sudden death of brain cells due to lack of oxygen— progress very fast, so that an early identification and appropriate management in the initial hours is vital. Another known issue is that nurses can spend up to 30% of their total work time searching for equipment in a hospital. Taking into account that labour costs make up the vast majority of healthcare spending, this represents a significant percentage of the total European healthcare spending.
One of the areas of research within the BigMedilytics project aims to demonstrate how workflows within a hospital can be optimized to help improve operational efficiency. Such efficiency improvements can help deliver on the quadruple aim: enhancing the patient experience, improving health outcomes, lowering the cost of care, and improving the work life of care providers.
The project demonstrates the role of Big Data in the optimization of workflows through the execution of four pilots. The first two pilots focus on using real-time location data in combination with other data streams (e.g. electronic medical records) to efficiently treat patients suffering from hyper-acute conditions such as stroke and sepsis where every lost minute can severely affect patient survival and the length of rehabilitation.
The third pilot focuses on locating mobile assets making this process more efficient, while the fourth pilot optimizes workflows in the radiology department with the aim of reducing the time to diagnosis in these departments, and at the same time increase the quality of diagnosis.