The Truven Health Analytics 50 Top Cardiovascular Hospitals study identifies hospitals that achieve the best performance on a scorecard of performance measures. This year, based on comparisons between the winners and a peer group of similar high-volume hospitals that were not winners, the study found that if all cardiovascular providers performed at the level of this year’s winners, approximately 9,500 additional patients could survive, more than $1 billion could be saved, and almost 3,000 additional bypass and angioplasty patients could be complication-free. This is based on an analysis of Medicare patients; if the same standards were applied to all inpatients, the impact would be even greater.
As the healthcare industry shifts focus from volume to value, standardization is needed to accurately benchmark labor resource utilization. This is the premise of a survey conducted by HealthLeaders Media and sponsored by Kronos.
What constitutes direct patient care? Hands-on patient assessment, administering medications, and performing procedures clearly top the list. But can other activities be considered direct care too—even those not conducted in a patient’s presence?
Download the free report to get statistics and analysis from the survey questions below and much more
- Which of the following actions are considered direct patient care in your organization?
- Which of the following actions are considered indirect patient care in your organization?
- Which of the following actions are considered neither direct nor indirect care but are categorized separately as non-patient care in your organization?
While we must continue to emphasize to all members of the care team that they are the front line to preventing errors, taking a systems or holistic approach will greatly assist in making adverse events rarer. Aiding in the implementation of the latter are many companies that provide incident reporting, analysis, and review systems.
In October 2013, S&P Dow Jones Indices (S&P DJI) launched the S&P Healthcare Claims Indices (the indices). This new index series is designed to provide an independent, timely measure of the changes in healthcare expenditures and utilization for individuals enrolled in commercial health insurance plans in the United States.
S&P DJI developed these new indices in conjunction with healthcare professionals at Health Index Advisors (HIA), a joint venture between the premier actuarial and consulting firms Aon Inc. and Milliman Inc. S&P DJI combined its knowledge and experience in developing leading indices with HIA’s experience in the healthcare market to develop the first index series of its kind, based on actual healthcare claims data. These indices seek to increase transparency in the healthcare market and enable the analysis and tracking of changes in healthcare expenditures.
Published By: McKesson
Published Date: May 27, 2015
The shift to value-based care creates a sharp increase in healthcare organizations and networks’ need for data collection, aggregation and analysis. This white paper outlines the challenges involved with performing population-level analyses, developing cost accounting and profitability analyses across care settings, evaluating care episodes and integrating quality data. It explores the limitations of targeted software solutions to provide cross-enterprise insights. Finally, it provides advice for healthcare executives regarding how to approach gathering quality and cost-related data and how to leverage technology and analytical expertise to drive risk-based contract success.
Today’s leading-edge organizations differentiate themselves through analytics to further their competitive advantage by extracting value from all their data sources. Other companies are looking to become data-driven through the modernization of their data management deployments. These strategies do include challenges, such as the management of large growing volumes of data. Today’s digital world is already creating data at an explosive rate, and the next wave is on the horizon, driven by the emergence of IoT data sources. The physical data warehouses of the past were great for collecting data from across the enterprise for analysis, but the storage and compute resources needed to support them are not able to keep pace with the explosive growth. In addition, the manual cumbersome task of patch, update, upgrade poses risks to data due to human errors. To reduce risks, costs, complexity, and time to value, many organizations are taking their data warehouses to the cloud. Whether hosted lo
Data is growing at amazing rates and will continue this rapid rate of growth. New techniques in data processing and analytics including AI, machine and deep learning allow specially designed applications to not only analyze data but learn from the analysis and make predictions.
Computer systems consisting of multi-core CPUs or GPUs using parallel processing and extremely fast networks are required to process the data. However, legacy storage solutions are based on architectures that are decades old, un-scalable and not well suited for the massive concurrency required by machine learning. Legacy storage is becoming a bottleneck in processing big data and a new storage technology is needed to meet data analytics performance needs.
In today’s IT infrastructure, data security can no longer be treated as an afterthought, because billions of dollars are lost each year to computer intrusions and data exposures. This issue is compounded by the aggressive build-out for cloud computing. Big data and machine learning applications that perform tasks such as fraud and intrusion detection, trend detection, and click-stream and social media analysis all require forward-thinking solutions and enough compute power to deliver the performance required in a rapidly evolving digital marketplace. Companies increasingly need to drive the speed of business up, and organizations need to support their customers with real-time data. The task of managing sensitive information while capturing, analyzing, and acting upon massive volumes of data every hour of every day has become critical.
These challenges have dramatically changed the way that IT systems are architected, provisioned, and run compared to the past few decades. Most compani
With the growing size and importance of information stored in today’s databases, accessing and using the right information at the right time has become increasingly critical. Real-time access and analysis of operational data is key to making faster and better business decisions, providing enterprises with unique competitive advantages. Running analytics on operational data has been difficult because operational data is stored in row format, which is best for online transaction processing (OLTP) databases, while storing data in column format is much better for analytics processing. Therefore, companies normally have both an operational database with data in row format and a separate data warehouse with data in column format, which leads to reliance on “stale data” for business decisions. With Oracle’s Database In-Memory and Oracle servers based on the SPARC S7 and SPARC M7 processors companies can now store data in memory in both row and data formats, and run analytics on their operatio
Databases have long served as the lifeline of the business. Therefore, it is no surprise that performance has always been top of mind. Whether it be a traditional row-formatted database to handle millions of transactions a day or a columnar database for advanced analytics to help uncover deep insights about the business, the goal is to service all requests as quickly as possible. This is especially true as organizations look to gain an edge on their competition by analyzing data from their transactional (OLTP) database to make more informed business decisions. The traditional model (see Figure 1) for doing this leverages two separate sets of resources, with an ETL being required to transfer the data from the OLTP database to a data warehouse for analysis. Two obvious problems exist with this implementation. First, I/O bottlenecks can quickly arise because the databases reside on disk and second, analysis is constantly being done on stale data.
In-memory databases have helped address p
A range of application security tools was developed to support the efforts to secure the enterprise from the threat posed by insecure applications. But in the ever-changing landscape of application security, how does an organization choose the right set of tools to mitigate the risks their applications pose to their environment? Equally important, how, when, and by whom are these tools used most effectively?
Today, when you make decisions about information technology (IT) security priorities, you must often strike a careful balance between business risk, impact, and likelihood of incidents, and the costs of prevention or cleanup. Historically, the most well-understood variable in this equation was the methods that hackers used to disrupt or invade the system.
The Business Case for Data Protection, conducted by Ponemon Institute and sponsored by Ounce Labs, is the first study to determine what senior executives think about the value proposition of corporate data protection efforts within their organizations. In times of shrinking budgets, it is important for those individuals charged with managing a data protection program to understand how key decision makers in organizations perceive the importance of safeguarding sensitive and confidential information.
Published By: Asure HR
Published Date: Nov 18, 2013
Improving workforce productivity is near the top of every organization’s “To-Do List”, but a surprisingly low percentage of companies feel confident that they are performing well in their workforce optimization efforts.
This white paper will provide you with an indepth analysis of the top 8 ways you can improve your workforce productivity using the latest advances in HR technology.
Published By: Zynapse
Published Date: Sep 10, 2010
UNSPSC enables preference item management, better spend analysis, supply standardization and information control.
Whether you are deliberating on the need for a common product and classification standard for your company, or are an advanced UNSPSC adopter, we hope that "Adopting UNSPSC" will answer some of your questions and perhaps help you in some way to improve your purchasing and supply management processes.
Published By: PayScale
Published Date: Sep 12, 2012
The U.S. economy has forced some difficult decisions on employers within the past few years. How has this affected employee pay within your industry and location?
With millions of employee incumbents from industries across the U.S., the PayScale Index is one of the most comprehensive and current trend analysis reports available.