Skip to main content
 

processing

Results 1 - 25 of 411Sort Results By: Published Date | Title | Company Name
Published By: KPMG     Published Date: Nov 21, 2018
Many BPO providers have already invested heavily in robotic process automation (RPA), providing an opportunity to renegotiate your BPO contract to reflect the cost savings they are making as a result. But there’s more. Contract renegotiation also creates an opportunity to leverage providers’ investment to accelerate automation in your own business. Read this to discover: • the benefits of automating business processes and the impressive level of cost savings this can deliver • the impact of automation on the BPO market and the implications this has for BPO contracts • a five-step processing for assessing and renegotiating your outsourcing contract • three options for starting your automation journey.
Tags : 
    
KPMG
Published By: KPMG     Published Date: Nov 21, 2018
Many BPO providers have already invested heavily in robotic process automation (RPA), providing an opportunity to renegotiate your BPO contract to reflect the cost savings they are making as a result. But there’s more. Contract renegotiation also creates an opportunity to leverage providers’ investment to accelerate automation in your own business. Read this to discover: • the benefits of automating business processes and the impressive level of cost savings this can deliver • the impact of automation on the BPO market and the implications this has for BPO contracts • a five-step processing for assessing and renegotiating your outsourcing contract • three options for starting your automation journey.
Tags : 
    
KPMG
Published By: Amazon Web Services     Published Date: Nov 14, 2018
Amazon Redshift Spectrum—a single service that can be used in conjunction with other Amazon services and products, as well as external tools—is revolutionizing the way data is stored and queried, allowing for more complex analyses and better decision making. Spectrum allows users to query very large datasets on S3 without having to load them into Amazon Redshift. This helps address the Scalability Dilemma—with Spectrum, data storage can keep growing on S3 and still be processed. By utilizing its own compute power and memory, Spectrum handles the hard work that would normally be done by Amazon Redshift. With this service, users can now scale to accommodate larger amounts of data than the cluster would have been capable of processing with its own resources. This e-book aims to provide you with expert tips on how to use Amazon Redshift Spectrum to increase performance and potentially reduce the cost of your queries.
Tags : 
    
Amazon Web Services
Published By: Magnetrol     Published Date: Nov 05, 2018
U.S. Department of Energy surveys show that minor adjustments in process management can incrementally improve efficiency in commercial and heavy industries. These include pulp & paper, chemical, petroleum refining, mining and food processing where as much as 60% of their total energy consumption goes to the production of steam. The information-packed Steam Generation & Condensate Recovery Process Optimization kit from Magnetrol explains how effective instrumentation solutions can:
Tags : 
    
Magnetrol
Published By: Talend     Published Date: Nov 02, 2018
Ready to embrace the multi-cloud future? This new TDWI Checklist Report is the cloud primer you’ve been waiting for. The most successful companies are embracing cloud data integration to help them leverage more data. Businesses are increasingly having to learn what data integration is and does as well as increasing their data processing scale and performance at lower cost. This whitepaper demonstrates how to reduce risk and disruption while implementing multi-cloud data integration and self-service data access.
Tags : 
    
Talend
Published By: Group M_IBM Q418     Published Date: Oct 23, 2018
The General Data Protection Regulation (GDPR) framework seeks to create a harmonized data protection framework across the European Union, and aims to give back EU citizens control of their personal data by imposing stricter requirements for those hosting and processing this data, anywhere in the world. IBM is committed to putting data responsibility first and providing solutions that are secure to the core for all customers. As such, IBM Cloud has fully adopted the EU Data Protection Code of Conduct for Cloud Service providers – meaning we agree to meet the entirety of its stringent requirements.
Tags : 
    
Group M_IBM Q418
Published By: Cognizant     Published Date: Oct 23, 2018
A group of emerging technologies is rapidly creating numerous opportunities for life sciences companies to improve productivity, enhance patient care and ensure regulatory compliance. These technologies include robotic process automation (RPA), artificial intelligence (AI), machine learning (ML), blockchain, the Internet of Things (IoT), 3-D printing and augmented reality/ virtual reality (AR/ VR). This whitepaper presents a preview of five pivotal technology trends remaking the life sciences industry: AI and automation, human augmentation, edge analytics/ processing, data ownership and protection, and the intermingling of products and services.
Tags : 
cognizant, life sciences, patient care
    
Cognizant
Published By: Group M_IBM Q418     Published Date: Oct 15, 2018
The enterprise data warehouse (EDW) has been at the cornerstone of enterprise data strategies for over 20 years. EDW systems have traditionally been built on relatively costly hardware infrastructures. But ever-growing data volume and increasingly complex processing have raised the cost of EDW software and hardware licenses while impacting the performance needed for analytic insights. Organizations can now use EDW offloading and optimization techniques to reduce costs of storing, processing and analyzing large volumes of data. Getting data governance right is critical to your business success. That means ensuring your data is clean, of excellent quality, and of verifiable lineage. Such governance principles can be applied in Hadoop-like environments. Hadoop is designed to store, process and analyze large volumes of data at significantly lower cost than a data warehouse. But to get the return on investment, you must infuse data governance processes as part of offloading.
Tags : 
    
Group M_IBM Q418
Published By: Amazon Web Services     Published Date: Oct 12, 2018
Safeguarding your data is more important than ever. In today’s data-driven business landscape, companies are using their data to innovate, inform product improvements, and personalize services for their customers. The sheer volume of data collected for these purposes keeps growing, but the solutions available to organizations for processing and analyzing it become more efficient and intuitive every day. Reaching the right customers at the right time with the right offers has never been easier. With this newfound agility, however, comes new opportunities for vulnerability. With so much riding on the integrity of your data and the services that make it secure and available, it’s crucial to have a plan in place for unexpected events that can wipe out your physical IT environment or otherwise compromise data access. The potential for natural disasters, malicious software attacks, and other unforeseen events necessitates that companies implement a robust disaster recovery (DR) strategy to
Tags : 
    
Amazon Web Services
Published By: Workday     Published Date: Oct 11, 2018
Before Workday, Panera Bread’s payroll processes were manual, inefficient, and error-prone, and payroll nightmares and compliance risks were a regular occurrence. Complex systems and costly integrations made it impossible for the company to keep up with its rapid growth or gain valuable insights into global labor expenses. See the infographic to learn why unifying HR, payroll, time tracking, and absence management in a single system allows Panera to use one consistent, flexible, and scalable system across the U.S. and Canada.
Tags : 
    
Workday
Published By: Ricoh     Published Date: Oct 02, 2018
Your business is changing. As a finance leader, you know that accounting is a labour-intensive, costly process where systems often don’t allow for expedient exception handling and many days are fraught with difficulty in matching invoices to other databases for reconciliation. Like most companies, you know where you want to go but may not have infrastructure or internal expertise to handle electronic fund transfers, credit card payments or cheque processing— all the pieces required to make your vision for an efficient, integrated operation a reality.
Tags : 
    
Ricoh
Published By: NetApp     Published Date: Sep 26, 2018
Get practical advice from IT professionals on how to successfully deploy all-flash arrays for Oracle, SAP and SQL Server workloads in a SAN environment. You'll explore topics such as transaction processing speed, storage management, future requirements planning, workload migration and more.
Tags : 
netapp, flash, san, workload, netapp
    
NetApp
Published By: TIBCO Software EMEA     Published Date: Sep 12, 2018
By processing real-time data from machine sensors using artificial intelligence and machine learning, it’s possible to predict critical events and take preventive action to avoid problems. TIBCO helps manufacturers around the world predict issues with greater accuracy, reduce downtime, increase quality, and improve yield. Read about our top data science best practices for becoming a smart manufacturer.
Tags : 
inter-company connectivity, real-time tracking, automate analytic models, efficient analytics, collaboration
    
TIBCO Software EMEA
Published By: Amazon Web Services     Published Date: Sep 05, 2018
Big data alone does not guarantee better business decisions. Often that data needs to be moved and transformed so Insight Platforms can discern useful business intelligence. To deliver those results faster than traditional Extract, Transform, and Load (ETL) technologies, use Matillion ETL for Amazon Redshift. This cloud- native ETL/ELT offering, built specifically for Amazon Redshift, simplifies the process of loading and transforming data and can help reduce your development time. This white paper will focus on approaches that can help you maximize your investment in Amazon Redshift. Learn how the scalable, cloud- native architecture and fast, secure integrations can benefit your organization, and discover ways this cost- effective solution is designed with cloud computing in mind. In addition, we will explore how Matillion ETL and Amazon Redshift make it possible for you to automate data transformation directly in the data warehouse to deliver analytics and business intelligence (BI
Tags : 
    
Amazon Web Services
Published By: Amazon Web Services     Published Date: Sep 05, 2018
Amazon Redshift Spectrum—a single service that can be used in conjunction with other Amazon services and products, as well as external tools—is revolutionizing the way data is stored and queried, allowing for more complex analyses and better decision making. Spectrum allows users to query very large datasets on S3 without having to load them into Amazon Redshift. This helps address the Scalability Dilemma—with Spectrum, data storage can keep growing on S3 and still be processed. By utilizing its own compute power and memory, Spectrum handles the hard work that would normally be done by Amazon Redshift. With this service, users can now scale to accommodate larger amounts of data than the cluster would have been capable of processing with its own resources.
Tags : 
    
Amazon Web Services
Published By: Amazon Web Services     Published Date: Sep 05, 2018
Big data alone does not guarantee better business decisions. Often that data needs to be moved and transformed so Insight Platforms can discern useful business intelligence. To deliver those results faster than traditional Extract, Transform, and Load (ETL) technologies, use Matillion ETL for Amazon Redshift. This cloud- native ETL/ELT offering, built specifically for Amazon Redshift, simplifies the process of loading and transforming data and can help reduce your development time. This white paper will focus on approaches that can help you maximize your investment in Amazon Redshift. Learn how the scalable, cloud- native architecture and fast, secure integrations can benefit your organization, and discover ways this cost- effective solution is designed with cloud computing in mind. In addition, we will explore how Matillion ETL and Amazon Redshift make it possible for you to automate data transformation directly in the data warehouse to deliver analytics and business intelligence (BI
Tags : 
    
Amazon Web Services
Published By: Amazon Web Services     Published Date: Sep 05, 2018
Amazon Redshift Spectrum—a single service that can be used in conjunction with other Amazon services and products, as well as external tools—is revolutionizing the way data is stored and queried, allowing for more complex analyses and better decision making. Spectrum allows users to query very large datasets on S3 without having to load them into Amazon Redshift. This helps address the Scalability Dilemma—with Spectrum, data storage can keep growing on S3 and still be processed. By utilizing its own compute power and memory, Spectrum handles the hard work that would normally be done by Amazon Redshift. With this service, users can now scale to accommodate larger amounts of data than the cluster would have been capable of processing with its own resources.
Tags : 
    
Amazon Web Services
Published By: SAS     Published Date: Aug 28, 2018
When designed well, a data lake is an effective data-driven design pattern for capturing a wide range of data types, both old and new, at large scale. By definition, a data lake is optimized for the quick ingestion of raw, detailed source data plus on-the-fly processing of such data for exploration, analytics and operations. Even so, traditional, latent data practices are possible, too. Organizations are adopting the data lake design pattern (whether on Hadoop or a relational database) because lakes provision the kind of raw data that users need for data exploration and discovery-oriented forms of advanced analytics. A data lake can also be a consolidation point for both new and traditional data, thereby enabling analytics correlations across all data. To help users prepare, this TDWI Best Practices Report defines data lake types, then discusses their emerging best practices, enabling technologies and real-world applications. The report’s survey quantifies user trends and readiness f
Tags : 
    
SAS
Published By: Workday     Published Date: Aug 20, 2018
Understanding how we protect your data, systems, and people. We built a technology platform specifically for the cloud. From our UI to our processing to our persistence layers we have a singular approach to security.
Tags : 
data, systems, business, innovation, optimization, workday
    
Workday
Published By: Workday     Published Date: Aug 20, 2018
Business has changed. Today organisations require finance to provide more information to support strategic decision making. For finance that means there is an increased need to transform the function; shift resources from transaction processing and control to being a better business partner, all at a lower cost.
Tags : 
finance, data, market, accounting, recruiting, operational, systems
    
Workday
Published By: SAS     Published Date: Aug 17, 2018
This SAS and Intel collaborated piece demonstrates the value of modernizing your analytics infrastructure using SAS® software on Intel processing. Readers will learn: • Benefits of applying a consistent analytic vision across all functions within the organization to make more insight-driven decisions. • How IT plays a pivotal role in modernizing analytics infrastructures. • Competitive advantages of modern analytics.
Tags : 
    
SAS
Published By: TIBCO Software APAC     Published Date: Aug 13, 2018
Big data has raised the bar for data virtualization products. To keep pace, TIBCO® Data Virtualization added a massively parallel processing engine that supports big-data scale workloads. Read this whitepaper to learn how it works.
Tags : 
    
TIBCO Software APAC
Published By: Oracle     Published Date: Aug 09, 2018
The purpose of IT backup and recovery systems is to avoid data loss and recover quickly, thereby minimizing downtime costs. Traditional storage-centric data protection architectures such as Purpose Built Backup Appliances (PBBAs), and the conventional backup and restore processing supporting them, are prone to failure on recovery. This is because the processes, both automated and manual, are too numerous, too complex, and too difficult to test adequately. In turn this leads to unacceptable levels of failure for today’s mission critical applications, and a poor foundation for digital transformation initiatives. Governments are taking notice. Heightened regulatory compliance requirements have implications for data recovery processes and are an unwelcome but timely catalyst for companies to get their recovery houses in order. Onerous malware, such as ransomware and other cyber attacks increase the imperative for organizations to have highly granular recovery mechanisms in place that allow
Tags : 
    
Oracle
Published By: DocuSign UK     Published Date: Aug 08, 2018
"Many financial services firms have automated the vast majority of key processes and customer experiences. However, the “last mile” of most transactions – completing the agreement– far too often relies on the same inefficient pen-and-paper processes of yesteryear. Digitising agreements using DocuSign lets you keep processes digital from end to end. Completing transactions no longer requires documents to be printed and shipped, and re-keyed on the back end. Read the whitepaper to learn how leading financial services organisations use straight-through processing by automating the last mile of business transactions to: - Speed processes by 80% or more, often going from days or weeks to just minutes - Reduce NIGO by anywhere from 55% to 93% - Achieve a 300% average ROI "
Tags : 
    
DocuSign UK
Published By: Microsoft     Published Date: Jul 20, 2018
Microsoft provides a solution to easily run small segments of code in the cloud with Azure Functions. Azure Functions provides solutions for processing data, integrating systems, and building simple APIs and microservices. The book starts with intermediate-level recipes on serverless computing along with some use cases on the benefits and key features of Azure Functions. Then, we'll deep dive into the core aspects of Azure Functions, such as the services it provides, how you can develop and write Azure Functions, and how to monitor and troubleshoot them. Moving on, you'll get practical recipes on integrating DevOps with Azure Functions, and providing continuous deployment with Visual Studio Team Services. The book also provides hands-on steps and tutorials based on real-world serverless use cases to guide you through configuring and setting up your serverless environments with ease. Finally, you'll see how to manage Azure Functions, providing enterprise-level security and compliance to
Tags : 
    
Microsoft
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search      

Add Research

Get your company's research in the hands of targeted business professionals.