Skip to main content
 

hadoop

Results 51 - 75 of 150Sort Results By: Published Date | Title | Company Name
Published By: Group M_IBM Q418     Published Date: Oct 15, 2018
The enterprise data warehouse (EDW) has been at the cornerstone of enterprise data strategies for over 20 years. EDW systems have traditionally been built on relatively costly hardware infrastructures. But ever-growing data volume and increasingly complex processing have raised the cost of EDW software and hardware licenses while impacting the performance needed for analytic insights. Organizations can now use EDW offloading and optimization techniques to reduce costs of storing, processing and analyzing large volumes of data. Getting data governance right is critical to your business success. That means ensuring your data is clean, of excellent quality, and of verifiable lineage. Such governance principles can be applied in Hadoop-like environments. Hadoop is designed to store, process and analyze large volumes of data at significantly lower cost than a data warehouse. But to get the return on investment, you must infuse data governance processes as part of offloading.
Tags : 
    
Group M_IBM Q418
Published By: Splunk     Published Date: Sep 17, 2012
Splunk has become a critical real-time monitoring and analytics tool for Bronto. Read more about Bronto Software's hard and soft ROI's by using Splunk versus extending their Hadoop deployment.
Tags : 
roi, enterprise management associates, real time monitoring, analytics, hadoop, email delivery system, enterprise business solutions
    
Splunk
Published By: Teradata     Published Date: May 01, 2015
Creating value in your enterprise undoubtedly creates competitive advantage. Making sense of the data that is pouring into the data lake, accelerating the value of the data, and being able to manage that data effectively is a game-changer. Michael Lang explores how to achieve this success in “Data Preparation in the Hadoop Data Lake.” Enterprises experiencing success with data preparation acknowledge its three essential competencies: structuring, exploring, and transforming. Teradata Loom offers a new approach by enabling enterprises to get value from the data lake with an interactive method for preparing big data incrementally and iteratively. As the first complete data management solution for Hadoop, Teradata Loom enables enterprises to benefit from better and faster insights from a continuous data science workflow, improving productivity and business value. To learn more about how Teradata Loom can help improve productivity in the Hadoop Data Lake, download this report now.
Tags : 
data management, productivity, hadoop, interactive, enterprise
    
Teradata
Published By: Intel     Published Date: Jun 22, 2015
Until recently, we used the Intel® Distribution for Apache Hadoop Software (IDH) to support our original three business intelligence (BI) big data use cases, and it delivered results worth millions of dollars to Intel.
Tags : 
    
Intel
Published By: BMC Software     Published Date: Jul 22, 2015
This CIO eBook explores how to deploy Hadoop applications faster and easier with a workload automation solution that simplifies and automates Hadoop batch processing and connected enterprise workflows. Read the eBook to learn: • The role—and challenges—of Hadoop in Big Data application development • Six considerations for a Hadoop proof-of-concept initiative • How to connect Hadoop to other enterprise data sources and applications
Tags : 
big data, hadoop applications, hadoop development
    
BMC Software
Published By: BMC Software     Published Date: Jul 22, 2015
In this white paper, you’ll discover an enterprise approach to Big Data that leverages workload automation to: - Integrate Hadoop workflows into your enterprise processes to deliver new applications faster - Resolve issues faster with predictive analytics, automated alerts, and early problem detection - Achieve compliance and governance adherence
Tags : 
big data, business processes, enterprise systems, hadoop, compliance
    
BMC Software
Published By: BMC Software     Published Date: Jul 22, 2015
Integrieren Sie Ihrer Big Data Initiativen in Ihre Unternehmensweiten Geschäftsprozesse. Gerne machen wir Sie damit vertraut, wie sie mit Control-M für Hadoop die Anwendungsentwicklung beschleunigen und die Unternehmensintegration vereinfachen können. Besprochene Themen schließen folgende Punkte ein: * Wie können Sie mit einem Enterprise Scheduler für Hadoop, weitere Automationsinseln vermeiden. * Wie kann sichergestellt werden, dass Sie aus ihren Big Data Initiativen den gewünschten Mehrwert erhalten. * Wie können Sie mit Ihren Big Data initiativen sich den administrativen Herausforderungen & Bedürfnissen stellen und mögliche Konfrontationen erfolgreich meistern.
Tags : 
    
BMC Software
Published By: BMC Software     Published Date: Jul 22, 2015
Managing Hadoop batch processing may consume a significant portion of application developers’ time and effort, which drives up application development times and costs. This paper from BMC discusses the obstacles IT organizations face in developing and managing Hadoop jobs and workflows and how a workload automation solution can remove these barriers.
Tags : 
hadoop, batch jobs, workflow, scheduling tools, automation
    
BMC Software
Published By: BMC Software     Published Date: Jul 22, 2015
In this paper, learn how a proven workload automation solution can help you shorten development time and get Hadoop applications into production more quickly. By replacing scripting with the standard functions provided by a workload automation system, you can also deliver an application that is simpler, more reliable, and easier to debug once in production.
Tags : 
big data, production, workload automation, hadoop, debugging, scripting
    
BMC Software
Published By: Hortonworks     Published Date: Apr 05, 2016
Download this whitepaper to learn how Hortonworks Data Platform (HDP), built on Apache Hadoop, offers the ability to capture all structured and emerging types of data, keep it longer, and apply traditional and new analytic engines to drive business value, all in an economically feasible fashion. In particular, organizations are breathing new life into enterprise data warehouse (EDW)-centric data architectures by integrating HDP to take advantage of its capabilities and economics.
Tags : 
    
Hortonworks
Published By: Hortonworks     Published Date: Apr 05, 2016
This white paper will help you evaluate your ability to protect your data in a Apache Hadoop ecosystem. Read on to learn ten signs that you might need to improve security and data governance in order to manage risk while getting more value out of your Apache Hadoop environment.
Tags : 
    
Hortonworks
Published By: BlueData     Published Date: Mar 13, 2018
In a benchmark study, Intel compared the performance of Big Data workloads running on a bare-metal deployment versus running in Docker containers with the BlueData software platform. This landmark benchmark study used unmodified Apache Hadoop* workloads
Tags : 
big data, big data analytics, hadoop, apache spark, docker
    
BlueData
Published By: StreamSets     Published Date: Sep 24, 2018
The advent of Apache Hadoop™ has led many organizations to replatform their existing architectures to reduce data management costs and find new ways to unlock the value of their data. One area that benefits from replatforming is the data warehouse. According to research firm Gartner, “starting in 2018, data warehouse managers will benefit from hybrid architectures that eliminate data silos by blending current best practices with ‘big data’ and other emerging technology types.” There’s undoubtedly a lot to ain by modernizing data warehouse architectures to leverage new technologies, however the replatforming process itself can be harder than it would at first appear. Hadoop projects are often taking longer than they need to create the promised benefits, and often times problems can be avoided if you know what to avoid from the onset.
Tags : 
replatforming, age, data, lake, apache, hadoop
    
StreamSets
Published By: Pentaho     Published Date: Mar 08, 2016
If you’re evaluating big data integration platforms, you know that with the increasing number of tools and technologies out there, it can be difficult to separate meaningful information from the hype, and identify the right technology to solve your unique big data problem. This analyst research provides a concise overview of big data integration technologies, and reviews key things to consider when creating an integrated big data environment that blends new technologies with existing BI systems to meet your business goals. Read the Buyer’s Guide to Big Data Integration by CITO Research to learn: • What tools are most useful for working with Big Data, Hadoop, and existing transactional databases • How to create an effective “data supply chain” • How to succeed with complex data on-boarding using automation for more reliable data ingestion • The best ways to connect, transport, and transform data for data exploration, analytics and compliance
Tags : 
data, buyer guide, integration, technology, platform, research
    
Pentaho
Published By: Pentaho     Published Date: Apr 28, 2016
Although the phrase “next-generation platforms and analytics” can evoke images of machine learning, big data, Hadoop, and the Internet of things, most organizations are somewhere in between the technology vision and today’s reality of BI and dashboards. Next-generation platforms and analytics often mean simply pushing past reports and dashboards to more advanced forms of analytics, such as predictive analytics. Next-generation analytics might move your organization from visualization to big data visualization; from slicing and dicing data to predictive analytics; or to using more than just structured data for analysis.
Tags : 
pentaho, best practices, hadoop, next generation analytics, platforms, infrastructure, data, analytics in organizations
    
Pentaho
Published By: Teradata     Published Date: Feb 26, 2013
This report explores the evolution of big data analytics and its maturity within the enterprise. It discusses the approaches and economics to using a Discovery platform and Apache Hadoop within the same unified analytical architecture.
Tags : 
big data analytics, experiences with teradata, apache hadoop, analytics, discovery platform, apache hadoop, teradata
    
Teradata
Published By: Teradata     Published Date: Feb 26, 2013
Business Insights Using Hadoop with SQL Analytics. Harness the value of big data analytics -- This report discusses a unified data architecture solution providing valuable insight from new and existing data.
Tags : 
power of analytics, harnessing the value of big data, business intelligence, teradata, unified data architecture
    
Teradata
Published By: Dell EMC     Published Date: Jun 29, 2016
Traditional DAS or Scale-out NAS for Hadoop Analytics? Here are our top 8 reasons to choose a Scale-Out Data Lake on EMC Isilon for Hadoop Analytics.
Tags : 
emc isilon, storage, best practices, data
    
Dell EMC
Published By: Dell EMC     Published Date: Jun 29, 2016
IDC believes that EMC Isilon is indeed an easy to operate, highly scalable and efficient Enterprise Data Lake Platform. IDC validated that a shared storage model based on the Data Lake can in fact provide enterprise-grade service-levels while performing better than dedicated commodity off-the-shelf storage for Hadoop workloads.
Tags : 
storage, data, enterprise, best practices, platform
    
Dell EMC
Published By: BlueData     Published Date: Aug 19, 2015
Many organizations seeking to get started with Hadoop implementation have resorted to the public cloud to avoid the complexities of deployment. Organizations are always looking for an opportunity to take advantage of the simplicity of cloud deployment and the efficiencies of multi-tenant operations on premise. Download this white paper to see why there is a greater need to simplify on-premise deployment and see how this will greatly improve manageability and security.
Tags : 
big data, hadoop, hadoop-as-a-service, multi-tenant, and infrastructure
    
BlueData
Published By: BlueData     Published Date: Aug 19, 2015
Big Data is on virtually every enterprise’s to-do list these days. Recognizing both its potential and competitive advantage, companies are aligning a vast array of resources to access and analyze this strategic asset. However, despite best intentions, the majority of these Big Data initiatives are either extremely slow in their implementation or are not yielding the results and benefits that enterprises expect. Download this white paper to learn how to solve the Big Data intention-deployment gap and see how you can make your infrastructure in a flexible, easy-to-use platform that will provide in-depth analytics.
Tags : 
big data, big data intention-deployment, in-depth analytics, hadoop
    
BlueData
Published By: BlueData     Published Date: Aug 19, 2015
As companies seek to better understand their customers, their opportunities, and themselves, they are embracing new technologies such as Hadoop and NoSQL to better manage and manipulate their data. Yet a complete solution for big data has many moving parts while at the same time these moving parts are continuously evolving. Download this white paper to figure out how to make all the moving parts work smoothly together and see how this will ease frustration with business users and free up your IT teams time to handle other issues.
Tags : 
big data infrastructure, hadoop deployment, spark, analytics software, big data
    
BlueData
Published By: BlueData     Published Date: Aug 19, 2015
Over the past few years, “Big Data” has evolved from an interesting technology topic into a source of major competitive advantage, in which IDG conducted a survey and found out that 60% of enterprises are planning on spending an average of $8 million on Big Data initiatives. However, somewhere between intention/investment and executive/production, Big Data initiatives are falling into a gap. Download this white paper to find out how to change the equation on Big Data spending and learn what the successful companies are doing in order to achieve a success from your Big Data applications.
Tags : 
bigdata, hadoop, big data spending, big data projects, it commitments
    
BlueData
Published By: Snowflake     Published Date: Jan 25, 2018
Compared with implementing and managing Hadoop (a traditional on-premises data warehouse) a data warehouse built for the cloud can deliver a multitude of unique benefits. The question is, can enterprises get the processing potential of Hadoop and the best of traditional data warehousing, and still benefit from related emerging technologies? Read this eBook to see how modern cloud data warehousing presents a dramatically simpler but more power approach than both Hadoop and traditional on-premises or “cloud-washed” data warehouse solutions.
Tags : 
    
Snowflake
Published By: Cohesity     Published Date: Aug 09, 2019
As organizations continue to look for ways to increase business agility, a need for a modern database architecture that can rapidly respond to the needs of business is more apparent than ever. While an RDBMS still serves as a lifeline for many organizations, the adoption of technologies such as NoSQL and Hadoop are enabling organizations to best address database performance and scalability requirements while also satisfying the goals of embracing hybrid cloud and becoming more data-driven. And with organizations relying so heavily on these new technologies to yield rapid insights that positively impact the business, the need to evaluate how those new technologies are managed and protected is essential. Hadoop and NoSQL workloads are now pervasive in production environments and require “production-class” data protection, yet few data protection solutions offer such capabilities today.
Tags : 
    
Cohesity
Start   Previous    1 2 3 4 5 6    Next    End
Search      

Add Research

Get your company's research in the hands of targeted business professionals.