Skip to main content
 

ingest

Results 1 - 21 of 21Sort Results By: Published Date | Title | Company Name
Published By: Amazon Web Services     Published Date: Jun 20, 2018
Data and analytics have become an indispensable part of gaining and keeping a competitive edge. But many legacy data warehouses introduce a new challenge for organizations trying to manage large data sets: only a fraction of their data is ever made available for analysis. We call this the “dark data” problem: companies know there is value in the data they collected, but their existing data warehouse is too complex, too slow, and just too expensive to use. A modern data warehouse is designed to support rapid data growth and interactive analytics over a variety of relational, non-relational, and streaming data types leveraging a single, easy-to-use interface. It provides a common architectural platform for leveraging new big data technologies to existing data warehouse methods, thereby enabling organizations to derive deeper business insights. Key elements of a modern data warehouse: • Data ingestion: take advantage of relational, non-relational, and streaming data sources • Federated q
Tags : 
    
Amazon Web Services
Published By: Adobe     Published Date: Nov 07, 2013
Today’s leading DMPs are ingesting a wide range of owned and licensed data streams for insights and segmentation and are pushing data into a growing number of external targeting platforms, helping marketers deliver more relevant and consistent marketing communications.
Tags : 
adobe, the forrester wave, forrester dmp wave, audience management, data management platforms, multi-touchpoint targeting, multi-touchpoint execution, dmp vendor offerings, seamless data ingestion, message delivery, strong vendors, product evaluations, selecting right partner, audience insights, coordinated targeting
    
Adobe
Published By: Ogilvy - IBM UK     Published Date: Aug 08, 2011
This solution brief introduces the Smart Archive strategy from IBM, which is a comprehensive approach that combines IBM software, systems and service capabilities to help you drive down costs down and help ensure critical content is properly retained and protected.
Tags : 
ibm uk, ibm smart archive, ibm smart archive strategy, hardware, software, services, optimize, unify content, ingest, infrastructure, integrate, integration, compliance, analytics, ediscovery
    
Ogilvy - IBM UK
Published By: Pure Storage     Published Date: Apr 18, 2018
In today’s world, it’s critical to have infrastructure that supports both massive data ingest and rapid analytics evolution. At Pure Storage, we built the ultimate data hub for AI, engineered to accelerate every stage of the data pipeline. Download this infographic for more information.
Tags : 
    
Pure Storage
Published By: Hitachi Vantara     Published Date: Mar 20, 2018
ESG Lab performed hands-on evaluation and testing of the Hitachi Content Platform portfolio, consisting of Hitachi Content Platform (HCP), Hitachi Content Platform Anywhere (HCP Anywhere) online file sharing, Hitachi Data Ingestor (HDI), and Hitachi Content Intelligence (HCI) data aggregation and analysis. Testing focused on integration of the platforms, global access to content, public and private cloud tiering, data quality and analysis, and the ease of deployment and management of the solution.
Tags : 
    
Hitachi Vantara
Published By: IBM     Published Date: Apr 14, 2017
Any organization wishing to process big data from newly identified data sources, needs to first determine the characteristics of the data and then define the requirements that need to be met to be able to ingest, profile, clean,transform and integrate this data to ready it for analysis. Having done that, it may well be the case that existing tools may not cater for the data variety, data volume and data velocity that these new data sources bring. If this occurs then clearly new technology will need to be considered to meet the needs of the business going forward.
Tags : 
data integration, big data, data sources, business needs, technological advancements, scaling data
    
IBM
Published By: IBM     Published Date: Apr 18, 2017
The data integration tool market was worth approximately $2.8 billion in constant currency at the end of 2015, an increase of 10.5% from the end of 2014. The discipline of data integration comprises the practices, architectural techniques and tools that ingest, transform, combine and provision data across the spectrum of information types in the enterprise and beyond — to meet the data consumption requirements of all applications and business processes. The biggest changes in the market from 2015 are the increased demand for data virtualization, the growing use of data integration tools to combine "data lakes" with existing integration solutions, and the overall expectation that data integration will become cloud- and on-premises-agnostic.
Tags : 
data integration, data security, data optimization, data virtualization, database security, data analytics, data innovation
    
IBM
Published By: IBM     Published Date: Jan 20, 2017
Government agencies are taking advantage of new capabilities like mobile and cloud to deliver better services to its citizens. Many agencies are going paperless, streamlining how they interact with citizens and providing services more efficiently and faster. This short video will show real examples of how government agencies are applying new capabilities like cognitive and analytics to improve how they ingest, manage, store and interact with content.
Tags : 
ibm, ecm, analytics, smarter content, ecm for government
    
IBM
Published By: Pentaho     Published Date: Mar 08, 2016
If you’re evaluating big data integration platforms, you know that with the increasing number of tools and technologies out there, it can be difficult to separate meaningful information from the hype, and identify the right technology to solve your unique big data problem. This analyst research provides a concise overview of big data integration technologies, and reviews key things to consider when creating an integrated big data environment that blends new technologies with existing BI systems to meet your business goals. Read the Buyer’s Guide to Big Data Integration by CITO Research to learn: • What tools are most useful for working with Big Data, Hadoop, and existing transactional databases • How to create an effective “data supply chain” • How to succeed with complex data on-boarding using automation for more reliable data ingestion • The best ways to connect, transport, and transform data for data exploration, analytics and compliance
Tags : 
data, buyer guide, integration, technology, platform, research
    
Pentaho
Published By: Dell EMC     Published Date: Mar 18, 2016
The EMC Isilon Scale-out Data Lake is an ideal platform for multi-protocol ingest of data. This is a crucial function in Big Data environments, in which it is necessary to quickly and reliably ingest data into the Data Lake using protocols closest to the workload generating the data. With OneFS it is possible to ingest data via NFSv3, NFSv4, SMB2.0, SMB3.0 as well as via HDFS. This makes the platform very friendly for complex Big Data workflows.
Tags : 
emc, emc isilon, data lake, storage, network, big data
    
Dell EMC
Published By: MemSQL     Published Date: Nov 15, 2017
Pairing Apache Kafka with a Real-Time Database Learn how to: ? Scope data pipelines all the way from ingest to applications and analytics ? Build data pipelines using a new SQL command: CREATE PIPELINE ? Achieve exactly-once semantics with native pipelines ? Overcome top challenges of real-time data management
Tags : 
digital transformation, applications, data, pipelines, management
    
MemSQL
Published By: Snowflake     Published Date: Jan 25, 2018
To thrive in today’s world of data, knowing how to manage and derive value from of semi-structured data like JSON is crucial to delivering valuable insight to your organization. One of the key differentiators in Snowflake is the ability to natively ingest semi-structured data such as JSON, store it efficiently and then access it quickly using simple extensions to standard SQL. This eBook will give you a modern approach to produce analytics from JSON data using SQL, easily and affordably.
Tags : 
    
Snowflake
Published By: Amazon Web Services     Published Date: Apr 27, 2018
Until recently, businesses that were seeking information about their customers, products, or applications, in real time, were challenged to do so. Streaming data, such as website clickstreams, application logs, and IoT device telemetry, could be ingested but not analyzed in real time for any kind of immediate action. For years, analytics were understood to be a snapshot of the past, but never a window into the present. Reports could show us yesterday’s sales figures, but not what customers are buying right now. Then, along came the cloud. With the emergence of cloud computing, and new technologies leveraging its inherent scalability and agility, streaming data can now be processed in memory, and more significantly, analyzed as it arrives, in real time. Millions to hundreds of millions of events (such as video streams or application alerts) can be collected and analyzed per hour to deliver insights that can be acted upon in an instant. From financial services to manufacturing, this rev
Tags : 
    
Amazon Web Services
Published By: VSS Monitoring     Published Date: Jun 17, 2014
Read this Solution Guide to learn how to optimize monitoring & security for 40G & 100G networks While network owners are migrating to 40G & 100G infrastructures, most tools available today cannot ingest high-speed traffic. They lack both the physical interfaces and processing power to do so.
Tags : 
vss monitoring, bandwidth, network, infrastructure, 40g, 100g, interface
    
VSS Monitoring
Published By: SAS     Published Date: Mar 06, 2018
When designed well, a data lake is an effective data-driven design pattern for capturing a wide range of data types, both old and new, at large scale. By definition, a data lake is optimized for the quick ingestion of raw, detailed source data plus on-the-fly processing of such data for exploration, analytics, and operations. Even so, traditional, latent data practices are possible, too. Organizations are adopting the data lake design pattern (whether on Hadoop or a relational database) because lakes provision the kind of raw data that users need for data exploration and discovery-oriented forms of advanced analytics. A data lake can also be a consolidation point for both new and traditional data, thereby enabling analytics correlations across all data. With the right end-user tools, a data lake can enable the self-service data practices that both technical and business users need. These practices wring business value from big data, other new data sources, and burgeoning enterprise da
Tags : 
    
SAS
Published By: IBM     Published Date: Apr 06, 2016
As big data environments ingest more data, organizations will face significant risks and threats to the repositories containing this data. Failure to balance data security and quality reduces confidence in decision making. Read this e-Book for tips on securing big data environments
Tags : 
ibm, big data, data security, risk management
    
IBM
Published By: IBM     Published Date: Jul 15, 2016
As big data environments ingest more data, organizations will face significant risks and threats to the repositories containing this data. Failure to balance data security and quality reduces confidence in decision making. Read this e-Book for tips on securing big data environments.
Tags : 
ibm, data, security, big data
    
IBM
Published By: SAS     Published Date: Oct 18, 2017
When designed well, a data lake is an effective data-driven design pattern for capturing a wide range of data types, both old and new, at large scale. By definition, a data lake is optimized for the quick ingestion of raw, detailed source data plus on-the-fly processing of such data for exploration, analytics and operations. Even so, traditional, latent data practices are possible, too. Organizations are adopting the data lake design pattern (whether on Hadoop or a relational database) because lakes provision the kind of raw data that users need for data exploration and discovery-oriented forms of advanced analytics. A data lake can also be a consolidation point for both new and traditional data, thereby enabling analytics correlations across all data. To help users prepare, this TDWI Best Practices Report defines data lake types, then discusses their emerging best practices, enabling technologies and real-world applications. The report’s survey quantifies user trends and readiness f
Tags : 
    
SAS
Published By: IBM     Published Date: Aug 04, 2016
IBM BigInsights is ready to help Quest Diagnostics Inc. ingest, normalize and analyze huge data-sets, delivering new insight into clinical outcomes for physicians, hospitals, and millions of patients.
Tags : 
ibm, analytics, myaa, quest diagnostics, case study, data analytics, data insight, big data
    
IBM
Published By: Amazon Web Services     Published Date: May 18, 2018
We’ve become a world of instant information. We carry mobile devices that answer questions in seconds and we track our morning runs from screens on our wrists. News spreads immediately across our social feeds, and traffic alerts direct us away from road closures. As consumers, we have come to expect answers now, in real time. Until recently, businesses that were seeking information about their customers, products, or applications, in real time, were challenged to do so. Streaming data, such as website clickstreams, application logs, and IoT device telemetry, could be ingested but not analyzed in real time for any kind of immediate action. For years, analytics were understood to be a snapshot of the past, but never a window into the present. Reports could show us yesterday’s sales figures, but not what customers are buying right now. Then, along came the cloud. With the emergence of cloud computing, and new technologies leveraging its inherent scalability and agility, streaming data
Tags : 
    
Amazon Web Services
Published By: IBM     Published Date: Jan 20, 2017
Government agencies are taking advantage of new capabilities like mobile and cloud to deliver better services to its citizens. Many agencies are going paperless, streamlining how they interact with citizens and providing services more efficiently and faster. This short video will show real examples of how government agencies are applying new capabilities like cognitive and analytics to improve how they ingest, manage, store and interact with content.
Tags : 
    
IBM
Search      

Add Research

Get your company's research in the hands of targeted business professionals.