Skip to main content
 

data storage systems

Results 26 - 50 of 75Sort Results By: Published Date | Title | Company Name
Published By: Pure Storage     Published Date: Dec 05, 2018
Deep learning opens up new worlds of possibility in artificial intelligence, enabled by advances in computational capacity, the explosion in data, and the advent of deep neural networks. But data is evolving quickly and legacy storage systems are not keeping up. Read this MIT Technology Review custom paper to learn how advanced AI applications require a modern all-flash storage infrastructure that is built specifically to work with high-powered analytics, helping to accelerate business outcomes for data driven organizations.
Tags : 
    
Pure Storage
Published By: HP     Published Date: Jul 29, 2008
The data residing on your storage systems and media, data-at-rest, presents serious security concerns. Regulations and various mandates around the world are putting the burden on companies and government entities to protect the private information they store. Increasingly, companies are being required to publicly disclose breaches that put individuals private data at risk, be it a customer, employee, shareholder, partner, or other stakeholder.
Tags : 
data security, database security, securing data, customer data, consumer data, pci, cardholder data, mission critical
    
HP
Published By: F5 Networks Inc     Published Date: Jan 22, 2009
F5 and Data Domain have joined their respective solutions, forming a partnership designed to assist customers in deploying and realizing the benefits of tiered storage. By combining F5’s tiered storage policy engine with Data Domain deduplication storage systems, mutual customers can realize the benefits of deploying tiered storage and, importantly, see dramatic reductions in the costs of storage.
Tags : 
data domain, curb storage costs, file servers, tiered storage, deduplication storage systems
    
F5 Networks Inc
Published By: VMware/Intel Server Refresh and Cost Savings     Published Date: Jan 27, 2009
Frustrated by the costs of maintain ever larger data centers-or building new ones-many companies are exploring virtualization. Virtualization lets your IT staff turn your data center into an internal cloud of computing resources controlled by a single virtual data center operating system (VDC-OS).
Tags : 
vmware, virtualization, server virtualization, data center, fc san, fibre, networkattached storage, nas, nic, high availability, vmware consolidated backup, vcb, vdc os, tco, total cost of ownership, vstorage, distributed power management, dpm, management information systems, mis
    
VMware/Intel Server Refresh and Cost Savings
Published By: VMware/Intel Server Refresh and Cost Savings     Published Date: Oct 23, 2007
Today's use of virtualization technology allows IT professionals to automatically manage the resources of the physical server to efficiently support multiple operating systems, each supporting different applications. This IDC Technology Assessment presents IDC's view of how virtualization technologies are impacting and will continue to impact operating environments and the operating environment market near- and long-term.
Tags : 
vmware, operating environments, virtualization, x86, server virtualization, data center, fc san, fibre, networkattached storage, nas, nic, high availability, vmware consolidated backup, vcb, vdc os, tco, total cost of ownership, vstorage, distributed power management, dpm
    
VMware/Intel Server Refresh and Cost Savings
Published By: VMware/Intel Server Refresh and Cost Savings     Published Date: Feb 16, 2007
VMware virtualization enables customers to reduce their server TCO and quickly delivers signification ROI. This paper describes commonly used TCO models and looks at several case studies that apply TCO models to virtualization projects. Learn more.
Tags : 
vmware, virtualization, tco, cpu, it infrastructure, server virtualization, data center, fc san, fibre, networkattached storage, nas, nic, high availability, vmware consolidated backup, vcb, vdc os, tco, total cost of ownership, vstorage, distributed power management
    
VMware/Intel Server Refresh and Cost Savings
Published By: AWS     Published Date: Nov 02, 2017
Today’s organizations are tasked with managing multiple data types, coming from a wide variety of sources. Faced with massive volumes and heterogeneous types of data, organizations are finding that in order to deliver insights in a timely manner, they need a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. A data lake is an architectural approach that allows you to store massive amounts of data into a central location, so it’s readily available to be categorized, processed, analyzed, and consumed by diverse groups within an organization. Since data - structured and unstructured - can be stored as-is, there’s no need to convert it to a predefined schema and you no longer need to know what questions you want to ask of your data beforehand.
Tags : 
    
AWS
Published By: NetApp     Published Date: Sep 30, 2013
"As the rapid rise in the number of mobile devices and users creates an explosion in data and virtual machine instances, datacenter transformation becomes imperative for many enterprises. It is essential enterprises move to consolidate resources and cut both capital and operating costs while still providing support for distributed applications. This brief white paper delves into a Q&A with Eric Sheppard, research director of IDC’s Storage Software program, on integrated systems and whether you should buy compute, network, and storage resources together. Read on as you will discover: What integrated systems are, and its benefits The differences between an integrated platform and integrated infrastructure How datacenters are leveraging these new systems today And more" NetApp Privacy Policy
Tags : 
storage infrastructure, clustered storage, technology, scalability, application, storage solution, non-disruptive operations
    
NetApp
Published By: Pure Storage     Published Date: Feb 25, 2014
This is a white paper that assumes familiarity with Oracle database administration as well as basic Linux system and storage administration tasks as required for a typical database installation, such as creating partitions and file systems. The Pure Storage Flash Array is ready to run your Oracle database with no changes to your existing configuration. You can realize significant improvements in performance and ease of manageability
Tags : 
pure storage, oracle, database, flash for oracle, flash for database
    
Pure Storage
Published By: NexGen     Published Date: Feb 09, 2015
94% of IT professionals consider it valuable to manage data based upon the business value of it. Yet only 32% are doing it - because it's hard. Are you looking to be a part of the 32% without the difficulties? There is a substantial gap between those who want to use data to its full potential and those that are actually doing so. Storage systems have never been designed with the end-goal to help customers prioritize their data based on their own priorities and the value of that data. Narrowing that gap may not be as hard or expensive as many companies think. Solving these challenges was one of the founding principles around NexGen's Hybrid Flash Array. Download now and learn how IT professionals can use NexGen storage architecture and software to simplify data priority assignment and management process.
Tags : 
data, data value, nexgen, flash, hybrid
    
NexGen
Published By: Cisco     Published Date: Feb 13, 2009
This white paper details how continuous data protection for files is important in this current day where lost data can significantly affect productivity, customer satisfaction, and, ultimately, revenue.
Tags : 
cisco, data protection, network storage systems, data recovery, data loss
    
Cisco
Published By: IBM     Published Date: Jul 25, 2012
As it has been the trend over the last decade, organizations must continue to deal with growing data storage requirements with the same or less resources. The growing adoption of storage-as-a-service, business intelligence, and big data results in ever more Service Level Agreements that are difficult to fulfill without IT administrators spending ever longer hours in the data center. Many organizations now expect their capital expense growth for storage to be unstoppable, and see operating expense levers - such as purchasing storage systems that are easy to manage - as the only way to control data storage-related costs.
Tags : 
infrastructure, technology, cloud, storage, virtualization, data management
    
IBM
Published By: CDW     Published Date: Aug 04, 2016
As data volumes grow, you need more than just storage space. Let us help you orchestrate a solution that brings you the scalability and agility you need to move your organization forward. Storage needs are changing rapidly, and legacy appliances and processes just can’t keep up. Old systems are running slowly and filling up fast. At CDW, we can help you evolve your storage with a smart solution that’s ready for what lies ahead.
Tags : 
data, technology, storage, best practices, best solutions
    
CDW
Published By: CDW     Published Date: Sep 21, 2015
Years of IT infrastructure advancements have helped to drive out vast amounts of costs within the datacenter. Technologies like server and storage virtualization, data deduplication, and flash-based storage systems (to name just a few) have contributed to improvements of utilization rates, performance, and resiliency for most organizations. Unfortunately, organizations still struggle with deeply rooted operational inefficiencies related to IT departments with silos of technology and expertise that lead to higher complexity, limited scalability, and suboptimal levels of agility. The recent tectonic shifts caused by the rise of 3rd Platform applications that focus on social, mobile, cloud, and big data environments have amplified the pains associated with these structural inefficiencies.
Tags : 
quantify, infrastructure, performance, resiliency, data deduplication, datacenter, virtualization
    
CDW
Published By: AWS     Published Date: Oct 26, 2018
Today’s organisations are tasked with analysing multiple data types, coming from a wide variety of sources. Faced with massive volumes and heterogeneous types of data, organisations are finding that in order to deliver analytic insights in a timely manner, they need a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. A data lake is an architectural approach that allows you to store enormous amounts of data in a central location, so it’s readily available to be categorised, processed, analysed, and consumed by diverse groups within an organisation? Since data—structured and unstructured—can be stored as-is, there’s no need to convert it to a predefined schema and you no longer need to know what questions you want to ask of your data beforehand.
Tags : 
data, lake, amazon, web, services, aws
    
AWS
Published By: Texas Memory Systems     Published Date: Feb 25, 2012
With flash based systems now commonly in the TBs of capacity and rapidly becoming more affordable, it may be time to look at using solid state storage with high bandwidth applications. Read this white paper to find out more.
Tags : 
solid state storage, bandwidth, high bandwidth, applications, data warehousing, tbs, tb capacity, storage, ramsan, texas memory systems, high availability, performance, solid state, enterprise, racks, data storage, digital signal, low latency, enterprise data storage, ssds
    
Texas Memory Systems
Published By: Texas Memory Systems     Published Date: Feb 25, 2012
Most companies have only put limited effort toward the system architecture that surrounds the flash memory modules. In this white paper, Storage Switzerland focuses on the importance of architecture design in solid state storage systems.
Tags : 
ssd, architecture, design, zero-latency, retrieval, solid state storage, bandwidth, high bandwidth, flash controllers, applications, data warehousing, tbs, tb capacity, storage, ramsan, texas memory systems, high availability, performance, solid state, enterprise
    
Texas Memory Systems
Published By: Texas Memory Systems     Published Date: Feb 25, 2012
It is common knowledge that PCIe SSD is faster, right? Maybe not, says George Crump from Storage Switzerland. Read on to find out why.
Tags : 
pcie, pcie ssd, ssd, architecture, design, zero-latency, retrieval, solid state storage, bandwidth, high bandwidth, flash controllers, applications, data warehousing, tbs, tb capacity, storage, ramsan, texas memory systems, high availability, performance
    
Texas Memory Systems
Published By: Texas Memory Systems     Published Date: Feb 25, 2012
In this part II of Enhancing Server And Desktop Virtualization With SSD, Storage Switzerland focuses on the next challenge of integrating SSD into a virtual server or desktop infrastructure and effectively using the technology in this environment.
Tags : 
integrated ssd, ssd, virtual server, infrastructure, desktop infrastructure, server virtualization, desktop virtualization, architecture, design, zero-latency, retrieval, solid state storage, bandwidth, high bandwidth, flash controllers, applications, data warehousing, tbs, tb capacity, storage
    
Texas Memory Systems
Published By: Texas Memory Systems     Published Date: Feb 25, 2012
For data centers looking to get the maximum performance from Flash-based storage supporting their virtual infrastructure, the RamSan-720 should be given strong consideration.
Tags : 
ramsan-720, integrated ssd, ssd, virtual server, infrastructure, desktop infrastructure, server virtualization, desktop virtualization, architecture, design, zero-latency, retrieval, solid state storage, bandwidth, high bandwidth, flash controllers, applications, data warehousing, tbs, tb capacity
    
Texas Memory Systems
Published By: Texas Memory Systems     Published Date: Feb 25, 2012
In this final piece on server virtualization and SSD, Storage Switzerland discusses the power of performance, specifically, the dramatic impact that latent-free storage can make on the virtual environment. Read on to find out more!
Tags : 
desktop virtualization, architecture, design, zero-latency, retrieval, solid state storage, bandwidth, high bandwidth, flash controllers, applications, data warehousing, tbs, tb capacity, storage, ramsan, texas memory systems, high availability, performance, solid state, enterprise
    
Texas Memory Systems
Published By: Gigaom     Published Date: Oct 24, 2019
A huge array of BI, analytics, data prep and machine learning platforms exist in the market, and each of those may have a variety of connectors to different databases, file systems and applications, both on-premises and in the cloud. But in today’s world of myriad data sources, simple connectivity is just table stakes. What’s essential is a data access strategy that accounts for the variety of data sources out there, including relational and NoSQL databases, file formats across storage systems — even enterprise SaaS applications — and can make them all consumable by tools and applications built for tabular data. In today’s data-driven business environment, fitting omni-structured data and disparate applications into a consistent data API makes comprehensive integration, and insights, achievable. Want to learn more and map out your data access strategy? Join us for this free 1-hour webinar from GigaOm Research. The webinar features GigaOm analyst Andrew Brust and special guests, Eric
Tags : 
    
Gigaom
Published By: SAS     Published Date: Mar 06, 2018
Imagine getting into your car and saying, “Take me to work,” and then enjoying an automated drive as you read the morning news. We are getting very close to that kind of scenario, and companies like Ford expect to have production vehicles in the latter part of 2020. Driverless cars are just one popular example of machine learning. It’s also used in countless applications such as predicting fraud, identifying terrorists, recommending the right products to customers at the right time, and correctly identifying medical symptoms to prescribe appropriate treatments. The concept of machine learning has been around for decades. What’s new is that it can now be applied to huge quantities of data. Cheaper data storage, distributed processing, more powerful computers and new analytical opportunities have dramatically increased interest in machine learning systems. Other reasons for the increased momentum include: maturing capabilities with methods and algorithms refactored to run in memory; the
Tags : 
    
SAS
Published By: Intel Corp.     Published Date: Aug 08, 2012
This report covers the challenges of first generation deduplication technology and the advantages of next-gen deduplication products. Next generation Dedupe 2.0 systems use a common deduplication algorithm across all storage systems-whether they're smaller systems in branch offices or large data center storage facilities. That means no more reconstituting data as it traverses different storage systems, which saves bandwidth and improves performance.
Tags : 
advances, deduplication, help, tame, big, data, chanllenges, first, generation, technology, dedupe, 2.0, algorithm
    
Intel Corp.
Published By: InMage     Published Date: Feb 24, 2009
This paper describes a series of tests run to determine the viability of continuous data protection (CDP) using InMage DR Scout along with Agami Systems AIS 3000 series of unified storage systems.
Tags : 
inmage, dr scout, accenture, cdp, continuous data protection, agami systems ais 3000, backup, data management, disaster recovery, network file systems, nfs, common internet file systems, cifs, raid
    
InMage
Start   Previous    1 2 3    Next    End
Search      

Add Research

Get your company's research in the hands of targeted business professionals.