Cisco engaged Miercom to conduct a competitive analysis of its Catalyst 2960-X switch versus two
comparable Hewlett-Packard switches from the 2920 and 5120 product families. Miercom executed comprehensive hands-on testing and evaluated the performance of some widely
deployed features that are critical for reliable functioning of enterprise networks. The test methodology focused on specific areas in which Cisco believed there were key competitive
differentiators between the products.
The report is divided into two main areas:
Threat Intelligence, which gives an overview of the latest threat
research from Cisco.
And, Analysis and Observations, where security industry consolidation and the emerging concept of integrated threat defense is discussed.
Nemertes Business Value Analysis independently evaluates technology products and
services to identify the value to enterprise organizations. Through detailed interviews with
technology professionals who use the products or services, Nemertes analyzes and
quantifies the real-world benefits and improvements to the efficiency of their organizations.
Self-service analytics implies that users design and develop their own reports and do their own data analysis with minimal support by IT. Most recently, due to the availability of tools, such as those from Qlik, Spotfire, and Tableau, self-service analytics has become immensely popular. Besides powerful analytical and visualization capabilities, they all support functionality for accessing and integrating data sources. With respect to this aspect of data integration four phases can be identified in the relatively short history of self-service analytics. This whitepaper describes these four phases in detail and shows how the tools Cisco Data Preparation (CDP) and Cisco Information Server (CIS) for data virtualization can strengthen and enrich the self-service data integration capabilities of tools for reporting and analytics.
To better understand the benefits, costs, and risks associated with Cisco TrustSec, Forrester interviewed two companies. Company A is an organization that provides senior housing for over 22,000 residents in the US. This organization used Cisco TrustSec to provide highly secure segmented mobile network services to its residents. Company B is an international packaging and paper group company experiencing rapid growth. It used Cisco TrustSec as an innovative solution to quickly integrate the infrastructure of new acquisitions into its own and standardize security architecture across its wide area network (WAN) globally. While the case study will talk about the experiences of these two companies, the financial analysis focuses on the impact that Cisco TrustSec had on one organization.
SAP HANA is a powerful, in-memory computing platform that streamlines business suite applications, analytics, planning, predictive analysis, and sentiment analysis on a single platform, so businesses can operate in real time. The design approach for enterprise-level solutions involving SAP HANA, and the best practices surrounding them, isn’t intrinsically different from the approach to any other enterprise-level solution for technology implementations. This paper is written to address those elements of good solution design and apply them to the SAP landscape, with particular focus on the SAP HANA element.
WebEx elected to pursue a course of investigation and analysis that required a granular picture of its application-centric hardware implementations and corresponding power usage profiles. To accomplish this, WebEx decided to retool their datacenters with intelligent power strips having the ability to remotely report power consumption.
Web presence is now a required option for customers accessing a business’s products and services. But more than just an option, web pages have become the face of most companies to the public. Learn
how can you take control of the Internet effect on your company’s web presence through Internet monitoring, analysis and planning.
Published By: Internap
Published Date: Dec 02, 2014
NoSQL databases are now commonly used to provide a scalable system to store, retrieve and analyze large amounts of data. Most NoSQL databases are designed to automatically partition data and workloads across multiple servers to enable easier, more cost-effective expansion of data stores than the single server/scale up approach of traditional relational databases. Public cloud infrastructure should provide an effective host platform for NoSQL databases given its horizontal scalability, on-demand capacity, configuration flexibility and metered billing; however, the performance of virtualized public cloud services can suffer relative to bare-metal offerings in I/O intensive use cases. Benchmark tests comparing latency and throughput of operating a high-performance in-memory (flash-optimized), key value store NoSQL database on popular virtualized public cloud services and an automated bare-metal platform show performance advantages of bare-metal over virtualized public cloud, further quant
Published By: Internap
Published Date: Mar 30, 2015
Selecting an Infrastructure-as-a-Service (IaaS) provider can be a complex exercise that involves an array of considerations including business needs, budget, and application requirements. Buyers frequently respond to this complexity by filtering vendors based on variables that are more easily comparable, usually product features, location, and price. By contrast, performance, which is a critical factor to ensuring fit with business needs and ultimately satisfaction with the service, is often ignored. Virtual machine (VM) performance can be challenging to assess because it can vary drastically across vendors, instance sizes and prices, as well as in terms of a particular application’s unique requirements.
This paper investigates the effects of DDR4’s Pseudo Open Drain (POD) driver on data bus signaling and describes methodologies for dynamically calculating the DRAM’s internal VrefDQ level required for data eye analysis.
Over the years, two major approaches to SERDES simulation have emerged and gained popularity: time-domain (or bit-by-bit) and statistical. Both are used to build the eye diagram and bit-error ratio (BER), and each has its benefits and limitations.
This paper, nominated for the DesignCon 2016 Best Paper Award, analyzes the computational procedure specified for Channel Operation Margin (COM) and compares it to traditional statistical eye/BER analysis.
The term “Big Data” has become virtually synonymous with “schema on read” unstructured data analysis and handling techniques like Hadoop. These “schema on read” techniques have been most famously exploited on relatively ephemeral human-readable data like retail trends, twitter sentiment, social network mining, log files, etc.
Published By: Altiscale
Published Date: Mar 30, 2015
Implementing and scaling Hadoop to analyze large quantities of data is enormously complicated. Unforeseen, very challenging problems are to be expected. However, if you can learn to recognize the problems before a fire starts, you can prevent your hair (and your Hadoop implementation) from igniting.
From the Hadoop experts at Altiscale, here are some of the danger signs and problems you should watch out for, as well as real-world lessons learned for heading them off.
This buyer’s guide provides an in-depth explanation of the factors that impel organizations to look at next-generation security solutions. It also offers:
-An analysis of the capabilities you should look for (and demand) in your network security solutions
-Arms you with the information you need to be an educated buyer
-Helps you get what you need, and not a set of future capabilities packaged in a “marketecture” that you can’t deploy
Published By: Internap
Published Date: Nov 11, 2015
This benchmark analysis examines the relative performance of major VM components including virtual cores, memory, block storage, and internal network for Internap AgileCLOUD and Amazon Web Services (AWS) ECS/EBS.
Over the years we’ve all heard claims of simple, seemingly magical solutions to solve security problems, including the use of sandboxing technology alone to fight advanced malware and targeted threats.
Prevention is your first line of defense. Make sure your Next-Gen Endpoint Security includes:
Global Threat Intelligence – a team of threat hunters detecting the newest threats and uncovering zero-days to keep you protected 24/7
Signature-based AV Detection – let your Next-Gen Endpoint Security solution do all the AV heavy lifting and consolidate protection onto one agent
Built-in Sandboxing – get static and dynamic analysis of suspicious threats, without having to deploy a third-party sandbox
Proactive Protection – identify and patch vulnerabilities, and analyze and stop suspicious low-prevalence executables before they become real problems