Optimize Storage Economics and Data Management with Machine Learning

Joint solution identifies and moves warm and cold data non-disruptively to cost-efficient object storage

Automated, Intelligent Decision Making is the Key

With the volume of unstructured data growing today at a dramatic rate, managing this data growth requires a new approach — one that can automatically make intelligent decisions so IT does not have to do so manually. Up to 90% of data is warm or cold, so there is a need to effectively identify and manage data so that the right data resides on the right storage at the right time. IT often over-provisions for performance and capacity in order to maintain SLAs, but this is an inefficient use of already limited budgets. Adding to the inefficiency, IT also ends up spending more time troubleshooting, managing performance and capacity, and dealing with disruptive migrations, upgrades, and maintenance.

Cloudian–DataSphere for Intelligent Data Management

Optimize your storage infrastructure and economics with DataSphere from Primary Data, storing warm and cold data in affordable and limitlessly scalable object storage from Cloudian.

 

Primary Data and Cloudian have teamed up to address the needs of the modern data center by focusing on scale, cost, and automation, without sacrificing the high-performance of primary storage. Primary Data’s intelligent data management platform, DataSphere, automatically and non-disruptively moves data to the most appropriate storage type to ensure cost efficiency and meet user-defined business objectives and desired service levels.

Cloudian’s HyperStore and HyperFile storage software deliver new levels of technical and economic efficiencies — dramatically reducing storage costs, and making this an ideal platform for capacity intensive data. Cloudian storage has unmatched durability and flexibly scales capacity as needed to support data growth — from terabytes to hundreds of petabytes, across all locations.

DataSphere Puts the Right Data, in the Right Place, at the Right Time

DataSphere employs machine learning software to build intelligence into how an enterprise automates the management of data across its IT infrastructure, both on-premises and in the cloud. The DataSphere machine learning metadata engine adds awareness between applications and infrastructure, virtualizing data by mounting all storage into a single, global namespace. This makes heterogeneous data stores such as the Cloudian HyperStore and HyperFile simultaneously available to all applications without requiring any changes or disruption to application workloads. DataSphere uses industry standard open source protocols to automate tiering and live-data mobility without application disruption. DataSphere also supports SMB, and NFS v3 for data access.

This same technology allows DataSphere to define a global namespace to host a heterogeneous storage environment, build high-performance scale-out NAS clusters with load-balanced parallelized read/write streams directly to arrays, and gather application performance telemetry. This virtualization of the data means that you can add file, block and object storage to your infrastructure without creating storage silos. Applications will see a single file-structure no matter where the data is located.

DataSphere Key Features

Machine Learning
Manage billions of files, while monitoring application performance and all data to make decisions to optimize the infrastructure every 15 seconds.

Support for Heterogeneous Environments
Easily deploy into mixed vendor environments with support for NFS, SMB, and S3 protocols.

Objective Expressions
Easily define the context for data management, guiding the machine learning engine so that it can make the best decisions for the infrastructure, based on your preferred business outcomes.

Metadata Management
Manage your data at any level of granularity — shares, volumes, or files. Metadata management allows DataSphere to make decisions based on age, type, tags, location, owner, size, etc.

Cloudian HyperStore and HyperFile

Cloudian’s HyperStore petabyte-scale object storage platform provides an elegant, cost-effective, on-premise solution for capacity intensive storage and archive. Designed to meet the need for large, secure, highly resilient and flexible storage infrastructure at a low cost, HyperStore seamlessly stores, moves and protects objects and files across locations, including the public cloud. Cloudian HyperFile is a NAS Controller that provides file services for HyperStore.

Cloudian clusters upgrade non-disruptively with newer, higher capacity nodes, eliminating costly overprovisioning. HyperStore’s automatic data verification and self-healing functions provide reliability and resilience against hardware failures, while its data encryption, both in-flight and at rest, safeguard data against threats of deletion or theft via malware.

KEY SOLUTION BENEFITS

Optimize storage economics by improving infrastructure utilization and performance, reducing the need to over-provision.

Cost-efficient object storage for capacity intensive application data.

Eliminate application downtime by virtualizing your data into a global namespace, pooling heterogeneous storage to make it easy to consume NAS and S3 with no reconfiguration or refactoring.

Reduce IT workload with machine learning that monitors application performance and manages the placement of the right data, in the right place, at the right time.

Easily grow with your needs, add capacity and scale only as needed.

Pay-as-you-grow — DataSphere is a software subscription licensed per instance. Cloudian subscriptions start at ½¢ per GB/month.

Manage data according to business needs, instead of storage concepts. Objective-based management uses a model of financial arbitrage to align the infrastructure to business needs.