Guest Blog Post by Colm Keegan from Storage Switzerland

Various industry sources estimate that data is doubling approximately every two years and the largest subset of that growth is coming from unstructured data. User files, images, rich multimedia, machine sensor data and anything that lives outside of a database application can be referred to collectively as unstructured data.

Storage Scaling Dilemma

3d-man-growing-data-centerThe challenge is that traditional storage systems, which rely on “scale-up” architectures (populating disk drives behind a dual controller system) to increase storage capacity, typically don’t scale well to meet the multi PB data growth which is now occurring within most enterprise data centers. On the other hand, while some “scale-out” NAS systems can scale to support multiple PB’s of storage within a single filesystem, they are often not a viable option since adding storage capacity to these systems often requires adding CPU and memory resources at the same time – resulting in a high total cost of ownership.

Commoditized Storage Scaling

Businesses need a way to cost effectively store and protect their unstructured data repositories utilizing commodity, off the shelf storage resources and/or low cost cloud storage capacity. In addition, these repositories need to be capable of scaling massively to support multiple PB’s of data and enable businesses to seamlessly share this information across wide geographical locations. But in addition to storage scale and economy, these resources should also be easy to integrate with existing business applications. And ideally, they should be performance optimized for unstructured data files.

Software Driven Capacity

Software defined storage (SDS) technologies are storage hardware agnostic solutions which allow businesses to use any form of storage to build-out a low cost storage infrastructure. Internal server disk, conventional hard disk drives inside a commodity disk array or even a generic disk enclosure populated with high density disk can be used. Likewise, with some SDS offerings, disk resources in the data center can be pooled with storage in secondary data center facilities located anywhere in the world and be combined with cloud storage to give businesses a virtually unlimited pool of low-cost storage capacity.

Plug-and-Play Integration

From an integration perspective, some of these solutions provide seamless integration between existing business applications and cloud storage by providing native support for NFS and CIFS protocols. So instead of going through the inconvenience and expense of re-coding applications with cloud storage compatible API’s like REST, SWIFT or Amazon’s S3 protocol, these technologies essentially make a private or hybrid cloud data center object storage deployment a plug-and-play implementation but still provide the option to go “native” in the future.

Tapping Into Cloud Apps

But storage integration isn’t just limited to on premise applications, it also applies to cloud based applications as well. Today there is a large ecosystem of Amazon S3 compatible applications that businesses may want to leverage. Examples include backup and recovery, archiving, file sync and share, etc. Gaining access to these software offerings by utilizing an S3 compatible object storage framework, gives businesses even more use cases and value for leveraging low-cost hybrid cloud storage.

Data Anywhere Access

Now businesses can provision object storage resources on-premises and/or out across public cloud infrastructure to give their end-users ubiquitous access to data regardless of their physical location. This enables greater data mobility and can serve to enhance collaborative activities amongst end-users working across all corners of the globe. Furthermore, by replicating data across geographically dispersed object storage systems, businesses can automatically backup data residing in remote offices/branch offices to further enhance data resiliency.

With data intensive applications like big data analytic systems and data mining applications clamoring for high speed access to information, object storage repositories need to be capable of providing good performance as well. Ideally, the storage solution should be tuned to read, write and store large objects very efficiently while still providing high performance.

 Stem The Data Tide

Businesses today need a seamless way to grow out low-cost abundant, hybrid cloud storage resources across the enterprise to meet the unstructured data tsunami that is flooding their data center environments. In addition to providing virtually unlimited scaling from a storage capacity perspective, these resources need to easily integrate into existing application environments and provide optimal performance access to large unstructured data objects. Cloudian’s HyperStore solution provides all of these capabilities through a software defined storage approach which gives businesses the flexibility to choose amongst existing commodity disk assets in the data center and/or low cost object storage in the cloud, to help stem the unstructured data tide.


About Author

Colm Keegan is a 23 year IT veteran, Colm’s focus is in the enterprise storage, backup and disaster recovery solutions space at Storage Switzerland.

Click to rate this post!
[Total: 0 Average: 0]