Amazon Web Services Introduces Big Data Storage


Bangalore: Amazon Web Services announced that have introduced a new storage package that offers more storage and faster access to large amounts of data called High Storage. This move clearly indicates Amazon’s intentions on being a major player on the big data storage market.

According to the company, High Storage is an Amazon Elastic Cloud Compute (EC2) package primarily designed to run high quality job analysis such as seismic analysis, log processing and data warehousing. The infrastructure has been designed in such way that it operates on a parallel file system architecture and also allows data’s to be moved from multiple disks simultaneously, thereby increasing the transfer speed.

"Instances of this family provide proportionally higher storage density per instance, and are ideally suited for applications that benefit from high sequential I/O performance across very large data sets," AWS states in the online marketing literature for this service.

The company has introduced the service as a supportive measure to its Elastic MapReduce Service since Amazon Web Services itself uses High Storage packages to run its Redshift data warehouse service.

The High Storage instance comes with some exciting new features as such it provides 35 EC2 units of computing capacity along with 117GB of working memory. With 48TB of data spread across multiple hard drives, the access and storage speed can always increase and provide great input-output sequential performance.

The High Storage package is the ninth type of computing instance from Amazon Web Services. The package joins a long list of prolific instances particularly for GPUs (Graphic Processing Units) and HPCs (High Performance Computing).

If your company is considering being a part of the AWS service, then consider AWS cost management to help you along the way of being the most optimized company on the cloud.