Top Technology Trends

Date:   Tuesday , May 19, 2015

Quantum (NYSE: QTM) is an expert in scale-out storage, archive and data protection providing solutions for capturing, sharing and preserving digital assets over the entire data lifecycle. Headquartered in California, the company has a current market cap of $410.22 million.

  1. Flash Here - There - and Everywhere

    Flash has emerged as a key technology for storage in the 2010s. Whether you use it locally or network attached, as a device or in a storage array, with or without software optimization, it can offer tremendous advantages in improving performance (both latency and IOPS) for use cases from VDI to VM convergence to database. The challenge is - which type of flash solution and who\'s flash to pick? The market is so chaotic right now that sometimes figuring this out can be a real challenge.


  2. Demand for the Easy Button: Simplicity

    Customers have spoken - clearly - to say that data complexity (and storage administration cost) is out of control. In addition to raw data growth, mobile and sensor devices - 20 billion of them by the year 2020, according to IDC - are all generating new endpoints where data is getting created. Data is arriving in an increasingly larger quantity of objects to be indexed, with each object being bigger than it used to be, due to the availability of more granular data to collect. Continuously streaming data presents new and unique performance and availability problems. And all data needs to be maintained longer - in 2014 Gartner reported that archive times were definitely increasing, with the percent of customers keeping data more than two years expanding from 47 percent in 2011 to 59 percent by 2013. The only thing not growing is the administration budget.


  3. Data Awareness and Automation - The Future of Data Storage

    Beyond greater administration simplicity - the easiest administration is to require none at all. The industry has already seen two major announcements this year on a new trend called Data Awareness, a concept of automatically collecting intelligence about the users and the content they are storing in order to better manage the storage being used. Enterprise IT buyers are insisting that storage processes must not only be automated - they must also be business-aware, with the ability to link data placement and movement with data source, data type, user or organizational demographics and business process management. Again, all automated - including automation to the cloud.


  4. Software-Defined Storage: Return to Best of Breed - with Support Headaches


    Software-defined storage is the latest rage. It\'s particularly popular among software startups looking to ease the buying and deployment of their new value-add on customers\' existing hardware. The question is - is this the model customers really need? Yes, it offers unique, high-value software to the customer earlier than his traditional systems vendor can deliver it. But isn\'t this reminiscent of the 1980s, when customers bought \"best-of-breed\" products by category and then were stuck managing integration - and support - for multi-vendor environments? Software-defined storage reintroduces higher integration and support costs for customers at a time when they don\'t want to spend money on storage administration, which seems like a contradiction in values. And what happens when the software vendor gets acquired by a larger systems vendor who isn\'t the customer\'s hardware provider?


  5. Clouds Part, Data Costs (and Risks) Shine Through

    There\'s no question that public cloud has real value for \"stateless\" services like compute and networking. Few customers use 100 percent of these resources all the time - but sometimes they need more than 100 percent of what they want to buy. This makes the public cloud ROI outstanding in these areas. But data services are more challenging. Data has \"gravity\": first, it needs to use its resource permanently; and, second, geographic location matters. This means the economics of public storage clouds are only compelling for customers if they don\'t plan, or need, to regularly download any of their data stored in the cloud back to the private data center. Otherwise, the cost of moving data back and forth over the wire quickly adds up. That\'s why hybrid cloud architectures will increasingly become the preferred model, enabling customers to access or restore data locally while still taking advantage of the public cloud for infrequently accessed data and disaster recovery.

    Speaking of regularly downloading data from an archive in the cloud, in addition to being expensive, it\'s also not very fast. Unless a customer is willing to pony up for a direct (and dedicated) network connection to his service provider, the network performance is going to be gated by a combination of general WAN network speed and/or the issues of sharing well with others. Downloading a PB will take weeks, not hours or days. As companies begin to realize this, the idea of using more than one cloud provider will have greater appeal - just in case one of their vendors suddenly becomes too pricey or painful to deal with.


  6. Object Storage: Starring Role Behind the Scenes

    Over the last few years, next generation object storage has increasingly been recognized by public cloud service providers like Google, AWS and Facebook as the answer to the shortcomings of traditional RAID in managing and protecting large data volumes. But object storage startups selling solutions to private enterprises still seemed to be struggling for market share. Why is that? Well, object storage - as an architecture - is a key enabler for storing large amounts of data for a long time with great integrity and availability - and no cost of forklift upgrades. It\'s great for static content, for example. But legacy block and file applications won\'t talk to it - and because its natural economies of scale tend to kick in at larger capacities, it\'s expensive for a commercial enterprise to test it as a standalone device. So object storage startups have had challenges in gaining adoption. But don\'t be confused. While object storage systems haven\'t taken off as standalone products, object storage technology is taking hold - buried in online archives, content distribution, and managed workflow solutions, and increasingly in converged offerings from traditional storage suppliers. In the coming year, this trend will accelerate, and object storage will increasingly be the underlying foundation of all scale-out storage solutions.



  7. Growing Need for Video-Specific Solutions

    Video is sprouting up everywhere. Suddenly everybody is making video and using it as a communication - and compliance - tool. On the communication side, 79 percent of consumer internet traffic by 2018 will be video, according to a Cisco study. When it comes to marketing, Forbes Insights research has found that 59 percent of senior executives say they\'d rather watch a video than read an advertisement, which will help drive $5 billion in video ads by 2016. On the compliance side, more and more public safety organizations are using video, particularly via on-body cameras (think GoPro for police) to capture live interactions and evidence, useful in both the courtroom and with the public. All of this will create increasing storage challenges in 2015 and beyond. Video files are huge, even compressed, and they don\'t dedupe. They also tend to be static - and re-usable - so they get stored for a long time. The combination of these factors tends to result in unpredicted data growth. And by the way, get ready for similar issues with streaming sensor data - in some applications; it looks a lot like video. The challenge is that traditional storage solutions don\'t work well for video, with its immense data sizes and (often) zero latency requirements. Customers need to look for ways to segregate this data and apply specialized tools to it - to avoid the risk that it undermines the performance of other storage applications.