MLC poised to dominate SSD device market

Much has been made about the ongoing data storage market struggle between solid state devices and traditional hard disk drives. SSD adoption rates have quickly picked up steam in recent years as the cost of NAND-based flash memory has dropped, allowing more businesses and consumers to benefit from its performance advantages. However, another hidden battle has been waging within the SSD market as single-level cell (SLC) and multi-level cell (MLC) flash memory vie for supremacy. 

To the average consumer, the differences between SLC and MLC technology may appear negligible, but those disparities are extremely important to business owners. SLC flash devices have demonstrated better endurance, a greater range of operating temperatures and faster data read/write speeds than MLC storage options. From a purely performance perspective, SLC seems like the clear choice for prospective SSD adopters, particularly businesses which place a higher premium on operability. However, MLC flash memory is far less expensive to produce than SLC. Considering that perhaps the greatest barrier to wider SSD adoption has traditionally been the price point, cost is a vital aspect of the technology's viability.

Cost trumps performance
Some industry experts have predicted that MLC flash storage will become the SSD market standard in the coming years. Dell vice president Alan Atkinson recently forecasted that MLC will completely replace SLC memory within five years. 

"SLC is still the most reliable technology out there, that I am aware, and has very capable performance characteristics but just happens to be really, really expensive," Atkinson stated. 

FierceCIO noted that many SSD manufacturers have already embraced MLC for most of their products. This has even been witnessed at the enterprise level, where industry-specific data storage solutions often utilize MLC flash memory aside from a very select range of high performance products. 

ComputerWeekly's Antony Adshead actually predicted the demise of single-level cell storage back in March. He noted that although SLC was the best performing flash memory on the market, the price gap would be too much to overcome. Adshead argued that the real battle within the SSD market is between multi-level cell and triple-level cell (TLC) memory. TLC, the latest NAND option to emerge in the marketplace, has many inherent challenges, including higher power and cooling needs, compared with MLC. However, industry observers such as Adshead predict that the cost of TLC could drop below that of other options, making it a highly desirable commodity in the SSD market. Until then, it appears that MLC will continue its reign as market leader. 

Enhancing web-based business with SSD drives

In recent years, businesses have been clamoring to deploy their operational applications, software and data through off-site cloud services. Without the need for on-premise hardware and maintenance teams, organizations can significantly reduce their budget costs without experiencing a drop in performance. In addition, cloud service providers offer the flexibility to deploy applications when needed, meaning businesses do not have to pay for software licenses they infrequently use. 

Interest in software-as-a-service (SaaS) cloud computing has increased substantially as of late. A recent market study released by Gartner found that 77 percent of respondents said they intended to increase their SaaS spending. In addition, 17 percent said they would maintain their current investment rates. Gartner officials reportedly expected such positive results, as interest in cloud services has been on the rise for some time now.

"Seeing such high intent to increase spending isn't a huge surprise as the adoption of the on-demand deployment model has grown for more than a decade, but its popularity has increased significantly within the past five years," Gartner research vice-president Charles Eschinger said.

Facilitating web-based software performance
From a service provider's standpoint, the increasing demand for web-based software solutions has placed additional importance on end user performance needs. If the functionality of a cloud application suffers and users are unable to receive the responsiveness they expect, cloud service providers may face a high level of customer churn as dissatisfied clients drop their services in favor of a competitor's offerings. To prevent this scenario from occurring, cloud service providers can deploy SSD drives in their server rooms to improve the performance of their applications.

As noted in a March TechInsidr blog post, traditional hard disk drives can produce operational bottlenecks that may result in significant latency periods when launching applications. According to the source, Google identified latency as a major factor that could adversely impact user experience and lead to a drop in advertising revenue. The search engine juggernaut determined that a single 400 millisecond delay could cause a -0.59 percent change in searches. While that may seem like a trivial number, a negative swing could result in a substantial loss of revenue considering the massive number of searches entered into Google each day.

Solid state technology effectively avoids many of these latency issues because it operates on NAND flash-based memory, allowing for systems to access applications and files with greater ease and speed. For instance, TechInsidr explained that, a $19.5 billion dollar web-based software vendor, leveraged SSD drives to facilitate its performance needs. The two main benefits that the company experienced using solid state devices were improved reliability and responsiveness. An operation on that scale necessitates an exceedingly high level of server performance to accommodate its high request volume. SSD drives can help shoulder that workload without critical equipment becoming overwhelmed. In addition, solid state technology can ensure that the end user is being provided with an application service that operates at an optimal level.

SSD drive facilitate the switch to digital film

For decades, celluloid film was the standard format of the movie industry. However, in recent years, more directors and theaters have converted to digital solutions, spurred on by a noticeable upgrade in picture quality and durability provided by the new technology

"The benefit of digital is, you don't have damaged film," Regal Cinemas vice president Roger Frazee told CNN. "You don't have scratched prints, and it looks as good in week six as it does on day one."

In addition, digital film has removed many of the projection issues that had previously been commonplace in movie theaters. Gone are the noticeable burn marks that would appear on screen, notifying audience members that a reel had been changed. In addition, digital projectors are devoid of the industrial rumble that characterized traditional models. Patrons who used to sit in the back of the theater no doubt were distracted by a noisy film projector.

Driven by the improved audience experience provided by the newer format, most theaters in the United States have opted to upgrade their existing projectors to digital ones. According to the National Association of Theatre Owners, 88 percent of the theaters in the United States have converted to digital formats, CNN reported. 

Because of the expenses involved in a digital equipment upgrade – Regal Cinemas' conversion cost upwards of $75,000 per auditorium – many art house theaters have missed the digital revolution. Without a steady supply of celluloid-based films, these independent theaters are closing up shop all across the country, according to For example, the Saguaro Theater in Wickenburg, Ariz., is facing the possibility that it may have to shut down its 63-year operation because of the exorbitant cost of a digital upgrade.

The switch to digital hits independent filmmakers
Caught in the middle of the move to digital media are the independent filmmakers and videographers who depend on smaller screens to showcase their movies. Like the nation's cinemas, they too will need to convert to a digital format or risk losing relevancy in the modern film industry. However, the cost of upgrading to a digital camera is far more feasible than switching to a digital projector, even for an operation consisting of a single person.  

Computer Dealer News noted that independent filmmakers and videographers can significantly benefit from a digital camera upgrade as they can simplify the shooting and editing process while producing a far better product. For instance, the Blackmagic Cinema Camera is capable of capturing uncompressed video, which would allow for unparalleled resolution and image quality. To fully take advantage of the this feature, however, filmmakers will need to deploy an SSD drive that is capable of facilitating its production. Prospective directors and videographers should take note that not all SSDs on the market can shoulder this workload. To ensure comprehensive uncompressed video compatibility, filmmakers should seek out a vendor who can service their entire storage needs.

Enterprises flock to SSD drives

In recent years, businesses of every size have increasingly explored the benefits of burgeoning technological advances. E-commerce has allowed many small operations to expand their business to reach a growing global market place. Meanwhile, organizations representing nearly all sectors have leveraged big data solutions to gain actionable insight from industry and consumer trends and to use that information to improve future operational decisions.

These advanced applications require greater performance capacities to provide businesses and their clients with optimal service. The bottlenecks inherent in traditional HDD technology can be extremely disruptive when attempting to launch a job-critical task, resulting in less effective enterprise operations, poor returns on big data investments and ultimately a dissatisfied client base.

Many businesses seem to be increasingly recognizing the need for greater data storage performance, as more companies have invested in SSD drives. Because solid state technology is based on NAND flash memory and has no internal moving parts, it is not susceptible to many common data reading bottlenecks, meaning operators can launch their applications and boot systems with far less lag time. According to a study released by Research and Markets, these performance enhancements have pushed many businesses around the world to pursue solid state solutions. This was particularly witnessed among companies that wished to launch their own big data projects. Researchers reported that the increased performance requirements of these initiatives represented new market growth potential for PCIe SSD technology.

Balancing cost, performance and endurance
Although the cost of solid state technology remains a barrier for widespread adoption, a recent study on enterprise SSD usage reported encouraging findings for the potential of future market expansion. Fifty-two percent of surveyed IT professionals said their organizations utilized solid state storage solutions. However, 55 percent cited cost as a major barrier to wider expansion. Meanwhile, 77 percent of respondents stated that the endurance of Flash-based memory would dictate the technology's growth potential.

"Since SSDs became a viable option for the enterprise performance, cost and reliability have been critical points of evaluation. However, as IT purchasers begin to better understand the technology's strengths and weaknesses, we are seeing more importance placed on the balance between endurance and cost," said data storage industry member John Scaramuzzo. "We have always believed that endurance is the key to making SSDs truly viable in the enterprise."

However, in an effort to lower costs, many respondents reported deploying consumer-grade multi-level cell flash drives. Because they are not designed for enterprise workloads, these drives have lower endurance levels, wearing out quicker than a more appropriate solution. As the cost of enterprise-level SSD drives drops, more businesses will have the opportunity to experience the benefits of a solid state storage solution that is tailored to their workloads and performance needs.

Optimizing data storage with a hybrid environment

With the relatively recent price drop of NAND flash memory, SSD drives have become much more widely utilized in both consumer and enterprise environments. However, those lower costs are still significantly higher than traditional hard disk drive devices. It is no secret that cost remains the greatest barrier for greater solid state adoption rates, especially when viewed in terms of price-per-gigabyte. Simply put, HDDs are currently a much cheaper alternative.

That lower price point comes with its own share of issues, as traditional storage drives relying on magnetic disks are notoriously prone to bottlenecks, preventing the quick launch and usability of applications and files. For consumers, this amounts to little more than an irritant, leaving users to wait longer for their programs to open. In a business setting, however, these operational hiccups can affect the quality of services provided by a company or employees' ability to meet deadlines.

Typically, the ongoing battle between SSD drives and HDDs has been characterized as an all or nothing scenario. Business managers had to choose between one solution or the other, weighing performance against cost. However, hybrid storage environments are increasingly becoming a popular option among many companies. TechTarget's Marc Staimer recently outlined the benefits of a setup that balanced the vastly higher performance levels of solid state technology with the cost-effectiveness of traditional storage devices.

"Hybrid systems are designed to enable both high performance for those applications requiring high [input/output operations per second] or high throughput and low-cost capacity for data that doesn't need the performance of SSDs," Staimer wrote. "This predictably produces a very low cost per ggabyte [sic], per IOPS and GBps throughput. Delivering a quality, capable and scalable hybrid system is non-trivial."

Choosing the appropriate storage solution for a task
Hybrid systems are especially useful for organizations that have specific applications or jobs that require optimal performance levels while also retaining numerous other tasks that can be completed easily with or without bottlenecks. This way, managers can load high-priority jobs on an SSD drive while relegating less intensive applications to traditional hard disk drive storage.

If deployed properly in the right circumstances, businesses may actually save money on a hybrid environment compared to a purely HDD-based storage system. TechTarget noted that organizations such as Colby-Sawyer College that want to implement virtual desktop infrastructure may find that a single SSD drive could support more virtual machines than numerous HDD devices. Assistant IT director David Blaisdell told the news outlet that deploying solid state solutions for virtualization purposes and using HDDS for others could allow his department to save on overhead costs.

"If we had used traditional spinning disk and bought enough up front for 600 desktops, we would've spent three times what our budget was," Blaisdell said. "We would've probably purchased enough for 200 desktops for phase one, and then would have had to add more for performance."

Facing down seasonal threats with data archiving solutions

Many businesses are so concerned about the threats posed by competitors, mercurial consumers and shifting market conditions that they fail to properly consider the danger presented by mother nature. Severe storms can cause substantial damage to legacy hardware, potentially resulting in a catastrophic instance of data loss that could threaten a business' ability to pick up the pieces and carry on. To prevent these circumstances from occurring, managers should deploy a variety of storage redundancies – including data archiving solutions – to establish an extensive defense against data loss and bolster business continuity.

The need for data archiving solutions and other backup storage options will be especially necessary this summer. According to forecasts from AccuWeather, the summer of 2013 will have an active severe storm season at least through the early months.

"People can't let their guard down," AccuWeather senior meteorologist Dan Kottlowski said. "It looks like everybody is going to be vulnerable to severe weather this year from the Gulf of Mexico in early April up to the Midwest by late in the spring and early summer."

Forecasters predict weather conditions during this period to be conducive for the development of an above-normal number of tornadoes in the United States. In addition, coastal states will have to remain alert for potential hurricane activity, while many western communities may have to contend with expansive wildfires. 

In fact, Colorado has recently experienced the most destructive wildfire in the state's history, according to The Weather Channel. The fire reportedly stretched across a 22-square mile area, destroying nearly 500 homes in the process. Many residents have been forced to evacuate, leaving behind their homes and businesses.

Preventing harmful data loss with archiving solutions
According to IT Management, natural disasters such as floods, earthquakes, tornadoes, wildfires and hurricanes all pose a significant threat to the continued prosperity of businesses, both large and small. Data recovery, backup and archiving solutions are necessary to prevent these events from destroying an organization.

"Hard drives break down frequently and natural disasters can wipe out facilities and equipment," the source stated. It continued, "That's why it's essential for companies to configure their backup systems and archiving strategies carefully."

IT consultant and TechTarget contributor Jon Gaasedelen recently argued for the deployment of data archiving solutions to mitigate the damage caused by unforeseen disasters including severe weather incidents. Although Gaasedelen conceded that the archiving process could potentially disrupt business operations if carried out ineffectively, he explained that high-quality software could offset potential problems with its efficient archiving capabilities.

Data storage sea change appears on the horizon

For multiple decades, hard disk drive devices have been the norm for data storage solutions. Outside of certain sectors such as industrial operations, companies have leveraged traditional HDDs because of their relatively cost-effective technology, resulting in less resources being spent per gigabyte of storage space. However, the cost of NAND-based flash technology has dropped substantially in recent years, making the once expensive solid state drive a viable solution for many enterprises. According to research compiled by Computerworld, the cost-per-gigabyte of SSDs dropped 66 percent over the last three years.

This is encouraging news for businesses looking to improve their operational performance since SSD drives can significantly reduce latency issues resulting from input/output bottlenecks and improve the launch speed of critical applications. Unlike traditional HDD devices, SSDs are not dependent on the movement of sensitive magnetic drive heads and platters, meaning they can read and write data at a much quicker pace. Now that the technology has become more affordable, many industry experts expect a surge in SSD adoption rates in the coming years. Morgan Stanley analyst Katy Huberty predicted that revenue from the global enterprise SSD storage market could potentially hit $20 billion over time.

SSD applications in the data center environment
The benefits provided by SSD drives could be especially enticing to data center operators. Regardless of their function, whether used for data storage purposes or to host public cloud services, the success of a data center hinges on the ability to quickly and reliably access and utilize applications and files. Latency issues can result in dissatisfied clients and high customer churn rates as performance quality dwindles. That is what makes solid state technology a viable candidate for data center applications. As noted on the industry observer blog Tech Insidr, HDDs remain the standard for data center storage, but the benefits of SSD technology and lower cost of entry may be too enticing for server room managers to ignore for long.

"[SSD] solutions are especially useful in the datacenter for cloud computing customers, who often process several hundred requests per second," the site stated. "Traditional spinning platter hard drives were not meant to handle the rigors of such a workload. Even in 2012, hard drives are often still the bottleneck and can bog down performance in a meaningful way. By putting NAND in the datacenter, it speeds up the data retrieval process and improves performance in a big way."

Tech Insidr predicted that, as more businesses become aware of the performance benefits provided by SSD drives and the technology drops in price, adoption rates will grow significantly, potentially resulting in SSDs seeing standard usage across all data centers by 2018. 

Benefits of using SSD drives in an industrial environment

Industrial operations depend on the integration of sophisticated software tools more than ever before. Complex machinery now runs with the assistance of networked devices that can improve their performance and function. In addition, managers rely on enterprise resource planning software to gain a holistic view of the company and monitor activity within various departments. If these software tools were to become unresponsive or prone to latency issues, the operability of an industrial enterprise could be jeopardized. That is why leaders from the industrial sector have consistently deployed solid state technology to meet their storage solutions. SSD drives can provide industrial enterprises with a reliable storage platform that ensures continued software performance.

According to StorageSearch, industrial operations have utilized solid state storage since the 1970s, but widespread use did not occur until the past two decades. Industrial environments require more durable and reliable equipment than traditional enterprise scenarios, and these needs extend to data storage solutions as well. Hard disk drives are largely ill-equipped to operate effectively under industrial conditions where they may face extreme temperatures, massive vibrations and the constant risk of physical damage. This is largely due to the technology's dependency on sensitive internal moving parts to read and write data.

According to SSD industry analyst Jim Handy, the tracks on a terabyte HDD are approximately 100 nanometers apart, meaning that these devices are highly susceptible to vibration and shock. He cited an exercise in which an engineer shouted next to a disk array running on HDD technology. Researchers observed substantial latency surges when this occurred. 

"In the data center [HDD vibration sensitivity] may not pose significant problems, especially if you don't make a habit of shouting at the disk arrays, but in other environments, especially in industrial applications, this exercise makes it clear why many engineers choose to use an SSD rather than an HDD, even if they don't need the the SSD's speed," Handy wrote. "I like to give the example of a jet fighter that shakes so much that the pilot may even worry about losing the fillings in his teeth."

Solid state technology ideal for industrial use
SSD drives, meanwhile, are not susceptible to disruptive vibrations as they run on NAND flash memory without any moving parts. In addition, they have been found to be more reliable in environments with extreme temperatures. According to StorageSearch, SSD drives can operate under temperatures spanning from -40 degrees Fahrenheit to 185 F. This makes solid state technology the obvious storage solution for enterprises that run under unusual temperature conditions. 

Enterprise managers interested in adopting solid state technology as a data storage solution should consider consider DIGISTOR's industrial strength SSD drives. In addition to being specifically designed to handle the rigors of operating in an industrial environment, these devices are capable of providing lightning-fast data transfer rates. With models supporting various generations of SATA interface, DIGISTOR's solid state products can deliver transfer rates up to 6 Gbps. Contact DIGISTOR today to begin benefiting from the superior performance and durability of SSD technology.

Home theater market poised for better days

The quick ascent of video streaming and digital download services has led some entertainment industry members to prognosticate the downfall of physical media. However, recent trends suggest that the demand for disc-based film media such as Blu-ray is as fervent as ever, especially in the United States. 

According to PricewaterhouseCoopers' "2013-2017 Global Entertainment and Media Outlook" report, the global entertainment industry is expected to net $2 trillion in annual revenue by 2017, Billboard reported. Currently, the U.S. entertainment and media market generates less than $500 billion each year, but that figure is expected to grow to approximately $632 billion in 2017, when it will account for 29.4 percent of the global industry. The U.S. film industry is expected to surpass the $100 billion mark in 2016, a historical high point for Hollywood. 

The home video market will contribute to this new era of growth. Although researchers predicted that revenue from digital media will outpace that of disc-based formats in North America by 2017, DVDs and Blu-ray discs will remain the market leaders worldwide. One of the factors contributing to projected market growth is the continued emergence of key regions with increasingly sophisticated consumers, including China, Russia, Brazil and India. As the economies of these nations grow, more of their citizens will pursue the creature comforts that Americans have enjoyed for years, including the high-definition playback provided by Blu-ray media and the experience of an advanced home theater system.

Time is right for home theater growth
According to Home Theater Review contributor Jerry Del Colliano, these encouraging trends combined with a recent upswing in the overall economy, and the real estate market in particular, have created conditions that are ideal for creating or improving a home theater. One of the factors that Colliano cited in support of his stance was the increasing availability of high-quality equipment at reasonable prices, thanks to the emergence of online vendors.

"High-performance internet-direct [audio-visual] companies have set a new standard for value in the consumer electronics world while offering excellent customer service, trade-up programs and more," Colliano wrote.

For instance, DIGISTOR's line of slot load Blu-ray hardware and software packages can enhance the home theater experience by facilitating home video playback. By installing these tools in a PC or Mac, users can copy their one-of-a-kind videos to Blu-ray media, ready to be viewed on the big screen. With quality equipment, consumers can enjoy a unique movie theater experience in the comfort of their own homes.

Small business, towns struggle with data backup

Data backup solutions are an essential component of any successful organization. At any given time, a primary storage device could experience a hardware failure that results in the loss of mission-critical information. All electronic equipment has a finite lifespan, meaning that eventually the devices that businesses depend on to store data such as revenue figures, billing information and payroll will one day cease to operate. It is essential that organizations, regardless of their size, have a backup solution in place to maintain business continuity. 

Both small businesses and small towns struggle with implementing a successful data backup plan, however. According to a recently released report by Veeam Software, many small and medium-sized businesses have experienced difficulties juggling the cost and complexity of their data backup and recovery systems, eWeek reported. Eighty-five percent reported that their current backup solution was too expensive. Respondents cited several areas where those costs were being felt. Fifty-one percent of survey participants identified high management expenses while 48 percent reported paying exorbitant licensing fees for their backup solutions.

Small town governments lack backup solutions
Similar concerns have been expressed by officials in small towns across the country as well. The Patriot Ledger's David Riley reported that many small towns in Massachusetts have failed to adequately back up sensitive files on local government servers. A 2010 Division of Local Services survey of small towns within the state that found that approximately 75 percent of respondents reported having no data recovery plan in place. Without a proper system to backup and restore lost data, these departments could experience major setbacks. 

"Losing this information might be costly," Riley wrote. "Allowing private information to get into the wrong hands could lead to lawsuits or compensation for victims. The loss of important data also could hamper town operations or fail to meet requirements of state public records rules, which require municipalities to retain certain records in digital formats."

Consumers, however, require simple and easy-to-use applications that do not require the oversight of an expensive disaster recovery expert for their data backup needs. An elegant answer to this dilemma would be to use a data archiving solution to create backup discs of important files such as home movies or family photos. Unlike a cloud-based alternative, this option would not require users to sign any service agreement contracts or trust a third-party entity with control of their information. Furthermore, Blu-ray discs are a highly durable form of media, meaning consumers do not need to be concerned about hardware degradation or failure. With this data archiving solution, individuals can rest easy knowing their irreplaceable documents will be secure in the event of an emergency.