The emergence of solid state technology as a viable data storage solution for many businesses has caused a stir in the industry as of late. While hard disk drives have been widely used over the years, the performance bottlenecks caused by the limitations of its internal design have driven many organizations to consider an option that could handle more intensive workloads with ease. SSD drives' NAND flash memory has proven to be an effective tonic for these performance issues, but pricing concerns had previously inhibited the reach of the technology. For years, industrial companies had served as the lone bastion of solid state devices, as their performance needs far outweighed any concerns regarding investment costs. Recently, the cost of flash memory has dropped considerably, allowing new industries to consider deploying an SSD drive as their data storage solution for critical or demanding applications.
The emergence of the data center
Another trend that has factored significantly into the growth of the SSD market is the continued rise of the data center. With the advent of powerful and resource-intensive technological advancements such as cloud services and big data, businesses have come to require much more computing capabilities. To achieve this, companies have constructed massive data centers to house numerous servers that can handle the workload. However, integrating traditional HDD storage devices would only slow down these machines. Again, SSD drives provided a way for businesses to optimize their application performance.
The increasing need for high-performance data storage has driven the SSD and NAND markets to new heights in recent years. A report issued by IHS iSuppi predicted that global NAND revenue will increase 14 percent in 2013, reaching $23.1 billion by the end of the year, New Electronics reported. Analysts from TechNavio agreed with this prognostication, as they too expected the global NAND flash market to grow considerably in the near future. According to the firm's researchers, the market will increase at a compound annual growth rate of 8.7 percent through 2016.
Despite this expected period of growth, other analysts have cautioned that the availability of NAND flash memory powering solid state technology may tighten in the coming months. According to DigiTimes, the demand for products leveraging this technology as well as the widespread use of SSDs in data centers and server rooms may significantly limit supplies. This could result in increased prices for NAND flash memory and numerous products across the board as soon as the third quarter of 2013. For businesses interested in enjoying the performance benefits of SSD drives, making the jump to solid state storage sooner rather than later may be the best course of action.
Recently, media outlets across the world have been abuzz with news that the National Security Agency has operated a clandestine surveillance program for several years, gathering data from popular web clients and sites. According to The New Yorker, the program, code-named "Stellar Wind," collected massive quantities of information regarding American citizens' internet browsing habits, phone calls and digital communications. This included sender and recipient names, subject lines and the content of emails, as well as websites visited by individuals. The scale of this project has led many businesses and consumers alike to consider alternative data archiving solutions that are not susceptible to unauthorized access.
Although the project officially ended in 2011, another initiative, Operation Prism, continued in its place. Although the object of the latter was to ostensibly intercept communications between foreign enemies of the United States, many innocent civilians were caught in the crossfire, having their sensitive information collected in the process as well. The scale of these data collecting projects was massive. A single NSA tracking program had managed to process one trillion metadata records at the beginning of this year.
Private entities support civilian espionage campaigns
The Guardian reported that the agency had collected the call records of millions of American Verizon customers alone. In addition, the NSA accessed and stored the internet protocol addresses of the emails they intercepted, allowing government officials to pinpoint the location of a particular user. According to CNET, many of these privacy breaches were made possible by agreements between the federal government and various telecommunications firms to allow NSA agents to tap fiber-optic cables, data networks and gateway switches. Although many of the corporations that participated in these campaigns have stressed that they only complied with requests for government access to consumer information when legally obligated to do so, their willingness to hand over sensitive data has unnerved many Americans.
The disclosure of this collusion between government and private entities has highlighted the need for robust data backup and security measures to protect sensitive information from being accessed. It has yet to be determined what precisely the NSA has done with the massive quantities of data it has collected from American citizens. Operation Prism also raises questions regarding what other information the government may have attempted to access. For example, while monitoring internet activity, the NSA could have gained accessed to cloud storage services, allowing agents to view sensitive files that had been offloaded to a public cloud network. In addition, the U.S. government is known to have malware capabilities on par with the most sophisticated cybercriminals in the world, so agents could conceivably access a citizen's computer hard drive. To protect sensitive information from invasive searches, small businesses and consumers alike can utilize data archiving solutions to store their files on Blu-ray media. This way, if a user's network or machine is breached, his or her information will be safely stored offline, away from harm.
We would like to share this article posted today on readwrite.com. CEO Drew Houston says Dropbox is “replacing the hard drive.” We are huge fans of Dropbox and use it regularly, however this article makes some great points about potential setbacks of relying solely on the cloud as a data storage solution.
For years, Apple has led the way with many technological advancements. For instance, the iPod made mp3 players a market-viable form of music distribution. Later, Apple's iPad firmly established the age of mobile devices as consumers became interested in tablet hardware. The tech company has demonstrated an unusually keen acumen for recognizing emerging trends and pouncing on them for major profit. Given recent developments, it would appear that Apple's latest big bet is on SSD drives.
According to TechRadar, Apple has chosen PCI Express SSD technology as the data storage standard for its newest iteration of the MacBook Air laptop. In a rapidly booming thin-and-light laptop market, the MacBook Air has quickly established itself as the leader by a large margin. Data gathered by the NPD Group found that Apple's product accounted for 56 percent of U.S. netbook sales, an analyst with the firm told CNET. Numerous PC companies represent the remaining 44 percent of the market.
Entrusting the data storage capabilities of the most popular netbook on the market to SSD demonstrates how confident Apple's leaders feel about the future of the technology. According to specs collected by TechRadar, the newest MacBook Air model will have data read/write speeds of 750 megabytes per second. This level of performance will far outstrip what can be offered by traditional hard disk drives. Coupled with Intel's new "Haswell" generation CPU, MacBook Air's SSD drives will allow users to quickly launch applications and access files. TechRadar noted that because SSD technology is not faced with the same physical limitations as magnetic drives, it can operate much more quickly in regard to both sequential throughput and random drive performance.
While data storage has traditionally been dominated by whatever device could provide the most capacity at the lowest cost, access and speed are becoming more prioritized by businesses and consumers alike. Having a massive collection of files that take ages to launch is beginning to lose its appeal. With a bellwether company like Apple putting its support firmly behind solid state technology, it would appear that the future of data storage belongs to SSD drives.
Although the number of businesses leveraging cloud services for their data storage needs has increased over the past few years, concerns have lingered about the security of that information. Most of these fears have been driven by the thought that when data is placed into the cloud, the original owner has lost all tangible control over it. The recent revealing of the NSA's PRISM project has only further exacerbated those concerns, leading many security experts to question just how secure cloud data storage really is and consider alternative data archiving solutions.
Cloud-based services hosted in the United States or European Union such as Google Docs and Dropbox have been widely used by businesses across the globe, but now company heads are beginning to doubt the security of that data, Information Daily reported. The Foreign Intelligence Surveillance Act could potentially allow the American government to request access to a business' stored cloud data without notifying company officials. These rights can be extended to monitor EU cloud services as well. In the wake of the PRISM fiasco, Switzerland has emerged as a desirable location for cloud services as the country is not a member of the EU and is not beholden to U.S. laws. According to the director of a Swiss offshore hosting company, his organization's revenue streams have grown 45 percent since these recent fears have risen.
The fallout from PRISM
Some European officials have suggested that the continent take control of its cloud computing capabilities. SiliconRepublic reported that European Commission vice president Neelie Kroes recently recommended that EU members construct a European cloud network to ensure the security of sensitive information. At a recent meeting of the European Cloud Partnership Board, Kroes warned that fallout from the PRISM project could result in EU companies dropping the services of America cloud providers, according to ZDNet.
"Why would you pay someone else to hold your commercial or other secrets, if you suspect or know they are being shared against your wishes?" Kroes stated. "Front or back door – it doesn't matter – any smart person doesn't want the information shared at all." She continued, "If European cloud customers cannot trust the United States government or their assurances, then maybe they won't trust US cloud providers either. That is my guess. And if I am right then there are multi-billion euro consequences for American companies."
With the security of data stored in the cloud in doubt, individuals and businesses may want to consider alternative offline solutions. For instance, optical storage can provide users with a data storage option that cannot be remotely accessed by an unauthorized user. Using a Blu-ray burner and media, individuals can back up their sensitive information without fear of that data getting out into the open.
The advent of cloud computing has ushered in many new services for businesses and consumers to utilize. One of the most widely deployed cloud services is data storage. One perceived benefit of cloud storage is its reliability. Instead of depending on the operability of on-site storage devices, consumers and businesses can entrust third-party cloud service providers with maintaining their data.
Consumers may believe that the information they store on various cloud storage clients will always be accessible, but that is simply not the case. Just like on-premises devices, cloud servers can experience hardware failure that may result in information being lost forever. Server Density's David Mytton outlined how seemingly trivial hardware and software issues can escalate and result in cloud customers losing their data. For example, maintenance errors and power failures take servers offline as well as potentially damage the equipment.
Furthermore, there is no guarantee that a cloud vendor will continue to be in business or provide the same level of service for years to come. In order to ensure that unique and irreplaceable files are not lost, consumers should deploy supplemental data archiving solutions when utilizing cloud storage.
The effects of cloud provider closures
Cloud storage is still in its infancy and, like any emerging market, has seen its fair share of turmoil as once-promising vendors get bought out by larger competitors or shutter their operations. For example, Snapjoy recently announced that it would shortly suspend all services. Forbes contributor Ewan Spence reported that the cloud-based photo library vendor initially began scaling back services when it was purchased by Dropbox last December. At the time, the company barred new users from uploading their photos to the cloud network but insisted service would continue to be provided to existing customers.
Just months later, however, Snapjoy announced that it would be ceasing operations and informed users that they had one month to download their files to a physical drive before they were deleted. According to Spence, this development highlights the current transitory state of cloud storage. He argued that consumers should treat cloud storage services as a temporary solution and employ their own data archiving solutions to ensure important files are not lost.
A reliable data archiving solution can provide consumers with a backup option in the event that their home videos, family photos or other irreplaceable media files are lost due to an issue with their cloud storage service. This way, users can establish a measure of agency over the fate of their data.
While sifting through the various options on the market today, many consumers choose their data storage devices based on capacity alone. However, the number of gigabytes or even terabytes' worth of data that a drive is able to hold does not accurately determine its value. Increasingly, performance is becoming just as prized as storage capacity. This is particularly true in the business sector, in which a sluggish application or file launch could have detrimental repercussions for an organization. Because of this concern, many professionals are investing in solid state technology for their data storage needs. Although other options tend to be cheaper, SSD drives offer unparalleled speed and reliability.
SSD vs. Tape
Many companies have traditionally opted for tape as their data backup format of choice. There were a number of advantages to this medium, according to StorageCraft's Steven Snyder. For instance, the best tape cartridges can store up to 5 terabytes of uncompressed data. Although at one time that was an impressive figure, capacity advancements have stagnated with the technology while both hard disk drives and SSDs have made massive storage leaps in recent years.
Another concern for tape deployment is the technology's physical limitations. While some tape drives can approach write speeds of 500 megabytes per second, their performance ceiling is limited by the the physics governing their moving parts and magnetic ribbons. The NAND flash memory which powers SSD drives, meanwhile, has no moving parts to hamper its performance, and as noted by Snyder, top speeds are increasing on a regular basis. Perhaps the greatest limitation presented by tape drives, however, are their cost. A single device will easily take up more than $1,000 in a company's operating budget, but it could be several thousand more. To provide their entire system with a tape-based backup solution, businesses would have to make a massive investment that could interfere with other planned expenditures.
SSD vs HDD
SSD's main competitor in the current marketplace is hard disk drive technology. HDDs provide users with a respectable storage capacity at a relatively low cost. Like tape, however, HDD performance is hampered by the movement of its internal parts. In this case, the amount of time needed to access or copy data with an HDD is dependent on the speed of the device's magnetic spinning disc. This limitation can result in substantial performance bottlenecks when attempting to launch an application or open a file. Again, SSDs are not beholden to such concerns, as they operate on NAND-based flash memory. ZDNet reported that two data storage experts recently suggested that the performance benefits of SSD technology would soon position it as the primary storage solution for businesses, particularly those that have high performance needs. Spinning disk drives, meanwhile, will largely be used as a cheap method to archive data. Balancing speed and cost, SSD drives present the best package for businesses looking to enhance their data storage solutions.
Even among consumers, computer systems are becoming more complex each year. The proliferation of mobile devices such as smartphones and tablets along with supplemental computers like netbooks have changed the consumer computing paradigm. For years, households had a single desktop PC at their disposal, but today's consumer environments contain a wider array of devices to account for. The emergence of complex computing systems has created new data storage challenges for numerous consumers. Copying data from one computer to have access to it on a different device can be an arduous and time-consuming process. To streamline these tasks, some companies have even recently deployed consumer-level personal cloud systems.
Engadget reported that Samsung recently announced a software suite that would essentially provide consumers with a personal cloud storage solution. After installing the proprietary program, users can share their files across a range of Samsung devices. Supporting five accounts and up to six devices, the software service allows entire families of Samsung devotees to access large amounts of data within their own personal environments.
When using a cloud-based storage option, consumers should have data archiving solutions in place in case their primary servers malfunction or go offline. Consumers should also consider the possibility that a large shared network enables another user to delete important documents or irreplaceable media files. The Unofficial Apple Weblog noted that in some cloud networks, deleting data from an account can remove those files from every synced device in the connected environment. This can be especially troublesome for families operating a personal cloud system, where parents and children have vastly different priorities regarding data files.
The importance of redundancy when backing up data
Veteran technology writer Todd Weiss argued in a recent CITEworld post that redundancy was a key component of an effective data storage plan. With so many backup options available to consumers, it would be foolish to not take advantage of multiple data archiving solutions.
"Redundancy is so important to me that I even have a separate desktop PC sitting right next to my main work PC so that if the first machine breaks, I can swiftly move to the standby PC," Weiss wrote. "In addition, I email copies of important work files to special email accounts where I can access them whenever I need them, wherever I am. On top of all of that, I also have a Lenovo laptop that I can call into use."
Disc-based data archiving solutions offer a reliable backup option for consumers worried about retaining access to their important files. Unlike external hard drives or cloud storage systems, these tools are not dependent on the continuing operability of a single device. Users can copy their documents, movies, music and photos onto as many discs as they wish, ensuring that these files will always be available.
When looking at the various SSD options on the market, prospective buyers may look to storage space as their guiding principle. However, given that SSD's main benefit is its superior read/write speed, weighing performance and storage capacity should be an essential component of the decision-making process. Beyond the amount of data that can be copied onto a drive, not all SSD drives are created equal. The NAND flash memory that solid state technology operates on can come in three basic forms: single-layer cell (SLC), multi-layer cell (MLC) or three-layer cell (TLC). Which type of SSD drive a business ultimately opts to deploy depends on its needs and financial circumstances.
Breaking down the three options
SLC technology is, without question, the most powerful form of NAND flash memory on the market today. Embedded Computing Design noted that SLC flash memory only has two states: a high or a low. This simplicity facilitates the data retrieval process, resulting in speedier performance. According to Centon Electronics' breakdown of the various forms of SSD flash, SLC bests the other two options in a number of categories, including performance, write speeds, lifespan and energy consumption. The tradeoff to this functionality is a higher cost point than the other two options.
MLC cannot compete with SLC on a pure performance level. While SLC flash memory will typically support approximately 100,000 program/erase cycles per each cell, MLC can only operate up to 10,000. However, it also is a considerably less expensive alternative to SLC. Because engineers can fit more cells onto a single plane of silicon, the production costs of MLC are a bit less. While businesses may covet the operability of SLC technology, budgetary restrictions may inhibit its use.
At the low end of the SSD scale is TLC flash memory. With slower read/write speeds than SLC and MLC, this technology represents the lowest performing option available to prospective buyers. In addition, TLC facilitates the lowest number of program/erase cycles of the group, meaning owners will have to replace their drives sooner than the others. It is also the least expensive of the three, as it can create more storage states within each cell.
Deciding on the best form of SSD drive for deployment depends on the user's needs and resources. An industrial enterprise may want to use SLC because of its high level performance. For many businesses, MLC will suffice, as it balances cost with operability. TLC is likely most appropriate for consumer usage as it cannot provide the same level of performance or longevity as the alternatives.
Cybercrime has become a major concern in recent years. Hackers have increasingly targeted any organization that might house data files which can be leveraged for financial gain. Although entities as large as federal databases and corporate banks have been victims of harmful data breaches, small businesses and individuals remain as the favorite targets for cybercriminals. The latest threat report released by cybersecurity firm Symantec found that small and medium-sized businesses (SMBs) accounted for 31 percent of all targeted attacks, according to Network World.
Many small business owners might believe that because the information they possess is of significantly less value than that of a larger enterprise, hackers would not waste their time and resources probing their networks. However, it is precisely SMBs' lack of sophistication that continues to make them potential targets for breaches. Smaller businesses tend to implement fewer cybersecurity defenses in their networked environments, meaning hackers will have much fewer barriers to overcome to gain access to their data. Although the payout from these breaches are considerably less, the risk entailed is lower as well.
Another concern is that cybercriminals are deploying greater numbers of effective viruses and other malware. Many of these threats have stealth capabilities so users may not even realize that their files have been compromised for months or even years after a breach has occurred. Furthermore, new malware strains are appearing at such a high rate of frequency that many anti-virus programs simply cannot update their threat detection lists fast enough to contend with recently released dangers. A recent Panda Labs study found that 74,000 unique forms of malware appear on the Internet each day, according to SPAMfighter.
The threat of file corruption
Once a virus gains access to a system, it can quickly propagate itself, spreading across the environment and infecting numerous files. In addition to the massive performance drop that may occur, users might notice that their file data has become corrupted by virulent malware strains. If this happens, the files in question will be unusable, and whatever contents they may have held – financial records, family photos, home videos or work orders, for example – will be lost forever. That is, unless the victimized SMB or individual has backed up those files with a data archiving solution.
Since there is just no way to predict when a malware attack will occur, businesses and consumers alike need to have their sensitive, irreplaceable and mission-critical files backed up in the event that the originals become corrupted. A traditional hard disk drive device is not ideal in these circumstances because if it shares a connection with the infected workstation, it too is vulnerable to attack. However, disc-based data archiving solutions would be immune to any virus infections, meaning users can rest easy that their important documents are secure.