NAND Flash Scarcity – the Roots and Effects of the Issue

Those who have kept an eye on the SSD market over the past 12 months will have noticed a rise in prices due to NAND flash shortages. NAND flash memory is the technology behind power-efficient, solid state drives (SSDs) and other storage memory found in personal computers and mobile devices.

This shortage is impacting those of us in the electronics industry in a variety of ways. Not only are prices rising, but more companies are now trying to fill the void by producing more SSD, while others are hard at work to create alternatives.

The shortage we are experiencing is due to several factors, including:

  • A difficult transition from 2D to denser 3D technology on the manufacturing side
  • Continued high demand for flash for use in smartphones, in particular, the increased storage offered by iPhone 7s
  • Heightened demand from manufacturers desiring flash storage for datacenter hardware
  • Sustained demand for PCs and notebooks, with average flash adoption in notebooks expected to exceed 30%
  • Troubles on the manufacturing side by the two of the largest producers

That final point merits a few more words. One of the largest factors that is undoubtedly contributing to the NAND shortage is Toshiba’s current financial troubles. The second largest supplier of flash memory in the global market and first company to begin producing NAND flash memory, Toshiba has struggled with the production of 3D NAND memory. Toshiba’s troubles are not, however, completely on the manufacturing side. The electronics giant recently acquired a company to build nuclear power plants in the United States—a woeful project that has resulted in accounting scandals, legal actions, and billions of dollars in debt. The upshot: Toshiba is now selling off its semiconductor/NAND memory division. We assume that bidders will include Micron Technology, SK Hynix, Broadcom Ltd, and Western Digital.

Another, more highly publicized issue has to do with the largest supplier of flash memory in the global market: Samsung. The recall of Samsung’s Galaxy Note 7 smartphones a few months ago have been a factor in the global scarcity, as scores of devices had to be returned and replaced in the market. Along with each of those returned devices was a flash memory unit taken off the market (at least temporarily).

The net effect of this shortage is that prices have increased to PC manufacturers. As SSD performance is now reaching mainstream consumer awareness, including these drives in personal laptops is becoming more and more expected. Nevertheless, SSDs are not usually within the same capacity that most standard hard disk drives (HDD) are sold with. Laptops sold with SSDs are typically in the range of 128 to 256 GB, while a laptop with an HDD is commonly much higher–anywhere between 500GB to 1 TB. But the price differential tends to be quite significant. That will likely remain consistent while scarcity exists.

Nevertheless, some manufacturers are optimistic. Samsung is now expected to begin operating a new plant in Pyeongtaek in July to further expand its 3D NAND production capacity. Micron will start producing 64-layer 3D NAND chips in the second quarter, with mass shipments becoming ready for the second half of the year. The company promises “meaningful output” by the end of their fiscal year in December.

We won’t hold our breath, but while manufacturers continue to scramble, and alternative storage technologies emerge, we’ll keep you updated. Keep your eye on this blog for further developments.

Use Case: Body Worn Camera Manufacturer requires a Data Storage Solution that solves for both Speed and Reliability

DIGISTOR was approached by a large, successful manufacturer of body worn camera equipment in 2013 concerning the launch of a new camera targeting the law enforcement community.

The camera had a beautiful industrial design, and was loaded with several new features including very high resolution video.

The software developed integrated seamlessly with a full chain of custody solution, ensuring that the digital evidence would be admissible in a court of law.

The Challenge

But, there was one problem. The microSD card originally selected for the camera continually became corrupted, thus losing valuable evidence and making this new body camera all but useless.

Although hundreds of thousands of dollars had been invested in the camera’s hardware and software development, very little investigation was done into the data storage solution the video would ultimately be written too.

The manufacturer turned to DIGISTOR for help.

The Solution

DIGISTOR worked with the manufacturer’s engineering team to understand the full picture, and identified two critical application requirements:

  • Speed. The customer had a critical high speed write requirement that the SD card had to achieve under all circumstances.
  • Reliability. It was crucial that not only the video was protected from corruption, but that the manufacturer’s customers could have a firm understanding of the life expectancy of each card.

Early on in the design process, the manufacturer focused heavily on speed as the number one requirement.

Working closely with the DIGISTOR firmware engineers, the manufacturer was able to achieve the performance needed for video capture of high resolution video.

Moving on to the reliability requirements, the engineers quickly realized the two bigger issues were the lack of consistency on longevity of the microSD cards, and an unacceptable failure rate.

The DIGISTOR engineer’s test results showed that corrupted tables were locking up the SD cards and not allowing for data recovery of potentially crucial video evidence. The engineers took the following approach to support the manufacture’s identification of the best microSD card solution:

  • DIGISTOR provided an application analysis card which the manufacturer ran in a real life application scenario for a 2-week period.
  • DIGISTOR analyzed the data captured to determine how the application was accessing the SD cards, which also showed the write/erase counts.
  • The data analysis also showed incompatible access patterns within the customer software which could be altered to help overall reliability.
  • DIGISTOR was able to perform a Failure Analysis (FA) on the failing cards that showed how the manufacturer’s application was writing to the SD card and where the issues were occurring.

The Results

By having a full understanding of how the video application accessed the SD card and also how the software was over-stressing memory cells due to unevenness of the write/erase cycles caused by incompatible access patterns in the application itself, the DIGISTOR engineering team found that the standard wear-leveling algorithm was not activating properly and causing corruption within the SD card.

DIGISTOR was able to modify standard firmware to meet the requirement of the video application.

DIGISTOR recommended the manufacturer make changes to the software, which improved the overall performance of the SD card and BWC application.The manufacturer was able to achieve both the performance and reliability needed for a successful new camera launch.

Today, the manufacturer continues to grow share in the body worn camera market and achieve a solid ROI on their secure data platform.

Use Case: Systems Integrator requires an SSD drive for the Professional Broadcasting Market with unparalleled Quality and Performance

DIGISTOR partnered with a leading video system storage integrator to develop a solid state drive designed for the Blackmagic Cinema Cameras, capable of capturing 4K uncompressed raw video.

While there was multiple Blackmagic camera compatible SSD’s currently on the market, many of these off the shelf solutions did not provide the performance and reliability required by video professionals. In particular, there were major issues with dropped frames and durability of the SSD when deployed in the field.

The Challenge

DIGISTOR completed a comprehensive analysis of the Blackmagic Cinema Cameras usage of SSD storage media. The approach was to have a better understanding of the actual data access patterns.

Very quickly, DIGISTOR was able to identify critical requirements that needed to be addressed in order to develop an SSD solution that would successfully meet the storage media needs of Blackmagic Cinema Camera customers.

  • Speed. It was critical that the SSD be able to handle a consistently high speed data throughput under any circumstance.
  • Reliability. The problem of dropped frames was prevalent and unacceptable.
  • Durability. A form factor design that nested firmly inside Blackmagic Cameras.
  • Locked Firmware. Firmware changes without advance notice were causing drives to fail in the field.

The Solution

DIGISTOR worked for months on a variety of configurations. Once they had a solid working platform they focused on optimizing the firmware for this specific application with the goal to create the most reliable SSD for video capture on the market.

DIGISTOR engineering enabled their integration partner to consistently achieve sustained high speed throughput, allowing for reliable and consistent results.

The Results

DIGISTOR has become a leading supplier of SSD drives into the Professional Video market.

DIGISTOR has shipped thousands of drives into many markets supporting professional video cameras through our integration partner including:

2014 World Cup Events
Churches throughout USA and South America
Broadcasting companies deployed throughout EU

SSD prices on the rise due to NAND flash shortages

The NAND flash supply shortage that has endured this year is expected to continue throughout the fourth quarter, and all signs point to ongoing supply issues well into 2017.

According to TrendForce, strong smartphone demand is the main reason for the NAND flash shortage.  However, higher than anticipated SSD adoption rates in the industrial, enterprise, and consumer markets have also contributed to the severe shortages.  We have already seen factory lead times increase nearly two-fold over the past few months, and price increases affecting certain SSD product lines are not far behind.

Have you felt the impact of flash?

We’ve seen a more dynamic change in the storage of data on Flash in the past few years than ever before. NAND flash used to only be “trusted” (using that term loosely) in two areas: laptops, and non-critical entertainment such as storage for cameras. Before DIGISTOR was big in the flash storage arena, we would scrutinize every detail of a 2.5” SSD drive for our desktop PC’s, comparing how much data per day we copy, use, and write to be sure we wouldn’t run out of NAND P/E cycles before a standard HDD would.

Over time we came to trust the technology, and even work closely with several global chipset manufacturers that let us in on the deep inner workings of how the NAND is used and how the chipset can, in fact, protect your data. Now we’ve scaled far beyond standard SSD storage for PC’s, and Cinema SSD drives. The more we improve industrial grade high speed flash storage, the more critical applications appear that benefit from improved speed and ruggedized design of solid state storage.

One area that was fairly surprising is the way cloud storage uses flash to improve the response time of any given search for your data. Usually we assume this bottleneck, our ISP, will mean that our remote files can be stored on big spinning disk drives out in a server farm where no SSD’s would really be of use. However, have you noticed that search results appear as you type these days on Google, and Facebook? Yes, that is the result of having some parts of the cloud built to use the high speed dependability of solid state storage.

The area that was more expected, but slower growing, is the IoT. The Internet of Things category has been looking for its day in the sun for some time. Now, something amazing is happening due to the improvements in controller tech, NAND flash, and testing for industrial applications. These little devices can finally have the ability to be truly smart because of the capacity, speed, and reliability of the integrated solid state storage, and improved SoC IC all packed into a small footprint.

The more IoT devices can prove themselves as necessities in our lives, the more innovation will grow from the next generation of devices. In order to do this though, the main requirement is usability. A close second is reliability. IoT devices are usually low power, and always running. Perhaps you don’t even use a particular device every day, but when you need to use it, it must be reliably functioning. There is no room for off-the-shelf consumer flash storage when you have a critical control system in place, perhaps monitoring the security of your home, or granting access to your front door.

We all got used to flash with cheap SD storage for our point-and-shoot cameras, and maybe we have had some experience with SSDs in our laptops. Now that flash storage is used for more critical components in our digital life, there’s no other choice but to be sure your storage of choice is high quality tested NAND and built by a trusted manufacturer. DIGISTOR is always willing to help you in this endeavor. If you’re working on any project, large or small, IoT or enterprise storage, and want to take storage concerns out of the equation, just contact us and we’ll be able to help.

How does your Body Worn Camera store and manage crucial video evidence?

Police agencies around the country are ramping up their Body Worn Camera (BWC) policies and procedures. High profile police assault videos have continued to surface on the internet creating a public outcry for reform, training, and accountability. Federal grants to help fund policing services need for BWC deployments are rapidly expanding.  These subsidies have created a whirlwind of new manufacturers into the BWC business looking to take advantage of federal grants and a growing market. BWC cameras have a variety of features including high definition video capture, night vision, and ruggedized housing.

BWC cameras record video evidence on an SD card similar to what you would use in a standard camera. That is part of the issue, most agencies and BWC manufacturers use off the shelf consumer SD cards to acquire and store video evidence without even realizing it.  These types of SD cards are at very high risk of corruption. Consumer cards do not offer extended temperature support and are at risk of failure in both hot and cold climates.  Standard SD cards will fail in several instances including being left behind in a hot car for just a short period. Most standard SD cards do not offer a Power-Fail feature or robust video acquisition firmware to keep from having corruption or overwrite issues. The testing and components used by consumer card manufacturers are not to the standard needed to provide a secure and robust BWC video storage solution.

The real question is how to protect crucial video evidence best on Body-Worn Cameras and ensure the proper video evidence gets from the camera to the courtroom? Agencies, BWC manufacturers, and surveillance companies can solve a lot of support and security issues by asking their current BWC supplier specific questions about the storage onboard their BWC camera. The first step is not to be so caught up on speed; these cards are all very fast. That said all SD Cards are not all equal, so it is important to design a card for your particular application. A few important questions to ask your BWC provider and ensure you have the best possible secure storage for your BWC application.

What operating/storage temp is your card supported, both heat/cold?

Does your SD card have a locked Bill of Materials (BOM)?

Does your SD card use TLC, MLC, or SLC flash?

Can you lock your SD card to one specific camera for security purposes?

Can your SD card protect against accidental deletions?

Do you have a power-fail feature?

Again be sure to look deeper than basic specifications such as read/write speeds, sometimes the fastest card may not be the best card for your application. Understanding your overall application and how it accesses the flash within the SD card itself is much more important. Understanding your overall BWC application demands and lifetime expectancy of your BWC camera is crucial to both protecting key video evidence as well protecting the agencies investment. Securing the video until transferred to a more secure and permanent location is the biggest question facing the BWC community, and they don’t even know it yet.

A new look to continue beyond 15 years of service

We are excited to introduce you to DIGISTOR’s updated website and blog.

We’ve come a long way since we got our start in 2001, when our family started this business to provide digital archiving and data storage products to industrial and global OEM customers. Over the past 15 years, DIGISTOR has become a leading innovator, manufacturer and distributor of industrial-grade flash storage products, secure storage products and digital-video solutions, and today we serve customers around the world in industries such as Law Enforcement, Media, and Entertainment, Medical, Professional Video, Security/Surveillance, and Military.

Our new website is designed to better serve our customers and over the coming months you will see several new product announcements and additional website enhancements such as improvements to our self-service customer center, and additional online B2B capabilities. Our updated blog will continue to publish interesting articles and news and stories relevant to our industry

We always welcome your comments, feedback or ideas so please be sure to connect with us on social media.  We will have a lot to share in the coming weeks and months ahead.

Visit our Facebook page

Tweet us.

Follow us on LinkedIn

Evaluating Storage System Security

Storing digital data successfully requires a balance of availability, cost, performance and reliability. With the emergence of low-power, petabyte-scale archival storage and flash-based systems, it is getting increasingly difficult to quantify performance, reliability and space-efficiency trade-offs, especially when coupled with storage-security factors. Storage performance is measured by latency, throughput (bandwidth) and IOPS, with throughput typically presented as overall sustained (long) and peak (short) performance transfer rates, and has a wide variety of non-uniform and unique measurement views when storage security is employed.

Although much work has been done on defining, testing and implementing mechanisms to safeguard data storage in long-term archival storage systems, data security verification in our cloud-based, mobile-driven, virtual containerized software-defined remote storage world, remains a unique and ongoing challenge.

Data security can be ensured in a variety of ways depending on the level of security desired, performance and the level of tolerance of user-inconvenience. Most storage systems rely on encrypting data over the wire or by on-disk data encryption, typically using pre-computed checksums and secure hashes, but with no standardized parameters or protocol for comparison between network or on-disk performance and integrity while in actual use.

In today’s multi-tenant virtualized container storage environments, containers depend on a different approach to virtualization, ie. they are not the hardware of things and how a guest O/S runs on top of all that (cpu/memory/network/storage), as containerization separates users and processes from each other. Multi-tenant security is especially important with the heavy reliance on 24xforever mobile data access from containerized cloud storage, where the top-10 security issues identified in 2015 by OWASP ( were:

  • Insecure data storage;
  • Weak server-side controls;
  • Insufficient transport layer protection;
  • Client-side injection;
  • Poor authorization & authentication
  • Improper session handling
  • Security decisions via un-trusted inputs
  • Side-channel data leakage
  • Broken cryptography
  • Sensitive information disclosure

Docker, one of the most prevalent deployed container technologies in use today, have just recently addressed container user-security concerns  by separating daily container operation privileges from root privileges on the server host, thus minimizing risk of cross-tenant user namespace and root server/data access.

The Center for Internet Security recently released a series of internet security benchmarks ( resources that, although an independent authority and not a standards body, are based on recommended industry-accepted FISMA, PCI, HIPAA and other system hardening standards to help in mitigating security risk for virtualized container storage infrastructure implementations. Although there are a number of new technology products being introduced specifically focused on unique virtual container data security, what does ‘secure’ really mean in the container-context, ie. secure container access, valid container data, native security of application(s) in the container, etc. ?  Most container data volumes today are tied to a specific virtual server, and if the container fails or is moved from that one server to another, the connection to the data volume is lost (no persistent storage), regardless of employed security parameters. For virtual container data to be truly secure, a fully distributed, reliable, secure read/write container file system must be employed to ensure secure, resilient cloud deployments. Ideally, this can be achieved with a container-native cloud deployment on bare-metal, without the use of virtual machines, making the container’s data lifecycle and application scalability independent of the container’s host, while minimizing the future cost and complexity of provision and management of virtual machine server hosts. That coupled with a hardware-secured, write-once data storage device tier, can truly ensure long-term data storage security irrespective of use or lack of encryption use. Additionally and most importantly, cloud data storage encryption keys, although defined within the facets of the SNIA-based Cloud Data Management Interface (CDMI) key management interoperability protocol (KMIP) proposed standard, requires better wide-spread adoption, as most crypto key management is either at the specific storage device level with a single point of key-access failure or as a Cloud provider-managed option today…Lose the key(s), lose the data, no matter how securely managed or replicated!

Clients acting in the role of using a data storage interface

Some data storage security basics:

  • Physical security is essential.
  • Develop internal storage security standards (authentication/authorization/access control methods, configuration templates, encryption req’s., security architecture, zoning, etc.).
  • Document, maintain and enforce security policies that cover availability, confidentiality and integrity for storage-specific areas.
  • Ensure basic access controls are in place to determine your policies; change insecure access permissions.
  • Unload unnecessary/not-required storage services related to NFS (mountd, statd, and lockd).
  • Limit and control network-based permissions for network volumes and shares.
  • Ensure proper authentication and credential verification is taking place at one or more layers above storage devices (within the host operating system, applications and databases).
  • Operating system, application and database-centric storage safeguards are inadequate. Consider vendor-specific and/or storage security add-ons.
  • Ensure audit logging is taking place for storage security accountability.
  • Perform semi-annual information audits of physical location inventory and critical information assets.
  • Separate storage administration and maintenance accounts with strong passwords for both accountability and to minimize potential compromised-account damage.
  • Encrypting data in transit helps, but should not be relied on exclusively.
  • Carefully consider software-based storage encryption solutions for critical systems (key mgt.).
  • Evaluate and consider hardware-based drive encryption on the client side.
  • Carefully select a unified encryption key management platform that includes centralized key lifecycle management.
  • Deploy Boolean-based file/stream access control expressions (ACE’s) in container environments to simplify permission granting to users/groups across data files/directories while providing an additional data protection level in multi-tenant environments.
  • Evaluate OASIS and XACML policy-based schemas for secure access control.
  • Evaluate and consider write-once data storage technology for long-term archival storage tiers.

Is Hybrid Data Storage a Solution for you?

As the 2016 New Year unfolds, the demand for secure data storage will increase at every level within the IT stack. According to the 2015 Cyber Defense Report, 70% of organizations have been compromised by a successful data breach within the last 12-months. With a zero-trust data protection mantra, new pervasive data security solutions will emerge to touch applications, endpoints, networks and storage collectively. Encryption technology alone, when keys are managed by employees in both on-premise and Cloud environments, is not an adequate cyber-attack deterrent, while control over data location and redundancy are key to maintaining compliance, data privacy and security across global, heterogeneous infrastructures.

To keep up with the burgeoning big-data deluge, organizations continue to move larger workloads into unified/virtualized environments, both on-premise and cloud. Many have already successfully deployed a variety of high-performance hybrid data storage solutions into the data center landscape. In a recently released survey by ActualTech Media, many of these enterprises have begun incorporating flash-based storage in their data centers, ie. 41% use on-premise HDD only; 9% are using off-premise/Cloud only; while 50% of respondents are already using some type of on-premise flash-based storage (3% all-flash, 47% hybrid mix flash/HDD). With all the significant benefits virtualization brings to the IT infrastructure, one factor has inhibited wide-scale legacy application virtualization, and that is performance.


Bandwidth, IOPS and latency, are standard storage performance metrics typically measured in milliseconds, with flash drives specs within fractions-of-a-millisecond. As data storage is usually the IT infrastructure latency bottleneck, minimizing latency is a key objective for faster I/O completions and faster transaction processing. As latency has a direct impact on VM performance in virtualized environments, an adoption of solid-state storage incorporating flash-caching hardware and software is enabling very low latencies while simultaneously helping to minimize network bandwidth bottlenecks. Flash SSD advantages include higher IOPS, reduced cooling, reduced power and lower failure rates than standard HDD. Although flash SSD storage costs are rapidly declining  and are still a 2x factor higher than HDD per terabyte (depending on TCO variables), a combined hybrid SSD/HDD automated tiered storage solution offers compelling metrics that IT professionals are finding both acceptable and in-budget. SSD-based data storage technology provides true business value by enabling faster access to information and real-time analytics. A hybrid SSD/HDD solution enables IT to balance cost and performance to meet their unique application and SLA requirements.

Which flash-based SSD solution is truly right for your environment? There are many factors to consider when comparing industrial-grade versus commercial-grade flash storage devices. Industrial-grade utilizes SLC (Single Level Cell) NAND versus commercial-grade MLC (Multi Level Cell) as the data storage medium. Based on voltage, SLC records only a single value (0-1, on-off), where MLC can store up to four values (00, 01, 10 or 11) in two-bits of data. SLC NAND has 20-30x more endurance cycles over MLC NAND, better data retention life and extreme temperature functionality.


  • SLC (Single Level Cell)
    • highest performance, high cost, enterprise grade NAND
    • 90-100,000 program/erase cycles per cell (highest endurance)
    • lowest density (1 bit per cell, lower is better for endurance)
    • lower power consumption
    • faster write speeds
    • much higher cost (3x higher than MLC)
    • best fit for industrial grade devices, embedded systems, critical applications
  • eMLC (Enterprise Multi Level Cell)
    • good performance, aimed at enterprise use
    • 20-30,000 program/erase cycles per cell
    • higher density (2 bits per cell)
    • lower endurance limit than SLC, higher than MLC
    • lower cost
    • good fit for light enterprise use & high-end consumer products with more disk writes than consumer-grade MLC
  • MLC (Multi Level Cell)
    • average performance, consumer grade NAND
    • 10,000 program/erase cycles per cell
    • higher density (2 or more bits per cell)
    • lower endurance limit than SLC
    • lower cost (3x lower than SLC)
    • good fit for consumer products (..not for critical applications which require frequent updates of data)

The pros and cons of HDD compared to SSD can be paired down to a handful of variables, ie. availability, capacity, durability, encryption, environment (humidity/temperature), fragmentation, heat/BTU’s produced, mtbf/failure-rate, noise, physical form-factor, power requirements, price, shock/vibration, speed, warranty, and write-protection capabilities. Write-protection at the SSD and HDD firmware level, not just the physical data and file system level, is one of the key differentiators when comparing secure SSD/HDD storage technology solutions. There are only a small number of manufacturers offering such functionality and price is presently a premium variable of consideration. HDD are vulnerable to magnetic pulse and x-ray, making automated replication to alternate HDD, storage arrays and locations a necessity, driving up cost while still ultimately susceptible to data loss. SSD is impervious to their effects making it not only a viable tier-0 high-performance data cache solution, but potentially a new long-term active-archive storage tier solution. New ISO and NIST secure storage regulatory compliance can also be a factor when evaluating which flash-based solution will best fit your requirements, as well as DOD 5220, EU-DPD, HIPPA, FedRamp, IRIC 106, NIST FIPS/FISMA/SP-800, NSA 130-2, PCI/DSS, and many others.

For more in-depth technical comparisons and product information, give Digistor a call today at 800-816-1886 or email us at


Is data-centric security essential in modern storage solutions?

Data storage security has quickly become both a hot-topic and a new budget line item for CTO/CIO’s in 2015, both here in the US and around the world. An organization’s data is most often its most valued asset, while keeping it stored safely is increasingly both a commercial and legal imperative. Managing not only how data is stored but how to securely access and communicate it across a wide range of media and services is the fundamental building block of information assurance.

Regulatory compliance has driven a variety of storage practices over the years to guarantee information assurance, but one of the most sweeping new international reforms comes from the pending new EU General Data Protection Regulation (GDPR) being adopted by all 28 of the EU member states.  Substantial changes in scope to embrace globalization of cloud computing, social networks and data-breeches, brings in new levels of enforcement and heavy fines that will forever shake up EU data protection practices and privacy guidelines.

graph 01

Often the security associated with data storage systems and supporting infrastructure has been overlooked due to basic misunderstanding of the inherent risks to data storage ecosystems, leading to data risk compromised from a wide variety of events. The new NIST-sponsored Cyber-Physical Systems (CPS) framework was initiated to define key characteristics to better manage development and implementation of both the Industrial Internet and Internet of Things (IoT) physical computational and data storage components across multiple smart application domains including energy, healthcare, law enforcement, manufacturing and transportation.

The brand new ISO/IEC 27040:2015 defines data storage-centric security as application of physical, technical and administrative controls to protect storage systems and infrastructure against unauthorized disclosure, modification or destruction. These controls can be compensatory, corrective, detective, deterrent, preventive or recovery in nature.

The rapid adoption of complex software-defined storage systems (SDS), ie. the uniting of compute, networking, storage, and virtualization into a hyper-converged storage solution, became a top data center trend impacting both data security and data recovery strategies in 2015.  Although simplifying rapid provisioning, ease of implementation and redundancy, while providing significant saving in cost, power and space, data storage-centric security remains a significant gap in the SDS infrastructure.

Due to superior accessibility, capacity-on-demand, flexibility and lower overall IT costs compared to legacy on-line compute and data storage methodologies, cloud computing has quickly become a mainstay on a worldwide basis. Yet, just like traditional online compute/storage methodologies, cloud computing has its own set of unique data security issues. Mitigating risks before and throughout a cloud adoption is the number one imperative among CIO/CISO/DPO’s, as they transition applications and data to the cloud. The decision to move to the cloud depends on the sensitivity of the data/application, service-level-agreement and overall cloud security infrastructure, and ultimately does the business value offset the risks?

In a recently released 2016 Trend Micro Security report, despite the need for Data Protection Officers (DPO) or a Chief Information Security Officer (CISO), less than 50% of enterprise organizations will have one, or a budget for them, by the end of 2016. With the EU GPDR directive, coupled with the ISO 27040 data security standard mandating a significantly higher degree of data protection, a DPO/CISO job slot designated solely to ensure the integrity of data within and outside the enterprise is a wise investment. With this higher degree of awareness, legislation and technology around data storage-centric security, we will begin to see a proactive shift in the enterprise policies, practices and strategies that will bring effective protection to the storage infrastructure.

Public safety is now a concern of every commercial enterprise, municipality, school and university. High-resolution video surveillance and law enforcement body-worn cameras (BWC) are generating more long-term video storage requirements than ever before. Enterprise IT must be able to balance a budget for both cameras and a secure infrastructure that enables easy, yet secure, data access. A wide variety of new BWC, chain-of-custody, evidence management and surveillance technology solutions are blossoming as new local, state and federal budget resources are being made available in 2016.

In the first quarter of 2015, IDC reported 28.3 Exabytes (28Billion Gigabytes) of data storage capacity was shipped worldwide. The majority (23%) of this spending was on server-based storage and hyperscale (SDS architecture) cloud infrastructures, while traditional external storage arrays fell significantly and were replaced by all-flash and hybrid flash arrays (NAND/HDD). Less than .05% of all these storage products shipped employed Self-Encrypting-Drive (SED) technology, while almost 90% of all flash ar
rays shipped were SED capable. SED offer FIPS 140-2 compliant security without all the overhead of a software-based encryption schema, coupled with self-describing encryption key management capability, making it a valued component in the secure data storage graph 02

Over the next several months throughout 2016, we will delve more deeply into the practical application of specific secure storage technologies, why and how to put security directly into the physical storage technology, advantages and disadvantages between specific data storage technology, cost analysis and more. Stay Tuned..