Evaluating Storage System Security

Storing digital data successfully requires a balance of availability, cost, performance and reliability. With the emergence of low-power, petabyte-scale archival storage and flash-based systems, it is getting increasingly difficult to quantify performance, reliability and space-efficiency trade-offs, especially when coupled with storage-security factors. Storage performance is measured by latency, throughput (bandwidth) and IOPS, with throughput typically presented as overall sustained (long) and peak (short) performance transfer rates, and has a wide variety of non-uniform and unique measurement views when storage security is employed.

Although much work has been done on defining, testing and implementing mechanisms to safeguard data storage in long-term archival storage systems, data security verification in our cloud-based, mobile-driven, virtual containerized software-defined remote storage world, remains a unique and ongoing challenge.

Data security can be ensured in a variety of ways depending on the level of security desired, performance and the level of tolerance of user-inconvenience. Most storage systems rely on encrypting data over the wire or by on-disk data encryption, typically using pre-computed checksums and secure hashes, but with no standardized parameters or protocol for comparison between network or on-disk performance and integrity while in actual use.

In today’s multi-tenant virtualized container storage environments, containers depend on a different approach to virtualization, ie. they are not the hardware of things and how a guest O/S runs on top of all that (cpu/memory/network/storage), as containerization separates users and processes from each other. Multi-tenant security is especially important with the heavy reliance on 24xforever mobile data access from containerized cloud storage, where the top-10 security issues identified in 2015 by OWASP (www.owasp.org) were:

  • Insecure data storage;
  • Weak server-side controls;
  • Insufficient transport layer protection;
  • Client-side injection;
  • Poor authorization & authentication
  • Improper session handling
  • Security decisions via un-trusted inputs
  • Side-channel data leakage
  • Broken cryptography
  • Sensitive information disclosure

Docker, one of the most prevalent deployed container technologies in use today, have just recently addressed container user-security concerns  by separating daily container operation privileges from root privileges on the server host, thus minimizing risk of cross-tenant user namespace and root server/data access.

The Center for Internet Security recently released a series of internet security benchmarks (https://benchmarks.cisecurity.org) resources that, although an independent authority and not a standards body, are based on recommended industry-accepted FISMA, PCI, HIPAA and other system hardening standards to help in mitigating security risk for virtualized container storage infrastructure implementations. Although there are a number of new technology products being introduced specifically focused on unique virtual container data security, what does ‘secure’ really mean in the container-context, ie. secure container access, valid container data, native security of application(s) in the container, etc. ?  Most container data volumes today are tied to a specific virtual server, and if the container fails or is moved from that one server to another, the connection to the data volume is lost (no persistent storage), regardless of employed security parameters. For virtual container data to be truly secure, a fully distributed, reliable, secure read/write container file system must be employed to ensure secure, resilient cloud deployments. Ideally, this can be achieved with a container-native cloud deployment on bare-metal, without the use of virtual machines, making the container’s data lifecycle and application scalability independent of the container’s host, while minimizing the future cost and complexity of provision and management of virtual machine server hosts. That coupled with a hardware-secured, write-once data storage device tier, can truly ensure long-term data storage security irrespective of use or lack of encryption use. Additionally and most importantly, cloud data storage encryption keys, although defined within the facets of the SNIA-based Cloud Data Management Interface (CDMI) key management interoperability protocol (KMIP) proposed standard, requires better wide-spread adoption, as most crypto key management is either at the specific storage device level with a single point of key-access failure or as a Cloud provider-managed option today…Lose the key(s), lose the data, no matter how securely managed or replicated!

Clients acting in the role of using a data storage interface

Some data storage security basics:

  • Physical security is essential.
  • Develop internal storage security standards (authentication/authorization/access control methods, configuration templates, encryption req’s., security architecture, zoning, etc.).
  • Document, maintain and enforce security policies that cover availability, confidentiality and integrity for storage-specific areas.
  • Ensure basic access controls are in place to determine your policies; change insecure access permissions.
  • Unload unnecessary/not-required storage services related to NFS (mountd, statd, and lockd).
  • Limit and control network-based permissions for network volumes and shares.
  • Ensure proper authentication and credential verification is taking place at one or more layers above storage devices (within the host operating system, applications and databases).
  • Operating system, application and database-centric storage safeguards are inadequate. Consider vendor-specific and/or 3rd.party storage security add-ons.
  • Ensure audit logging is taking place for storage security accountability.
  • Perform semi-annual information audits of physical location inventory and critical information assets.
  • Separate storage administration and maintenance accounts with strong passwords for both accountability and to minimize potential compromised-account damage.
  • Encrypting data in transit helps, but should not be relied on exclusively.
  • Carefully consider software-based storage encryption solutions for critical systems (key mgt.).
  • Evaluate and consider hardware-based drive encryption on the client side.
  • Carefully select a unified encryption key management platform that includes centralized key lifecycle management.
  • Deploy Boolean-based file/stream access control expressions (ACE’s) in container environments to simplify permission granting to users/groups across data files/directories while providing an additional data protection level in multi-tenant environments.
  • Evaluate OASIS and XACML policy-based schemas for secure access control.
  • Evaluate and consider write-once data storage technology for long-term archival storage tiers.

Related posts:

Is Hybrid Data Storage a Solution for you?

blog0116-01

As the 2016 New Year unfolds, the demand for secure data storage will increase at every level within the IT stack. According to the 2015 Cyber Defense Report, 70% of organizations have been compromised by a successful data breach within the last 12-months. With a zero-trust data protection mantra, new pervasive data security solutions will emerge to touch applications, endpoints, networks and storage collectively. Encryption technology alone, when keys are managed by employees in both on-premise and Cloud environments, is not an adequate cyber-attack deterrent, while control over data location and redundancy are key to maintaining compliance, data privacy and security across global, heterogeneous infrastructures.

To keep up with the burgeoning big-data deluge, organizations continue to move larger workloads into unified/virtualized environments, both on-premise and cloud. Many have already successfully deployed a variety of high-performance hybrid data storage solutions into the data center landscape. In a recently released survey by ActualTech Media, many of these enterprises have begun incorporating flash-based storage in their data centers, ie. 41% use on-premise HDD only; 9% are using off-premise/Cloud only; while 50% of respondents are already using some type of on-premise flash-based storage (3% all-flash, 47% hybrid mix flash/HDD). With all the significant benefits virtualization brings to the IT infrastructure, one factor has inhibited wide-scale legacy application virtualization, and that is performance.

blog0116-02

Bandwidth, IOPS and latency, are standard storage performance metrics typically measured in milliseconds, with flash drives specs within fractions-of-a-millisecond. As data storage is usually the IT infrastructure latency bottleneck, minimizing latency is a key objective for faster I/O completions and faster transaction processing. As latency has a direct impact on VM performance in virtualized environments, an adoption of solid-state storage incorporating flash-caching hardware and software is enabling very low latencies while simultaneously helping to minimize network bandwidth bottlenecks. Flash SSD advantages include higher IOPS, reduced cooling, reduced power and lower failure rates than standard HDD. Although flash SSD storage costs are rapidly declining  and are still a 2x factor higher than HDD per terabyte (depending on TCO variables), a combined hybrid SSD/HDD automated tiered storage solution offers compelling metrics that IT professionals are finding both acceptable and in-budget. SSD-based data storage technology provides true business value by enabling faster access to information and real-time analytics. A hybrid SSD/HDD solution enables IT to balance cost and performance to meet their unique application and SLA requirements.

Which flash-based SSD solution is truly right for your environment? There are many factors to consider when comparing industrial-grade versus commercial-grade flash storage devices. Industrial-grade utilizes SLC (Single Level Cell) NAND versus commercial-grade MLC (Multi Level Cell) as the data storage medium. Based on voltage, SLC records only a single value (0-1, on-off), where MLC can store up to four values (00, 01, 10 or 11) in two-bits of data. SLC NAND has 20-30x more endurance cycles over MLC NAND, better data retention life and extreme temperature functionality.

blog0116-03

  • SLC (Single Level Cell)
    • highest performance, high cost, enterprise grade NAND
    • 90-100,000 program/erase cycles per cell (highest endurance)
    • lowest density (1 bit per cell, lower is better for endurance)
    • lower power consumption
    • faster write speeds
    • much higher cost (3x higher than MLC)
    • best fit for industrial grade devices, embedded systems, critical applications
  • eMLC (Enterprise Multi Level Cell)
    • good performance, aimed at enterprise use
    • 20-30,000 program/erase cycles per cell
    • higher density (2 bits per cell)
    • lower endurance limit than SLC, higher than MLC
    • lower cost
    • good fit for light enterprise use & high-end consumer products with more disk writes than consumer-grade MLC
  • MLC (Multi Level Cell)
    • average performance, consumer grade NAND
    • 10,000 program/erase cycles per cell
    • higher density (2 or more bits per cell)
    • lower endurance limit than SLC
    • lower cost (3x lower than SLC)
    • good fit for consumer products (..not for critical applications which require frequent updates of data)

The pros and cons of HDD compared to SSD can be paired down to a handful of variables, ie. availability, capacity, durability, encryption, environment (humidity/temperature), fragmentation, heat/BTU’s produced, mtbf/failure-rate, noise, physical form-factor, power requirements, price, shock/vibration, speed, warranty, and write-protection capabilities. Write-protection at the SSD and HDD firmware level, not just the physical data and file system level, is one of the key differentiators when comparing secure SSD/HDD storage technology solutions. There are only a small number of manufacturers offering such functionality and price is presently a premium variable of consideration. HDD are vulnerable to magnetic pulse and x-ray, making automated replication to alternate HDD, storage arrays and locations a necessity, driving up cost while still ultimately susceptible to data loss. SSD is impervious to their effects making it not only a viable tier-0 high-performance data cache solution, but potentially a new long-term active-archive storage tier solution. New ISO and NIST secure storage regulatory compliance can also be a factor when evaluating which flash-based solution will best fit your requirements, as well as DOD 5220, EU-DPD, HIPPA, FedRamp, IRIC 106, NIST FIPS/FISMA/SP-800, NSA 130-2, PCI/DSS, and many others.

For more in-depth technical comparisons and product information, give Digistor a call today at 800-816-1886 or email us at sales@digistor.com.

blog0116-04

Related posts:

Is data-centric security essential in modern storage solutions?

Data storage security has quickly become both a hot-topic and a new budget line item for CTO/CIO’s in 2015, both here in the US and around the world. An organization’s data is most often its most valued asset, while keeping it stored safely is increasingly both a commercial and legal imperative. Managing not only how data is stored but how to securely access and communicate it across a wide range of media and services is the fundamental building block of information assurance.

Regulatory compliance has driven a variety of storage practices over the years to guarantee information assurance, but one of the most sweeping new international reforms comes from the pending new EU General Data Protection Regulation (GDPR) being adopted by all 28 of the EU member states.  Substantial changes in scope to embrace globalization of cloud computing, social networks and data-breeches, brings in new levels of enforcement and heavy fines that will forever shake up EU data protection practices and privacy guidelines.

graph 01

Often the security associated with data storage systems and supporting infrastructure has been overlooked due to basic misunderstanding of the inherent risks to data storage ecosystems, leading to data risk compromised from a wide variety of events. The new NIST-sponsored Cyber-Physical Systems (CPS) framework was initiated to define key characteristics to better manage development and implementation of both the Industrial Internet and Internet of Things (IoT) physical computational and data storage components across multiple smart application domains including energy, healthcare, law enforcement, manufacturing and transportation.

The brand new ISO/IEC 27040:2015 defines data storage-centric security as application of physical, technical and administrative controls to protect storage systems and infrastructure against unauthorized disclosure, modification or destruction. These controls can be compensatory, corrective, detective, deterrent, preventive or recovery in nature.

The rapid adoption of complex software-defined storage systems (SDS), ie. the uniting of compute, networking, storage, and virtualization into a hyper-converged storage solution, became a top data center trend impacting both data security and data recovery strategies in 2015.  Although simplifying rapid provisioning, ease of implementation and redundancy, while providing significant saving in cost, power and space, data storage-centric security remains a significant gap in the SDS infrastructure.

Due to superior accessibility, capacity-on-demand, flexibility and lower overall IT costs compared to legacy on-line compute and data storage methodologies, cloud computing has quickly become a mainstay on a worldwide basis. Yet, just like traditional online compute/storage methodologies, cloud computing has its own set of unique data security issues. Mitigating risks before and throughout a cloud adoption is the number one imperative among CIO/CISO/DPO’s, as they transition applications and data to the cloud. The decision to move to the cloud depends on the sensitivity of the data/application, service-level-agreement and overall cloud security infrastructure, and ultimately does the business value offset the risks?

In a recently released 2016 Trend Micro Security report, despite the need for Data Protection Officers (DPO) or a Chief Information Security Officer (CISO), less than 50% of enterprise organizations will have one, or a budget for them, by the end of 2016. With the EU GPDR directive, coupled with the ISO 27040 data security standard mandating a significantly higher degree of data protection, a DPO/CISO job slot designated solely to ensure the integrity of data within and outside the enterprise is a wise investment. With this higher degree of awareness, legislation and technology around data storage-centric security, we will begin to see a proactive shift in the enterprise policies, practices and strategies that will bring effective protection to the storage infrastructure.

Public safety is now a concern of every commercial enterprise, municipality, school and university. High-resolution video surveillance and law enforcement body-worn cameras (BWC) are generating more long-term video storage requirements than ever before. Enterprise IT must be able to balance a budget for both cameras and a secure infrastructure that enables easy, yet secure, data access. A wide variety of new BWC, chain-of-custody, evidence management and surveillance technology solutions are blossoming as new local, state and federal budget resources are being made available in 2016.

In the first quarter of 2015, IDC reported 28.3 Exabytes (28Billion Gigabytes) of data storage capacity was shipped worldwide. The majority (23%) of this spending was on server-based storage and hyperscale (SDS architecture) cloud infrastructures, while traditional external storage arrays fell significantly and were replaced by all-flash and hybrid flash arrays (NAND/HDD). Less than .05% of all these storage products shipped employed Self-Encrypting-Drive (SED) technology, while almost 90% of all flash ar
rays shipped were SED capable. SED offer FIPS 140-2 compliant security without all the overhead of a software-based encryption schema, coupled with self-describing encryption key management capability, making it a valued component in the secure data storage infrastructure.blog graph 02

Over the next several months throughout 2016, we will delve more deeply into the practical application of specific secure storage technologies, why and how to put security directly into the physical storage technology, advantages and disadvantages between specific data storage technology, cost analysis and more. Stay Tuned..

Related posts:

Anyone can build an SD card, but not all SD cards are created equal

industrial-sd-card-blog

SD cards and microSD cards of all varieties permeate the consumer market. The price is continually dropping. They’re mass produced in huge factories in Asia. The cost of materials is so low you may start to wonder is there a way to ensure you receive a high quality SD card for your automotive, medical, body worn camera or other demanding product?

Do you really want consumer SD or microSD cards for these projects? Not if storage is a critical component of your solution. You’re dealing with narrow requirements, and you don’t have room for mistakes and equipment malfunctions. Home users of electronic equipment like SD cards can work around the occasional dud or random defective piece, but working in a professional industry is a different ballgame. What to a causal user is simply an annoyance can make the difference between success or failure, and the odds aren’t small. You need to get things right the first time.

That’s why industrial SD and microSD cards aren’t an option for those working in certain industries—they’re a necessity. All SD cards are not created equal, and comparing your industrial SD card with one bought off the Walmart shelves is like comparing a grocery store hot dog with fine Paris cuisine. The cards you buy off store shelves are produced with minimal quality control, and there’s no consistency even among cards that come from the same manufacturer. You might test a hundred and find one you really like, but there’s no guarantee that the next one you buy with the same labeling is made of the same components or has anything like the same capability. There’s no guarantee for what’ll happen when you put it under stress.

What about DIGISTOR industrial SD cards? These come with a price tag, but there’s nothing arbitrary about the price: you get what you pay for. In this case, it’s high quality MLC/SLC NAND, a wide temperature range, and Consistency spelled with a capital C. It means you don’t have to worry about the bill of materials (BOM) changing on you—and that’s huge. If you build an application around one of our SD or microSD cards you can have the confidence to know that these cards will continue working in future instances of the application: no unpleasant surprises, no need for frequent retests.

Using an industrial SD or microSD card means you get quality and you get control. You get to utilize the resources within the controller. You have control over the BOM. Nothing can change without your say-so, and you know exactly what the capabilities of your cards are in every instance.

We’d be happy to share more with you about our industrial quality SD cards—just call us for more info. DIGISTOR stands for reliability and quality, and SD cards for your industrial application is one case where those two things really do matter.

Related posts:

Bringing Your Data Home

bringing-your-data-home-blog-image

You had your picture archive safe on Flickr, your documents on Dropbox, and a running archive of your devices on Apple’s iCloud. But when something happens to one of these services—like the two-day Dropbox downtime—you wonder if keeping your archives in cloud storage options really is the best way to go about this.  Cloud storage, no matter how respected the provider, is prone to downtime. And having your precious files suddenly disappear is not something you can take with equanimity.

How to Make a Smooth Switch From Cloud Storage to Home Data Archive Options
There’s something about having all that data available at home, in an archive of blu-ray discs or a storage drive; even if all of today’s big web companies go bankrupt, you’ve nothing to worry about. But what is the best data storage, and how to make the switch? Isn’t it too much work to be feasible? Bringing your data home may not be a half hour job, but if you do your planning first, it can be a smooth, easy run and not the huge headache that otherwise threatens.

Your first task is researching which type of storage device to use. Over the years you’ve probably accumulated more than a small amount of data, so your archive solution will need to have high capacity. You also want it to be reliable, long lasting, and you want to be able to add to it periodically. Should you buy a nice high-capacity hard disk drive, or is shelling out the bucks for a state of the art solid state drive the way to go?

The answer is—neither! Hard disk drives and solid state drives are both wonderful in their places, but for a home archive you can’t do better than go with Blu-ray discs. Unlike hard disk drives, which have lots of moving parts that are prone to breakage, a Blu-ray disc is simply a ‘page’ of written information—cold storage, if you will. Unlike solid state drives, where data could deteriorate if not accessed, the data on your Blu-ray discs can be left in a drawer for years and only read when you want what you’ve archived.  

Blu-ray discs are affordable, and they won’t take up much room. Over the years you could accumulate a collection of these discs, which can be stored conveniently in a small cabinet or magazine.

You can buy a quality external Blu-ray burner for a very reasonable price; and if you get it from us at Digistor, it’ll come with a program called Rewind™—software that will make archiving super-easy for Windows or OS X. You’ll need to buy your actual discs as well, of course—a set of 10 25GB or 50GB discs is a good place to start.

When you’ve settled on your storage device and ordered your equipment, the next thing to do is figure out how to reclaim your data from cloud storage. Some cloud storage solutions make export super-easy; from others, it is a pain, but it’s better to do it now then five years from now when you’ll have even more to deal with! If you’re looking at long download times, you may want to set up the process in the evening and let it run overnight. Make sure you have room on your computer for everything you’ll be downloading. If you don’t, setup an external hard disk for temporary storage.  You can always do it in parts, downloading one disc worth of archive material at a time.

Ready? Push that download button, and watch that data materialize out of thin air and come to solid existence on your home PC. When it’s all there, plug in your Blu-ray burner, stick a disc in and open Rewind™. Making a running archive of your data could scarcely be easier.  Choose a name for your archive, select your files, click ‘Archive It!’, and let the burn begin!

Then there is nothing left to do but organize your Blu-ray stash and file it somewhere safe and out of the way. Ideally, you’d make two identical archives, one for home, one for an alternate location. Disaster doesn’t happen often, but when it does , it’s well to be prepared.

For an extra safeguard, you can always keep your files up in your old web repository as well.  Cloud solutions are wonderful in their place; as a way to give you access to specific data from a wide variety of locations. They’re also a wonderful as a quick backup of small files in case of  natural disasters such as tornado and fire.  But for an all-purpose general archive of all your data, pictures, and information, nothing beats a well organized home-based storage center, like your new mini-cabinet of Blu-ray discs.

Related posts:

What You’re Paying for When You Buy SSD Drives Designed for Professional Video Shoots

SSD-film-blog-image

Sure, you can get an SSD that looks as though it ought to fit your video camera for fairly cheap on eBay or off the shelf. So what makes a “professional video” SSD, well, professional?

To begin with, not all SSD drives are compatible with a high-end video camera like that from Blackmagic Design.

Some don’t fit the camera; a standard 7mm SSD can make enough difference to either keep the drive from going in at all or making it slip around unforgivably once it’s been put in place. Most newly released SSD’s don’t have cameras in mind and are designed to be as thin as possible. This extra space within the camera can cause rattling, and additional wear on the SATA connection.

Others have firmware that just doesn’t work with your camera, interrupting your workflow with inability to record, or cause you to drop frames every time you try to shoot an important video.

That’s why brands like Blackmagic supply their customers with a list of approved SSDs that have been tested and been found to work.
These are higher-end SSDs that have been rigorously tested to ensure you can depend on them—and we’re proud that our DIGISTOR Professional Video SSD series is included on that list.

But they aren’t just one of the numbers. We’ve built them to be something special.

What is it that sets DIGISTOR Professional Video SSD Drives apart?
DIGISTOR Professional Video SSDs aren’t just compatible with your Blackmagic camera; they’re made to function with the camera as if they were born together. You can take your DIGISTOR Professional Video SSD Drive straight out of the box, stick it in your camera, and expect it to work immediately. Contrast that with the formatting, reformatting, and extensive fiddling you can expect if you use another SSD drive and you’ll already start to appreciate the synergy we’ve worked for.

Additionally, here’s an SSD series that’s all about video. (In fact, it’s the first and only!)

See Also: Top 5 things cinematographers love about our Professional Video SSDs

DIGISTOR Professional Video SSDs aren’t just a possible co-opt for filming needs, they’re designed for filming in 2.5K RAW and 2.5K and 4K ProRes along with our special 1TB SSD designed for 4K RAW & ProRes (HQ) 422 format. Extensively tested for Blackmagic Cinema and Production Cameras, our SSDs do more than support the equipment preferred by professional filmmakers.  Powerful, reliable and durable, DIGISTOR Professional Video SSDs aim to make a difference in your filming experience.

Bottom line? Made-for-PC or bottom shelf SSDs may save you a few dollars up front, but there’s a chance you could be throwing the entire cost away (not to mention the price of lost work!) if one fails to meet your needs.

Related posts:

Industrial SSDs on the Frontiers of Science: Using SSDs at the International Space Station

International_Space_Station_blog

It’s not only high-end business and heavy-duty applications that rely on the power of Industrial Solid State Drives (SSDs) these days. Besides powering most of our earthly communications and industry, Industrial SSDs are also pushing the frontiers of science beyond the limits of our atmosphere. They have become the storage medium of choice at the International Space Station, allowing reliable, high-volume data collection like never before.

Data storage in space comes with its own set of special challenges. Not only does whatever storage medium used need to be compact, taking up a minimum amount of space, it also needs to be light; as every ounce in the journey to space counts. Limitations also demand that it should have low power consumption.

Finally, any storage system used should have high reliability, and an extreme temperature operating range, the ability to function without gravity, and the ability to withstand a high dose of radiation and remain uncorrupted.

Industrial strength SSD systems are all good as far as most of those criteria go. Radiation alone is a potential problem area. Down here on earth, we’re protected from cosmic radiation by the ozone layer and our atmosphere. This covering is effective in shielding us from most debilitating radiation.  Out there in space, they’re going (figuratively) naked.

Off-the-Shelf and Into Space
NAND flash memory tends to have a vulnerability to radiation; ionizing effects have the potential to do a number on the individual cells that hold the information bytes, resulting in voltage shifts and data corruption. But NASA scientists have discovered that while some memory chips fail dramatically under radiation pressures, others have the capacity to perform reliably.

This means that high quality industrial SSDs can be used after a rigorous test-and-retest procedure in which the highest performers are selected.

That’s why the International Space Station (ISS) now has the capacity to send a large volume of data and video images down to us here; shouldering past the old limits of knowledge and understanding in a way that’s never before been possible.

And it’s only getting better. While the switch from older operating technologies to SSDs began several years ago, just last week astronaut Tom Kelly switched out the old-fashioned Columbus Video Cassette Recorders (VCRs) from the starboard end of the ISS and replaced them with new Solid State Drive recorders.

As the transition to SSDs continues, we can expect to see a much larger volume of higher quality images and information beamed down to us directly from the outer frontiers of scientific exploration.

Related posts:

What Went Wrong With TLC NAND

tlc_nand_flash-blog

When Samsung pushed the envelope and introduced their TLC-NAND flash memory for general use, it had the makings of a landmark innovation. TLC (triple-level cell) NAND is cheaper to manufacture than either SLC or MLC NAND, as it works by fitting more data  into the same NAND cell—three bits per cell, rather than the one bit or two bits that single level (SLC) and multi-level (MLC) NAND can put away in one cell space.  You’d think TLC NAND would take over the market in short order; no reason to waste resources manufacturing more expensive SLC or MLC NAND.

When introduced the new TLC-NAND solid state drives seemed to have conquered all previous difficulties of TLC NAND with some state of the art firmware. Read speeds looked pretty; Samsung SSD 840’s 500MB/s is nothing to sneeze at and reliability was a non-issue.

But, mounting excitement over the potentially cost-effective storage innovation waned as performance problems were discovered.

In fact, it wasn’t long before users began reporting a new and extremely debilitating challenge. Those pretty read-speeds, that near 100% reliability: those only counted for new, freshly written data. Data that had been sitting on the drive for, say, all of eight weeks, would have deteriorated to a level that it could only be read at much slower speeds.

Meaning, by the time you had data sitting static on your drive for six months or a year, those previously high read-speeds would have been reduced to processing at a snail’s pace.

It turns out that this is a problem inherent in the TLC system. Although there’s a voltage drift that happens in every NAND drive over time, in SLC and MLC NAND, this drift is small, consistent, and can be accounted for in the reading algorithms. When you lock three data bits in a cell, though, data deterioration speeds up immensely. What’s worse, there’s no longer a generalized algorithm that can take all the shifting into account, so the old data is simply blurred.

Samsung has introduced two firmware updates in an attempt to smooth over the problem. The first, a fancy algorithm that was meant to take account of the voltage drift and factor it in where necessary, completely failed at solving the issue.

The second, while more successful, offers a somewhat unpleasant workaround: The drive is set to rewrite all data regularly, so nothing ever is old.  It does manage to get around the problem: if all data is new data, it will all be readable and quickly accessible. However, since every NAND SSD has a finite number of writes or rewrites, this isn’t an ideal fix.

What does this all boil down to?
Simply that TLC NAND is not the future of data storage, and it doesn’t even have a good seating in the present. If your data matters in the long term, you’ll want to go with a higher-quality NAND: MLC-NAND for your basic SSD needs, or SLC-NAND for industrial use or super-sensitive data storage. There’s no other way about it.

Related posts:

Your microSD Card: Is Cost All That Matters?

microSD-coins

They look the same and have the same capacity.  For consumers shopping for a microSD card, the only difference might appear to be their price. Does it really matter which microSD card you buy? Should you go with the cheapest option?

Turns out, there’s more to that little plastic chip than you can see on the outside.
It may come as a surprise, but choosing a microSD card is actually an important decision that affects the performance of your phone or other mobile device. Pair a cheapo microSD with your top notch smartphone and you’re setting yourself up for a whole range of problems, ranging from memory failures and corrupted files to crashing your entire phone OS.

“The fact that there are a lot of fake cards out there,” reports Richard Lai of Engadget interviewing Hugo Barra, Xiaomi’s Vice President of International.

The market abounds with SanDisk that is not really SanDisk, Kingston that is not really Kingston, and other major manufacturers that are possibly just cheap Chinese fakes. It’s not hard to print out a name-brand sticker and put it on a generic microSD. Barra elaborated on the problem:

“You think you’re buying like a Kingston or a SanDisk but you’re actually not, and they’re extremely poor quality, they’re slow, they sometimes just stop working, and it gives people huge number of issues, apps crashing all the time, users losing data, a lot of basically complaints and customer frustration. It’s gonna be a while before you finally accept that maybe the reason why it’s not performing is because you put in an SD card.”

Because of the prevalence of sub-quality microSD cards on the market, the problem has many smartphone manufacturers choosing not to put microSD slots in their high-end, flagship phones at all. There’s just too much potential for disaster and the problems that come up usually get blamed on the innocent phone, rather than the tiny memory card hiding inside of it, jamming up the works.

But despite the concern from big brands, microSD slots are still a big selling point for shoppers.
If you do value the ability to increase your smartphone’s memory beyond its out-of-the-box capacity, make sure the card you’re slipping inside is a quality product – not a phone-debilitating enemy in disguise.

It’s important to not only choose a brand you can trust, but also with a reliable distributor; preferably, an official partner of a top-quality brand like Panasonic or Kingston. Our official partnership with Panasonic is just one reason DIGISTOR is a preferred retailer for high-end, quality microSD cards you can trust. If you’d like to learn more about our quality control and microSD card compatibility, call up one of our sales associates for more information.

Related posts:

Top 5 things cinematographers love about our Professional Video SSDs

Digistor-SSD-on-set_blog

Cinematographers and filmmakers love our Professional video SSD drives. After much discussion on the how and why we developed a series of SSD drives focused on video capture, primarily aiming towards certification for Blackmagic recording hardware, we have found the top 5 things that get cinematographers excited about our drives.

 

1. It’s the first and only SSD series made exclusively for video capture.
DIGISTOR set out to build the SSD line up specifically focusing on video capture rather than designing an SSD for computer use. Where other brands build SSDs for a wide variety of computers, we focus on what works well with your video recording hardware.

From the locked bill of materials and firmware, to the NAND flash, controller, and even the physical size of our SSDs, our Professional Video Series was built from the ground up to meet the unique needs of professional filmmakers.

2. Our Professional Video Series SSDs are sized for a snug fit.
Your Blackmagic Camera has a 9.5mm size slot — shouldn’t your SSD be designed to sit securely inside?

While the industry continues to move towards more compact drives, the Professional Video Series remains a standard 9.5mm to avoid shaking, rattling, flexing your connector and possibly dropping frames.

3. Our locked BOM ensures long-term compatibility with Blackmagic.
Once approved by Blackmagic, we lock our bill of materials — both physical and firmware. Before any changes are made, our SSDs are again extensively tested for re-certification by Blackmagic.

For our customers, that means every DIGISTOR drive that has been listed as Blackmagic compatible will remain so, regardless of movements in the market for new NAND or updated controllers.

4. Your workflow, made more efficient by editing straight from the disk.
Stuck on site or don’t have the time to transfer your footage to local storage?

Made to go directly from your Blackmagic recording hardware into the HyperDeck Studio, Blackmagic MultiDock, or connect with a Thunderbolt adapter, our powerful SSDs allow you to smoothly transition from shooting to on-site editing.

5. Our Professional Video Series comes ready to shoot.
Unlike off-the-shelf SSDs that require formatting before use, our Professional Video Series SSDs are made to support recording hardware right out of the box. Compatible with exFAT and already formatted, you can go straight to shooting.

Related posts: