SSD prices on the rise due to NAND flash shortages

intel-3d-nand-1200x630-c

The NAND flash supply shortage that has endured this year is expected to continue throughout the fourth quarter, and all signs point to ongoing supply issues well into 2017.

According to TrendForce, strong smartphone demand is the main reason for the NAND flash shortage.  However, higher than anticipated SSD adoption rates in the industrial, enterprise, and consumer markets have also contributed to the severe shortages.  We have already seen factory lead times increase nearly two-fold over the past few months, and price increases affecting certain SSD product lines are not far behind.

Related posts:

Have you felt the impact of flash?

flash-impact-iot-blog

We’ve seen a more dynamic change in the storage of data on Flash in the past few years than ever before. NAND flash used to only be “trusted” (using that term loosely) in two areas: laptops, and non-critical entertainment such as storage for cameras. Before DIGISTOR was big in the flash storage arena, we would scrutinize every detail of a 2.5” SSD drive for our desktop PC’s, comparing how much data per day we copy, use, and write to be sure we wouldn’t run out of NAND P/E cycles before a standard HDD would.

Over time we came to trust the technology, and even work closely with several global chipset manufacturers that let us in on the deep inner workings of how the NAND is used and how the chipset can, in fact, protect your data. Now we’ve scaled far beyond standard SSD storage for PC’s, and Cinema SSD drives. The more we improve industrial grade high speed flash storage, the more critical applications appear that benefit from improved speed and ruggedized design of solid state storage.

One area that was fairly surprising is the way cloud storage uses flash to improve the response time of any given search for your data. Usually we assume this bottleneck, our ISP, will mean that our remote files can be stored on big spinning disk drives out in a server farm where no SSD’s would really be of use. However, have you noticed that search results appear as you type these days on Google, and Facebook? Yes, that is the result of having some parts of the cloud built to use the high speed dependability of solid state storage.

The area that was more expected, but slower growing, is the IoT. The Internet of Things category has been looking for its day in the sun for some time. Now, something amazing is happening due to the improvements in controller tech, NAND flash, and testing for industrial applications. These little devices can finally have the ability to be truly smart because of the capacity, speed, and reliability of the integrated solid state storage, and improved SoC IC all packed into a small footprint.

The more IoT devices can prove themselves as necessities in our lives, the more innovation will grow from the next generation of devices. In order to do this though, the main requirement is usability. A close second is reliability. IoT devices are usually low power, and always running. Perhaps you don’t even use a particular device every day, but when you need to use it, it must be reliably functioning. There is no room for off-the-shelf consumer flash storage when you have a critical control system in place, perhaps monitoring the security of your home, or granting access to your front door.

We all got used to flash with cheap SD storage for our point-and-shoot cameras, and maybe we have had some experience with SSDs in our laptops. Now that flash storage is used for more critical components in our digital life, there’s no other choice but to be sure your storage of choice is high quality tested NAND and built by a trusted manufacturer. DIGISTOR is always willing to help you in this endeavor. If you’re working on any project, large or small, IoT or enterprise storage, and want to take storage concerns out of the equation, just contact us and we’ll be able to help.

Related posts:

How does your Body Worn Camera store and manage crucial video evidence?

body-cam-right-tile

Police agencies around the country are ramping up their Body Worn Camera (BWC) policies and procedures. High profile police assault videos have continued to surface on the internet creating a public outcry for reform, training, and accountability. Federal grants to help fund policing services need for BWC deployments are rapidly expanding.  These subsidies have created a whirlwind of new manufacturers into the BWC business looking to take advantage of federal grants and a growing market. BWC cameras have a variety of features including high definition video capture, night vision, and ruggedized housing.

BWC cameras record video evidence on an SD card similar to what you would use in a standard camera. That is part of the issue, most agencies and BWC manufacturers use off the shelf consumer SD cards to acquire and store video evidence without even realizing it.  These types of SD cards are at very high risk of corruption. Consumer cards do not offer extended temperature support and are at risk of failure in both hot and cold climates.  Standard SD cards will fail in several instances including being left behind in a hot car for just a short period. Most standard SD cards do not offer a Power-Fail feature or robust video acquisition firmware to keep from having corruption or overwrite issues. The testing and components used by consumer card manufacturers are not to the standard needed to provide a secure and robust BWC video storage solution.

The real question is how to protect crucial video evidence best on Body-Worn Cameras and ensure the proper video evidence gets from the camera to the courtroom? Agencies, BWC manufacturers, and surveillance companies can solve a lot of support and security issues by asking their current BWC supplier specific questions about the storage onboard their BWC camera. The first step is not to be so caught up on speed; these cards are all very fast. That said all SD Cards are not all equal, so it is important to design a card for your particular application. A few important questions to ask your BWC provider and ensure you have the best possible secure storage for your BWC application.

What operating/storage temp is your card supported, both heat/cold?

Does your SD card have a locked Bill of Materials (BOM)?

Does your SD card use TLC, MLC, or SLC flash?

Can you lock your SD card to one specific camera for security purposes?

Can your SD card protect against accidental deletions?

Do you have a power-fail feature?

Again be sure to look deeper than basic specifications such as read/write speeds, sometimes the fastest card may not be the best card for your application. Understanding your overall application and how it accesses the flash within the SD card itself is much more important. Understanding your overall BWC application demands and lifetime expectancy of your BWC camera is crucial to both protecting key video evidence as well protecting the agencies investment. Securing the video until transferred to a more secure and permanent location is the biggest question facing the BWC community, and they don’t even know it yet.

Related posts:

A new look to continue beyond 15 years of service

new-blog-image

We are excited to introduce you to DIGISTOR’s updated website and blog.

We’ve come a long way since we got our start in 2001, when our family started this business to provide digital archiving and data storage products to industrial and global OEM customers. Over the past 15 years, DIGISTOR has become a leading innovator, manufacturer and distributor of industrial-grade flash storage products, secure storage products and digital-video solutions, and today we serve customers around the world in industries such as Law Enforcement, Media, and Entertainment, Medical, Professional Video, Security/Surveillance, and Military.

Our new website is designed to better serve our customers and over the coming months you will see several new product announcements and additional website enhancements such as improvements to our self-service customer center, and additional online B2B capabilities. Our updated blog will continue to publish interesting articles and news and stories relevant to our industry

We always welcome your comments, feedback or ideas so please be sure to connect with us on social media.  We will have a lot to share in the coming weeks and months ahead.

Visit our Facebook page

Tweet us.

Follow us on LinkedIn

Related posts:

Evaluating Storage System Security

Storing digital data successfully requires a balance of availability, cost, performance and reliability. With the emergence of low-power, petabyte-scale archival storage and flash-based systems, it is getting increasingly difficult to quantify performance, reliability and space-efficiency trade-offs, especially when coupled with storage-security factors. Storage performance is measured by latency, throughput (bandwidth) and IOPS, with throughput typically presented as overall sustained (long) and peak (short) performance transfer rates, and has a wide variety of non-uniform and unique measurement views when storage security is employed.

Although much work has been done on defining, testing and implementing mechanisms to safeguard data storage in long-term archival storage systems, data security verification in our cloud-based, mobile-driven, virtual containerized software-defined remote storage world, remains a unique and ongoing challenge.

Data security can be ensured in a variety of ways depending on the level of security desired, performance and the level of tolerance of user-inconvenience. Most storage systems rely on encrypting data over the wire or by on-disk data encryption, typically using pre-computed checksums and secure hashes, but with no standardized parameters or protocol for comparison between network or on-disk performance and integrity while in actual use.

In today’s multi-tenant virtualized container storage environments, containers depend on a different approach to virtualization, ie. they are not the hardware of things and how a guest O/S runs on top of all that (cpu/memory/network/storage), as containerization separates users and processes from each other. Multi-tenant security is especially important with the heavy reliance on 24xforever mobile data access from containerized cloud storage, where the top-10 security issues identified in 2015 by OWASP (www.owasp.org) were:

  • Insecure data storage;
  • Weak server-side controls;
  • Insufficient transport layer protection;
  • Client-side injection;
  • Poor authorization & authentication
  • Improper session handling
  • Security decisions via un-trusted inputs
  • Side-channel data leakage
  • Broken cryptography
  • Sensitive information disclosure

Docker, one of the most prevalent deployed container technologies in use today, have just recently addressed container user-security concerns  by separating daily container operation privileges from root privileges on the server host, thus minimizing risk of cross-tenant user namespace and root server/data access.

The Center for Internet Security recently released a series of internet security benchmarks (https://benchmarks.cisecurity.org) resources that, although an independent authority and not a standards body, are based on recommended industry-accepted FISMA, PCI, HIPAA and other system hardening standards to help in mitigating security risk for virtualized container storage infrastructure implementations. Although there are a number of new technology products being introduced specifically focused on unique virtual container data security, what does ‘secure’ really mean in the container-context, ie. secure container access, valid container data, native security of application(s) in the container, etc. ?  Most container data volumes today are tied to a specific virtual server, and if the container fails or is moved from that one server to another, the connection to the data volume is lost (no persistent storage), regardless of employed security parameters. For virtual container data to be truly secure, a fully distributed, reliable, secure read/write container file system must be employed to ensure secure, resilient cloud deployments. Ideally, this can be achieved with a container-native cloud deployment on bare-metal, without the use of virtual machines, making the container’s data lifecycle and application scalability independent of the container’s host, while minimizing the future cost and complexity of provision and management of virtual machine server hosts. That coupled with a hardware-secured, write-once data storage device tier, can truly ensure long-term data storage security irrespective of use or lack of encryption use. Additionally and most importantly, cloud data storage encryption keys, although defined within the facets of the SNIA-based Cloud Data Management Interface (CDMI) key management interoperability protocol (KMIP) proposed standard, requires better wide-spread adoption, as most crypto key management is either at the specific storage device level with a single point of key-access failure or as a Cloud provider-managed option today…Lose the key(s), lose the data, no matter how securely managed or replicated!

Clients acting in the role of using a data storage interface

Some data storage security basics:

  • Physical security is essential.
  • Develop internal storage security standards (authentication/authorization/access control methods, configuration templates, encryption req’s., security architecture, zoning, etc.).
  • Document, maintain and enforce security policies that cover availability, confidentiality and integrity for storage-specific areas.
  • Ensure basic access controls are in place to determine your policies; change insecure access permissions.
  • Unload unnecessary/not-required storage services related to NFS (mountd, statd, and lockd).
  • Limit and control network-based permissions for network volumes and shares.
  • Ensure proper authentication and credential verification is taking place at one or more layers above storage devices (within the host operating system, applications and databases).
  • Operating system, application and database-centric storage safeguards are inadequate. Consider vendor-specific and/or 3rd.party storage security add-ons.
  • Ensure audit logging is taking place for storage security accountability.
  • Perform semi-annual information audits of physical location inventory and critical information assets.
  • Separate storage administration and maintenance accounts with strong passwords for both accountability and to minimize potential compromised-account damage.
  • Encrypting data in transit helps, but should not be relied on exclusively.
  • Carefully consider software-based storage encryption solutions for critical systems (key mgt.).
  • Evaluate and consider hardware-based drive encryption on the client side.
  • Carefully select a unified encryption key management platform that includes centralized key lifecycle management.
  • Deploy Boolean-based file/stream access control expressions (ACE’s) in container environments to simplify permission granting to users/groups across data files/directories while providing an additional data protection level in multi-tenant environments.
  • Evaluate OASIS and XACML policy-based schemas for secure access control.
  • Evaluate and consider write-once data storage technology for long-term archival storage tiers.

Related posts:

Is Hybrid Data Storage a Solution for you?

blog0116-01

As the 2016 New Year unfolds, the demand for secure data storage will increase at every level within the IT stack. According to the 2015 Cyber Defense Report, 70% of organizations have been compromised by a successful data breach within the last 12-months. With a zero-trust data protection mantra, new pervasive data security solutions will emerge to touch applications, endpoints, networks and storage collectively. Encryption technology alone, when keys are managed by employees in both on-premise and Cloud environments, is not an adequate cyber-attack deterrent, while control over data location and redundancy are key to maintaining compliance, data privacy and security across global, heterogeneous infrastructures.

To keep up with the burgeoning big-data deluge, organizations continue to move larger workloads into unified/virtualized environments, both on-premise and cloud. Many have already successfully deployed a variety of high-performance hybrid data storage solutions into the data center landscape. In a recently released survey by ActualTech Media, many of these enterprises have begun incorporating flash-based storage in their data centers, ie. 41% use on-premise HDD only; 9% are using off-premise/Cloud only; while 50% of respondents are already using some type of on-premise flash-based storage (3% all-flash, 47% hybrid mix flash/HDD). With all the significant benefits virtualization brings to the IT infrastructure, one factor has inhibited wide-scale legacy application virtualization, and that is performance.

blog0116-02

Bandwidth, IOPS and latency, are standard storage performance metrics typically measured in milliseconds, with flash drives specs within fractions-of-a-millisecond. As data storage is usually the IT infrastructure latency bottleneck, minimizing latency is a key objective for faster I/O completions and faster transaction processing. As latency has a direct impact on VM performance in virtualized environments, an adoption of solid-state storage incorporating flash-caching hardware and software is enabling very low latencies while simultaneously helping to minimize network bandwidth bottlenecks. Flash SSD advantages include higher IOPS, reduced cooling, reduced power and lower failure rates than standard HDD. Although flash SSD storage costs are rapidly declining  and are still a 2x factor higher than HDD per terabyte (depending on TCO variables), a combined hybrid SSD/HDD automated tiered storage solution offers compelling metrics that IT professionals are finding both acceptable and in-budget. SSD-based data storage technology provides true business value by enabling faster access to information and real-time analytics. A hybrid SSD/HDD solution enables IT to balance cost and performance to meet their unique application and SLA requirements.

Which flash-based SSD solution is truly right for your environment? There are many factors to consider when comparing industrial-grade versus commercial-grade flash storage devices. Industrial-grade utilizes SLC (Single Level Cell) NAND versus commercial-grade MLC (Multi Level Cell) as the data storage medium. Based on voltage, SLC records only a single value (0-1, on-off), where MLC can store up to four values (00, 01, 10 or 11) in two-bits of data. SLC NAND has 20-30x more endurance cycles over MLC NAND, better data retention life and extreme temperature functionality.

blog0116-03

  • SLC (Single Level Cell)
    • highest performance, high cost, enterprise grade NAND
    • 90-100,000 program/erase cycles per cell (highest endurance)
    • lowest density (1 bit per cell, lower is better for endurance)
    • lower power consumption
    • faster write speeds
    • much higher cost (3x higher than MLC)
    • best fit for industrial grade devices, embedded systems, critical applications
  • eMLC (Enterprise Multi Level Cell)
    • good performance, aimed at enterprise use
    • 20-30,000 program/erase cycles per cell
    • higher density (2 bits per cell)
    • lower endurance limit than SLC, higher than MLC
    • lower cost
    • good fit for light enterprise use & high-end consumer products with more disk writes than consumer-grade MLC
  • MLC (Multi Level Cell)
    • average performance, consumer grade NAND
    • 10,000 program/erase cycles per cell
    • higher density (2 or more bits per cell)
    • lower endurance limit than SLC
    • lower cost (3x lower than SLC)
    • good fit for consumer products (..not for critical applications which require frequent updates of data)

The pros and cons of HDD compared to SSD can be paired down to a handful of variables, ie. availability, capacity, durability, encryption, environment (humidity/temperature), fragmentation, heat/BTU’s produced, mtbf/failure-rate, noise, physical form-factor, power requirements, price, shock/vibration, speed, warranty, and write-protection capabilities. Write-protection at the SSD and HDD firmware level, not just the physical data and file system level, is one of the key differentiators when comparing secure SSD/HDD storage technology solutions. There are only a small number of manufacturers offering such functionality and price is presently a premium variable of consideration. HDD are vulnerable to magnetic pulse and x-ray, making automated replication to alternate HDD, storage arrays and locations a necessity, driving up cost while still ultimately susceptible to data loss. SSD is impervious to their effects making it not only a viable tier-0 high-performance data cache solution, but potentially a new long-term active-archive storage tier solution. New ISO and NIST secure storage regulatory compliance can also be a factor when evaluating which flash-based solution will best fit your requirements, as well as DOD 5220, EU-DPD, HIPPA, FedRamp, IRIC 106, NIST FIPS/FISMA/SP-800, NSA 130-2, PCI/DSS, and many others.

For more in-depth technical comparisons and product information, give Digistor a call today at 800-816-1886 or email us at sales@digistor.com.

blog0116-04

Related posts:

Is data-centric security essential in modern storage solutions?

Data storage security has quickly become both a hot-topic and a new budget line item for CTO/CIO’s in 2015, both here in the US and around the world. An organization’s data is most often its most valued asset, while keeping it stored safely is increasingly both a commercial and legal imperative. Managing not only how data is stored but how to securely access and communicate it across a wide range of media and services is the fundamental building block of information assurance.

Regulatory compliance has driven a variety of storage practices over the years to guarantee information assurance, but one of the most sweeping new international reforms comes from the pending new EU General Data Protection Regulation (GDPR) being adopted by all 28 of the EU member states.  Substantial changes in scope to embrace globalization of cloud computing, social networks and data-breeches, brings in new levels of enforcement and heavy fines that will forever shake up EU data protection practices and privacy guidelines.

graph 01

Often the security associated with data storage systems and supporting infrastructure has been overlooked due to basic misunderstanding of the inherent risks to data storage ecosystems, leading to data risk compromised from a wide variety of events. The new NIST-sponsored Cyber-Physical Systems (CPS) framework was initiated to define key characteristics to better manage development and implementation of both the Industrial Internet and Internet of Things (IoT) physical computational and data storage components across multiple smart application domains including energy, healthcare, law enforcement, manufacturing and transportation.

The brand new ISO/IEC 27040:2015 defines data storage-centric security as application of physical, technical and administrative controls to protect storage systems and infrastructure against unauthorized disclosure, modification or destruction. These controls can be compensatory, corrective, detective, deterrent, preventive or recovery in nature.

The rapid adoption of complex software-defined storage systems (SDS), ie. the uniting of compute, networking, storage, and virtualization into a hyper-converged storage solution, became a top data center trend impacting both data security and data recovery strategies in 2015.  Although simplifying rapid provisioning, ease of implementation and redundancy, while providing significant saving in cost, power and space, data storage-centric security remains a significant gap in the SDS infrastructure.

Due to superior accessibility, capacity-on-demand, flexibility and lower overall IT costs compared to legacy on-line compute and data storage methodologies, cloud computing has quickly become a mainstay on a worldwide basis. Yet, just like traditional online compute/storage methodologies, cloud computing has its own set of unique data security issues. Mitigating risks before and throughout a cloud adoption is the number one imperative among CIO/CISO/DPO’s, as they transition applications and data to the cloud. The decision to move to the cloud depends on the sensitivity of the data/application, service-level-agreement and overall cloud security infrastructure, and ultimately does the business value offset the risks?

In a recently released 2016 Trend Micro Security report, despite the need for Data Protection Officers (DPO) or a Chief Information Security Officer (CISO), less than 50% of enterprise organizations will have one, or a budget for them, by the end of 2016. With the EU GPDR directive, coupled with the ISO 27040 data security standard mandating a significantly higher degree of data protection, a DPO/CISO job slot designated solely to ensure the integrity of data within and outside the enterprise is a wise investment. With this higher degree of awareness, legislation and technology around data storage-centric security, we will begin to see a proactive shift in the enterprise policies, practices and strategies that will bring effective protection to the storage infrastructure.

Public safety is now a concern of every commercial enterprise, municipality, school and university. High-resolution video surveillance and law enforcement body-worn cameras (BWC) are generating more long-term video storage requirements than ever before. Enterprise IT must be able to balance a budget for both cameras and a secure infrastructure that enables easy, yet secure, data access. A wide variety of new BWC, chain-of-custody, evidence management and surveillance technology solutions are blossoming as new local, state and federal budget resources are being made available in 2016.

In the first quarter of 2015, IDC reported 28.3 Exabytes (28Billion Gigabytes) of data storage capacity was shipped worldwide. The majority (23%) of this spending was on server-based storage and hyperscale (SDS architecture) cloud infrastructures, while traditional external storage arrays fell significantly and were replaced by all-flash and hybrid flash arrays (NAND/HDD). Less than .05% of all these storage products shipped employed Self-Encrypting-Drive (SED) technology, while almost 90% of all flash ar
rays shipped were SED capable. SED offer FIPS 140-2 compliant security without all the overhead of a software-based encryption schema, coupled with self-describing encryption key management capability, making it a valued component in the secure data storage infrastructure.blog graph 02

Over the next several months throughout 2016, we will delve more deeply into the practical application of specific secure storage technologies, why and how to put security directly into the physical storage technology, advantages and disadvantages between specific data storage technology, cost analysis and more. Stay Tuned..

Related posts:

Anyone can build an SD card, but not all SD cards are created equal

industrial-sd-card-blog

SD cards and microSD cards of all varieties permeate the consumer market. The price is continually dropping. They’re mass produced in huge factories in Asia. The cost of materials is so low you may start to wonder is there a way to ensure you receive a high quality SD card for your automotive, medical, body worn camera or other demanding product?

Do you really want consumer SD or microSD cards for these projects? Not if storage is a critical component of your solution. You’re dealing with narrow requirements, and you don’t have room for mistakes and equipment malfunctions. Home users of electronic equipment like SD cards can work around the occasional dud or random defective piece, but working in a professional industry is a different ballgame. What to a causal user is simply an annoyance can make the difference between success or failure, and the odds aren’t small. You need to get things right the first time.

That’s why industrial SD and microSD cards aren’t an option for those working in certain industries—they’re a necessity. All SD cards are not created equal, and comparing your industrial SD card with one bought off the Walmart shelves is like comparing a grocery store hot dog with fine Paris cuisine. The cards you buy off store shelves are produced with minimal quality control, and there’s no consistency even among cards that come from the same manufacturer. You might test a hundred and find one you really like, but there’s no guarantee that the next one you buy with the same labeling is made of the same components or has anything like the same capability. There’s no guarantee for what’ll happen when you put it under stress.

What about DIGISTOR industrial SD cards? These come with a price tag, but there’s nothing arbitrary about the price: you get what you pay for. In this case, it’s high quality MLC/SLC NAND, a wide temperature range, and Consistency spelled with a capital C. It means you don’t have to worry about the bill of materials (BOM) changing on you—and that’s huge. If you build an application around one of our SD or microSD cards you can have the confidence to know that these cards will continue working in future instances of the application: no unpleasant surprises, no need for frequent retests.

Using an industrial SD or microSD card means you get quality and you get control. You get to utilize the resources within the controller. You have control over the BOM. Nothing can change without your say-so, and you know exactly what the capabilities of your cards are in every instance.

We’d be happy to share more with you about our industrial quality SD cards—just call us for more info. DIGISTOR stands for reliability and quality, and SD cards for your industrial application is one case where those two things really do matter.

Related posts:

Bringing Your Data Home

bringing-your-data-home-blog-image

You had your picture archive safe on Flickr, your documents on Dropbox, and a running archive of your devices on Apple’s iCloud. But when something happens to one of these services—like the two-day Dropbox downtime—you wonder if keeping your archives in cloud storage options really is the best way to go about this.  Cloud storage, no matter how respected the provider, is prone to downtime. And having your precious files suddenly disappear is not something you can take with equanimity.

How to Make a Smooth Switch From Cloud Storage to Home Data Archive Options
There’s something about having all that data available at home, in an archive of blu-ray discs or a storage drive; even if all of today’s big web companies go bankrupt, you’ve nothing to worry about. But what is the best data storage, and how to make the switch? Isn’t it too much work to be feasible? Bringing your data home may not be a half hour job, but if you do your planning first, it can be a smooth, easy run and not the huge headache that otherwise threatens.

Your first task is researching which type of storage device to use. Over the years you’ve probably accumulated more than a small amount of data, so your archive solution will need to have high capacity. You also want it to be reliable, long lasting, and you want to be able to add to it periodically. Should you buy a nice high-capacity hard disk drive, or is shelling out the bucks for a state of the art solid state drive the way to go?

The answer is—neither! Hard disk drives and solid state drives are both wonderful in their places, but for a home archive you can’t do better than go with Blu-ray discs. Unlike hard disk drives, which have lots of moving parts that are prone to breakage, a Blu-ray disc is simply a ‘page’ of written information—cold storage, if you will. Unlike solid state drives, where data could deteriorate if not accessed, the data on your Blu-ray discs can be left in a drawer for years and only read when you want what you’ve archived.  

Blu-ray discs are affordable, and they won’t take up much room. Over the years you could accumulate a collection of these discs, which can be stored conveniently in a small cabinet or magazine.

You can buy a quality external Blu-ray burner for a very reasonable price; and if you get it from us at Digistor, it’ll come with a program called Rewind™—software that will make archiving super-easy for Windows or OS X. You’ll need to buy your actual discs as well, of course—a set of 10 25GB or 50GB discs is a good place to start.

When you’ve settled on your storage device and ordered your equipment, the next thing to do is figure out how to reclaim your data from cloud storage. Some cloud storage solutions make export super-easy; from others, it is a pain, but it’s better to do it now then five years from now when you’ll have even more to deal with! If you’re looking at long download times, you may want to set up the process in the evening and let it run overnight. Make sure you have room on your computer for everything you’ll be downloading. If you don’t, setup an external hard disk for temporary storage.  You can always do it in parts, downloading one disc worth of archive material at a time.

Ready? Push that download button, and watch that data materialize out of thin air and come to solid existence on your home PC. When it’s all there, plug in your Blu-ray burner, stick a disc in and open Rewind™. Making a running archive of your data could scarcely be easier.  Choose a name for your archive, select your files, click ‘Archive It!’, and let the burn begin!

Then there is nothing left to do but organize your Blu-ray stash and file it somewhere safe and out of the way. Ideally, you’d make two identical archives, one for home, one for an alternate location. Disaster doesn’t happen often, but when it does , it’s well to be prepared.

For an extra safeguard, you can always keep your files up in your old web repository as well.  Cloud solutions are wonderful in their place; as a way to give you access to specific data from a wide variety of locations. They’re also a wonderful as a quick backup of small files in case of  natural disasters such as tornado and fire.  But for an all-purpose general archive of all your data, pictures, and information, nothing beats a well organized home-based storage center, like your new mini-cabinet of Blu-ray discs.

Related posts:

What You’re Paying for When You Buy SSD Drives Designed for Professional Video Shoots

SSD-film-blog-image

Sure, you can get an SSD that looks as though it ought to fit your video camera for fairly cheap on eBay or off the shelf. So what makes a “professional video” SSD, well, professional?

To begin with, not all SSD drives are compatible with a high-end video camera like that from Blackmagic Design.

Some don’t fit the camera; a standard 7mm SSD can make enough difference to either keep the drive from going in at all or making it slip around unforgivably once it’s been put in place. Most newly released SSD’s don’t have cameras in mind and are designed to be as thin as possible. This extra space within the camera can cause rattling, and additional wear on the SATA connection.

Others have firmware that just doesn’t work with your camera, interrupting your workflow with inability to record, or cause you to drop frames every time you try to shoot an important video.

That’s why brands like Blackmagic supply their customers with a list of approved SSDs that have been tested and been found to work.
These are higher-end SSDs that have been rigorously tested to ensure you can depend on them—and we’re proud that our DIGISTOR Professional Video SSD series is included on that list.

But they aren’t just one of the numbers. We’ve built them to be something special.

What is it that sets DIGISTOR Professional Video SSD Drives apart?
DIGISTOR Professional Video SSDs aren’t just compatible with your Blackmagic camera; they’re made to function with the camera as if they were born together. You can take your DIGISTOR Professional Video SSD Drive straight out of the box, stick it in your camera, and expect it to work immediately. Contrast that with the formatting, reformatting, and extensive fiddling you can expect if you use another SSD drive and you’ll already start to appreciate the synergy we’ve worked for.

Additionally, here’s an SSD series that’s all about video. (In fact, it’s the first and only!)

See Also: Top 5 things cinematographers love about our Professional Video SSDs

DIGISTOR Professional Video SSDs aren’t just a possible co-opt for filming needs, they’re designed for filming in 2.5K RAW and 2.5K and 4K ProRes along with our special 1TB SSD designed for 4K RAW & ProRes (HQ) 422 format. Extensively tested for Blackmagic Cinema and Production Cameras, our SSDs do more than support the equipment preferred by professional filmmakers.  Powerful, reliable and durable, DIGISTOR Professional Video SSDs aim to make a difference in your filming experience.

Bottom line? Made-for-PC or bottom shelf SSDs may save you a few dollars up front, but there’s a chance you could be throwing the entire cost away (not to mention the price of lost work!) if one fails to meet your needs.

Related posts: