Storage Vendors - Technology Impact the last 10 years
Arcserve
Sam Roguine, backup, DR and ransomware prevention evangelist
Attacks affecting critical infrastructure made ransomware lethal
The past decade has seen cyberattacks emerge as a major threat to businesses and their data. For companies in critical sectors like oil and gas, among others, it has become clear that the impact of a cyberattack could be significantly more far-reaching than just data loss. Cybercriminals have targeted systems most essential to a company’s bottom line and solicited a payout in return for a decryption key. To maximize their chances, these criminals have targeted critical systems and focused on industries essential to keeping society up and running. Companies have therefore had to expand their data protection and security protocols.
Minimizing downtime
In critical industries, one of the main concerns is keeping downtime to a minimum to prevent widespread impact. As a first step, companies should define recovery point and time objectives for each system and application in their network. Categorizing applications by risk and determining what would have the biggest negative impact if they are not recovered quickly enables IT teams to chart a hierarchy.
While losing data wasn’t always deemed as serious as an attack that disrupts critical operations like power and gas, the importance of safeguarding data has become clear. It could contain information vital to the recovery process or be relevant for regulatory compliance – another major disruption of recent years. Many industries reverted to storing backups separately from the main network to help ensure they remained clean.
By integrating cybersecurity with data protection, companies were able to make the recovery process smoother and streamline IT efforts. A two-pronged approach also reduced the time between detection of an attack or breach and kickstarting backup and recovery protocols, making attacks less damaging.
Critical infrastructure workers became critical
Over the past decade, cyberattacks haven’t just been an IT issue. They’ve also had a significant business impact and members of every department have become involved in the planning process. Clear communication is particularly important, as attacks in these sectors could potentially be life-threatening.
In 2020, MITRE shared an update to its ATT&CK knowledge framework that specifically addresses the tactics cybercriminals use when attacking ICS – and tips for how to defend against them. Many IT and security professionals working in critical infrastructure now refer to this framework when building attack response plans and securing key systems. Businesses have also recognized the importance of regularly updating and testing plans so employees know how to respond to a real crisis.
As digital transformation continues to accelerate and attack surfaces expand, the upcoming decade, just like the previous, is sure to bring a variety of new cyber threats. Cyberattacks affecting critical infrastructure will only increase as cybercriminals realize just how damaging they can be, meaning IT pros working in these critical industries will need to be prepared.
FujiFilm
Rich Gadomski, head of tape evangelism
The past 10 years have been marked by explosive data
growth, and the tape industry has experienced a renaissance thanks to
significant advancements in capacity, reliability, performance, and
functionality that have led to new applications and key industry adoption.
Capacity
- In terms of capacity, the decade started for LTO
with LTO-5 at 1.5TB native capacity and culminated most recently with LTO-8
at 12TB and LTO-9 soon to be delivered at 18TB.
- Enterprise tape formats started the decade at 1TB
native and are currently at 20 TB native.
- Barium Ferrite magnetic particles became a key
enabler for multi-terabyte tapes and were demonstrated by IBM and FujiFilm
in 2015 to have the potential to achieve 220TB on a single tape cartridge.
This signaled that tape technology had no fundamental areal density
limitations for the foreseeable future.
- By the end of the decade, IBM and FujiFilm
demonstrated the ability to achieve a record areal density of 317Gb per
square inch using the next generation of magnetic particles, Strontium
Ferrite, with potential cartridge capacity of 580TB.
Reliability and Performance
- During the decade, tape achieved the highest
reliability rating as measured by bit error rate at 1×1019
, even better than enterprise HDD at 1×1016.
- Data transfer rates for tape also improved from
140MB/s in 2010 to an impressive 400 MB/s.
- LTFS provided an open tape file system with media
partitions for faster “disk-like” access and ease of interchangeability,
making LTO a de facto standard in the M&E industry.
New Applications and Key Industry Adoption
- Storing objects on tape became a reality with object
archive software solutions offering S3 compatibility, objects can now move
to and from tape libraries in their native object format.
- The concept of active archiving grew in popularity
with tape as a key component complementing flash, HDD and cloud for
cost-effectively maintaining online archives.
- Tape was recognized for its ease of removability and
portability, providing air gap protection in the escalating war against
cybercrime.
- Major U.S hyper scalers began to rely on tape during
the decade for both back-up and deep archive applications. In one
well-publicized example, Google restored a February 2011 Gmail outage from
its tape backups. Microsoft adopted tape for Azure later in the decade. Tape
became firmly established as a competitive advantage for these and other
hyper scalers based on its scalability, long archival life, lowest TCO, low
energy consumption, and air gap security.
- With this steady technological advancement over the
last decade, tape has been recognized for its complementary value to flash,
HDD and cloud in tiered storage strategies for managing data in the
zettabyte age.
Quantum
Eric Bassier, senior director, products
The last decade has delivered tremendous storage
advancements and innovations. Flash has become widely adopted and
commercialized, and NVMe development now allows the industry to unlock the
performance of flash storage. Infrastructures have evolved from direct-attached
to network-attached and now hyperconverged. We saw the development and broad
adoption of public cloud infrastructure-as-a-service and software-as-a-service
bring new elasticity levels to compute and storage resources. And data continued
to grow at exponential rates, with 90% of all the world’s data being generated
in recent decades.
Unstructured data, in particular video, digital images,
and other ‘rich’ data generated by machines and devices, is growing at rates of
30-60% per year, and will soon represent 80% of all the planet’s data.
It’s this unstructured data that’s driving a revolution
in how businesses are thinking about their storage infrastructure. Not only is
it growing exponentially, it lives everywhere. It’s generated ‘at the edge’ – in
a lab, on a city street, on the manufacturing floor, in space – and then needs
to be processed, analyzed, and consumed. This data in most industries is central
to an organization’s business or mission.
If the last decade was about storage innovation and
management, this coming decade is about data innovation and management.
Organizations will use AI analytics, index data and add context to it, and layer
on metadata to enrich all of these valuable files and objects. We’ll see
tremendous advancements and adoption of AI.
Many organizations will start to treat data as the
valuable asset it is. It will be curated, well organized, and tagged in a way
that makes sense to business users. Much of this data will need to be kept
indefinitely – stored, protected, and accessible to users and applications. Data
must remain at our fingertips.
The biggest emerging challenges of the next decade are:
how to store and process data, discern what’s valuable, and unlock the business
value to drive the next discovery, innovation, or business breakthrough.
Spectra Logic
Matt Starr, CTO
The Rise of Ransomware
From the early days of computer hacking, threat actors
have always attempted to steal data from an organization or business – which is
one of the primary purposes of computer viruses. Over the last 10 years, the
goal has changed from stealing data to kidnapping the data. Today,
cybercriminals use ransomware to hold an organization’s data hostage with the
threat of destroying the data or selling it on the dark web if their monetary
demands are not met in a timely fashion.
The rise of ransomware puts a premium on data and data
protection. Over the last decade, as threat actors have become more
sophisticated, they have realized that money can be made by electronically
infiltrating an organization’s IT department and encrypting data that might
include trade secrets, IP, customer lists, etc. while threatening to sell or
destroy that critical data unless a ransom is paid – usually in bitcoin to
prevent tracking by authorities. The larger the company the more valuable the
haul in many cases. Because an organization’s data is most important to that
entity, they will likely pay more to get it back than anyone on the black
market.
IT workflows and systems have become faster and more
resilient – enabling organizations to improve their IT infrastructures and
better serve their internal and external customers. These complex networks,
however, also leave organizations more vulnerable to the encryption of
ransomware as, once the systems on the network are attacked, they often are
encrypted at the speed of disk. Only those devices that store data out of the
network stream, such as tape, are safe from ransomware’s impact.
The Old Mantra that ‘Tape is Dead’ is Dead
For the last decade, just as in decades before that,
the industry has been pronouncing the death of tape storage. However, today,
tape is back into a growth mode according to most analysts assessing the market.
Tape, with recent R&D innovations and deployments now sits poised for another
10-plus years of success. Unlike the nearest competitor, the spinning archive
disk, tape’s R&D hurdles are well understood, the physics are known and don’t
need to be invented. Even most of the cloud storage providers know that tape has
to be a part of the ecosystem from a cost and reliability standpoint. As cloud
vendors develop cold tiers of storage at a fraction of a cent per month per
gigabyte, each forcing the other to a lower and lower price, tape, with its low
total cost of ownership, high reliability, scalable capacity and low power and
cooling costs, is the logical solution. And tape’s traditional usage for archive
in industries such as oil and gas, M&E, government, scientific research, and
more, is seeing increased deployment. With the advent of ransomware, tape is one
of the only storage platforms that provides an air gap from that ransomware,
meaning that it sits outside the network stream so cannot be infected. This
means that, as data on disk are being encrypted, the data backed up and archived
to tape remains safe and ready to allow a recovery of an organization’s IT
platform.
Cloud Storage Comes into its Own
Over the last decade, as cloud has gone from a
discussion to a reality, more and more companies look to the cloud for their “IT
solution”, whatever that solution may be. In some cases, it is moving from a
Capex to an Opex model, and in other cases, it is the idea that cloud must be
cheaper than on-premise storage due to scale. For customers who have jumped head
long into cloud storage, most have discovered the bitter pill that charges and
other hidden fees drive the cost of cloud storage way up beyond their
expectations and higher than their original budgets. It not to say that storing
data in the cloud is wrong or bad, it is that so many companies have jumped on
the cloud bandwagon without doing the same due diligence required for any onsite
architectural changes. Like getting caught up in a wave of excitement without
much thought, some users ran to the cloud with their data in hand, only to now
learn the true cost of that move. Cloud storage, for customers who have used it,
still have it or have left it, now understand the nuances of a cloud contract
and cloud pricing model (including the inevitable and undesirable ‘cloud lock-in
scenario’) vs. a capital purchase. Those users will be more cautious and careful
over the next decade as they enhance their infrastructures, perhaps with a
hybrid cloud approach that offers the best of both worlds – the scalability and
accessibility advantages of cloud with the control, affordability and
flexibility of on-premise storage.