Data storage

A flash drive can be rewritten nearly a limitless number of times and is unaffected by electromagnetic interference (making them ideal for moving through airport security). Because of this, flash drives have entirely replaced floppy disks for portable storage. With their large storage capacity, and low cost, flash drives are now on the verge of replacing CDs and DVDs.

Isometric illustration of a group of people interacting among different types of storage systems

Data Storage

Data storage is the containment of any type of information in a particular location. Though today it is typically used to describe storing applications , files and other computing resources, it has existed as long as humans have. Data has been commonly stored and managed by memorizing, carving, writing, recording sound and video, printing type, taping, programming , creating files and powering servers .

It is estimated that the world will create 44 zettabytes of data in 2020; that’s 687 billion times larger than the data contained in all the scrolls in the Great Library of Alexandria, the largest library of the ancient world. And that number grows exponentially every year. Storing, managing and securing all that data requires enormous computing power and physical storage devices such as hard drives , flash memory , solid state drives and data tapes , whether on laptops, mobile devices or on servers in a cloud or data center . It also makes issues such as data storage integrity, reliability, and compatibility extremely important; nothing less than preserving the record of our civilization is at stake.

Data storage devices

Direct area storage, also known as direct-attached storage (DAS), is as the name implies. This storage is often in the immediate area and directly connected to the computing machine accessing it. Often, it’s the only machine connected to it. DAS can provide decent local backup services, too, but sharing is limited. DAS devices include floppy disks, optical discs—compact discs (CDs) and digital video discs (DVDs)—hard disk drives (HDD), flash drives and solid-state drives (SSD).

Network-based storage allows more than one computer to access it through a network, making it better for data sharing and collaboration. Its off-site storage capability also makes it better suited for backups and data protection. Two common network-based storage setups are network-attached storage (NAS) and storage area network (SAN).

NAS is often a single device made up of redundant storage containers or a redundant array of independent disks (RAID). SAN storage can be a network of multiple devices of various types, including SSD and flash storage, hybrid storage, hybrid cloud storage, backup software and appliances, and cloud storage. Here are how NAS and SAN differ:

A Brief History of Data Storage

data storage

Punch cards were the first effort at Data Storage in a machine language. Punch cards were used to communicate information to equipment “before” computers were developed. The punched holes originally represented a “sequence of instructions” for pieces of equipment, such as textile looms and player pianos. The holes acted as on/off switches. Basile Bouchon developed the punch card as a control for looms in 1725.

In 1837, a little over 100 years later, Charles Babbage proposed the Analytical Engine, a primitive calculator with moving parts, that used punch cards for instructions and responses. Herman Hollerith developed this idea, and made the Analytical Engine a reality by having the holes represent, not just a sequence of instructions, but stored data the machine could read.

He developed a punch card data processing system for the 1890 U.S. Census, and then started the Tabulating Machine Company in 1896. By 1950, punch cards had become an integral part of the American industry and government. The warning, “Do not fold, spindle, or mutilate,” originated from punch cards. Punch cards were still being used quite regularly until the mid-1980s. (Punch cards continue to be used in recording the results of standardized tests and voting ballots.)

In the 1960s, “magnetic storage” gradually replaced punch cards as the primary means for data storage. Magnetic tape was first patented in 1928, by Fritz Pfleumer. (Cassette tapes were often used for homemade “personal computers,” in the 1970s and 80s.) In 1965, Mohawk Data Sciences offered a magnetic tape encoder, described as a punch card replacement. By 1990, the combination of affordable personal computers and “magnetic disk storage” had made punch cards nearly obsolete.

In the past, the terms “Data Storage” and “memory” were often used interchangeably. However, at present, Data Storage is an umbrella phrase that includes memory. Data Storage is often considered long term, while memory is frequently described as short term.

In 1948, Professor Fredrick Williams, and colleagues, developed “the first” Random Access Memory (RAM) for storing frequently used programming instructions, in turn, increasing the overall speed of the computer. Williams used an array of cathode-ray tubes (a form of vacuum tube) to act as on/off switches, and digitally store 1024 bits of information.

Data in RAM (sometimes called volatile memory) is temporary and when a computer loses power, the data is lost, and often frustratingly irretrievable. ROM (Read Only Memory), on the other hand, is permanently written and remains available after a computer has lost power.

In the late 1940s, magnetic core memory was developed, and patented, and over ten years, became the primary way early computers wrote, read, and stored data. The system used a grid of current carrying wires (address and sense wires), with doughnut-shaped magnets (called Ferrite Cores) circling where the wires intersected. Address lines polarized a Ferrite Core’s magnetic field one way or the other, creating a switch that represents a zero or one (on/off). The arrangement of address and sense wires feeding through the ferrite cores allows each core to store one bit o’ data (on/off). Each bit is then grouped into units, called words, to form a single memory address when accessed together.

In 1953, MIT purchased the patent, and developed the first computer to use this technology, called the Whirlwind. Magnetic core memories, being faster and more efficient than punch cards, became popular very quickly. However, manufacturing them was difficult and time consuming. It involved delicate work, using women with steady hands and microscopes to tediously thread thin wires through very small holes.

The Twistor Magnetic Memory was invented in 1957 by Andrew Bobeck. It creates computer memories using very fine magnetic wires interwoven with current-carrying wire. It is similar to core memory, but the wrapped magnetic wires replace the circular magnets, and each intersection on the network represents one bit o’ data. The magnetic wires were specifically designed to only allow magnetization along specific sections of the length, so only designated areas of the Twistor would be magnetized, and capable of changing polarization (on/off).

Bell Labs promoted the Twistor technology, describing it as superior to magnetic core memories. The system weighed less, required less current, was cheaper to produce, and was predicted to provide much lower production costs. The Twistor Memory concept led Mr. Bobeck to develop another short-lived magnetic memory technology in the 1980’s, known as Bubble Memory. Bubble memory is a thin magnetic film using small magnetized areas which look like bubbles.

Data storage

Private cloud storage is a service model for provisioning storage to users in an organization. This service model offers storage on-demand, with the same private cloud capabilities: on-demand access, resource pooling, elasticity and metering.

Data Backup in Depth: Concepts, Techniques, and Storage Technologies

In an increasingly digitized business landscape, data backup is vital for the survival of an organization. You can get hacked or ransomed, and lose your data to thieves who’ll sell your trade secrets to the highest bidder. Injected malware can corrupt your hard-earned information. Disgruntled employees or other insider threats can delete your valuable digital assets. Can you recover from data loss?

Data backup is a practice that combines techniques and solutions for efficient and cost-effective backup. Your data is copied to one or more locations, at pre-determined frequencies, and at different capacities. You can set up a flexible data backup operation, using your own architecture, or make use of available Backup as a Service (BaaS) solutions, mixing them up with local storage. Today, there are plenty of corporate storage TCO solutions to help you calculate costs, avoid data loss, and prevent data breaches.

What Is a Data Backup?

Data backup is the practice of copying data from a primary to a secondary location, to protect it in case of a disaster, accident or malicious action. Data is the lifeblood of modern organizations, and losing data can cause massive damage and disrupt business operations. This is why back ing up your data is critical for all businesses, large and small.

Typically backup data means all necessary data for the workloads your server is running. This can include documents, media files, configuration files, machine images, operating systems, and registry files. Essentially, any data that you want to preserve can be stored as backup data.

  • Backup solutions and tools—while it is possible to back up data manually, to ensure systems are backed up regularly and consistently, most organizations use a technology solution to back up their data.
  • Backup administrator—every organization should designate an employee responsible for backups. That employee should ensure backup systems are set up correctly, test them periodically and ensure that critical data is actually backed up.
  • Backup scope and schedule—an organization must decide on a backup policy, specifying which files and systems are important enough to be backed up, and how frequently data should be backed up.
  • Recovery Point Objective (RPO)—RPO is the amount of data an organization is willing to lose if a disaster occurs, and is determined by the frequency of backup. If systems are backed up once per day, the RPO is 24 hours. The lower the RPO, the more data storage, compute and network resources are required to achieve frequent backups.
  • Recovery Time Objective (RTO)—RTO is the time it takes for an organization to restore data or systems from backup and resume normal operations. For large data volumes and/or backups stored off-premises, copying data and restoring systems can take time, and robust technical solutions are needed to ensure a low RTO.

How does it work?

Like on-premise storage networks, cloud storage uses servers to save data; however, the data is sent to servers at an off-site location. Most of the servers you use are virtual machines hosted on a physical server. As your storage needs increase, the provider creates new virtual servers to meet demand.

Typically, you connect to the storage cloud either through the internet or a dedicated private connection, using a web portal, website, or a mobile app. The server with which you connect forwards your data to a pool of servers located in one or more data centers, depending on the size of the cloud provider’s operation.

As part of the service, providers typically store the same data on multiple machines for redundancy. This way, if a server is taken down for maintenance or suffers an outage, you can still access your data.

  • Public storage clouds: In this model, you connect over the internet to a storage cloud that’s maintained by a cloud provider and used by other companies. Providers typically make services accessible from just about any device, including smartphones and desktops and let you scale up and down as needed.
  • Private cloud storage: Private cloud storage setups typically replicate the cloud model, but they reside within your network, leveraging a physical server to create instances of virtual servers to increase capacity. You can choose to take full control of an on-premise private cloud or engage a cloud storage provider to build a dedicated private cloud that you can access with a private connection. Organizations that might prefer private cloud storage include banks or retail companies due to the private nature of the data they process and store.
  • Hybrid cloud storage: This model combines elements of private and public clouds, giving organizations a choice of which data to store in which cloud. For instance, highly regulated data subject to strict archiving and replication requirements is usually more suited to a private cloud environment, whereas less sensitive data (such as email that doesn’t contain business secrets) can be stored in the public cloud. Some organizations use hybrid clouds to supplement their internal storage networks with public cloud storage.

Pros and cons

  • Off-site management: Your cloud provider assumes responsibility for maintaining and protecting the stored data. This frees your staff from tasks associated with storage, such as procurement, installation, administration, and maintenance. As such, your staff can focus on other priorities.
  • Quick implementation: Using a cloud service accelerates the process of setting up and adding to your storage capabilities. With cloud storage, you can provision the service and start using it within hours or days, depending on how much capacity is involved.
  • Cost-effective: As mentioned, you pay for the capacity you use. This allows your organization to treat cloud storage costs as an ongoing operating expense instead of a capital expense with the associated upfront investments and tax implications.
  • Scalability: Growth constraints are one of the most severe limitations of on-premise storage. With cloud storage, you can scale up as much as you need. Capacity is virtually unlimited.
  • Business continuity: Storing data offsite supports business continuity in the event that a natural disaster or terrorist attack cuts access to your premises.
  • Security: Security concerns are common with cloud-based services. Cloud storage providers try to secure their infrastructure with up-to-date technologies and practices, but occasional breaches have occurred, creating discomfort with users.
  • Administrative control: Being able to view your data, access it, and move it at will is another common concern with cloud resources. Offloading maintenance and management to a third party offers advantages but also can limit your control over your data.
  • Latency: Delays in data transmission to and from the cloud can occur as a result of traffic congestion, especially when you use shared public internet connections. However, companies can minimize latency by increasing connection bandwidth.
  • Regulatory compliance: Certain industries, such as healthcare and finance, have to comply with strict data privacy and archival regulations, which may prevent companies from using cloud storage for certain types of files, such as medical and investment records. If you can, choose a cloud storage provider that supports compliance with any industry regulations impacting your business.

The 28 Best Enterprise Data Storage Companies for 2022

The Best Enterprise Data Storage Companies

Solutions Review’s listing of the best enterprise data storage companies is an annual sneak peek of the solution providers included in our Buyer’s Guide and Solutions Directory. Information was gathered via online materials and reports, conversations with vendor representatives, and examinations of product demonstrations and free trials.

Enterprise data storage is a centralized repository for information, which commonly offers data management, protection, and sharing functions. Because enterprises handle massive amounts of business-critical data, storage systems that are highly scalable , offer unlimited connectivity , and support multiple platforms would benefit them the most . There are multiple approaches to data storage to choose from, including Storage Are a Networks (SANs), Network-Attached Storage (NAS), Direct-Attached Storage (DAS), and cloud storage. The importance of data storage is underlined by the exponential generation of new data and the proliferation of Internet of Things ( IoT ) devices.

Newer approaches and technologies that are currently disrupting the market include hyperconverged storage and flash technologies such as Non-Volatile Memory Express ( NVMe ). This trend stems from the increased amount of horizontal scalability and reduced latency these methods offer. Storage for containers is also becoming a stronger selling point, as well as enterprise storage based on composable and disaggregated infrastructure concepts , which combine individual resources at the hardware level and then assemble them at the software level by using APIs .

Selecting the best enterprise data storage company to work with can be a daunting task, and we’re here to help. That’s why our editors have compiled this list of the best enterprise data storage companies to consider if you’re looking for a new solution.

Download Link to Data Storage Buyer

Amazon Web Services (AWS) offers a range of IT infrastructure services to enterprises. In addition to storage, the provider’s solutions and products include cloud computing, compute , networking, content delivery, databases, analytics, application services, backup, and archive. AWS provides a variety of cloud storage solutions, such as Amazon Elastic Block Store (Amazon EBS), Amazon Simple Storage Service (Amazon S3), and AWS Backup, among others. Users are enabled to select from object, block, and file storage services as well as cloud data migration options when selecting their solution. The vendor’s various platforms also support both application and archival compliance requirements.

Caringo is a provider of object-based technology for accessing, storing, and distributing unstructured or file-based data. Its flagship product, Caringo Swarm , provides private cloud storage that enables users to deploy storage clusters without being locked into proprietary hardware. In addition to data storage, the provider offers enterprise IT, medical, high-performance computing, and media and entertainment solutions. Caringo’s storage platform is offered in private, public, and hybrid cloud environments. Users also have the ability to scale on – premise with any mix of x86 hardware .

Cloudian is an independent provider of object storage systems, offering S3 compatibility along with a partnership ecosystem. The vendor’s flagship solution, HyperStore , provides scalability, flexibility, and economics within the data center. Additionally, Cloudian’s data fabric architecture allows enterprises to store, find, and protect object and file data across sites. These processes can take place both on- prem and in public clouds within a single, unified platform. In 2020, Cloudian HyperStore was recognized as a 2020 Gartner Peer Insights Customers’ Choice for Distributed File Systems and Object Storage.

Cohesity consolidates secondary storage silos onto a hyperconverged , web-scale data platform that supports both public and private clouds. The vendor’s storage solution enables users to streamline their backup and data protection and then converge file and object services, test/dev instance, and analytic functions to provide a global data store. Cohesity deliver s a single platform, a single GUI, and an app ecosystem, as well as machine learning capabilities. The provider offers two hyperconverged platforms , C3000 and C4000, as well as its distributed file system solution, Cohesity SpanFS . In 2020, Cohesity raised $250 million in Series E funding. Additionally, it was named a ‘Leader” in the GigaOm Report on Unstructured Data Management Solutions.

Data storage

A 3-2-1 backup strategy is a method for ensuring that your data is adequately duplicated and reliably recoverable. In this strategy, three copies of your data are created on at least two different storage media and at least one copy is stored remotely:

Data Backup in Depth: Concepts, Techniques, and Storage Technologies

In an increasingly digitized business landscape, data backup is vital for the survival of an organization. You can get hacked or ransomed, and lose your data to thieves who’ll sell your trade secrets to the highest bidder. Injected malware can corrupt your hard-earned information. Disgruntled employees or other insider threats can delete your valuable digital assets. Can you recover from data loss?

Data backup is a practice that combines techniques and solutions for efficient and cost-effective backup. Your data is copied to one or more locations, at pre-determined frequencies, and at different capacities. You can set up a flexible data backup operation, using your own architecture, or make use of available Backup as a Service (BaaS) solutions, mixing them up with local storage. Today, there are plenty of corporate storage TCO solutions to help you calculate costs, avoid data loss, and prevent data breaches.

Data Center Storage Companies

These data storage companies are market leaders and wield big influence. They’re the go-to companies for businesses that are looking to deploy storage area network (SAN) or network-attached storage (NAS) technologies, and increasingly, hybrid cloud storage solution.

No discussion of the enterprise data storage market is complete without mentioning Dell EMC. Since the blockbuster $67 billion merger of server and PC maker Dell with data storage giant EMC in 2016, the combined company has lived up to the EMC’s legacy by remaining atop the external enterprise storage systems market, essentially the arrays that make up a SAN and/or NAS (many models today can pull double duty), according to technology analyst firm IDC. Product lines of note include Isilon NAS storage, EMC Unity hybrid-flash storage arrays for block and file storage, SC series arrays and the enduring VMAX family of products.

Hewlett Packard Enterprise and its Chinese joint venture, the New H3C Group, has surpassed Dell EMC in the overall market for enterprise storage systems, but the company still has to cover a lot of ground to catch up to it rival in the traditional storage array segment. Notable product lines include HPE 3PAR StoreServ midrange arrays, entry-level HPE StoreEasy Storage NAS systems and flash-enabled MSA Storage.

Data Storage Companies: Up and Coming Storage Vendors

Cobalt Iron

Data backup, to be sure, is of absolutely critical importance. Backup is the biggest application in the data center for a reason. But it is this critical importance that makes backup so challenging – it must be done right. Cobalt Iron’s Compass solution uses automation to ease the constant challenging work load of managing data backup.

As part of its mission to enable better backup, Cobalt offer the much sought after “single pane of glass” – a single dashboard to manage backup. This console can manage the many enterprise backup tasks at any level, including global policy administration. And if there’s a problem with your system? This single console gives you the data you need to more easily troubleshoot the system. Compass can be found in many global data centers, including AWS, Google Cloud, IBM Cloud, Alibaba and Microsoft Azure.

Pavilion Data Systems

You may not have seen the acronym NVMe-oF, but it’s reasonable to assume we will see more of this term in the years to come. It stands for NVMe over Fabric, a solution designed to enable faster speeds in today’s data storage environments. Pavilion Data Systems is a leader in this area, a pioneer in developing this blazingly fast technology.

Yes, this is impressive: the Pavilion Data Platform offers write performance at speeds up to 90 GBs per second. Write latency can be a low as 40 microseconds; in other words, remarkably close to instantaneous. The company incorporates Swarm, decentralized data storage and distribution; with Swarm, the Pavilion system can rebuild a solid state drive in just a few minutes. In a data storage world that must move ever faster, these speed boosts really help.


Software defined storage has come from being a vague buzzword to a truly adopted technology over the last few years, though the term “software defined” continues to be used in any number of ways. Riding this wave, StorOne offers software-defined data storage solution that is as flexible as it needs to be to handle any number of storage tasks.

It can be used for (of course) virtual storage, or to handle cloud storage and other secondary storage, hybrid arrays or those ultra-fast all-flash arrays. If your company uses file, block or object storage, StorOne can handle it.


During the first freight elevator ride (when it is engulfed in flames), stay on even after the RPG cyborgs detonate the large fuel tank until an armored cyborg with a katana drops down (it will be in the last wave of enemies). Like the Slider UG carrying a data cube from R-03, this enemy has a data cube you cannot see. Kill him (a zandatsu will let you recover and grab the cube all at once) and escape to the second freight elevator before it collapses.