In the world of archives, tape is the undisputed king. Although technological advances such as DNA and glass storage offer a glimpse into the future, there is currently no alternative that would be able to compete with tape in terms of reliability, longevity and cost.
Businesses still face a number of challenges in managing and storing data in the long run cloud storage or on site.
TechRadar Pro spoke about David Trachy, a senior manager in emerging markets at the warehousing company Spectra Logic, to find out how hybrid, continuous storage could solve some of the most difficult data problems for businesses.
What does the future look like in lightning? What impact will this have on the storage industry?
The fastest growing technology in the stock market is still NAND flash. It has the resilience and speed capability that favors both the consumer and business segments, but a key innovation for the future of the flash market is in the search for more capacity. Although the transition from planar (2D) to 3D NAND seemed very promising at the time, future capacity increases are proving unviable because adding writings simultaneously reduces the number of cell programming times, affecting long-term flash capacity. Another option to increase flash capacity is to reduce cell size. But given that 19 nanometers (nm) is as small as the industry plans to produce, and we’re already at 20 nm in the roadmap, this also seems like a dead end.
The greatest opportunity to achieve an increase in flash capacity is by increasing the number of layers of the chip; However, there are complex issues involved in constructing 100-story sections. For this and other reasons, there are no suppliers talking about building 136 floors past a single stack section. So we predict that future increases in flash capacity will be achieved primarily by stacking parts together. A string stacking technique is when multi-layer flash patterns are combined to form a flash chip with more layers. This can lead to less cost reduction in flash. System and cloud service providers take advantage of a zone-based interface (allowing data to be physically placed in zones that meet data performance requirements) for longer life, better performance, and greater capacity for their flash resources.
Which market effects have affected the magnetic disk the most? What’s in front of the record?
Shipments of disk drives shipped in the last four quarters decreased by approximately 20 percent to 255 million, compared to 328 million a year earlier. This decline is due to the weakening of flash technology in a market where disk was once the only choice. For example, most laptops now use flash storage. Recently, the new generation of gaming systems are all flash-based. Despite the deterioration of the 2.5-inch disk class, the 3.5-inch short-line disk class has experienced both capacity and volume deliveries compared to the year. It now generates more than 50% of disk revenue and is sold primarily to large IT stores and cloud service providers. Developing a single product with a few variations has allowed record companies to concentrate their resources, allowing them to remain profitable even when much of their old business is declining.
With several ongoing advances and a long LTO roadmap, it seems that the tape is still showing no signs of disappearing. What are the most important samples of tape innovations and what’s next for tape?
Tape is definitely here to stay. It is the perfect tool for long-term archiving. And thanks to its air gap feature, the tape has undoubtedly helped thousands of companies survive ransomware attacks. The world’s largest organizations – including cloud service providers – use tape. In fact, tape is regenerating because there is no storage medium in use in the world today that would have a higher density and cost than the tape period.
Although the backup of the primary disk systems of the digital tape business has decreased year by year (as IT backup has shifted to disk-based technology), the need for tape in the long-term archive market continues to grow. Tape technology is well suited for this space because it offers the benefits of a small environmental footprint on both floor surface and power; high data integrity in the long run; unlimited scalability; and a much lower price per gigabyte of storage than any other storage medium.
Linear Tape Open (LTO) technology has been and will be the primary tape technology. The LTO consortium guarantees interoperability for both LTO tape drives and media manufacturers. In 2018, the eighth generation of this technology was introduced, offering 12 TB of native capacity (uncompressed) per cartridge. It is expected that later in 2021, the ninth generation, LTO-9, will be introduced with an increase in capacity of 18 TB (unpacked): 50% compared to LTO-8. The LTO consortium offers a very solid LTO roadmap for future products all the way to LTO-12 with 144 TB capacity points on a single media.
A historical problem with tape has been the perception that it is “difficult to control.” Hierarchical Storage Management (HSM) attempted to solve the complexity of the tape by providing a standard network file interface for the application and allowing HSM to manage the tape system. To greatly facilitate tape management, an interface is required that accepts long retrieval times and is capable of specifying that an unlimited number of data units be retrieved at a time. A new de facto interface has emerged that, with the support of tape system vendors, would significantly expand the number of applications that could utilize tape. The S3 interface would be presented to the application and all data stored on the tape would be mapped to the offline layer. The application is hidden from all the details of tape management, and at the same time the tape system can not only manage the tape system, but can provide advanced features such as multiple copying, foreign tape management and rewriting – all done openly in the application. Numerous S3 applications could use a tape system that supports this interface, without the need to make changes to the tape. A future product has already been announced with this feature, and another is said to be released in 2021.
What are the deciding factors for organizations when choosing between cloud services and on-premises storage, and what predictions can you share on this topic?
There has been talk lately, even cloud service providers, the introduction of new hybrid systems (mainly hybrid continuous storage), which will allow the use of either cloud and / or on-site processing capabilities while providing long-term storage of raw and processed data for that processing, wherever processing takes place. The two storage spaces are defined as the project level and the perpetual level. Project storage always stays where data is active / processed either in the cloud or on site. However, with the new generation of storage solutions, organizations now have a choice, regardless of where the project layer is located, whether the Perpetual Tier (inactive data) should be located in the cloud or on-premises.
The first decision an organization must make when deciding on the location of both the project and perpetual levels is to determine where processing is performed – either in the cloud or on-site. Many factors must be weighed in making this decision, such as total cost of ownership, organizational diversity, and favoring the business over capital or operating costs. When analyzing the pros and cons of a cloud service or on-premises Perpetual Tier solution, organizations should ask themselves several questions, such as: 1) How much information is stored? 2) How long must the data exist? 3) How often and how much data must be returned? 4) How fast must the data be restored? 5) How committed is my organization in the long run to a particular cloud vendor? 6) Do we have the necessary facilities and staff to maintain the on-site solution?
Once the decision to process in the cloud service or onsite or a combination of the two has been made, the next decision is to place in the Perpetual Tier cloud service or onsite. To run processes in a cloud service, the project information must be in the online storage of that cloud service provider.
The ideal scenario may be for customers to be able to run Project Tier on-site or in the cloud while ensuring that the Perpetual storage system is on-site. This would require a next generation storage system. Consider a future on-site storage system where all the raw data is sent to it instead of the cloud and performs this data after receiving two actions. First, it “synchronizes” the data to the cloud so that cloud processing can take place in that data, and second, it would make an archive copy of that data to either a local disk or tape. In addition, the system can be programmed to automatically delete data from the cloud after a preset time, or the client can delete data manually when processing is complete.
What can you tell us about future technologies and what makes it mature?
The warehousing industry is an $ 50 billion annual market and will and will continue to attract venture capital investment in new technology. Many of these efforts have promised to significantly improve one or more of the basic features of storage, such as cost (per capacity), low latency, high bandwidth, and longevity. For the sake of clarity, over the last 20 years, a small proportion of venture capital investments have been allocated to the development of low-end storage devices and the majority to the development of storage systems that utilize existing storage devices as part of their solution. This development is more in line with the venture capital markets, as they are mainly software-based and require relatively little venture capital to achieve production. In addition, they are lower risk and faster to enter the market because they do not contain scientific breakthroughs related to materials, light or quantum physics phenomenon.
Much of the advanced basic research on breakthrough storage devices is purely evidence of the concept, funded by universities or the state, or funded by the private equity market. For example, a announcement was made about storing data in five dimensions on a piece of glass or quartz crystal capable of holding 360 TB of data literally forever. Advanced development work continues to attempt to store data in holograms, a technology that has long been longer than promises than results. Another group is researching the storage of data using DNA, and recently the company received $ 40 million for the idea of storing data by constantly bouncing data between space satellites in a low-Earth orbit.
Quantum-level evolution involves the storage of data by controlling the “rotation” of electrons. While these and other efforts can revolutionize data storage, it’s hard to believe that anyone is mature enough at the moment to make a significant impact on the digital universe, at least by 2030. Historically, many storage technologies have shown their promise in the prototype phase, but have not been able to move to the production of products that match the cost, durability, performance, and most importantly – reliability of current technology on the market. Given cloud providers, some of these technologies may become easier to market.