NAS (Network Attached Storage)
NAS and SAN: The Waiter and the Chef
Complementary Technologies, Not Competing Solutions
Jun. 27, 2005 06:00 AM
Each year, one of the most eagerly awaited events for food aficionados is the publication of the new Zagat Guides for restaurants. Within the pages of various editions are listings and reviews of hundreds of the top dining experiences around the world – each designed to delight the palate and rejuvenate the soul. Earning top marks in the Guide is not only a source of pride; it’s essential to the success of these top-tier establishments.
While there are a great many factors that go into a great dining experience, essentially they fall into one of two categories: the quality of the food and the quality of the service. Both must be exceptional in order to make the top of the list. If the food is great and the service is poor, or service is great but food is humdrum, the restaurant falls down the list.
Today’s storage world is much the same way. Much is being made about network attached storage (NAS) and storage area networks (SANs) as options for the high-volume data storage needs of modern enterprises. Yet when you look closely, these are not competing solutions, but rather complementary technologies that are best suited to different tasks.
Just as you probably don’t want the chef waiting on your table, or your waiter cooking your duck a l’orange, it’s important to make sure your storage technologies are doing what they do best, and not trying to cover a function best left to the other. Let’s take a look at the best functionality of each (the technologies, not the restaurant staff), and see how they fit into an overall information lifecycle management (ILM) strategy.
NAS – The Waiter
In the storage world, NAS serves the function of the waiter. It works best for file or block-level data access, acting as a gateway between the SAN and workgroups or users. In other words, it brings the information out from the kitchen and sends it to the appropriate table. This is a function it performs very well.
NAS is attractive because it is generally plug and play, with a low cost of acquisition and management. There’s no need to carve out logical units (LUNs) the way you do with SAN since the RAID array, tape, hard disk, or other device is attached directly to each server or group of servers. This method ensures it is up and running quickly. NAS is also very agile, serving up data quickly as needed since there’s a one-to-one relationship between the network and the storage unit.
From a technical standpoint, NAS uses an IP protocol to serve files to clients. In effect it acts like a giant network server, only providing access to a larger pool of files.
Where enterprises run into trouble with NAS is when they try to make it their primary method of high-volume bulk storage. Usually they are comfortable with the NAS design they already have in place and continue to add to it. That strategy seems logical on the surface, but in practice it doesn’t work as well as you might hope.
The problem is that while NAS has some scalability, that scalability is not linear. At some point the curve flattens out and NAS is no longer capable of handling the workload. Depending on the size of the organization and the topology of the network, having individual NAS servers for various workgroups also tends to work against its native simplicity, requiring more resources rather than fewer to manage the organization’s storage needs.
In a small organization, NAS can serve both functions – just as one person can cook and serve the food in a small restaurant. But as the enterprise grows and becomes more sophisticated, the needs change and it’s time for a separation of responsibilities.
SAN – The Chef
Where NAS is more of a device-oriented strategy, SANs are really an architecture or method of providing storage. They incorporate a wide variety of storage devices and storage spaces that sit at a higher level than a typical NAS device. They serve up data blocks to servers over a Fibre connection rather than directly serving files to clients. A server taps into the SAN when a request comes in, then provides the files out of that data block.
SANs are designed to help improve throughput and file sharing by centralizing data rather than dividing it by workgroups. This arrangement also helps speed and simplify critical backups in large organizations. In short, it is the lynchpin in an effective ILM strategy.
Going back to our restaurant analogy, the SAN is the kitchen where all of the food is prepared. It doesn’t matter if the diners order beef, fish, poultry, or even vegetarian. Everything needed to give them what they want is there, and it is then routed out on demand via the waiters. Using NAS for the same task would require separate kitchens for each of those types of dishes, or one kitchen for the tables covered by each of the waiters. And the waiters would have to take the order, then go in back to cook the meal. This is not the function of that employee. With that in mind, NAS solutions don’t fit every storage need. The convenience of replicating a NAS storage solution throughout the entire enterprise is outweighed by the fact that it’s not designed for certain situations.
Putting a SAN solution in place takes the burden off of local servers, speeding delivery of the information to the user by eliminating the need for servers to search their own disks (or extensions thereof) for data. The network is not congested with an abundance of IP traffic but rather the storage fabric network handles that transport. Instead, data storage becomes more of a virtual function, a pass-through from the server to the mass storage arrangement that has been optimized for this single function. Separating storage from servers simplifies storage administration; instead of having to manage multiple LAN or WAN storage arrangements, IT resources can administer a single, centralized, dedicated resource.
SAN makes storage more efficient as well. In a typical network, one server might be maxed out on storage space while another has several GB of space available. SANs efficiently pool all the storage together so each server has equal access to the total amount of space available within the organization. They also provide the ability to manage that storage centrally, which alone could justify the ROI on that investment. This method also helps reduce file redundancy since files should only be stored in one place (the SAN) in the organization rather than on multiple servers throughout the enterprise.
Another advantage of SAN is that it makes the “black box” concept for storage work. It places an umbrella over the system, allowing you to mix and match manufacturers rather than having to accept a monolithic storage solution that locks you into a single manufacturer (and that manufacturer’s pricing structure). With SANs, the economics of competition come into play, allowing you to seek out the best product (and best deal) as new needs arise. That is true storage virtualization. This will allow companies to constantly analyze their IT storage portfolio to maximize their storage investment.
This concept also ensures that you are able to protect the investments you’ve already made rather than having to scrap one SAN in favor of another. You can add or replace storage units as necessary and easily fit them into your overall SAN strategy.
The downside of SAN is that it’s not as good as NAS at working with multiple file platforms. For those used to working with NAS, there may also be some sticker shock as it can be far more costly. In addition, setting up a SAN is much more complex than installing NAS on the back end of your servers. Some SAN customers never utilize the full benefits of the SAN, instead using it as a basic backup and storage device. In those cases, customers should make sure that if they are paying for an eight-course meal they are not merely nibbling on the appetizer and going home. You have to make sure you have the time and resources available to make the jump in order to realize the business benefits.
Which to Choose
Given these factors, the question facing many organizations is whether to stay with (or add) NAS to their networks or turn over all the cards and bring in a SAN. The answer, in my opinion, is yes.
A blended solution of NAS and SAN offers the greatest flexibility and performance advantage for most organizations. The more heterogeneous your server environment, the more important NAS becomes for smooth operation between servers. And the higher the volume of data firing around your enterprise, the more important SAN becomes for working with it effectively.
Having NAS in place simplifies access to the SAN. In fact, NAS is the ideal gateway to a SAN, helping take the data blocks provided by the SAN and routing them to the proper servers in the form of files. At the same time, having a SAN in place allows NAS to work more efficiently by removing the burden of mass storage of less critical data. Important files can be stored locally on the NAS device, while those thousands of joke e-mails tying up space on the Microsoft Exchange server can be offloaded to the SAN.
Getting to ILM
Establishing the right combination of storage is critical to achieving the goals of information lifecycle management. The whole purpose of ILM is to allow organizations to prioritize data and establish a hierarchy for information based on its value rather than treating all data as equal. Combining NAS and SAN makes it happen.
SAN provides a foundation for ILM by allowing you to segment storage in LUNs. Once it is segmented, backups can be set to occur at different intervals by order of importance rather than backing up everything on the network every night. It also allows you to move from physical tape to virtual tape drives – systems that use your current tape back up software to save data onto hard drives.
Virtual tape drives are proving far more reliable than tape, which studies have shown fail roughly 70% of the time during backup. They also eliminate the need to perform a time-consuming restoration since the backup is disk-based and therefore nonlinear.
NAS provides accessibility to the stored data, helping users get to critical data on the SAN quickly. It makes it easier to find data in a given LUN and provide the cross-platform access required by various applications. Together, they help the organization manage its data better, driving down costs while freeing up resources for other tasks.
Cooking Up Success
Just as a restaurant needs both top chefs and attentive waiters to earn the top ranking from Zagat’s, your enterprise will benefit from a hybrid NAS and SAN solution. Together they will keep your storage optimized, your business more productive, and your enterprise data safer than an either/or approach.
Reader Feedback: Page 1 of 1
Subscribe to the World's Most Powerful Newsletters
ChatOps is an emerging topic that has led to the wide availability of integrations between group cha...
Feb. 21, 2018 11:30 AM EST Reads: 6,719
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know ...
Feb. 21, 2018 04:00 AM EST Reads: 3,440
Is advanced scheduling in Kubernetes achievable?Yes, however, how do you properly accommodate every ...
Feb. 21, 2018 03:45 AM EST Reads: 3,010
As Marc Andreessen says software is eating the world. Everything is rapidly moving toward being soft...
Feb. 21, 2018 03:00 AM EST Reads: 4,189
The cloud era has reached the stage where it is no longer a question of whether a company should mig...
Feb. 20, 2018 06:30 PM EST Reads: 7,865
The need for greater agility and scalability necessitated the digital transformation in the form of ...
Feb. 20, 2018 04:45 PM EST Reads: 1,048
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an over...
Feb. 20, 2018 04:45 PM EST Reads: 8,248
As DevOps methodologies expand their reach across the enterprise, organizations face the daunting ch...
Feb. 20, 2018 11:00 AM EST Reads: 2,463
Coca-Cola’s Google powered digital signage system lays the groundwork for a more valuable connection...
Feb. 19, 2018 01:45 AM EST Reads: 3,955
In his session at 21st Cloud Expo, Raju Shreewastava, founder of Big Data Trunk, provided a fun and ...
Feb. 19, 2018 01:00 AM EST Reads: 3,277
While some developers care passionately about how data centers and clouds are architected, for most,...
Feb. 18, 2018 11:00 AM EST Reads: 2,884
"Since we launched LinuxONE we learned a lot from our customers. More than anything what they respon...
Feb. 18, 2018 08:45 AM EST Reads: 2,834
DevOps is under attack because developers don’t want to mess with infrastructure. They will happily ...
Feb. 17, 2018 10:00 PM EST Reads: 2,774
"As we've gone out into the public cloud we've seen that over time we may have lost a few things - w...
Feb. 17, 2018 12:45 PM EST Reads: 5,197
In his session at 21st Cloud Expo, Michael Burley, a Senior Business Development Executive in IT Ser...
Feb. 17, 2018 09:00 AM EST Reads: 3,405
Sanjeev Sharma Joins June 5-7, 2018 @DevOpsSummit at @Cloud Expo New York Faculty. Sanjeev Sharma is...
Feb. 16, 2018 10:15 PM EST Reads: 7,404
We are given a desktop platform with Java 8 or Java 9 installed and seek to find a way to deploy hig...
Feb. 15, 2018 01:00 PM EST Reads: 745
"I focus on what we are calling CAST Highlight, which is our SaaS application portfolio analysis too...
Feb. 15, 2018 12:15 PM EST Reads: 3,402
"Cloud4U builds software services that help people build DevOps platforms for cloud-based software a...
Feb. 15, 2018 11:45 AM EST Reads: 2,710
The question before companies today is not whether to become intelligent, it’s a question of how and...
Feb. 15, 2018 07:45 AM EST Reads: 3,801