Recent changes in the areas of both software and hardware are combining to revolutionize networking.
by Marcio Saito
Revolution is defined as a sudden and fundamental change, and having seen the effects of one revolution does not always allow us to foresee those of the next. Revolutions can cause new waves of innovation and shifts in power and control, changes that are unnerving and exciting at the same time.
One might say that the computer world is experiencing revolutions in the areas of both hardware and software. What might happen in the networking arena when they intersect and amplify one another? Of particular interest is how possible changes will affect the functionality of the solutions available to endusers, now that it is suddenly possible to build a new generation of network appliances that are more powerful and incorporate richer functionality than traditional router appliances and other networking equipment. The potential impact of Linux on networking is even more significant than the impact it has already had on the server market.
During the 1970s and 1980s, computer networks were predominately host-based, with mainframes or midrange systems constituting most of the processing power. Users were directly connected to the host using simple, non-intelligent terminals.
In the early 1990s, computer network protocols converged into a few standards and were adopted for use with desktop computers. Local area networks became a requirement for almost any type of business. While network services were distributed among many servers, the network connectivity migrated from host computers to specialized hardware designed and built to perform internetworking functions. These specialized network appliances (routers, switches and access servers) allowed more reliable, cost-effective and efficient networking. Network connectivity (provided by a router) and network services (running on the servers) are seen as separate entities. This is how most people understand networking today, but cheap hardware and open-source software is beginning to change this.
The PC revolution that consolidated in the late 1980s extended computing power to almost every office desktop. Fueled by a high level of competition and the establishment of industry-standard hardware and software, PCs became more affordable and more powerful. With the Internet, an explosion in the demand for home computers was triggered. PC manufacturers were able to leverage the volumes driven, and prices of PC-related hardware dropped sharply. An article in the May 1995 issue of PC Magazine said, ``As of spring, the touchstone price is $1,999 (US) for a Pentium/75 multimedia system with 8MB RAM, a 700MB hard disk, and a 15-inch monitor.'' In January of 1997, when the first PCs for under $1,000 were offered, the same magazine wrote, ``So what does $999 get you? You can buy a 120MHz or 133MHz system, for less than $1,000.'' As we start a new decade, consumer PC prices have dropped further. Today, we can buy desktop computers with CPUs running over 500MHz, 128MB of RAM and many built-in peripherals for a few hundred dollars. That is about the same price you would pay for a typical access router with much less impressive hardware specifications.
Because the architecture of servers and desktops are similar, manufacturers can take advantage of the low cost of components to build inexpensive server systems. Although cost was one of the important factors in the substitution of servers for routers, this is no longer necessarily the case. Standard hardware components are becoming so inexpensive that it is almost impossible for the manufacturer of a proprietary hardware device to be competitive. This trend is reaching a threshold where the addition of a catalyst could trigger a paradigm shift.
Linux may be that catalyst. Distributed without restrictions on use and installation, it is always provided with source code. It is not necessarily free (zero cost), but anyone can change or improve it to meet specific requirements. Frequently, those requirements are shared by others, and the changes or improvements are fed back to the community.
According to the latest IDC numbers (August 2000), Linux was the second most popular server OS in 1999, with 24% of new server licenses, and is the fastest growing one (the Windows platforms together have 36%). The favorite for Internet-related applications, it is growing quickly in the enterprise market for corporate applications. Due to its roots in the Internet, Linux developed strong networking support, better than other commercial operating systems. Because it is open source and receives the contributions of a huge developer community, Linux is more flexible and evolves much faster as well. The Linux operating system has features, security and robustness comparable to a specialized, internetworking operating system. Put it together with commodity hardware and the result is a very powerful network platform.
It's obvious that Linux is changing the server market landscape: 24% of the market is substantial, regardless of your preference. While it remains to be seen whether Linux will change the client/desktop market, its impact on the networking market is sure.
In this early stage of the revolution, there is still a need for technology integrators to make these benefits widely available. Some technical users are doing the integration themselves. They get communication boards, integrate them with standard PC hardware, and build their own Linux-based network boxes. One such example is Internet Service Providers who, instead of buying a PPP remote access server to provide dial-up Internet access, use Linux servers with multiport serial boards connected to modem banks to perform the same function. Some technology integrators, however, are already delivering a successful new generation of network appliance products. For example, the Cobalt Qube is an all-in-one Internet gateway for small-and medium-size businesses, that can be fitted with a routing board for WAN connectivity. The whole solution integrates all the Internet functionality needed, including network services and connectivity, is very easy to set up and manage, and costs about the same as the less functional access router it replaces.
But this new network device is not simply a more affordable replacement for the traditional router. It has the added advantages of expandability and flexibility. As routers were once better adapted for the network of the past, the new network appliance is better adapted to the realities of the future.
Users end up getting new products that are cheaper, better and easier to use; products that replace the router or access server, incorporate new networking services, and can be easily customized to each different application. It is a different kind of product.
To fully understand the impact of Linux in this mix, it's necessary to consider the latest IDC numbers (available at http://www.idc.com/itforecaster/itf20000808.stm). The total client and server OS market was about $17 billion in 1999. Windows generated almost $8 billion in revenues, while Linux generated less than $100 million. Considering that Linux now has a substantial share of the market, those numbers are shocking. From the user standpoint, the value of a solution is the same, independent of the OS being used; so one would expect revenues to be proportional to the market share. If Microsoft is making $8 billion on Windows, where are the Linux revenues? Because Linux combines open architecture and the business models implied in the open-source model, its ``revenues'' translate almost directly into savings for endusers--savings that can be used to pay for integration and services that produce better solutions for each user.
This gives some idea of the impact and the shift in control and power that Linux brings to the table. But our focus is networking, rather than general purpose OS. According to the last Data Communications' market forecast, the size of the network equipment market in 1999 was $70 billion in the U.S. and $120 billion worldwide. All of this money is going to the equipment manufacturers, holders of proprietary software and hardware technology, such as Cisco and Nortel.
So, when open architectures replace proprietary boxes, a lot of money will change hands. In the networking market, this is happening in both software and hardware at the same time, the impact is amplified.
In the past, proprietary solutions were used because they were cost-effective compared to server-based solutions. Linux and standard hardware frees the market, allowing technology integrators to produce better solutions and be competitive without the need to drive large volumes (the large volumes are already integral to open-source and standard hardware). The need for better solutions drives the change, and open-source and commodity hardware enables it.
These changes have important consequences. Today, users depend on proprietary networking box solutions for features and functionality. They work well for connectivity but cannot have services added or be customized in terms of functionality. Users are driven by economics to separate network connectivity from network services, and to choose solutions that are many times more cumbersome and difficult to manage. In the near future, however, additional functionality will be incorporated into the connectivity product, and using an open platform will make incorporating new hardware or software technologies simpler and quicker. Users and technology integrators will not depend on a sole technology provider, and control will shift toward the enduser.
As with any change, at first it is not easy to perceive all the benefits of a new approach. For networking, all the elements are in place and changes are coming. New possibilities will be discovered along the way, and those discoveries will lead to a new, more powerful approach to networking.
Marcio Saito (marcio@cyclades.com) is director of technology for Cyclades Corporation.