Monthly Archives: November 2022

NGO – The History of Domain Names

Is .NGO the First Defensive TLD Registration Announcement?

03 Aug 2011

Registry behind .org wants .ngo.

This week Public Interest Registry (PIR) announced it will apply for the .ngo top level domain name when ICANN opens up its application process next year.

Could this be the first example of a “defensive” TLD registration?

PIR runs the .org registry, and .ngo for non-governmental organizations is definitely competitive to .org. Others had expressed plans for .ngo as well, and PIR certainly doesn’t want another group running a TLD so similar to .org.

That’s not to say .ngo doesn’t make business sense standalone for PIR. It plans to operate it with some registration restrictions, which .org doesn’t have. It will also be a new TLD, which means it won’t be encumbered by any of the limitations of the .org gTLD.

Still, I suspect a primary reason for applying for this TLD — and particularly for pre-announcing it — is to ward off any competition.

NewTLD Operations – The History of Domain Names

A NEW TOP LEVEL DOMAIN TO COMMENCE OPERATIONS

December 5, 2011

ICANN has approved the delegation and operations of the new country code Top Level Domain (ccTLD), .sx, to a registry located in the equally new country to which it is assigned – Sint Maarten.

Sint Maarten is part of the Kingdom of the Netherlands, encompassing the southern half of Saint Martin island. Now a country in its own right, prior to October 2010 Sint Maarten was known as the Island Territory of Sint Maarten; one of five island territories in the former Netherlands Antilles.

The .an extension was previously used for the region, but as the Netherlands Antilles was dissolved in 2010, the .an domain is being phased out.

The five island territories in the former Netherlands Antilles have been split into three countries, each with a ccTLD created – CW (Curacao), SX (Sint Maarten) and BQ (Bonaire, Sint Eustatius and Saba). The latter extension is yet to be delegated.

Before the new .sx domain is available for general registration, two sunrise periods will occur, allowing trademark holders to apply for domain names and in order to help minimise instances of cyber-squatting.

After the sunrise periods, anyone will be able to register a .sx domain name – residency of or a permanent address in Sint Maarten Island is not required.

It’s not uncommon for countries to allow unrestricted registrations of their ccTLD’s, and while opening up to the world can be beneficial in some ways; countries such as Australia that only allow Australian entities to register a .com.au name tend to foster increased confidence in their extensions – particularly in local markets.

The .sx domain name will be managed by SX Registry SA. The Registry is based on the island of Sint Maarten, but technical operations will occur in Canada and Luxembourg.

NewTLD NamingPolicy – The History of Domain Names

ICANN started a new process of TLD naming policy to take a significant step forward on the introduction of new generic top-level domains.

Date: 06/26/2008

During the 32nd International Public ICANN Meeting in Paris in 2008, ICANN started a new process of TLD naming policy to take a “significant step forward on the introduction of new generic top-level domains.” This program envisioned the availability of many new or already proposed domains, as well as a new application and implementation process. Observers believed that the new rules could result in hundreds of new gTLDs being registered. Proposed TLDs included free, music, shop, berlin, wien, nyc and istanbul

On 13 June 2012 ICANN announced nearly 2,000 applications for top-level domains, which began installation throughout 2013. Donuts Inc. invested $57 million in more than 300 applications while Famous Four Media applied for 61 new domains. The first seven, bike, clothing, guru, holdings, plumbing, singles, and ventures are active. More were released in 2014 and 2015.

The introduction of several generic top-level domains over the years had not stopped the demand for more gTLDs; ICANN received many proposals for establishment of new top-level domains. Proponents argued for a variety of models, ranging from adoption of policies for unrestricted gTLDs  to chartered gTLDs for specialized uses by specialized organizations.

In 2008, a new initiative foresaw a stringent application process for new domains, adhering to a restricted naming policy for open gTLDs, community-based domains, and internationalized domain names (IDNs). According to a guidebook published by ICANN, a community-based gTLD is “a gTLD that is operated for the benefit of a defined community consisting of a restricted population.” All other domains fall under the category open gTLD, which “is one that can be used for any purpose consistent with the requirements of the application and evaluation criteria, and with the registry agreement. An open gTLD may or may not have a formal relationship with an exclusive registrant or user population. It may or may not employ eligibility or use restrictions.”

The establishment of new gTLDs under this program required the management of registrar relationships, the operation of a domain registry, and demonstration of technical (as well as financial) capacity for such operations.

A fourth version of the draft applicant guidebook (DAG4) was published in May 2011. On June 20, 2011, ICANN’s board voted to end most restrictions on the creation of generic top-level domain names (gTLDs) — at which time 22 gTLDs were available. Companies and organizations would be able to choose essentially arbitrary top-level Internet domains. The use of non-Latin characters (such as Cyrillic, Arabic, Chinese, etc.) would also be allowed in gTLDs. ICANN began accepting applications for new gTLDs on January 12, 2012. A survey by registrar Melbourne IT considered entertainment and financial services brands most likely to apply for new gTLDs for their brands. The initial price to apply for a new gTLD was $185,000. ICANN expected that the first batch of new gTLDs would be operational by September 2013. ICANN expected the new rules to significantly change the face of the internet. Peter Thrush, chairman of ICANN’s board of directors, stated after the vote: “Today’s decision will usher in a new internet age. We have provided a platform for the next generation of creativity and inspiration. Unless there is a good reason to restrain it, innovation should be allowed to run free.” Industry analysts predicted 500–1000 new gTLDs, mostly reflecting names of companies and products, but also cities, and generic names like bank and sport. According to Theo Hnarakis, chief executive of

Melbourne IT, the decision would “allow corporations to better take control of their brands. For example, apple or ipad would take customers right to those products.” In agreement, Nick Wood, Managing Director of Valideus, suggested “Your own gTLD demonstrates confidence and vision and may accelerate your brand and its value. An internet address at the Top Level is far better than registration at the ‘low rent’ Second Level.”  However, some companies, like Pepsi, ruled out a branded gTLD.

Neustar – The History of Domain Names

Neustar Announces .Biz Price Increase

July 31, 2011

Neustar has informed its registrars and ICANN that it will increase its annual prices for .biz domain names effective February 1, 2012. The new price will be $7.85, or 55 cents more than the current $7.30 per year. That’s about 7.5%.

This wholesale price does not include ICANN’s 18 cent “tax” per domain year registered.

Neustar’s notice to ICANN does not include a reason for the price increase.

Price increases from leading registries Neustar, VeriSign, and Afilias might come under more scrutiny as they begin offering registry services to third parties applying for new top level domain names. If they offer services for $2 per domain, it may become harder to justify the prices they charge for incumbent domain names.

But I doubt ICANN will pay attention. It just renewed VeriSign’s .net contract with 10% annual price increases…without justification for those increases.

Networksolutions webgroup – The History of Domain Names

Network Solutions sold to Web.com for $405M and stock

Aug 3, 2011

Network Solutions has been scooped up by Web.com Group Inc., which will pay $405 million in cash and issue 18 million shares of common stock to buy the Herndon-based web domain registrar.

Web.com stock (NASDAQ: WWWW) closed at $8.66 a share Wednesday before the deal was announced, putting the total value of the deal at $560 million.

The buyout – which awaits shareholder and regulatory approval — comes a year after the Jacksonville, Fla.-based Web.com bought a Network Solutions competitor, Register.com, for $135 million.

After the acquisition, the combined Web.com and Network Solutions operation is expected to see revenue “in the mid-$450 million range” this year, with 1,900 employees, 3 million paying subscribers and 9 million domain names under management, according to a news release.

It’s the latest turn in Network Solutions’ tumultuous 16-year history of ownership changes. It was first bought by SAIC in 1995 and went public two years later under the ticker symbol NSOL, raising $67 million. VeriSign in 2000 bought the company in a stock deal worth $21 billion, then the largest Internet buyout in history. Three years and a tech crash later, VeriSign ended the ill-fated marriage and sold Network Solutions to the Najafi Companies, a private equity firm, for $100 million. Najafi had better luck – selling the domain registrar to General Atlantic LLC in 2007 at a reported price tag of $800 million.

The acquisition is slated to be completed in the fall. At the close, General Atlantic and other current Network Solutions shareholders are expected to own about 37 percent of Web.com. Anton Levy, a managing director at General Atlantic, will join the Web.com board.

Netscape – The History of Domain Names

Netscape Navigator – Netscape Navigator becomes world most popular browser

Date: 01/01/1994

Netscape Navigator is a discontinued proprietary web browser, and the original browser of the Netscape line, from versions 1 to 4.08, and 9.x. It was the flagship product of the Netscape Communications Corp and was the dominant web browser in terms of usage share in the 1990s, but by 2002 its usage had almost disappeared. This was primarily due to the increased usage of Microsoft’s Internet Explorer web browser software, and partly because the Netscape Corporation (later purchased by AOL) did not sustain Netscape Navigator’s technical innovation after the late 1990s.

The business demise of Netscape was a central premise of Microsoft’s antitrust trial, wherein the Court ruled that Microsoft Corporation’s bundling of Internet Explorer with the Windows operating system was a monopolistic and illegal business practice. The decision came too late for Netscape, however, as Internet Explorer had by then become the dominant web browser in Windows.

The Netscape Navigator web browser was succeeded by the Netscape Communicator suite in 1997. Netscape Communicator’s 4.x source code was the base for the Netscape-developed Mozilla Application Suite, which was later renamed SeaMonkey. Netscape’s Mozilla Suite also served as the base for a browser-only spinoff called Mozilla Firefox.

The Netscape Navigator name returned in 2007 when AOL announced version 9 of the Netscape series of browsers, Netscape Navigator 9. On 28 December 2007, AOL canceled its development but continued supporting the web browser with security updates until 1 March 2008. AOL allows downloading of archived versions of the Netscape Navigator web browser family. AOL maintains the Netscape website as an Internet portal.

Origin

Netscape Navigator was inspired by the success of the Mosaic web browser, which was co-written by Marc Andreessen, a part-time employee of the National Center for Supercomputing Applications and a student at the University of Illinois. After Andreessen graduated in 1993, he moved to California and there met Jim Clark, the recently departed founder of Silicon Graphics. Clark believed that the Mosaic browser had great commercial possibilities and provided the seed money. Soon Mosaic Communications Corporation was in business in Mountain View, California, with Andreessen as a vice-president. Since the University of Illinois was unhappy with the company’s use of the Mosaic name, the company changed its name to Netscape Communications (thought up by Product Manager Greg Sands and named its flagship web browser Netscape Navigator.

Netscape announced in its first press release (13 October 1994) that it would make Navigator available without charge to all non-commercial users, and beta versions of version 1.0 and 1.1 were indeed freely downloadable in November 1994 and March 1995, with the full version 1.0 available in December 1994. Netscape’s initial corporate policy regarding Navigator is interesting, as it claimed that it would make Navigator freely available for non-commercial use in accordance with the notion that Internet software should be distributed for free.

However, within two months of that press release, Netscape apparently reversed its policy on who could freely obtain and use version 1.0 by only mentioning that educational and non-profit institutions could use version 1.0 at no charge.

The reversal was complete with the availability of version 1.1 beta on 6 March 1995, in which a press release states that the final 1.1 release would be available at no cost only for academic and non-profit organizational use. Gone was the notion expressed in the first press release that Navigator would be freely available in the spirit of Internet software.

Some security experts and cryptographers found out that all released Netscape versions had major security problems with crashing the browser with long URLs and 40 bits encryption keys.

The first few releases of the product were made available in “commercial” and “evaluation” versions; for example, version “1.0” and version “1.0N”. The “N” evaluation versions were completely identical to the commercial versions; the letter was there to remind people to pay for the browser once they felt they had tried it long enough and were satisfied with it. This distinction was formally dropped within a year of the initial release, and the full version of the browser continued to be made available for free online, with boxed versions available on floppy disks (and later CDs) in stores along with a period of phone support. During this era, “Internet Starter Kit” books were popular, and usually included a floppy disk or CD containing internet software, and this was a popular means of obtaining Netscape’s and other browsers. Email support was initially free, and remained so for a year or two until the volume of support requests grew too high.

During development, the Netscape browser was known by the code name Mozilla, which became the name of a Godzilla-like cartoon dragon mascot used prominently on the company’s web site. The Mozilla name was also used as the User-Agent in HTTP requests by the browser. Other web browsers claimed to be compatible with Netscape’s extensions to HTML, and therefore used the same name in their User-Agent identifiers so that web servers would send them the same pages as were sent to Netscape browsers. Mozilla is now a generic name for matters related to the open source successor to Netscape Communicator.

Rise of Netscape

When the consumer Internet revolution arrived in the mid-to-late 1990s, Netscape was well-positioned to take advantage of it. With a good mix of features and an attractive licensing scheme that allowed free use for non-commercial purposes, the Netscape browser soon became the de facto standard, particularly on the Windows platform. Internet service providers and computer magazine publishers helped make Navigator readily available.

An important innovation that Netscape introduced in 1994 was the on-the-fly display of web pages, where text and graphics appeared on the screen as the web page downloaded. Earlier web browsers would not display a page until all graphics on it had been loaded over the network connection; this often made a user stare at a blank page for as long as several minutes. With Netscape, people using dial-up connections could begin reading the text of a web page within seconds of entering a web address, even before the rest of the text and graphics had finished downloading. This made the web much more tolerable to the average user.

Through the late 1990s, Netscape made sure that Navigator remained the technical leader among web browsers. Important new features included cookies, frames, proxy auto-config, and JavaScript (in version 2.0). Although those and other innovations eventually became open standards of the W3C and ECMA and were emulated by other browsers, they were often viewed as controversial. Netscape, according to critics, was more interested in bending the web to its own de facto “standards” (bypassing standards committees and thus marginalizing the commercial competition) than it was in fixing bugs in its products. Consumer rights advocates were particularly critical of cookies and of commercial web sites using them to invade individual privacy.

In the marketplace, however, these concerns made little difference. Netscape Navigator remained the market leader with more than 50% usage share. The browser software was available for a wide range of operating systems, including Windows (3.1, 95, 98, NT), Macintosh, Linux, OS/2, and many versions of Unix including DEC, Sun Solaris, BSD/OS, IRIX, AIX, and HP-UX, and looked and worked nearly identically on every one of them. Netscape began to experiment with prototypes of a web-based system, known internally as “Constellation”, which would allow a user to access and edit his or her files anywhere across a network no matter what computer or operating system he or she happened to be using.

Industry observers confidently forecast the dawn of a new era of connected computing. The underlying operating system, it was believed, would become an unimportant consideration; future applications would run within a web browser. This was seen by Netscape as a clear opportunity to entrench Navigator at the heart of the next generation of computing, and thus gain the opportunity to expand into all manner of other software and service markets.

Decline

With the success of Netscape showing the importance of the web (more people were using the Internet due in part to the ease of using Netscape), Internet browsing began to be seen as a potentially profitable market. Following Netscape’s lead, Microsoft started a campaign to enter the web browser software market. Like Netscape before them, Microsoft licensed the Mosaic source code from Spyglass, Inc. (which in turn licensed code from University of Illinois). Using this basic code, Microsoft created Internet Explorer (IE).

The competition between Microsoft and Netscape dominated the Browser Wars. Internet Explorer, Version 1.0 (shipped in the Internet Jumpstart Kit in Microsoft Plus! For Windows 95) and IE, Version 2.0 (the first cross-platform version of the web browser, supporting both Windows and Mac OS were thought by many[who?] to be inferior and primitive when compared to contemporary versions of Netscape Navigator. With the release of IE version 3.0 (1996) Microsoft was able to catch up with Netscape competitively, with IE Version 4.0 (1997) further improving in terms of market share. IE 5.0 (1999) improved stability and took significant market share from Netscape Navigator for the first time.

There were two versions of Netscape Navigator 3.0; the Standard Edition and the Gold Edition. The latter consisted of the Navigator browser with e-mail, news readers, and a WYSIWYG web page compositor; however, these extra functions enlarged and slowed the software, rendering it prone to crashing.

This Gold Edition was renamed Netscape Communicator starting with version 4.0; the name change diluted its name-recognition and confused users. Netscape CEO James L. Barksdale insisted on the name change because Communicator was a general-purpose client application, which contained the Navigator browser.

The aging Netscape Communicator 4.x was slower than Internet Explorer 5.0. Typical web pages had become heavily illustrated, often JavaScript-intensive, and encoded with HTML features designed for specific purposes but now employed as global layout tools (HTML tables, the most obvious example of this, were especially difficult for Communicator to render). The Netscape browser, once a solid product, became crash-prone and buggy; for example, some versions re-downloaded an entire web page to re-render it when the browser window was re-sized (a nuisance to dial-up users), and the browser would usually crash when the page contained simple Cascading Style Sheets, as proper support for CSS never made it into Communicator 4.x. At the time that Communicator 4.0 was being developed, Netscape had a competing technology called JavaScript Style Sheets. Near the end of the development cycle, it became obvious that CSS would prevail, so Netscape quickly implemented a CSS to JSSS converter, which then processed CSS as JSSS.(This is why turning JavaScript off also disabled CSS.) Moreover, Netscape Communicator’s browser interface design appeared dated in comparison to Internet Explorer and interface changes in Microsoft and Apple’s operating systems.

At decade’s end, Netscape’s web browser had lost dominance over the Windows platform, and the August 1997 Microsoft financial agreement to invest one hundred and fifty million dollars in Apple required that Apple make Internet Explorer the default web browser in new Mac OS distributions. The latest IE Mac release at that time was Internet Explorer version 3.0 for Macintosh, but Internet Explorer 4 was released later that year.

Microsoft succeeded in having ISPs and PC vendors distribute Internet Explorer to their customers instead of Netscape Navigator, mostly due to Microsoft using its leverage from Windows OEM licenses, and partly aided by Microsoft’s investment in making IE brandable, such that a customized version of IE could be offered. Also, web developers used proprietary, browser-specific extensions in web pages. Both Microsoft and Netscape did this, having added many proprietary HTML tags to their browsers, which forced users to choose between two competing and almost incompatible web browsers.

In March 1998, Netscape released most of the development code base for Netscape Communicator under an open source license. Only pre-alpha versions of Netscape 5 were released before the open source community decided to scrap the Netscape Navigator codebase entirely and build a new web browser around the Gecko layout engine which Netscape had been developing but which had not yet incorporated. The community-developed open source project was named Mozilla, Netscape Navigator’s original code name. America Online bought Netscape; Netscape programmers took a pre-beta-quality form of the Mozilla codebase, gave it a new GUI, and released it as Netscape 6. This did nothing to win back users, who continued to migrate to Internet Explorer. After the release of Netscape 7 and a long public beta test, Mozilla 1.0 was released on 5 June 2002. The same code-base, notably the Gecko layout engine, became the basis of independent applications, including Firefox and Thunderbird.

On 28 December 2007, the Netscape developers announced that AOL had canceled development of Netscape Navigator, leaving it unsupported as of 1 March 2008. Despite this, archived and unsupported versions of the browser remain available for download. Firefox would go on to win back market share from Internet Explorer in the next round of the browser wars.

Legacy

Netscape’s contributions to the web include JavaScript, which was submitted as a new standard to Ecma International. The resultant ECMAScript specification allowed JavaScript support by multiple web browsers and its use as a cross-browser scripting language, long after Netscape Navigator itself has dropped in popularity. Another example is the FRAME tag, that is widely supported today, and that has been incorporated into official web standards such as the “HTML 4.01 Frameset” specification.

In a 2007 PC World column, the original Netscape Navigator was considered the “best tech product of all time” due to its impact on the Internet.

NEC – The History of Domain Names

NEC Corporation – NEC.com was registered

Date: 10/27/1986

On October 27, 1986, NEC Corporation registered the nec.com domain name, making it 30th .com domain ever to be registered.

NEC Corporation is a Japanese multinational provider of information technology (IT) services and products, headquartered in Minato, Tokyo, Japan. NEC provides IT and network solutions to business enterprises, communications services providers and to government agencies, and has also been the biggest PC vendor in Japan since the 1980s. The company was known as the Nippon Electric Company, Limited, before rebranding in 1983 as just NEC. Its NEC Semiconductors business unit was one of the worldwide top 20 semiconductor sales leaders before merging with Renesas Electronics. NEC is a member of the Sumitomo Group.

Company History

NEC Corporation is one of the world’s leading makers of computers, computer peripherals, and telecommunications equipment and owns majority control of NEC Electronics Corporation, one of the leading semiconductor makers in the world. NEC is considered one of Japan’s sogo denki, or general electric companies, a group that is typically said to also include Fujitsu Limited; Hitachi, Ltd.; Mitsubishi Electric Corporation; and Toshiba Corporation. Like the other members of the Japanese high-tech “Big Five,” NEC was hit hard at the turn of the millennium by a global downturn in demand in the corporate sector for electronics products. NEC has subsequently been undertaking an ongoing and massive restructuring, including an increasing emphasis on systems integration services, software, and Internet-related services. These operations are housed within the company’s IT Solutions business segment, which also includes mainframe computers, network servers, supercomputers, workstations, and computer peripherals. The Networking Solutions segment comprises optical, broadband, and wireless networking equipment and services as well as cellular phones and other communications devices. Computers, printers and other peripheral devices, Internet services, and network equipment aimed at the consumer market are handled through the Personal Solutions segment. About 17 percent of NEC’s net sales originate outside of Japan.

Early History Involving Western Electric Company

The Nippon Electric Company, Limited, as NEC Corporation was originally known, was first organized in 1899 as a limited partnership between Japanese investors and the Western Electric Company. Western Electric recognized that Japan, which was undergoing an ambitious industrialization, would soon be building a telephone network. With a solid monopoly in North America as the manufacturing arm of the Bell system, Western Electric sought to establish a strong market presence in Japan, as it had done in Europe. NEC went public the following year, with Western Electric a 54 percent owner. In need of a plant, NEC took over the Miyoshi Electrical Manufacturing Company in central Tokyo.

Under the management of Kunihiko Iwadare and with substantial direction from Western Electric, NEC was at first little more than a distributor of imported telephone equipment from Western Electric and General Electric. Iwadare, however, set NEC to producing magneto-type telephone sets and secured substantial orders from the Ministry of Communications for the government-sponsored telephone-network-expansion program. With steadily increasing, and guaranteed, business from the government, NEC was able to plan further expansion. In September 1900 NEC purchased from Mitsui a site at Mita Shikokumachi, where a second NEC factory was completed in December 1902.

In an attempt to heighten NEC’s competitiveness with rival Oki Shokai, Iwadare ordered his apprentices at Western Electric to study that company’s accounting and production-control systems. Takeshiro Maeda, a former Ministry of Communications official, recommended that NEC emphasize the consumer market, because he regarded the government sales as uncompetitive and limited. Still, government sales were the company’s major vehicle for growth, particularly with Japan’s expansion into Manchuria after the 1904-05 Russo-Japanese War.

Japan’s Ministry of Communications engineered an aggressive telecommunications program, linking the islands of Japan with commercial, military, and government offices in Korea and Manchuria. As was Bell in the United States, NEC was permitted a “natural,” though imperfect, monopoly over cable communications in Japan and its territories. NEC opened offices in Seoul in 1908 and Port Arthur (now Lüshun), China, in 1909.

A serious economic recession in Japan in 1913 forced the government to retrench sponsorship of its second telephone expansion program. Struggling to survive, NEC quickly turned back to importing–this time of such household appliances as the electric fan, a device never seen before in Japan. As quickly as it had fallen, the Japanese economy recovered in 1916, and the expansion program was placed back on schedule. Intelligent planning effectively insulated NEC from the effects of a second serious recession in 1922; NEC even continued to grow during that time. In the meantime, Western Electric’s stake in NEC was transferred in 1918 to the company’s international division, International Western Electric Company, Incorporated (IWE).

Relationship with Sumitomo Beginning in 1920s

In the early 1920s, IWE wanted to create a joint venture with NEC to produce electrical cables. NEC, however, lacked the industrial capacity to be an equal partner, and recommended the inclusion of a third party, Sumitomo Densen Seizosho, the cable-manufacturing division of the Sumitomo group. A three-way agreement was concluded, marking the beginning of an important role for Sumitomo in NEC’s operations.

On September 1, 1923, a violent earthquake severely damaged Tokyo and Yokohama, killing 140,000 people and leaving 3.4 million homeless. The Great Kanto Earthquake also destroyed four NEC factories and 80,000 telephone sets. Still, the government maintained its commitment to a modern telephone network and supported NEC’s development of automatic switching devices.

NEC began to work on radios and transmitting devices in 1924. As with the telephone project, the Japanese government sponsored the establishment of a radio network, the Nippon Hoso Kyokai, which began operation with Western Electric equipment from NEC. By May 1930, however, NEC had built its own transmitter, a 500-watt station at Okayama.

In 1925 American Telephone & Telegraph sold International Western Electric to International Telephone & Telegraph, which renamed the division International Standard Electric Corporation (ISE). Partially as a result, Yasujiro Niwa, a director who had joined NEC in 1924, felt NEC should lessen its dependence on technologies developed by foreign affiliates. In order to strengthen NEC’s research and development, Niwa inaugurated a policy of recruiting the best graduates from top universities. By 1928 NEC engineers had completed their own wire photo device.

The Japanese economy, which had been in a slump since 1927, fell into crisis after the Wall Street crash of 1929. With a rapidly contracting economy, the government was forced year after year to scale back its telecommunications projects. While it restricted imports of electrical equipment, the government also encouraged greater competition in the domestic market. Decreased subsidization and a shrinking market share reversed many of NEC’s gains during the previous decade.

The deployment of Japanese troops in Manchuria in 1931 created a strong wave of nationalism in Japan. Legislation was passed that forced ISE to transfer about 15 percent of its ownership in NEC to Sumitomo Densen. Under the directorship of Sumitomo’s Takesaburo Akiyama (Iwadare had retired in 1929), NEC began to work more closely with the Japanese military. A right-wing officers corps was at the time successfully engineering a rise to power and diverting money to military and industrial projects, particularly after Japan’s declaration of war against China in 1937. NEC’s sales grew by seven times between 1931 and 1937, and by 1938 the company’s Mita and Tamagawa plants had been placed under military control.

Under pressure from the militarists, ISE was obliged to transfer a second block of NEC shares to Sumitomo; by 1941, ISE’s stake had fallen to 19.7 percent. Later that year, however, when Japan went to war against the Allied powers, ISE’s remaining share of NEC was confiscated as enemy property.

During the war, NEC worked on microwave communications and radar and sonar systems for the army and navy. The company took control of its prewar Chinese affiliate, China Electric, as well as a Javanese radio-research facility belonging to the Dutch East Indies Post, Telegraph and Telephone Service. In February 1943, Sumitomo took full control of NEC and renamed it Sumitomo Communication Industries. The newly named company’s production centers were removed to 15 different locations to minimize damage from American bombings. Despite this, Sumitomo Communication’s major plants at Ueno, Okayama, and Tamagawa were destroyed during the spring of 1945; by the end of the war in August, the company had ceased production altogether.

Struggling to Recover Following World War II

The Allied occupation authority ordered the dissolution of Japan’s giant zaibatsu (conglomerate) enterprises such as Sumitomo in November that year. Sumitomo Communications elected to readopt the name Nippon Electric, and ownership of the company reverted to a government liquidation corporation. At the same time, the authority ordered a purge of industrialists who had cooperated with the military during the war, and Takeshi Kajii, wartime president of NEC, was removed from the company.

NEC’s new president, Toshihide Watanabe, faced the nearly impossible task of rehabilitating a company paralyzed by war damage, with 27,000 employees and no demand for its products. Although it was helped by the mass resignation of 12,000 workers, NEC was soon constrained by new labor legislation sponsored by the occupation authority. This legislation resulted in the formation of a powerful labor union that frequently came into conflict with NEC management. Although NEC was able to open its major factories by January 1946, workers demanding higher wages went on strike for 45 days only 18 months later.

The Japanese government helped NEC and other companies to remain viable through the award of public works projects. Uneasy about becoming dependent on these programs, however, Watanabe ordered the reapplication of NEC’s military technologies for commercial use. Submarine sonar equipment was thus converted into fish detectors, and military two-way radios were redesigned into all-band commercial radio receivers.

Still, NEC fell drastically short of its postwar recovery goals. In April 1949 the company closed its Ogaki, Seto, and Takasaki plants and its laboratory at Ikuta, and laid off 2,700 employees. The union responded by striking, yielding only after 106 days.

Next on Watanabe’s agenda was the establishment of patent protection for NEC’s technologies. During the war, all patented designs had become a “common national asset”–in the public domain. Eager to reestablish its link with ISE, NEC needed first to ensure that both companies’ technologies would be legally protected. This accomplished, NEC and ISE signed new cooperative agreements in July 1950.

Diversifying and Expanding Internationally in the 1950s and 1960s

With Japan’s new strategic importance in light of the Korean War, and with the advent of commercial radio broadcasting and subsequent telephone expansions, NEC had several new opportunities for growth. The company made great progress in television and microwave communication technologies and in 1953 created a separate consumer-appliance subsidiary called the New Nippon Electric Company. The company had begun research and development on transistors in 1950; it entered the computer field four years later, and in 1960 began developing integrated circuits. By 1956 NEC had diversified so successfully that a major reorganization became necessary and additional plant space in Sagamihara and Fuchu was put on line. NEC also established foreign offices in Taiwan, India, and Thailand in 1961. Watanabe, believing that NEC should more aggressively establish an international reputation, created a marketing subsidiary in the United States in 1963 called Nippon Electric New York, Inc. (which later became NEC America, Inc.). In addition, the company changed its logo, dropping the simple igeta diamond and “NEC” for a more distinctive script. In November of the following year, Watanabe resigned as president and became chairman of the board.

The company’s new president, Koji Kobayashi, took office with the realization that because the Japanese telephone market would soon become saturated, NEC would have to diversify more aggressively into new and peripheral electronics product lines to maintain its high growth rate. In preparation for this, he introduced modern management methods, including a zero-defects quality-control policy, a concept borrowed from the Martin Aircraft Company. Over the next two years, Kobayashi split NEC’s five divisions into 14, paving the way for a more decentralized management system that gave individual division heads greater autonomy and responsibility. With the continued introduction of more advanced television-broadcasting equipment and telephone switching devices, and taking advantage of the stronger position Watanabe and Kobayashi had created, NEC opened factories in Mexico and Brazil in 1968, Australia in 1969, and Korea in 1970. Affiliates were opened in Iran in 1971 and Malaysia in 1973. With a diminishing need for technical-assistance programs, NEC moved toward greater independence from ITT. That company’s interest in NEC (held through ISE) was reduced to 9.3 percent by 1970, and eliminated completely by 1978. Similarly, NEC shares retained after the war by Sumitomo-affiliated companies were gradually sold off, an action that reduced the Sumitomo group’s interest in NEC from 38 percent in 1961 to 28 percent in 1982.

NEC’s competitive advantage in labor costs eroded continually from the mid-1960s, when worker scarcity became apparent, until the early 1980s. This, together with President Richard Nixon’s decision to remove the U.S. dollar from the gold standard in 1971 and the effects of the Arab oil embargo of 1973, profoundly compromised NEC’s competitive standing. The company was forced into a seven-month retrenchment program in 1974, losing precious momentum in its competition with European and American firms.

Pursuit of C&C Vision Beginning in Late 1970s

In an effort to promote Japanese electronics companies, the Japanese government pushed through a series of partnership agreements among the Big Six computer makers: NEC, Fujitsu, Hitachi, Mitsubishi, Oki, and Toshiba. NEC and Toshiba formed a joint venture, which gave both companies an opportunity to pool their resources and eliminate redundant research. However, a subsequent attempt by NEC to enter the personal computer market failed miserably. Still, NEC, choosing to work with Honeywell instead of building IBM compatibles, invested heavily in its computer operations.

Later in the 1970s, NEC’s computer activities suffered from the fall of Honeywell’s computer fortunes. NEC recovered by relying more on its ability to develop systems in-house. The company was further spurred on by the visionary Kobayashi’s concept of “C&C,” his prediction of the future integration of computers and communications. This prescient vision, which was initially scoffed at, was first announced by NEC at INTELCOM 77. By 1984 NEC had sold more than one million personal computers in Japan. By 1990 the company, whose Japanese personal computers used a proprietary NEC operating system, held a commanding 56 percent share of the Japanese market, as well as a top five position in the United States, where it sold PC clones.

Kobayashi, in the meantime, was promoted to chairman and CEO, and succeeded as president first by Tadao Tanaka in 1976, and then Tadahiro Sekimoto in 1980. Under Kobayashi and Tanaka, NEC tripled its sales volume in the ten years to 1980. A greater proportion of those sales than ever before was derived from foreign markets, and between 1981 and 1983 NEC’s stock was listed on several European stock exchanges. In 1982 an NEC plant in Scotland began to manufacture memory devices, then in 1987 NEC Technologies (UK) Ltd. was established in the United Kingdom to manufacture printers and other products for the European market. In 1984 NEC, Honeywell, and France’s Groupe Bull entered into an agreement involving the manufacture and distribution of NEC mainframe computers; the deal also provided for cross-licensing of patents and copyrights among the three companies. One year earlier, Nippon Electric changed its English-language name to NEC Corporation.

Meanwhile, in the United States NEC formed NEC Electronics, Inc. in 1981 to be the company’s manufacturing and marketing arm for semiconductors in the United States. This subsidiary in 1984 opened a $100 million plant in Roseville, California, to manufacture electron devices. In 1989 another U.S. subsidiary, NEC Technologies, Inc., was established to handle the company’s computer peripheral operations in the United States.

Increased International Profile in the Early to Mid-1990s

By 1989, NEC’s sales had reached ¥3.13 trillion ($21.3 billion). The company’s focus on C&C had led it to top five positions in computer chips, computers, and telecommunications equipment. Like IBM, NEC was also vertically integrated, which added to its strength. Although NEC, like other Japanese electronics computers, suffered from the Japanese recession and strong yen of the early 1990s and from increased competition in Japan from U.S. companies, its aggressive pursuit of overseas opportunities helped the company maintain its leading and varied positions.

In Europe, NEC began selling its IBM-compatible PowerMate line in 1991. In late 1993 the relationship between NEC and Groupe Bull was strengthened with an additional NEC investment of ¥7 billion ($64.5 million) in the troubled state-owned computer manufacturer. By 1996 NEC had a 17 percent stake in Groupe Bull. In 1995 NEC spent $170 million to gain a 19.9 percent stake in Packard Bell Electronics, Inc., the leading U.S. marketer of home computers. In February 1996, NEC, Groupe Bull, and Packard Bell entered into a complex three-way arrangement. NEC invested an additional $283 million in Packard Bell, while Packard Bell acquired the assets of Groupe Bull’s PC subsidiary, Zenith Data Systems. In June of that same year, NEC and Packard Bell merged their PC businesses outside of China and Japan into a new firm called Packard Bell NEC Inc., with NEC investing another $300 million for a larger stake in Packard Bell. Packard Bell NEC immediately became the world’s fourth largest PC maker, trailing only Compaq, IBM, and Apple.

As the 1990s progressed, NEC increasingly looked to parts of Asia outside Japan for manufacturing and sales opportunities, particularly in semiconductors, transmission systems, cellular phones, and PCs. During fiscal 1996, for example, NEC entered into several joint ventures in China for the production and marketing of PBXs, PCs, and digital microwave communications systems and in Indonesia for the manufacture of semiconductors. In May 1997 NEC took a 30 percent stake in a $1 billion joint venture to construct the largest semiconductor factory in China.

In 1994 NEC announced the development of the SX-4 series of supercomputers, touted as the world’s fastest. U.S.-based competitor Cray Research Inc. later filed a complaint with the U.S. Commerce Department accusing NEC of dumping the series in the U.S. market. The Commerce Department in March 1997 ruled in Cray Research’s favor and imposed a 454 percent tariff on NEC’s supercomputers. NEC’s subsequent appeals of this ruling failed.

By 1996, Sekimoto had become chairman of NEC and Hisashi Kaneko was serving as president (Kobayashi died in 1996, when he still held the post of honorary chairman). During the 1990s, these executives had led NEC to increase the share of its sales derived outside Japan from 20 percent in 1990 to 28 percent in 1996. Nonetheless, NEC also kept its sights on its home market; NEC’s share of the domestic PC market had fallen to about 50 percent by 1996, leading to a plan to sell IBM-compatible computers in Japan for the first time. In October 1997 NEC began selling PCs in Japan with Intel microprocessors and the Windows 98 operating system. The move to belatedly adopt what had become the international PC standard was made to support NEC’s drive to increase its share of the global PC market.

Struggling and Restructuring: Late 1990s and Early 2000s

Through additional investments of $285 million in 1997 and $225 million in 1998 NEC gained majority control of Packard Bell NEC, which became a subsidiary of the Japanese firm. NEC and Groupe Bull had now infused more than $2 billion into Packard Bell, but the U.S. firm continued to hemorrhage, posting losses of more than $1 billion in 1997 and 1998. The company and its U.S.-based manufacturing simply could not compete with lower-cost contract manufacturers based in Asia and elsewhere. Late in 1999, with Packard Bell NEC on its way to posting another loss, NEC pulled the plug. Packard Bell’s California plant was closed, and NEC decided to abandon the retail PC market in the United States. The Packard Bell brand disappeared from the U.S. scene. NEC began focusing its U.S. PC efforts on the corporate market, where it sold computers under the NEC brand.

Meanwhile, NEC was being buffeted by a host of additional problems. The prolonged economic downturn in Japan depressed demand for high-tech products, including personal computers and consumer electronics. At the same time, fierce international competition among both electronics and chip makers was cutting drastically into profit margins on consumer electronics and semiconductors. Compounding matters was the economic crisis that erupted in Asia in mid-1997. As a result, net income for fiscal 1998 plummeted 55 percent, and the following year NEC–further battered by a sharp increase in the value of the yen–fell into the red, posting a net loss of ¥157.9 billion ($1.34 billion), its largest loss to that time. In October 1998, in the midst of the latter year, Sekimoto resigned from the chairmanship following the revelation of NEC’s involvement in a defense procurement scandal. Executives of a partly owned NEC subsidiary were charged with overbilling Japan’s Defense Agency and with bribing officials at the agency to gain business. The executives were later convicted.

At the end of 1999, Hajime Sasaki took over as NEC chairman, and Koji Nishigaki replaced Kaneko as president. Reflecting the profound changes that were needed to turn the company’s fortunes around, Sasaki was the first chairman to have come from NEC’s semiconductor side, rather than the telecommunications operations, and Nishigaki was the first president to come from the computer-systems divisions as well as the first with a background in marketing rather than engineering. The new managers recognized that they would have to make fundamental changes to the way NEC operated.

To improve profitability, they almost immediately announced that the workforce would be reduced by 10 percent, or 15,000 positions, with 6,000 workers laid off overseas and 9,000 job cuts in Japan coming through attrition (layoffs still being anathema in that society). They began a debt-reduction program to improve NEC’s financial structure, which in early 1999 was weighed down by ¥2.38 trillion ($20.13 billion) in liabilities. NEC also shifted its focus from hardware to the Internet and Internet-related software and services, building on its ownership of Biglobe, one of the leading Internet service providers in Japan, boasting 2.7 million members in late 1999, three years after the service’s launch. To support this shift, NEC in April 2000 reorganized its operations into three autonomous in-house companies based on customers and markets served: NEC Solutions, providing Internet solutions for corporate customers and individuals; NEC Networks, focusing on Internet solutions for network service providers; and NEC Electron Devices, supplying device solutions for manufacturers of Internet-related hardware. Like other sogo denki a traditionally go-it-alone company, NEC began aggressively pursuing joint ventures with its competitors to spread the costs and risks of developing new products. In one of the first such ventures, NEC joined with Hitachi to form Elpida Memory, Inc., to make the dynamic random-access memory chips (DRAMs) used in personal computers. Ventures were also formed with Mitsubishi Electric in the area of display monitors and with Toshiba in space systems. NEC’s deemphasis of manufacturing also led to the closure of a number of plants located outside of Japan.

Although these and other initiatives helped NEC return to profitability in fiscal 2000 and 2001, the global downturn in the information technology sector, coupled with heightened competition from China and other low-cost countries and the economic fallout from the terrorist attacks of 9/11, sent the company deep into the red again in fiscal 2002; a net loss of ¥312 billion ($2.35 billion) was reported, reflecting ¥370.47 billion ($2.79 billion) in restructuring and other charges. The semiconductor sector suffered the deepest falloff, with the prices of certain commodity chips plunging by nearly 90 percent. In response, NEC Electron Devices closed down a number of plants and eliminated 4,000 jobs, including 2,500 in Japan. An additional 14,000 job cuts were announced in early 2002, along with additional plant closings and the elimination of certain noncore product lines.

In a further cutback of company assets and in an attempt to raise cash for other initiatives, NEC began taking some of its subsidiaries public. Both NEC Soft, Ltd., a developer of software, and NEC Machinery Corporation, a producer of semiconductor manufacturing machinery and factory automation systems, were taken public in 2000. In February 2002 NEC sold about a one-third interest in NEC Mobiling, Ltd., a distributor of mobile phones and developer of software for mobile and wireless communications network systems, to the public. Later that year, a similar one-third interest was sold to the public in NEC Fielding, Ltd., a provider of maintenance services for computers and computer peripheral products. NEC’s most radical such maneuver involved its troubled semiconductor business. In November 2002 all of NEC’s semiconductor operations, except for the DRAM business now residing within Elpida Memory, were placed into a separately operating subsidiary called NEC Electronics Corporation. NEC then reduced its stake in the newly formed company to 70 percent through a July 2003 IPO that raised ¥155.4 billion ($1.31 billion).

Restructuring efforts continued in 2003 under the new leadership of Akinobu Kanasugi, who took over as president from Nishigaki, named vice-chairman. Kanasugi had previously been in charge of NEC Solutions. Concurrent with the appointment of the new president, NEC replaced its in-house company structure with a business line structure, with NEC Solutions evolving into an IT Solutions segment (comprising systems integration services, software, and Internet-related services as well as computers and peripherals) and NEC Networks becoming Network Solutions (comprising network integration services as well as telecommunications and broadband Internet equipment). The eventual goal was to merge the information technology and networking groups in order to offer fully integrated “total” IT/networking/telecommunications solutions encompassing software, operational and maintenance services, and equipment. NEC also established a Personal Solutions group to offer a full range of products and services, including Biglobe, to consumers. The transformation of NEC was far from complete, and its success uncertain–the firm having posted its second straight net loss during fiscal 2003–but the company’s restructuring efforts were as aggressive as, if not more aggressive than, those of the other big Japanese electronics firms. NEC seemed determined to remain among the world’s high-tech leaders.

NCR – The History of Domain Names

NCR Corporation – NCR.com was registered

Date: 04/30/1987

On April 30, 1987, NCR Corproation registered the ncr.com domain name, making it 72nd .com domain ever to be registered.

The NCR Corporation (abbrev. National Cash Register) is an American computer hardware, software and electronics company that makes self-service kiosks, point-of-sale terminals, automated teller machines, check processing systems, barcode scanners, and business consumables. They also provide IT maintenance support services. NCR had been based in Dayton, Ohio, starting in 1884, but in June 2009, the company sold most of the Dayton properties and moved its headquarters to Metro Atlanta. Currently the headquarters are in unincorporated Gwinnett County, Georgia, near Duluth, and Alpharetta, Georgia. With a future headquarters planned for the end of 2016 at Technology Square (Adjacent to Georgia Institute of Technology) located in Atlanta, Georgia

NCR was founded in 1884 and acquired by AT&T in 1991. A restructuring of AT&T in 1996 led to NCR’s re-establishment on 1 January 1997 as a separate company, and involved the spin-off of Lucent Technologies from AT&T. NCR is the only AT&T spin-off company that has retained its original name—all the others have either been purchased or renamed following subsequent mergers.

Company History:

When National Cash Register Company was formed during the last two decades of the 19th century, it had one product–cash registers. Today NCR Corporation, as it is now known, develops and markets a wide range of computer and terminal systems; office automation products; automated teller, data warehousing, and telecommunications services; semiconductor components; software; and business forms and supplies. Among NCR’s claims to fame are its introduction of bar code scanning in 1974, its position as a world leader in paper roll products for cash registers, and the fact that fully 40 percent of the checks issued around the globe are cleared with NCR equipment.

Origins

NCR’s first years were shaped in large part by John Henry Patterson, who was president from 1884 to 1921. Patterson’s early emphasis on sales, his initiation of business practices that became standards for other companies and industries, and his pioneering efforts in industrial welfare made NCR a role model for other companies during the late 1800s and early 1900s.

While running a dry goods operation in Ohio during the early 1880s, Patterson found he was losing money because not all sales were being reported by his clerks. When Patterson learned of a device called a cash register, he ordered two from James and John Ritty, who had recently established a Dayton, Ohio-based company called National Cash Register. In 1882 the Rittys sold part of their company and renamed it the National Manufacturing Company.

Patterson, meanwhile, was reaping such financial rewards from the use of his cash registers that he bought stock in the Rittys’ company. He eventually joined the board of directors and suggested that the company use nationwide marketing techniques to sell its cash registers. Patterson’s ideas met with opposition, and in 1884 he bought additional stock and took control of the company. Once president, Patterson again named the company National Cash Register Company and moved quickly to change NCR’s emphasis from manufacturing to sales. His interest in sales led to the concept of quotas and guaranteed sales territories for agents. Patterson also provided his agents with sales seminars, training brochures, and scripted sales pitches, and required them to wear white shirts and dark suits. All of these practices were new at the time but soon became widespread at other companies.

Cash register sales responded almost immediately to Patterson’s techniques. Sales more than doubled to 1,000 machines a year by 1886, while by 1888 the number of NCR employees had grown from 12 to more than 100. About this time Patterson also began to produce various forms of printed advertising. Beginning in the late 1880s, prospective customers were inundated with weekly or monthly advertising circulars and direct-mail letters describing products. Employees’ publications were introduced to bolster communication and enthusiasm about meeting sales quotas. Output–the first employee newspaper–listed sales, discussed the benefits of cash registers, and printed encouraging words from satisfied customers.

Poor economic conditions in the 1890s affected many companies in the United States, including NCR. Between 1892 and 1897 the company’s production was reduced and employees worked scaled-down weeks. The company also looked more closely at the manufacturing side of business: a system of interchangeable parts for cash register models was introduced, streamlining production and trimming overhead.

In 1894 NCR constructed a new and larger “safety-conscious” facility in Dayton with the aid of bank loans. The following year Patterson hired Thomas J. Watson, who rose quickly through the sales ranks to become a sales manager in New York and later became part of an inner circle of Dayton executives. It was Watson who led the campaign to reduce competition, including a massive advertising blitz as well as an adamant defense of patents. By 1897 NCR’s competition had been reduced to three companies, down from more than 80 a decade before.

In 1900 NCR reported the sale of its 200th cash register. It now employed a record 2,269 people. That same year the company was chartered as a New Jersey corporation for the purpose of selling stock. Construction of a ten-building facility began in 1906, and overseas operations, which had been established in the 1880s, were growing as well. In a company publication, NCR boasted that its sales force extended from Norway and Alaska to New Zealand and China, with nearly 1,000 agents in more than 270 offices.

First Electric Cash Register in 1906

In 1906 a young inventor named Charles F. Kettering gave the company its first electric cash register. Kettering, who had been hired just two years earlier, also developed NCR’s Class 1000 machine, a low-cost redesigned register that remained in production for nearly 40 years with only minor changes. Kettering left the company in 1909 to join the automotive industry.

Spurred by the success of Kettering’s cash register and the Class 1000 machine, sales continued to climb throughout the early 1900s. By 1911 NCR had sold a million machines. The company’s aggressive battle to secure patent rights and fend off competition led the American Cash Register Company to file an antitrust complaint based on the Sherman Antitrust Act, a federal law prohibiting the monopolistic restraint of trade or commerce. In 1912 the government brought NCR to trial and presented 32 cases of alleged trade interference. The following year Patterson, Watson, and 20 other officers were found guilty of trade restraint and unlawful monopoly in three of those cases. (The decision would be reversed two years later by a higher court.) In 1913, however, Watson left the company after a falling out with Patterson.

The Dayton Flood of 1913 brought more attention to NCR. Under Patterson’s leadership, the company responded to the flood by suspending all operations and providing relief shelter in company facilities. Shortly thereafter, during the early stages of World War I, NCR continued to make cash registers while involved in wartime production contracts with the government. By 1919 the company was operating almost solely on a wartime production basis.

The 1920s marked NCR’s gradual entrance into its accounting machine era. NCR already had proved its dominance in the cash register field, having controlled more than 95 percent of that market prior to the outbreak of the war. In 1921 NCR announced its Class 2000 itemizer, which provided 30 totals and embodied what the company believed were the best features of all previous registers. John Henry Patterson passed the reins of the company presidency in 1921 to his son Frederick Beck Patterson, who also assumed the duties of the chairman of the board after his father’s death a year later.

Frederick Patterson exercised voting control over NCR after the death of his father, while comptroller Stanley C. Allyn and director John H. Barringer led the company’s first major diversification drive. NCR’s profits rose from $2.8 million in 1921 to $7.8 million in 1925. Because of its success, the company went public with stock sales for the first time.

The 1920s were good years for office equipment firms. After 1925 competitors made inroads into the cash register market, while NCR failed to introduce new products. Sales flattened for NCR, and by 1928 Remington Rand topped the list of business machine companies, taking in $59 million to second-running NCR’s $49 million. Young IBM was fourth at the time with $19 million reported in sales.

Struggling During the Great Depression

In attempts to hasten the diversification drive, NCR purchased the Ellis Adding-Typewriter Company in January 1929. That same year the company announced the Class 3000, NCR’s first hybrid machine, which represented an advance in the area of payroll, billing, and accounting operations. The promise of the new machine was dampened by the Depression later that year. Sales and earnings plunged while the company began a four-year process of cutting the number of its employees in half. With NCR nearly bankrupt by 1931, New York bankers Dillon, Read and Company, who had set up the 1925 stock sales, were ready to invade the company. In response, NCR’s board of directors sought out Edward Deeds to take control of the company, and Frederick Patterson agreed to step down as chairman in 1931. Patterson remained as president until Deeds assumed that additional post in 1936; it was Deeds who turned things around for NCR.

Joining the company at the beginning of the century, Deeds had been put in charge of engineering and construction for a new factory. By 1910 he had become vice-president. Deeds left NCR for Delco in 1915 and later helped found the Wright Airplane Company with Orville Wright, Charles Kettering, and H.E. Talbott. Deeds’s success by 1931 was evident, as he sat on the corporate boards of 28 companies. Shortly after Deeds took control, the company purchased the Remington Cash Register Company, whose assets strengthened NCR’s position. In 1934 the company moved back into the black. Despite broad price fluctuations, by mid-decade sales were stabilizing and overseas operations were expanding in Great Britain, Europe, Central America, South America, and the Middle East and Far East. By the end of the decade NCR was third in the business machine field behind Remington and fast-climbing IBM. NCR in 1939 earned $12 million less than it had the year prior to the Depression. In 1940 Stanley Allyn assumed the post of president, while Deeds continued to serve as chairman and chief executive.

Effects of World War II and Its Aftermath

World War II had a significant impact on NCR, as well as on other data processing and business machine companies, spurring the conversion from office tabulating equipment to data processing. By the time the United States entered the war in 1941, NCR’s expansion into Central America and South America in the 1930s had gained importance, helping to offset the wartime reduction or elimination of operations in Japan, Germany, and Australia. For the next few years the sale of rebuilt machines was the only business NCR continued in countries directly involved in the war. By 1942 the U.S. War Production Board halted the manufacturing of all cash registers to conserve metal.

Wartime contracts for such items as bomb fuses and rocket motors covered NCR’s overhead during the war, while reconditioning of machines provided modest profits. The company’s in-house electronics research program, established prior to World War II, was utilized by the U.S. Navy during the war years. NCR built a computerlike device to calculate bombing navigational data. It also worked on a secret operation to assist the Navy in breaking the German ENIGMA communication cipher. Dubbed “the Bombe,” the mechanism was actually a high-speed electromechanical decrypting machine; about 120 Bombes were built during the course of the war.

By the war’s end a pent-up market for cash registers and accounting machines resulted in a hiring surge for NCR in Dayton. Business boomed after the war. Between 1946 and 1949 NCR reestablished itself in war-torn areas of the United Kingdom, West Germany, and Japan. Improvements and expansion continued into the early 1950s, with a rebuilt plant in Australia, a new factory in Toronto, and new office buildings in Hawaii and Mexico.

Entering the Computer Business in the 1950s

NCR continued its electronics work after the war and in 1952 secured a defense contract for a bombing navigational computer. That same year the company entered into a stock purchase agreement with Computer Research Corporation, which became its electronics division the following year. Development of a computer designed for scientific work had limited impact, and the company’s role in the computer industry remained conservative in the mid-1950s. But the 1956 introduction of the Post-Tronic, an electronic posting machine for banking, was successful. Sales of the Post-Tronic eventually passed the $100 million mark before the machine passed out of use near the end of the 1960s.

With NCR on the edge of a new era, the aging Deeds retired as chairman in 1957 and was succeeded by Allyn. Robert S. Oelman, who had been instrumental in procuring wartime contracts as a company vice-president, became president. Later that year NCR announced the 304, a general purpose computer based on solid-state technology. A few years later, in 1960, NCR’s first “small” computer–the 390, manufactured by Control Data Corporation (CDC)–made its debut.

In the early 1960s NCR increased its development of computers, as well as peripheral devices and software. In 1962 Oelman became chairman of the board, and R. Stanley Laing was named president two years later. Mid-decade saw NCR continue to operate under a split sales strategy, targeting its old customer line as well as new customers in the data processing market. NCR’s computer-related products were successful, but its innovations still remained conservative; the company’s marketplace continued to revolve around banking and retailing.

By the end of the 1960s NCR often was referred to as one of the “Seven Dwarfs” because of its relative position of inferiority to IBM. Joining NCR in these ranks were General Electric (GE), RCA, Burroughs, UNIVAC, CDC, and Honeywell. With GE and RCA bowing out of the computer field in the early 1970s, the five remaining companies became known as the BUNCH, an acronym made up of the first letter of each name.

NCR announced its third generation of computers in 1968 with the introduction of the Century Series, which included a variety of business applications and allowed NCR to market its wares to a broader customer base. NCR’s failure to take advantage of new conditions calling for terminals and software cost it some market share and resulted in a trend of declining profits from 1969 to 1972.

The first half of the 1970s marked the greatest transition period in the history of NCR as it attempted to move full force into the computer market. The period was marred by a number of setbacks that were worsened by an inflationary economy and poor business climate. Labor costs to produce older technology products were enormous, and the company also had marketing problems. Layoffs followed declining earnings, and the company was hit by a three-month strike at its Dayton plant in late 1971. The strike idled 8,500 production and maintenance employees, sharply reduced equipment deliveries, and cost the company millions of dollars in lost orders.

In 1971 NCR entered into a cooperative agreement with CDC to establish a computer peripherals company. The following year NCR established its microelectronics division. Declining profits continued through 1972, and the company posted its first net loss since 1933.

With revenues on shaky ground, William S. Anderson was named president in 1972 and chairman of the board in 1974. Anderson, who had been successful in heading up NCR’s Far East operations and NCR Japan, was the first president to come from outside the parent company. His success in Japan was due in part to the revamping of the company’s marketing organization there, and as president, Anderson quickly moved to modify NCR’s marketing structure through a similar “vocationalizing” system. The branch manager system, in which a branch manager was responsible for sales from a number of different industries, was replaced by a system whereby a district manager oversaw one major marketing area and marketing personnel were trained to specialize in a single vocational area. Areas of specialization included retail, finance, commercial business, industrial, medical, educational, governmental, and media. In 1974 NCR reported that its computer business was finally out of the red. That same year the company’s name was changed from National Cash Register to NCR Corporation.

Growth in the Late 1970s

NCR began making great strides in the computer field after naming Charles E. Exley, Jr., president in 1976. A 22-year veteran of Burroughs Corporation, Exley oversaw the introduction of a new series of computers and related equipment during the later part of the decade. NCR’s 1976 announcement of the 8000 series was well received, and improvements were made throughout the remainder of the decade. NCR’s push into computers resulted in strong earnings, while the company began a series of smaller company acquisitions that boosted expertise in factory data systems, microcards, and IBM-compatible data systems. In fewer than five years NCR revamped its entire product line. During this time the company withdrew from the mainframe computer arena and moved closer to its traditional core industries such as banking and retailing. In 1979 the company passed the $3 billion revenue mark.

NCR came into the 1980s strong, posting its first double-digit increase in revenues in 1980, but growth stalled in 1981, and earnings dropped. Product lines besieged by bugs in the late 1970s resulted in user lawsuits being filed against NCR in the early 1980s. In 1980 a lawsuit was filed by Glovatorium, a small Oakland, California dry cleaning firm. Glovatorium, a first-time computer user, had purchased an NCR Spirit/8200 system to do routine accounting, but the system failed to work. NCR defended its case on the grounds that contracts with Glovatorium had contained limitations of liability and disclaimers. The California judge ruling in the case in 1981 said NCR had targeted first-time computer users and was under a special obligation to be fair in dealing with the user. Punitive damages totaling $2 million were awarded along with compensatory damages for breach of warranty and intentional misrepresentation. The following year NCR agreed to a $2.6 million settlement with Glovatorium.

In 1983 Exley was named chief executive officer, and in the following year he became board chairman. Under his leadership, NCR underwent a corporate restructuring process, made a push back into personal computers, began reemphasizing fiscal control, and started a long-term plan of repurchasing its own stock. The Tower family of microcomputers, which was introduced in 1982, became one of the keys to NCR’s success in the mid-1980s. By 1986 the company was again posting double-digit increases, while most of the computer industry was suffering from a market recession.

NCR’s revenues had grown to $6 billion by 1988, as the company developed customized products that generated significant indirect sales. Meanwhile NCR’s microelectronics division became a leading producer of semiconductors, and the company surpassed IBM as the largest worldwide supplier of automatic teller machines (ATMs). Personal computers and the Tower microcomputers also saw significant sales gains in the emphasis switch from mainframes to distribution processing.

In 1988 Gilbert P. Williamson was promoted from executive vice-president to president, while Exley remained chairman and CEO. The following year overall sales began to dip, although foreign sales were rising. The company closed out the decade as the last thriving member of the BUNCH that had avoided a merger or sellout of interests.

NCR expected to keep its products on par with the computer industry’s powerhouses. In late 1989 it announced that it was jumping into the market for microcomputers that were based on a powerful new microchip. The announcement helped NCR land an agreement with Businessland, Inc. to begin selling the new line in 1990. According to Exley, NCR entered the 1990s with a goal to “reach all markets.” The company had operations in nine countries, with products sold in more than 120 countries. NCR expected continued success in the ATM and semiconductor markets and expanded sales in technology and information processing markets. The company also expected indirect sales to increase, with a number of NCR-manufactured products being sold bearing other companies’ labels.

NCR looked for benefits from the implementation of “concurrent engineering,” to keep its operations on a par with Japanese competitors through a more timely and less costly manufacture of products. Concurrent engineering eliminated a number of independent steps of production, some of which had been contracted out, and replaced that system with one in which design engineers and manufacturing personnel collaborated in a closer working environment, thereby reducing the time needed to correct glitches. NCR had introduced concurrent engineering in 1987 in its new Atlanta, Georgia plant, and by the 1990s the concept was implemented to some degree in all of NCR’s manufacturing facilities.

The 1990s started with great promise for NCR. As the result of an April agreement with California-based Teradata Corporation to develop parallel-processing computer technologies, NCR received 1.4 million shares of Teradata stock. In May the J.C. Penney retail chain announced that it would buy $45 million worth of workstation systems from NCR; two months later, NCR negotiated a $10 million contract to automate the branch offices of the Fleet Norstar Financial Group.

Hostile Takeover by AT&T in 1991

Then NCR ran into a formidable adversary, the American Telephone & Telegraph Company (AT&T). Seeking to bolster its failing computer division, AT&T issued a bid for NCR in December 1990, placing the purchase price at $90 a share, or $6.1 billion. The bid was met with instant hostility by NCR and over the next five months the tug-of-war was played out in the media. NCR Chairman Charles Exley publicly expressed his disdain at the thought of helping AT&T become profitable in the computer field and vowed to quit if the takeover were successful. AT&T countered with a proxy fight to unseat the NCR board of directors. Both sides hired high-powered advisers–takeover lawyer Martin Lipton and Chemical Bank for AT&T, and investment bankers Goldman-Sachs for NCR.

NCR fought hard by taking out full-page newspaper advertisements to turn public opinion its way and by asking the FCC to investigate AT&T’s bid. In the end, AT&T agreed to pay the $110 per share, or $7.4 billion, that NCR was demanding, stipulating, however, that payment be made in AT&T stock. The merger was completed in September 1991. In July NCR announced plans to create a new division to market computer products to telephone companies. NCR’s market position was slowed by the hostile takeover and subsequent adjustment period. Exley retired in February 1992 and Gilbert Williamson, NCR president, succeeded him as CEO. Elton White, executive vice-president, moved into the president’s spot.

Incorporating NCR, with its superior product development capabilities and focused marketing plan, into AT&T, whose computer products were not as sophisticated but whose market was universal, proved to be a challenging task. To counter the market drop, a restructuring of NCR occurred almost immediately. In August 1992, even before the merger was completed, plans to close NCR’s Cambridge, Ohio plant were announced. In November NCR’s Workstation Products Division was split into smaller groups that would function as independent corporations. A number of AT&T employees and products were moved into the division at this time. That same month, AT&T announced that 120 workers would be released from NCR’s Network Products Division in St. Paul, Minnesota.

Despite the internal upheaval caused by the hostile takeover bid, NCR continued to develop new products. A pen-based notepad computer, the NCR System 3125, was introduced in June 1991. The computer was the first of its kind to use an electronic stylus instead of a keyboard. The alliance with Teradata was solidified in December when NCR purchased the company for $520 million in AT&T stock. Ironically, Teradata’s biggest customer had been AT&T. In early 1993, after initially keeping a “hands-off” attitude toward NCR, AT&T installed one of its executives, Jerre Stead, as NCR CEO. Stead’s casual, “open-door” approach was one that clashed with NCR’s conservative corporate culture, and his desire to broaden NCR’s focus and step up the company’s production of PCs was not popular in all quarters. In 1994 NCR also was renamed AT&T Global Information Solutions (GIS).

Under AT&T’s management NCR/GIS was not performing up to par, however, and Stead jumped ship in 1995. The company found a replacement in Lars Nyberg, a Swede who had successfully turned around the fortunes of Philips Electronics N.V.’s computer division. Nyberg immediately began to make serious changes, announcing a restructuring that included the layoffs of 20 percent of the company’s workforce. NCR was reportedly losing almost $2 million a day for AT&T, and Nyberg also made the decision to get out of the PC business, in which NCR seemed to have few prospects for long-term success. The company also dropped the unpopular GIS name and became known as NCR once again.

In early 1996 AT&T announced that it would spin off NCR as part of a massive realignment, issuing to its shareholders NCR stock worth nearly $4 billion, or about half of what it had invested in the company four years earlier. NCR became independent in January 1997, and its stock resumed trading on the New York Stock Exchange. Nyberg continued his efforts to restore NCR’s fortunes and reorganized further during the year, cutting another 1,000 jobs and reconfiguring the company’s structure into five large divisions from 130 smaller ones. He also sold three of the company’s manufacturing plants to Solectron, Inc., who would continue to make computer hardware for NCR at the facilities. Two acquisitions were completed as well, those of Compris Technologies, Inc. and Dataworks, companies that made software for the food service and retail sectors. The company posted a small profit in 1997, its first in five years.

NCR’s fortunes were on the upswing in part because of the company’s focus on the relatively new field of data warehousing. Sifting through the vast amounts of data generated when millions of consumers used ATMs or made purchases, businesses could discern patterns that allowed narrow targeting of product pitches to individual customers. NCR had half of the market in this field, and analysts estimated that most Fortune 1000 companies would double the size of their data warehouses within the next several years. NCR was also the top maker of ATMs worldwide, with about 27 percent of the international market. As it continued to fine-tune operations in 1998, the company eliminated 5,200 more jobs and also repurchased $200 million worth of stock. Revenues for the year dropped by one percent but earnings increased more than 15fold, to $122 million. NCR also acquired half ownership of Stirling Douglas Group, Inc., a maker of software for retail businesses, and announced a partnership with Microsoft to develop advanced data warehousing systems. In early 1999 NCR’s board approved a further $250 million stock buyback. Freed from the stranglehold of AT&T, NCR appeared to be making a remarkably swift recovery and was positioned for further growth with its command of the expanding data warehousing and ATM markets.

NCR – The History of Domain Names

NCR Corporation – NCR.com was registered

Date: 04/30/1987

On April 30, 1987, NCR Corproation registered the ncr.com domain name, making it 72nd .com domain ever to be registered.

The NCR Corporation (abbrev. National Cash Register) is an American computer hardware, software and electronics company that makes self-service kiosks, point-of-sale terminals, automated teller machines, check processing systems, barcode scanners, and business consumables. They also provide IT maintenance support services. NCR had been based in Dayton, Ohio, starting in 1884, but in June 2009, the company sold most of the Dayton properties and moved its headquarters to Metro Atlanta. Currently the headquarters are in unincorporated Gwinnett County, Georgia, near Duluth, and Alpharetta, Georgia. With a future headquarters planned for the end of 2016 at Technology Square (Adjacent to Georgia Institute of Technology) located in Atlanta, Georgia

NCR was founded in 1884 and acquired by AT&T in 1991. A restructuring of AT&T in 1996 led to NCR’s re-establishment on 1 January 1997 as a separate company, and involved the spin-off of Lucent Technologies from AT&T. NCR is the only AT&T spin-off company that has retained its original name—all the others have either been purchased or renamed following subsequent mergers.

Company History:

When National Cash Register Company was formed during the last two decades of the 19th century, it had one product–cash registers. Today NCR Corporation, as it is now known, develops and markets a wide range of computer and terminal systems; office automation products; automated teller, data warehousing, and telecommunications services; semiconductor components; software; and business forms and supplies. Among NCR’s claims to fame are its introduction of bar code scanning in 1974, its position as a world leader in paper roll products for cash registers, and the fact that fully 40 percent of the checks issued around the globe are cleared with NCR equipment.

Origins

NCR’s first years were shaped in large part by John Henry Patterson, who was president from 1884 to 1921. Patterson’s early emphasis on sales, his initiation of business practices that became standards for other companies and industries, and his pioneering efforts in industrial welfare made NCR a role model for other companies during the late 1800s and early 1900s.

While running a dry goods operation in Ohio during the early 1880s, Patterson found he was losing money because not all sales were being reported by his clerks. When Patterson learned of a device called a cash register, he ordered two from James and John Ritty, who had recently established a Dayton, Ohio-based company called National Cash Register. In 1882 the Rittys sold part of their company and renamed it the National Manufacturing Company.

Patterson, meanwhile, was reaping such financial rewards from the use of his cash registers that he bought stock in the Rittys’ company. He eventually joined the board of directors and suggested that the company use nationwide marketing techniques to sell its cash registers. Patterson’s ideas met with opposition, and in 1884 he bought additional stock and took control of the company. Once president, Patterson again named the company National Cash Register Company and moved quickly to change NCR’s emphasis from manufacturing to sales. His interest in sales led to the concept of quotas and guaranteed sales territories for agents. Patterson also provided his agents with sales seminars, training brochures, and scripted sales pitches, and required them to wear white shirts and dark suits. All of these practices were new at the time but soon became widespread at other companies.

Cash register sales responded almost immediately to Patterson’s techniques. Sales more than doubled to 1,000 machines a year by 1886, while by 1888 the number of NCR employees had grown from 12 to more than 100. About this time Patterson also began to produce various forms of printed advertising. Beginning in the late 1880s, prospective customers were inundated with weekly or monthly advertising circulars and direct-mail letters describing products. Employees’ publications were introduced to bolster communication and enthusiasm about meeting sales quotas. Output–the first employee newspaper–listed sales, discussed the benefits of cash registers, and printed encouraging words from satisfied customers.

Poor economic conditions in the 1890s affected many companies in the United States, including NCR. Between 1892 and 1897 the company’s production was reduced and employees worked scaled-down weeks. The company also looked more closely at the manufacturing side of business: a system of interchangeable parts for cash register models was introduced, streamlining production and trimming overhead.

In 1894 NCR constructed a new and larger “safety-conscious” facility in Dayton with the aid of bank loans. The following year Patterson hired Thomas J. Watson, who rose quickly through the sales ranks to become a sales manager in New York and later became part of an inner circle of Dayton executives. It was Watson who led the campaign to reduce competition, including a massive advertising blitz as well as an adamant defense of patents. By 1897 NCR’s competition had been reduced to three companies, down from more than 80 a decade before.

In 1900 NCR reported the sale of its 200th cash register. It now employed a record 2,269 people. That same year the company was chartered as a New Jersey corporation for the purpose of selling stock. Construction of a ten-building facility began in 1906, and overseas operations, which had been established in the 1880s, were growing as well. In a company publication, NCR boasted that its sales force extended from Norway and Alaska to New Zealand and China, with nearly 1,000 agents in more than 270 offices.

First Electric Cash Register in 1906

In 1906 a young inventor named Charles F. Kettering gave the company its first electric cash register. Kettering, who had been hired just two years earlier, also developed NCR’s Class 1000 machine, a low-cost redesigned register that remained in production for nearly 40 years with only minor changes. Kettering left the company in 1909 to join the automotive industry.

Spurred by the success of Kettering’s cash register and the Class 1000 machine, sales continued to climb throughout the early 1900s. By 1911 NCR had sold a million machines. The company’s aggressive battle to secure patent rights and fend off competition led the American Cash Register Company to file an antitrust complaint based on the Sherman Antitrust Act, a federal law prohibiting the monopolistic restraint of trade or commerce. In 1912 the government brought NCR to trial and presented 32 cases of alleged trade interference. The following year Patterson, Watson, and 20 other officers were found guilty of trade restraint and unlawful monopoly in three of those cases. (The decision would be reversed two years later by a higher court.) In 1913, however, Watson left the company after a falling out with Patterson.

The Dayton Flood of 1913 brought more attention to NCR. Under Patterson’s leadership, the company responded to the flood by suspending all operations and providing relief shelter in company facilities. Shortly thereafter, during the early stages of World War I, NCR continued to make cash registers while involved in wartime production contracts with the government. By 1919 the company was operating almost solely on a wartime production basis.

The 1920s marked NCR’s gradual entrance into its accounting machine era. NCR already had proved its dominance in the cash register field, having controlled more than 95 percent of that market prior to the outbreak of the war. In 1921 NCR announced its Class 2000 itemizer, which provided 30 totals and embodied what the company believed were the best features of all previous registers. John Henry Patterson passed the reins of the company presidency in 1921 to his son Frederick Beck Patterson, who also assumed the duties of the chairman of the board after his father’s death a year later.

Frederick Patterson exercised voting control over NCR after the death of his father, while comptroller Stanley C. Allyn and director John H. Barringer led the company’s first major diversification drive. NCR’s profits rose from $2.8 million in 1921 to $7.8 million in 1925. Because of its success, the company went public with stock sales for the first time.

The 1920s were good years for office equipment firms. After 1925 competitors made inroads into the cash register market, while NCR failed to introduce new products. Sales flattened for NCR, and by 1928 Remington Rand topped the list of business machine companies, taking in $59 million to second-running NCR’s $49 million. Young IBM was fourth at the time with $19 million reported in sales.

Struggling During the Great Depression

In attempts to hasten the diversification drive, NCR purchased the Ellis Adding-Typewriter Company in January 1929. That same year the company announced the Class 3000, NCR’s first hybrid machine, which represented an advance in the area of payroll, billing, and accounting operations. The promise of the new machine was dampened by the Depression later that year. Sales and earnings plunged while the company began a four-year process of cutting the number of its employees in half. With NCR nearly bankrupt by 1931, New York bankers Dillon, Read and Company, who had set up the 1925 stock sales, were ready to invade the company. In response, NCR’s board of directors sought out Edward Deeds to take control of the company, and Frederick Patterson agreed to step down as chairman in 1931. Patterson remained as president until Deeds assumed that additional post in 1936; it was Deeds who turned things around for NCR.

Joining the company at the beginning of the century, Deeds had been put in charge of engineering and construction for a new factory. By 1910 he had become vice-president. Deeds left NCR for Delco in 1915 and later helped found the Wright Airplane Company with Orville Wright, Charles Kettering, and H.E. Talbott. Deeds’s success by 1931 was evident, as he sat on the corporate boards of 28 companies. Shortly after Deeds took control, the company purchased the Remington Cash Register Company, whose assets strengthened NCR’s position. In 1934 the company moved back into the black. Despite broad price fluctuations, by mid-decade sales were stabilizing and overseas operations were expanding in Great Britain, Europe, Central America, South America, and the Middle East and Far East. By the end of the decade NCR was third in the business machine field behind Remington and fast-climbing IBM. NCR in 1939 earned $12 million less than it had the year prior to the Depression. In 1940 Stanley Allyn assumed the post of president, while Deeds continued to serve as chairman and chief executive.

Effects of World War II and Its Aftermath

World War II had a significant impact on NCR, as well as on other data processing and business machine companies, spurring the conversion from office tabulating equipment to data processing. By the time the United States entered the war in 1941, NCR’s expansion into Central America and South America in the 1930s had gained importance, helping to offset the wartime reduction or elimination of operations in Japan, Germany, and Australia. For the next few years the sale of rebuilt machines was the only business NCR continued in countries directly involved in the war. By 1942 the U.S. War Production Board halted the manufacturing of all cash registers to conserve metal.

Wartime contracts for such items as bomb fuses and rocket motors covered NCR’s overhead during the war, while reconditioning of machines provided modest profits. The company’s in-house electronics research program, established prior to World War II, was utilized by the U.S. Navy during the war years. NCR built a computerlike device to calculate bombing navigational data. It also worked on a secret operation to assist the Navy in breaking the German ENIGMA communication cipher. Dubbed “the Bombe,” the mechanism was actually a high-speed electromechanical decrypting machine; about 120 Bombes were built during the course of the war.

By the war’s end a pent-up market for cash registers and accounting machines resulted in a hiring surge for NCR in Dayton. Business boomed after the war. Between 1946 and 1949 NCR reestablished itself in war-torn areas of the United Kingdom, West Germany, and Japan. Improvements and expansion continued into the early 1950s, with a rebuilt plant in Australia, a new factory in Toronto, and new office buildings in Hawaii and Mexico.

Entering the Computer Business in the 1950s

NCR continued its electronics work after the war and in 1952 secured a defense contract for a bombing navigational computer. That same year the company entered into a stock purchase agreement with Computer Research Corporation, which became its electronics division the following year. Development of a computer designed for scientific work had limited impact, and the company’s role in the computer industry remained conservative in the mid-1950s. But the 1956 introduction of the Post-Tronic, an electronic posting machine for banking, was successful. Sales of the Post-Tronic eventually passed the $100 million mark before the machine passed out of use near the end of the 1960s.

With NCR on the edge of a new era, the aging Deeds retired as chairman in 1957 and was succeeded by Allyn. Robert S. Oelman, who had been instrumental in procuring wartime contracts as a company vice-president, became president. Later that year NCR announced the 304, a general purpose computer based on solid-state technology. A few years later, in 1960, NCR’s first “small” computer–the 390, manufactured by Control Data Corporation (CDC)–made its debut.

In the early 1960s NCR increased its development of computers, as well as peripheral devices and software. In 1962 Oelman became chairman of the board, and R. Stanley Laing was named president two years later. Mid-decade saw NCR continue to operate under a split sales strategy, targeting its old customer line as well as new customers in the data processing market. NCR’s computer-related products were successful, but its innovations still remained conservative; the company’s marketplace continued to revolve around banking and retailing.

By the end of the 1960s NCR often was referred to as one of the “Seven Dwarfs” because of its relative position of inferiority to IBM. Joining NCR in these ranks were General Electric (GE), RCA, Burroughs, UNIVAC, CDC, and Honeywell. With GE and RCA bowing out of the computer field in the early 1970s, the five remaining companies became known as the BUNCH, an acronym made up of the first letter of each name.

NCR announced its third generation of computers in 1968 with the introduction of the Century Series, which included a variety of business applications and allowed NCR to market its wares to a broader customer base. NCR’s failure to take advantage of new conditions calling for terminals and software cost it some market share and resulted in a trend of declining profits from 1969 to 1972.

The first half of the 1970s marked the greatest transition period in the history of NCR as it attempted to move full force into the computer market. The period was marred by a number of setbacks that were worsened by an inflationary economy and poor business climate. Labor costs to produce older technology products were enormous, and the company also had marketing problems. Layoffs followed declining earnings, and the company was hit by a three-month strike at its Dayton plant in late 1971. The strike idled 8,500 production and maintenance employees, sharply reduced equipment deliveries, and cost the company millions of dollars in lost orders.

In 1971 NCR entered into a cooperative agreement with CDC to establish a computer peripherals company. The following year NCR established its microelectronics division. Declining profits continued through 1972, and the company posted its first net loss since 1933.

With revenues on shaky ground, William S. Anderson was named president in 1972 and chairman of the board in 1974. Anderson, who had been successful in heading up NCR’s Far East operations and NCR Japan, was the first president to come from outside the parent company. His success in Japan was due in part to the revamping of the company’s marketing organization there, and as president, Anderson quickly moved to modify NCR’s marketing structure through a similar “vocationalizing” system. The branch manager system, in which a branch manager was responsible for sales from a number of different industries, was replaced by a system whereby a district manager oversaw one major marketing area and marketing personnel were trained to specialize in a single vocational area. Areas of specialization included retail, finance, commercial business, industrial, medical, educational, governmental, and media. In 1974 NCR reported that its computer business was finally out of the red. That same year the company’s name was changed from National Cash Register to NCR Corporation.

Growth in the Late 1970s

NCR began making great strides in the computer field after naming Charles E. Exley, Jr., president in 1976. A 22-year veteran of Burroughs Corporation, Exley oversaw the introduction of a new series of computers and related equipment during the later part of the decade. NCR’s 1976 announcement of the 8000 series was well received, and improvements were made throughout the remainder of the decade. NCR’s push into computers resulted in strong earnings, while the company began a series of smaller company acquisitions that boosted expertise in factory data systems, microcards, and IBM-compatible data systems. In fewer than five years NCR revamped its entire product line. During this time the company withdrew from the mainframe computer arena and moved closer to its traditional core industries such as banking and retailing. In 1979 the company passed the $3 billion revenue mark.

NCR came into the 1980s strong, posting its first double-digit increase in revenues in 1980, but growth stalled in 1981, and earnings dropped. Product lines besieged by bugs in the late 1970s resulted in user lawsuits being filed against NCR in the early 1980s. In 1980 a lawsuit was filed by Glovatorium, a small Oakland, California dry cleaning firm. Glovatorium, a first-time computer user, had purchased an NCR Spirit/8200 system to do routine accounting, but the system failed to work. NCR defended its case on the grounds that contracts with Glovatorium had contained limitations of liability and disclaimers. The California judge ruling in the case in 1981 said NCR had targeted first-time computer users and was under a special obligation to be fair in dealing with the user. Punitive damages totaling $2 million were awarded along with compensatory damages for breach of warranty and intentional misrepresentation. The following year NCR agreed to a $2.6 million settlement with Glovatorium.

In 1983 Exley was named chief executive officer, and in the following year he became board chairman. Under his leadership, NCR underwent a corporate restructuring process, made a push back into personal computers, began reemphasizing fiscal control, and started a long-term plan of repurchasing its own stock. The Tower family of microcomputers, which was introduced in 1982, became one of the keys to NCR’s success in the mid-1980s. By 1986 the company was again posting double-digit increases, while most of the computer industry was suffering from a market recession.

NCR’s revenues had grown to $6 billion by 1988, as the company developed customized products that generated significant indirect sales. Meanwhile NCR’s microelectronics division became a leading producer of semiconductors, and the company surpassed IBM as the largest worldwide supplier of automatic teller machines (ATMs). Personal computers and the Tower microcomputers also saw significant sales gains in the emphasis switch from mainframes to distribution processing.

In 1988 Gilbert P. Williamson was promoted from executive vice-president to president, while Exley remained chairman and CEO. The following year overall sales began to dip, although foreign sales were rising. The company closed out the decade as the last thriving member of the BUNCH that had avoided a merger or sellout of interests.

NCR expected to keep its products on par with the computer industry’s powerhouses. In late 1989 it announced that it was jumping into the market for microcomputers that were based on a powerful new microchip. The announcement helped NCR land an agreement with Businessland, Inc. to begin selling the new line in 1990. According to Exley, NCR entered the 1990s with a goal to “reach all markets.” The company had operations in nine countries, with products sold in more than 120 countries. NCR expected continued success in the ATM and semiconductor markets and expanded sales in technology and information processing markets. The company also expected indirect sales to increase, with a number of NCR-manufactured products being sold bearing other companies’ labels.

NCR looked for benefits from the implementation of “concurrent engineering,” to keep its operations on a par with Japanese competitors through a more timely and less costly manufacture of products. Concurrent engineering eliminated a number of independent steps of production, some of which had been contracted out, and replaced that system with one in which design engineers and manufacturing personnel collaborated in a closer working environment, thereby reducing the time needed to correct glitches. NCR had introduced concurrent engineering in 1987 in its new Atlanta, Georgia plant, and by the 1990s the concept was implemented to some degree in all of NCR’s manufacturing facilities.

The 1990s started with great promise for NCR. As the result of an April agreement with California-based Teradata Corporation to develop parallel-processing computer technologies, NCR received 1.4 million shares of Teradata stock. In May the J.C. Penney retail chain announced that it would buy $45 million worth of workstation systems from NCR; two months later, NCR negotiated a $10 million contract to automate the branch offices of the Fleet Norstar Financial Group.

Hostile Takeover by AT&T in 1991

Then NCR ran into a formidable adversary, the American Telephone & Telegraph Company (AT&T). Seeking to bolster its failing computer division, AT&T issued a bid for NCR in December 1990, placing the purchase price at $90 a share, or $6.1 billion. The bid was met with instant hostility by NCR and over the next five months the tug-of-war was played out in the media. NCR Chairman Charles Exley publicly expressed his disdain at the thought of helping AT&T become profitable in the computer field and vowed to quit if the takeover were successful. AT&T countered with a proxy fight to unseat the NCR board of directors. Both sides hired high-powered advisers–takeover lawyer Martin Lipton and Chemical Bank for AT&T, and investment bankers Goldman-Sachs for NCR.

NCR fought hard by taking out full-page newspaper advertisements to turn public opinion its way and by asking the FCC to investigate AT&T’s bid. In the end, AT&T agreed to pay the $110 per share, or $7.4 billion, that NCR was demanding, stipulating, however, that payment be made in AT&T stock. The merger was completed in September 1991. In July NCR announced plans to create a new division to market computer products to telephone companies. NCR’s market position was slowed by the hostile takeover and subsequent adjustment period. Exley retired in February 1992 and Gilbert Williamson, NCR president, succeeded him as CEO. Elton White, executive vice-president, moved into the president’s spot.

Incorporating NCR, with its superior product development capabilities and focused marketing plan, into AT&T, whose computer products were not as sophisticated but whose market was universal, proved to be a challenging task. To counter the market drop, a restructuring of NCR occurred almost immediately. In August 1992, even before the merger was completed, plans to close NCR’s Cambridge, Ohio plant were announced. In November NCR’s Workstation Products Division was split into smaller groups that would function as independent corporations. A number of AT&T employees and products were moved into the division at this time. That same month, AT&T announced that 120 workers would be released from NCR’s Network Products Division in St. Paul, Minnesota.

Despite the internal upheaval caused by the hostile takeover bid, NCR continued to develop new products. A pen-based notepad computer, the NCR System 3125, was introduced in June 1991. The computer was the first of its kind to use an electronic stylus instead of a keyboard. The alliance with Teradata was solidified in December when NCR purchased the company for $520 million in AT&T stock. Ironically, Teradata’s biggest customer had been AT&T. In early 1993, after initially keeping a “hands-off” attitude toward NCR, AT&T installed one of its executives, Jerre Stead, as NCR CEO. Stead’s casual, “open-door” approach was one that clashed with NCR’s conservative corporate culture, and his desire to broaden NCR’s focus and step up the company’s production of PCs was not popular in all quarters. In 1994 NCR also was renamed AT&T Global Information Solutions (GIS).

Under AT&T’s management NCR/GIS was not performing up to par, however, and Stead jumped ship in 1995. The company found a replacement in Lars Nyberg, a Swede who had successfully turned around the fortunes of Philips Electronics N.V.’s computer division. Nyberg immediately began to make serious changes, announcing a restructuring that included the layoffs of 20 percent of the company’s workforce. NCR was reportedly losing almost $2 million a day for AT&T, and Nyberg also made the decision to get out of the PC business, in which NCR seemed to have few prospects for long-term success. The company also dropped the unpopular GIS name and became known as NCR once again.

In early 1996 AT&T announced that it would spin off NCR as part of a massive realignment, issuing to its shareholders NCR stock worth nearly $4 billion, or about half of what it had invested in the company four years earlier. NCR became independent in January 1997, and its stock resumed trading on the New York Stock Exchange. Nyberg continued his efforts to restore NCR’s fortunes and reorganized further during the year, cutting another 1,000 jobs and reconfiguring the company’s structure into five large divisions from 130 smaller ones. He also sold three of the company’s manufacturing plants to Solectron, Inc., who would continue to make computer hardware for NCR at the facilities. Two acquisitions were completed as well, those of Compris Technologies, Inc. and Dataworks, companies that made software for the food service and retail sectors. The company posted a small profit in 1997, its first in five years.

NCR’s fortunes were on the upswing in part because of the company’s focus on the relatively new field of data warehousing. Sifting through the vast amounts of data generated when millions of consumers used ATMs or made purchases, businesses could discern patterns that allowed narrow targeting of product pitches to individual customers. NCR had half of the market in this field, and analysts estimated that most Fortune 1000 companies would double the size of their data warehouses within the next several years. NCR was also the top maker of ATMs worldwide, with about 27 percent of the international market. As it continued to fine-tune operations in 1998, the company eliminated 5,200 more jobs and also repurchased $200 million worth of stock. Revenues for the year dropped by one percent but earnings increased more than 15fold, to $122 million. NCR also acquired half ownership of Stirling Douglas Group, Inc., a maker of software for retail businesses, and announced a partnership with Microsoft to develop advanced data warehousing systems. In early 1999 NCR’s board approved a further $250 million stock buyback. Freed from the stranglehold of AT&T, NCR appeared to be making a remarkably swift recovery and was positioned for further growth with its command of the expanding data warehousing and ATM markets.

Napster – The History of Domain Names

Napster peer-to-peer file sharing

Date: 01/01/1999

Napster was the name given to two music-focused online services. It was originally founded as a pioneering peer-to-peer (P2P) file sharing Internet service that emphasized sharing digital audio files, typically songs, encoded in MP3 format. The original company ran into legal difficulties over copyright infringement, ceased operations and was eventually acquired by Roxio. In its second incarnation Napster became an online music store until it was acquired by Rhapsody from Best Buy on December 1, 2011.

Later companies and projects successfully followed its P2P file sharing example such as Gnutella, Freenet, Kazaa, Bearshare, and many others. Some services, like LimeWire, Scour, Grokster, Madster, and eDonkey2000, were brought down or changed due to copyright issues.

Origin

Napster was founded by Shawn Fanning, John Fanning, and Sean Parker. Initially, Napster was envisioned as an independent peer-to-peer file sharing service by Shawn Fanning. The service operated between June 1999 and July 2001. Its technology allowed people to easily share their MP3 files with other participants. Although the original service was shut down by court order, the Napster brand survived after the company’s assets were liquidated and purchased by other companies through bankruptcy proceedings.

History

Although there were already networks that facilitated the distribution of files across the Internet, such as IRC, Hotline, and Usenet, Napster specialized in MP3 files of music and a user-friendly interface. At its peak the Napster service had about 80 million registered users. Napster made it relatively easy for music enthusiasts to download copies of songs that were otherwise difficult to obtain, such as older songs, unreleased recordings, and songs from concert bootleg recordings. Users did not have to be computer programmers to download songs. Some users felt justified in downloading digital copies of recordings they had already purchased in other formats, such as LP and cassette tape, before the compact disc emerged as the dominant format for music recordings.

Many other users simply enjoyed trading and downloading music for free. High-speed networks in college dormitories became overloaded, with as much as 61% of external network traffic consisting of MP3 file transfers. Many colleges blocked its use for this reason, even before concerns about liability for facilitating copyright violations on campus. The ease of downloading individual songs facilitated by Napster and later services is often credited with ushering in the end of the Album Era in popular music, which focused on the release of albums of songs by bands and artists.

Macintosh version

The service and software program were initially Windows-only. However, in 2000, Black Hole Media wrote a Macintosh client called Macster. Macster was later bought by Napster and designated the official Mac Napster client (“Napster for the Mac”), at which point the Macster name was discontinued. Even before the acquisition of Macster, the Macintosh community had a variety of independently-developed Napster clients. The most notable was the open source client called MacStar, released by Squirrel Software in early 2000 and Rapster, released by Overcaster Family in Brazil. The release of MacStar’s source code paved the way for third-party Napster clients across all computing platforms, giving users advertisement-free music distribution options.

Legal challenges

Heavy metal band Metallica discovered a demo of their song “I Disappear” had been circulating across the network before it was officially released. This led to it being played on several radio stations across the United States and alerted Metallica that their entire back catalogue of studio material was also available. On March 13, 2000, they filed a lawsuit against Napster. A month later, rapper and producer Dr. Dre, who shared a litigator and legal firm with Metallica, filed a similar lawsuit after Napster refused his written request to remove his works from its service. Separately, both Metallica and Dr. Dre later delivered to Napster thousands of usernames of people whom they believed were pirating their songs. In March 2001, Napster settled both suits, after being shut down by the Ninth Circuit Court of Appeals in a separate lawsuit from several major record labels (see below). In 2000, Madonna’s single “Music” was leaked out onto the web and Napster prior to its commercial release, causing widespread media coverage. Verified Napster use peaked with 26.4 million users worldwide in February 2001.

In 2000, the American musical recording company A&M Records along with several other recording companies, through the Recording Industry Association of America (RIAA), sued Napster (A&M Records, Inc. v. Napster, Inc.) on grounds of contributory and vicarious copyright infringement under the US Digital Millennium Copyright Act (DMCA). Napster was faced with the following allegations from the music industry:

  • That its users were directly violating the plaintiffs’ copyrights.
  • That Napster was responsible for contributory infringement of the plaintiffs’ copyrights.
  • That Napster was responsible for vicarious infringement of the plaintiffs’ copyrights.

Initially Napster lost the case in the District Court but then appealed to the U.S. Court of Appeals for the Ninth Circuit. Although it was clear that Napster could potentially have commercially significant non-infringing uses, the Ninth Circuit upheld the District Court’s decision. Immediately after, the District Court commanded Napster to keep track of the activities of its network and to restrict access to infringing material when informed of that material’s location. Napster wasn’t able to comply and thus had to close down its service in July 2001. The following year, in 2002, Napster finally announced itself bankrupt and sold its assets to a third party.

Promotional power

Along with the accusations that Napster was hurting the sales of the record industry, there were those who felt just the opposite, that file trading on Napster actually stimulated, rather than hurt, sales. Some evidence may have come in July 2000 when tracks from English rock band Radiohead’s album Kid A found their way to Napster only three months before the album’s release. Unlike Madonna, Dr. Dre or Metallica, Radiohead had never hit the top 20 in the US. Furthermore, Kid A was an album without any singles released, and received relatively little radio airplay. By the time of the album’s release, the album was estimated to have been downloaded for free by millions of people worldwide, and in October 2000 Kid A captured the number one spot on the Billboard 200 sales chart in its debut week. According to Richard Menta of MP3 Newswire, the effect of Napster in this instance was isolated from other elements that could be credited for driving sales, and the album’s unexpected success suggested that Napster was a good promotional tool for music. One of the most successful bands to owe its success to Napster was the band Dispatch. As an independent band, it had no formal promotion or radio play, yet it was able to tour in cities where it had never played and to sell out concerts, thanks to the spread of its music on Napster. In July 2007, the band became the first independent band to headline New York City’s Madison Square Garden, selling the venue out for three consecutive nights. The band members were avid supporters of Napster, promoting it at their shows, playing a Napster show around the time of the Congressional hearings, and attending the hearings themselves. Shawn Fanning, one founder of Napster, is a known Dispatch fan.

Since 2000, many musical artists, particularly those not signed to major labels and without access to traditional mass media outlets such as radio and television, have said that Napster and successive Internet file-sharing networks have helped get their music heard, spread word of mouth, and may have improved their sales in the long term. One such musician to publicly defend Napster as a promotional tool for independent artists was Dj xealot, who became directly involved in the 2000 A&M Records Lawsuit. Chuck D from Public Enemy also came out and publicly supported Napster.

Shutdown

In July 2001, Napster shut down its entire network in order to comply with the injunction. On September 24, 2001, the case was partially settled. Napster agreed to pay music creators and copyright owners a $26 million settlement for past, unauthorized uses of music, as well as an advance against future licensing royalties of $10 million. In order to pay those fees, Napster attempted to convert its free service into a subscription system. Thus traffic to Napster was reduced. A prototype solution was tested in the spring of 2002: the Napster 3.0 Alpha, using the “.nap” secure file format from PlayMedia Systems and audio fingerprinting technology licensed from Relatable. Napster 3.0 was, according to many former Napster employees, ready to deploy, but it had significant trouble obtaining licenses to distribute major-label music. On May 17, 2002, Napster announced that its assets would be acquired by German media firm Bertelsmann for $85 million with the goal of transforming Napster into an online music subscription service. The two companies had been collaborating since the summer of 2000 where Bertelsmann became the first major label to drop its copyright lawsuit against Napster. Pursuant to the terms of the acquisition agreement, on June 3 Napster filed for Chapter 11 protection under United States bankruptcy laws. On September 3, 2002, an American bankruptcy judge blocked the sale to Bertelsmann and forced Napster to liquidate its assets.

2008-2016

Napster’s brand and logos were acquired at bankruptcy auction by Roxio which used them to re-brand the Pressplay music service as Napster 2.0. In September 2008, Napster was purchased by US electronics retailer Best Buy for US $121 million. On December 1, 2011, pursuant to a deal with Best Buy, Napster merged with Rhapsody. Best Buy will receive a minority stake in Rhapsody. Later on July 14, 2016, Rhapsody phased out the Rhapsody brand in favor of Napster and has since branded its service internationally as Napster.

NAP – The History of Domain Names

New Internet architecture with commercial ISPs connected at NAP

Date: 01/01/1995

Internet Exchanges and NAPs (Network Access Points)

By 1993, the NSFNET decided to defund the NSFNET and do away with the AUP in order to promote commercialization of the Internet. Many commercial Internet networks came online during this time. In fact, the regional networks that were originally supported by the NSF turned into commercial service providers, including UUNet, PSINet, BBN, Intermedia, Netcom, and others.

The NSF’s plan for privatization included the creation of NAPs (network access points), which are Internet exchanges with open access policies that support commercial and international traffic. One can think of the NAPs as being like airports that serve many different airlines. The airlines lease airport space and use its facilities. Likewise, NSPs lease space at NAPs and use their switching facilities to exchange traffic with other parts of the Internet.

Part of NSF’s strategy was that all NSPs that received government funding must connect to all the NAPs. In 1993, the NFS awarded NAP contracts to MFS (Metropolitan Fiber Systems) Communications for a NAP in Washington, D.C., Ameritech for a NAP in Chicago, Pacific Bell for a NAP in San Francisco, and Sprint for a NAP in New York. MFS already operated MAEs (metropolitan area exchanges) in Washington, D.C. (MAE East) and in California’s Silicon Valley (MAE West). A MAE is a fiber-optic loop covering a metropolitan area that provides a connection point for local service providers and businesses.

A NAP is a physical facility with equipment racks, power supplies, cable trays, and facilities for connecting to outside communication systems. The NAP operator installs switching equipment. Originally, NAPs used FDDI and switched Ethernet, but ATM switches or Gigabit Ethernet switches are common today. NSPs install their own routers at the NAP and connect them to the switching facilities, as shown in Figure I-5. Thus, traffic originating from an ISP crosses the NSP’s router into the NAP’s switching facility to routers owned by other NSPs that are located at the NAP. Refer to Geoff Huston’s paper called “Interconnection, Peering, and Settlements” at the Web address listed on the related entries page for a complete discussion of NAPs and peering.

Most NAPs now consist of core ATM switches surrounded by routers. Traffic is exchanged across ATM PVCs (permanent virtual circuits). Usually, a NAP provides a default fully meshed set of PVCs that provide circuits to every other NSP router located at the NAP. However, an NSP can remove a PVC to block traffic from a particular NSP. However, larger NSPs may not want to peer with smaller NSPs because there is no equal exchange of traffic. A rule of thumb is that NSPs with a presence at every NAP peer with one another on an equal basis.

NAP operators do not establish peering agreements between NSPs, but only provide the facilities where peering can take place. Peering agreements are bilateral agreements negotiated between NSPs that define how they will exchange traffic at the NAPs. In addition, all IP datagram routing is handled by the NSP’s equipment. However, the NAP provides the switching equipment over which the packets traverse after being routed.

The NSF also funded the creation of the Routing Arbiter service, which provided routing coordination in the form of a route server and a routing arbiter database (RADB). Route servers would handle routing tasks at NAPs while the RADB generated the route server configuration files. RADB was part of a distributed set of databases known as the Internet Routing Registry, a public repository of announced routes and routing policy in a common format. NSPs use information in the registry to configure their backbone routers.

Today, Internet exchanges are only one part of the Internet architecture. Many NSPs establish private peering arrangements, as previously mentioned. A private connection is a direct physical link that avoids forwarding traffic through the NAPs switching facility, which is often overburdened. NSPs create private connections in two ways. One method is to run a cable between their respective routers at the NAP facility. Another more costly approach is to lay cable or lease lines between their own facilities.

Internap Network Services Corporation provides an Internet exchange service designed to maximize performance. Its Assimilator proprietary technology provides intelligent routing and route management to extend and enhance BGP4 routing. Assimilator allows the P-NAP to make intelligent routing decisions, such as choosing the faster of multiple backbones to route data if the destination is multihomed. Internap customer packets are sent immediately to the correct Internet backbone, rather than to a randomly chosen public or private peering point.

NANOG – The History of Domain Names

North American Network Operators’ Group (NANOG) established

Date: 01/01/1993

The North American Network Operators’ Group (NANOG) is an educational and operational forum for the coordination and dissemination of technical information related to backbone/enterprise networking technologies and operational practices. It runs meetings, talks, surveys, and an influential mailing list for Internet service providers. The main method of communication is the NANOG mailing list (known informally as nanog-l), a free mailing list to which anyone may subscribe or post.

History

NANOG evolved from the NSFNET “Regional-Techs” meetings, where technical staff from the regional networks met to discuss operational issues of common concern with each other and with the Merit engineering staff. At the February 1994 regional techs meeting in San Diego, the group revised its charter to include a broader base of network service providers, and subsequently adopted NANOG as its new name. NANOG was organized by Merit Network, a non-profit Michigan organization, from 1994 through 2011 when it was transferred to NewNOG.

Organization

NANOG meetings are organized by NewNOG, Inc., a Delaware non-profit organization, which took over responsibility for NANOG from the Merit Network in February 2011. Meetings are hosted by NewNOG and other organizations from the U.S. and Canada. Overall leadership is provided by the NANOG Steering Committee, established in 2005, and a Program Committee.

Meetings

NANOG meetings are held three times each year, and include presentations, tutorials, and BOFs (Birds of a Feather meetings). There are also ‘lightning talks’, where speakers can submit brief presentations (no longer than 10 minutes), on a very short term. The meetings are informal, and membership is open. Conference participants typically include senior engineering staff from tier 1 and tier 2 ISPs. Participating researchers present short summaries of their work for operator feedback.

Funding

Funding for NANOG originally came from the National Science Foundation, as part of two projects Merit undertook in partnership with NSF and other organizations: the NSFNET Backbone Service and the Routing Arbiter project. All NANOG funds now come from conference registration fees and donations from vendors, and starting in 2011, membership dues.

Scope

NANOG meetings provide a forum for the exchange of technical information, and promote discussion of implementation issues that require community cooperation. Coordination among network service providers helps ensure the stability of overall service to network users. The group’s charter is available on the official NANOG website.

Topics

The NANOG Program Committee publishes a Call for Presentations as well as proposes topics that address current operational issues. The committee’s criteria for selecting talks are outlined on the Call for Presentations: the talks focus on large-scale backbone operations, ISP coordination, or technologies that are already deployed or soon to be deployed in core Internet backbones and exchange points. Popular topics include traffic engineering, applications of new protocols, routing policy specification, queue management and congestion, routing scalability, caching, and inter-provider security, to name a few.

Myspace – The History of Domain Names

Myspace – Social networking site

Date: 01/01/2003

Myspace is a social networking website offering an interactive, user-submitted network of friends, personal profiles, blogs, groups, photos, music, and videos. It is headquartered in Beverly Hills, California.

Myspace was acquired by News Corporation in July 2005 for $580 million. From 2005 to 2009, Myspace was the largest social networking site in the world, and in June 2006 surpassed Google as the most visited website in the United States. In April 2008, Myspace was overtaken by Facebook in the number of unique worldwide visitors, and was surpassed in the number of unique U.S. visitors in May 2009, though Myspace generated $800 million in revenue during the 2008 fiscal year. Since then, the number of Myspace users has declined steadily in spite of several redesigns. As of October 2016, Myspace was ranked 2,154 by total web traffic, and 1,522 In the United States.

Myspace had a significant influence on pop culture and music and created a gaming platform that launched the successes of Zynga and RockYou, among others. The site also started the trend of creating unique URLs for companies and artists.

In June 2009, Myspace employed approximately 1,600 employees. In June 2011, Specific Media Group and Justin Timberlake jointly purchased the company for approximately $35 million. Under new ownership, the company had undergone several rounds of layoffs and by June 2011, Myspace had reduced its staff to around 200.

History

2003–05: Beginnings

In August 2003, several eUniverse employees with Friendster accounts saw potential in its social networking features. The group decided to mimic the more popular features of the website. Within 10 days, the first version of Myspace was ready for launch, implemented using ColdFusion. A complete infrastructure of finance, human resources, technical expertise, bandwidth, and server capacity was available for the site. The project was overseen by Brad Greenspan (eUniverse’s Founder, Chairman, CEO), who managed Chris DeWolfe (MySpace’s starting CEO), Josh Berman, Tom Anderson (MySpace’s starting president), and a team of programmers and resources provided by eUniverse.

The first Myspace users were eUniverse employees. The company held contests to see who could sign up the most users. eUniverse used its 20 million users and e-mail subscribers to breathe life into Myspace, and move it to the head of the pack of social networking websites. A key architect was tech expert Toan Nguyen who helped stabilize the Myspace platform when Brad Greenspan asked him to join the team. Co-founder and CTO Aber Whitcomb played an integral role in software architecture, utilizing the then superior development speed of ColdFusion over other dynamic database driven server-side languages of the time. Despite over ten times the number of developers, Friendster, which was developed in JavaServer Pages (jsp), could not keep up with the speed of development of Myspace and cfm.

The MySpace.com domain was originally owned by YourZ.com, Inc., intended until 2002 for use as an online data storage and sharing site. By 2004, it was transitioned from a file storage service to a social networking site. A friend, who also worked in the data storage business, reminded Chris DeWolfe that he had earlier bought the domain MySpace.com. DeWolfe suggested they charge a fee for the basic Myspace service. Brad Greenspan nixed the idea, believing that keeping Myspace free was necessary to make it a successful community.

2005–08: Rise and purchase by News Corp.

Myspace quickly gained popularity among teenage and young adult social groups. In February 2005, DeWolfe held talks with Mark Zuckerberg over acquiring Facebook but DeWolfe rejected Zuckerberg’s $75 million asking price.

Some employees of Myspace, including DeWolfe and Berman, were able to purchase equity in the property before MySpace and its parent company eUniverse (now renamed Intermix Media) was bought. In July 2005, in one of the company’s first major Internet purchases, Rupert Murdoch’s News Corporation (the parent company of Fox Broadcasting and other media enterprises) purchased Myspace for US$580 million. News Corporation had beat out Viacom by offering a higher price for the website, and the purchase was seen as a good investment at the time. Of the $580 million purchase price, approximately $327 million has been attributed to the value of Myspace according to the financial adviser fairness opinion. Within a year, Myspace had tripled in value from its purchase price. News Corporation saw the purchase as a way to capitalize on Internet advertising, and drive traffic to other News Corporation properties.

After losing the bidding war for Myspace, Viacom chairman Sumner Redstone stunned the entertainment industry in September 2006 when he fired Tom Freston from the position of CEO. Redstone believed that the failure to acquire MySpace contributed to the 20% drop in Viacom’s stock price in 2006 up to the date of Freston’s ouster. Freston’s successor as CEO, Philippe Dauman, was quoted as saying “never, ever let another competitor beat us to the trophy”. Redstone told interviewer Charlie Rose that losing MySpace had been “humiliating”, adding, “MySpace was sitting there for the taking for $500 million” (Myspace was sold in 2012 by News Corp for $35 million.)

In January 2006, Fox announced plans to launch a UK version of Myspace in a bid to “tap into the UK music scene”, which they did. They released a version in China and launched similar versions in other countries.

The 100 millionth account was created on August 9, 2006, in the Netherlands.

On November 1, 2007, Myspace and Bebo joined the Google-led OpenSocial alliance, which already included Friendster, Hi5, LinkedIn, Plaxo, Ning and Six Apart. OpenSocial was to promote a common set of standards for software developers to write programs for social networks. Facebook remained independent. Google had been unsuccessful in building its own social networking site Orkut in the U.S. market and was using the alliance to present a counterweight to Facebook.

By late 2007 and into 2008, Myspace was considered the leading social networking site, and consistently beat out main competitor Facebook in traffic. Initially, the emergence of Facebook did little to diminish Myspace’s popularity; at the time, Facebook was targeted only at college students. At its peak, when News Corp attempted to merge it with Yahoo! in 2007, Myspace was valued at $12 billion.

2008–11: Decline and sale by News Corp.

On April 19, 2008, Facebook overtook Myspace in the Alexa rankings. Since then, Myspace has seen a continuing loss of membership, and there are several suggestions for its demise, including the fact that it stuck to a “portal strategy” of building an audience around entertainment and music, whereas Facebook and Twitter continually launched new features to improve the social-networking experience.

Marvin L. Gittelman suggested that the $900 million three-year advertisement deal with Google, while being a short-term cash windfall, was a handicap in the long run. That deal required Myspace to place even more ads on its already heavily advertised space, which made the site slow, more difficult to use, and less flexible. Myspace could not experiment with its own site without forfeiting revenue, while rival Facebook was rolling out a new clean site design. MySpace CEO Chris DeWolfe reported that he had to push back against Fox Interactive Media’s sales team who monetized the site without regard to user experience.

While Facebook focused on creating a platform that allowed outside developers to build new applications, Myspace built everything in-house. Shawn Gold, Myspace’s former head of marketing and content, said “Myspace went too wide and not deep enough in its product development. We went with a lot of products that were shallow and not the best products in the world”. The products division had introduced many features (communication tools such as instant messaging, a classifieds program, a video player, a music player, a virtual karaoke machine, a self-serve advertising platform, profile-editing tools, security systems, privacy filters, and Myspace book lists, among others). However, the features were often buggy and slow as there was insufficient testing, measuring, and iterating. Danah Boyd, a senior researcher at Microsoft Research, noted of social networking websites that Myspace and others were a very peculiar business—one in which companies might serially rise, fall, and disappear, as “Influential peers pull others in on the climb up—and signal to flee when it’s time to get out”. The volatility of social networks was exemplified in 2006 when Connecticut Attorney General Richard Blumenthal launched an investigation into children’s exposure to pornography on Myspace; the resulting media frenzy and Myspace’s inability to build an effective spam filter gave the site a reputation as a “vortex of perversion”. Around that time, specialized social media companies such as Twitter formed and began targeting Myspace users, while Facebook rolled out communication tools which were seen as safe in comparison to Myspace. Boyd compared the shift of white, middle-class kids from the “seedy” Myspace to the “supposedly safer haven” of Facebook, to the “white flight” from American cities; the perception of Myspace eventually drove advertisers away as well. In addition, Myspace had particular problems with vandalism, phishing, malware and spam which it failed to curtail, making the site seem inhospitable. These have been cited as factors why users, who as teenagers were Myspace’s strongest audience in 2006 and 2016, had been migrating to Facebook. Facebook, which started strong with the 18-to-24 group (mostly college students), has been much more successful than Myspace at attracting elderly men.

Chairman and CEO Rupert Murdoch was said to be frustrated that Myspace never met expectations, as a distribution outlet for Fox studio content, and missing the US$1 billion mark in total revenues. That resulted in DeWolfe and Anderson gradually losing their status within Murdoch’s inner circle of executives, plus DeWolfe’s mentor Peter Chernin, the President and COO of News Corp. who was based in Los Angeles, departed the company. Former AOL executive Jonathan Miller, who joined News Corp in charge of the digital media business, was in the job for three weeks when he shuffled Myspace’s executive team in April 2009. Myspace President Tom Anderson stepped down while Chris DeWolfe was replaced as Myspace CEO by former Facebook COO Owen Van Natta. A News Corp. meeting in March 2009 over the direction of Myspace was reportedly the catalyst for that management shakeup, with the Google search deal about to expire, the departure of key personnel (Myspace’s COO, SVP of engineering, and SVP of strategy) to form a startup. Furthermore, the opening of extravagant new offices around the world was questioned, as rival Facebook did not have similarly expensive expansion plans yet it still attracted international users at a rapid rate. The changes to Myspace’s executive ranks was followed in June 2009 by a layoff of 37.5% of its workforce (including 30% of its U.S. employees), reducing employees from 1,600 to 1,000. In 2009, around the time that Myspace underwent layoffs and a management shakeup, the site “relied on drastic redesigns as Hail Mary passes to get users back”. However this may have backfired for Myspace, as it is noted that users generally disliked interface tweaks on rival Facebook (which avoided major site redesigns). Myspace has attempted to redefine itself as a social entertainment website, with more of a focus on music, movies, celebrities, and TV, instead of a social networking website. Myspace also developed a linkup with Facebook that would allow musicians and bands to manage their Facebook profiles. CEO Mike Jones was quoted as saying that Myspace now is a “complementary offer” to Facebook Inc., which is “not a rival anymore”.

In March 2011, market research figures released by comScore suggested that Myspace had lost 10 million users between January and February 2011, and that it had fallen from 95 million to 63 million unique users during the previous twelve months. Myspace registered its sharpest audience declines in the month of February 2011, as traffic fell 44% from a year earlier to 37.7 million unique U.S. visitors. Advertisers have been reported as unwilling to commit to long term deals with the site.

In late February 2011, News Corp officially put the site up for sale, which was estimated to be worth $50–200 million. Losses from last quarter of 2010 were $156 million, over double of the previous year, which dragged down the otherwise strong results of parent News Corp. The deadline for bids, May 31, 2011, passed without any above the reserve price of $100 million being submitted It has been said that the rapid deterioration in Myspace’s business during the most recent quarter deterred many potential suitors.

On June 29, 2011, Myspace announced to label partners and press via email that it had been acquired by Specific Media for an undisclosed sum, rumoured to be a figure as low as $35m. CNN reported that Myspace sold for $35 million, and noted that it was “far less than the $580 million News Corp. paid for Myspace in 2005”. Rupert Murdoch went on to call the Myspace purchase a “huge mistake”. Time Magazine compared News Corporation’s purchase of Myspace to Time Warner’s purchase of AOL – a conglomerate trying to stay ahead of the competition. Many former executives have gone onto further success after departing Myspace.

2013: Relaunch

On September 24, 2012, Timberlake tweeted a link to a video that featured a redesigned Myspace, dubbed the “new Myspace”. Timberlake stated in an interview with the Hollywood Reporter that he believed he was “bridging the gap” between artists and their fan bases.

On January 15, 2013, the new Myspace entered its publicly accessible open beta phase, featuring written editorial content, radio stations, music mixes and videos. Music was streamed through a constant music player located at the bottom of the page, while musicians could track the location of their top fans, who were identified by the number of times they played the artist’s music. Although the unveiling was purposefully scheduled on the same date as the release of Timberlake’s new music single, the event was overshadowed by Facebook’s announcement of its “graph search” function on the same day. Christian Parkes, vice-president of global marketing, explained in a May 2013 interview that the redesign was undertaken with brands in mind: The site is going through this custom process of evolution … When you think of shifting perception, it just comes down to trying to do everything right: Do everything right by the artist, by the brands that we work with, and make the right partnerships. Writing for his company blog, digital marketing author Rohit Bhargava identified “5 Things For Marketers To Love About The New Myspace” in a post that was published shortly after the commencement of the open beta phase on January 7. Bhargava was impressed by the redesigned platform in terms of marketing and wrote: “the new Myspace may be more important for brand marketers in 2013 than you think.” His list included the horizontal navigation of the new website and the engagement potential for brands. The Guardian’s Amanda Holpuch was less enthusiastic and concluded “clever design and useful functionalities do not a successful social network make.”

The official launch of the new Myspace occurred on June 12, 2013, and included the launch of a corresponding mobile app for the new Myspace, providing users access to streaming radio stations curated by artists and Myspace, as well as personal radio stations created by users themselves. The app’s social features facilitate connections between users who possess similar interests, and users can also create animated GIF files, which can be shared on Myspace and other social platforms. The app was launched on Apple Inc.’s App Store, while a mobile website was also designed for those users without access to an iOS device. The newly designed platform also included new analytics tools for artists to manage their digital presence from a single location, and, at the time of the launch, the Myspace music catalog consisted of over 50 million songs. As part of the discontinuation of the “Classic MySpace” and the launch of the new platform, the user content from the old MySpace was deleted. Myspace explained on its website that it would no longer feature “Blogs, Private Messages, Videos, Comments or Posts, Custom background design and Games,” acknowledging that “this is upsetting to some.” Myspace received a large amount of online complaints from users and eventually locked the primary discussion thread. The complaints described the loss of poems and personal notes, photos of dead friends, intimate messages, and games that cost significant amounts of time and money.

In July 2013, Myspace revealed its new hires for editorial content: Joseph Patel, previously a producer at Vice, became the vice president of content and “creative,” while editors Benjamin Meadows-Ingram (Billboard) and Monica Herrera (Rolling Stone) were subsequently recruited by Patel.

As of October 1, 2013, Myspace said it had 36 million users.

2016: Purchase by Time Inc.

On February 11, 2016 it was announced that MySpace and its parent company had been bought by Time Inc.

MX – The History of Domain Names

.mx created

Date: 05/01/2009

.mx is the Internet country code top-level domain (ccTLD) for Mexico, which in 2009 was re-opened to new registrations by NIC México. In 2009, the .mx ccTLD will be rolled out in three steps:

Sunrise period from May 1 to July 31, 2009, waiting period, registrants who have already registered any another .MX second-level domain would be able to register their domain for one year

Waiting period from August 1 to August 31, 2009, to set up the domains registered in the Sunrise period, and to resolve domain name disputes

Initial registration period from September 1 to October 31, 2009; in this part the registration will be done with the policy first-come, first-served and only for one year with a special set of prices.

After the three phases, .mx registrations will be open for the public.

Second-level domains

Up to August 2009 domain registrations besides .MX, consist of third-level names beneath second-level names which parallel some of the top-level domains:

  • .com.mx: Commercial entities (actually unrestricted, like .com)
  • .net.mx: Network providers (registration limited to qualifying entities)
  • .org.mx: Non-profit organizations (registration limited to qualifying entities)
  • .ngo.mx: Non-profit organizations or Civil society organizations (registration NOT limited to qualifying entities)
  • .edu.mx: Educational institutions (registration limited to qualifying entities)
  • .gob.mx: Federal, State or Municipal Governmental entities only (.gob derives from the Spanish word for government: “Gobierno”)

Currently second level domains can be registered directly under .mx.

On April 30, 2009, second level domain registrations were 0.06% of the total. A month later the value was up to 4.9% On April 30, 2010, second level registrations were 21.4% of the total.

Mosaic – The History of Domain Names

Mosaic Web Browser and its Release

Date: 01/01/1993

NCSA Mosaic, or simply Mosaic, is a discontinued early web browser. It has been credited with popularizing the World Wide Web. It was also a client for earlier protocols such as File Transfer Protocol, Network News Transfer Protocol, and gopher. The browser was named for its support of multiple internet protocols. Its intuitive interface, reliability, Windows port and simple installation all contributed to its popularity within the web, as well as on Microsoft operating systems. Mosaic was also the first browser to display images inline with text instead of displaying images in a separate window. While often described as the first graphical web browser, Mosaic was preceded by WorldWideWeb, the lesser-known Erwise and ViolaWWW.

Mosaic was developed at the National Center for Supercomputing Applications (NCSA) at the University of Illinois Urbana-Champaign beginning in late 1992. NCSA released the browser in 1993, and officially discontinued development and support on January 7, 1997. However, it can still be downloaded from NCSA.

Netscape Navigator was later developed by Netscape, which employed many of the original Mosaic authors; however, it intentionally shared no code with Mosaic. Netscape Navigator’s code descendant is Mozilla Firefox. Starting in 1995 Mosaic lost a lot of share to Netscape Navigator, and by 1997 only had a tiny fraction of users left, by which time the project was discontinued. Microsoft licensed the browser to create Internet Explorer in 1995.

Background

David Thompson tested ViolaWWW and showed the application to Marc Andreessen. Andreessen and Eric Bina originally designed and programmed NCSA Mosaic for Unix’s X Window System called xmosaic. Then, in December 1991, the Gore Bill created and introduced by then Senator and future Vice President Al Gore was passed, which provided the funding for the Mosaic project. Development began in December 1992. Marc Andreessen announced the project on Jan 23, 1993. The first alpha release (numbered 0.1a) was published in June 1993, and the first beta release (numbered 0.6b) followed quickly thereafter in September 1993. Version 1.0 for Windows was released on November 11, 1993. NCSA Mosaic for Unix (X-Windows) version 2.0 was released on November 10, 1993. A port of Mosaic to the Commodore Amiga was available by October 1993. Ports to Windows and Macintosh had already been released in September. From 1994 to 1997, the National Science Foundation supported the further development of Mosaic.

Marc Andreessen, the leader of the team that developed Mosaic, left NCSA and, with James H. Clark, one of the founders of Silicon Graphics, Inc. (SGI), and four other former students and staff of the University of Illinois, started Mosaic Communications Corporation. Mosaic Communications eventually became Netscape Communications Corporation, producing Netscape Navigator.

1994 saw the first commercial product to incorporate Mosaic: SCO Global Access, a modified version of its Open Desktop version of Unix that served as an Internet gateway.

Spyglass, Inc. licensed the technology and trademarks from NCSA for producing their own web browser but never used any of the NCSA Mosaic source code. Microsoft licensed Spyglass Mosaic in 1995 for US$2 million, modified it, and renamed it Internet Explorer. After a later auditing dispute, Microsoft paid Spyglass $8 million. The 1995 user guide The HTML Sourcebook: The Complete Guide to HTML, specifically states, in a section called Coming Attractions, that Internet Explorer “will be based on the Mosaic program”. 331 Versions of Internet Explorer before version 7 stated “Based on NCSA Mosaic” in the About box. Internet Explorer 7 was audited by Microsoft to ensure that it contained no Mosaic code, and thus no longer credits Spyglass or Mosaic.

Licensing

The licensing terms for NCSA Mosaic were generous for a proprietary software program. In general, non-commercial use was free of charge for all versions (with certain limitations). Additionally, the X Window System/Unix version publicly provided source code (source code for the other versions was available after agreements were signed). Despite persistent rumors to the contrary, however, Mosaic was never released as open source software during its brief reign as a major browser; there were always constraints on permissible uses without payment.

As of 1993, license holders included these:

  • Amdahl Corporation
  • Fujitsu Limited (Product: Infomosaic, a Japanese version of Mosaic. Price: Yen5,000 (approx US$50)
  • InfoSeek Corporation (Product: No commercial Mosaic. May use Mosaic as part of a commercial database effort)
  • Quadralay Corporation (Consumer version of Mosaic. Also using Mosaic in its online help and information product, GWHIS. Price: US$249)
  • Quarterdeck Office Systems Inc.

The Santa Cruz Operation Inc. (Product: Incorporating Mosaic into “SCO Global Access,” a communications package for Unix machines that works with SCO’s Open Server. Runs a graphical e-mail service and accesses newsgroups.)

SPRY Inc. (Products: A communication suite: Air Mail, Air News, Air Mosaic, etc. Also producing Internet In a Box with O’Reilly & Associates. Price: US$149–$399 for Air Series.)

Spyglass, Inc. (Product: Relicensing to other vendors. Signed deal with Digital Equipment Corp., which would ship Mosaic with all its machines.)

Immediate effect

Other browsers existed during this period, notably Erwise, ViolaWWW, MidasWWW and tkWWW. These browsers, however, did not have the same effect as Mosaic on public use of the Internet.

In the October 1994 issue of Wired Magazine, Gary Wolfe notes in the article titled “The (Second Phase of the) Revolution Has Begun: Don’t look now, but Prodigy, AOL, and CompuServe are all suddenly obsolete – and Mosaic is well on its way to becoming the world’s standard interface”:

When it comes to smashing a paradigm, pleasure is not the most important thing. It is the only thing. If this sounds wrong, consider Mosaic.

Mosaic is the celebrated graphical “browser” that allows users to travel through the world of electronic information using a point-and-click interface. Mosaic’s charming appearance encourages users to load their own documents onto the Net, including color photos, sound bites, video clips, and hypertext “links” to other documents. By following the links – click, and the linked document appears – you can travel through the online world along paths of whim and intuition. Mosaic is not the most direct way to find online information. Nor is it the most powerful. It is merely the most pleasurable way, and in the 18 months since it was released, Mosaic has incited a rush of excitement and commercial energy unprecedented in the history of the Net.

Importance of Mosaic

Mosaic was the web browser that led to the Internet boom of the 1990s. Robert Reid underscores this importance stating, “while still an undergraduate, Marc wrote the Mosaic software … that made the web popularly relevant and touched off the revolution”. Mosaic is not the first web browser for Windows; this is Thomas R. Bruce’s little-known Cello. And the Unix version of Mosaic was already famous before the Windows and Mac versions were released. Other than displaying images embedded in the text rather than in a separate window, Mosaic’s original feature set is not greater than of the browsers on which it was modeled, such as ViolaWWW. But Mosaic was the first browser written and supported by a team of full-time programmers, was reliable and easy enough for novices to install, and the inline graphics reportedly proved immensely appealing. Mosaic is said to have made the Web accessible to the ordinary person for the first time and already had 53% market share in 1995.

Reid also refers to Matthew K. Gray’s website, Internet Statistics: Growth and Usage of the Web and the Internet, which indicates a dramatic leap in web use around the time of Mosaic’s introduction.

In addition, David Hudson concurs with Reid, noting that:

Marc Andreessen’s realization of Mosaic, based on the work of Berners-Lee and the hypertext theorists before him, is generally recognized as the beginning of the web as it is now known. Mosaic, the first web browser to win over the Net masses, was released in 1993 and made freely accessible to the public. The adjective phenomenal, so often overused in this industry, is genuinely applicable to the… ‘explosion’ in the growth of the web after Mosaic appeared on the scene. Starting with next to nothing, the rates of the web growth (quoted in the press) hovering around tens of thousands of percent over ridiculously short periods of time were no real surprise.

Ultimately, web browsers such as Mosaic became the killer applications of the 1990s. Web browsers were the first to bring a graphical interface to search tools the Internet’s burgeoning wealth of distributed information services. A mid-1994 guide lists Mosaic alongside the traditional, text-oriented information search tools of the time, Archie and Veronica, Gopher, and WAIS but Mosaic quickly subsumed and displaced them all. Joseph Hardin, the director of the NCSA group within which Mosaic was developed, said downloads were up to 50,000 a month in mid-1994.

In November 1992, there were twenty-six websites in the world and each one attracted attention. In its release year of 1993, Mosaic had a What’s New page, and about one new link was being added per day. This was a time when access to the Internet was expanding rapidly outside its previous domain of academia and large industrial research institutions. Yet it was the availability of Mosaic and Mosaic-derived graphical browsers themselves that drove the explosive growth of the Web to over 10,000 sites by Aug 1995 and millions by 1998. Metcalf expressed the pivotal role of Mosaic this way:

In the Web’s first generation, Tim Berners-Lee launched the Uniform Resource Locator (URL), Hypertext Transfer Protocol (HTTP), and HTML standards with prototype Unix-based servers and browsers. A few people noticed that the Web might be better than Gopher.

In the second generation, Marc Andreessen and Eric Bina developed NCSA Mosaic at the University of Illinois. Several million then suddenly noticed that the Web might be better than sex.

End of Mosaic

Mosaic’s popularity as a separate browser began to lessen upon the release of Andreessen’s Netscape Navigator in 1994. This was noted at the time in The HTML Sourcebook: The Complete Guide to HTML: “Netscape Communications has designed an all-new WWW browser Netscape, that has significant enhancements over the original Mosaic program.”

By 1998 its user base had almost completely evaporated, being replaced by other web browsers. After NCSA stopped work on Mosaic, development of the NCSA Mosaic for the X Window System source code was continued by several independent groups. These independent development efforts include mMosaic (multicast Mosaic) which ceased development in early 2004, and Mosaic-CK and VMS Mosaic.

VMS Mosaic, a version specifically targeting OpenVMS operating system, was one of the longest-lived efforts to maintain Mosaic. Using the VMS support already built-in in original version (Bjorn S. Nilsson ported Mosaic 1.2 to VMS in the summer of 1993), developers incorporated a substantial part of the HTML engine from mMosaic, another defunct flavor of the browser. As of 3 September 2003, VMS Mosaic supported HTML 4.0, OpenSSL, cookies, and various image formats including GIF, JPEG, PNG, BMP, TGA, TIFF and JPEG 2000 image formats. The browser works on VAX, Alpha, and Itanium platforms.

Another long-lived version of Mosaic – Mosaic-CK, developed by Cameron Kaiser – saw its last release (version 2.7ck9) on July 11, 2010; a maintenance release with minor compatibility fixes (version 2.7ck10) was released on 9 January 2015, followed by another one (2.7ck11) in October 2015. The stated goal of the project is “Lynx with graphics” and runs on Mac OS X, Power MachTen, Linux and other compatible Unix-like OSs.

Morrisworm – The History of Domain Names

Morris worm

Date: 01/01/1988

The Morris worm or Internet worm of November 2, 1988 was one of the first computer worms distributed via the Internet. It was the first to gain significant mainstream media attention. It also resulted in the first felony conviction in the US under the 1986 Computer Fraud and Abuse Act. It was written by a graduate student at Cornell University, Robert Tappan Morris, and launched on November 2, 1988 from the computer systems of the Massachusetts Institute of Technology.

Robert Morris, Jr., a graduate student in Computer Science at Cornell, wrote an experimental, self-replicating, self-propagating program called a worm and injected it into the Internet. He chose to release it from MIT, to disguise the fact that the worm came from Cornell. Morris soon discovered that the program was replicating and reinfecting machines at a much faster rate than he had anticipated—there was a bug. Ultimately, many machines at locations around the country either crashed or became “catatonic.” When Morris realized what was happening, he contacted a friend at Harvard to discuss a solution. Eventually, they sent an anonymous message from Harvard over the network, instructing programmers how to kill the worm and prevent reinfection. However, because the network route was clogged, this message did not get through until it was too late. Computers were affected at many sites, including universities, military sites, and medical research facilities. The estimated cost of dealing with the worm at each installation ranged from $200 to more than $53,000. The worm was designed to be undetectable, but a design flaw led it to create far more copies of itself than Morris estimated, and resulted in the drastic over-taxing of all the computers on which it was installed.  This in turn allowed for its immediate detection and the repair of the flaws that it exploited. The Morris worm was not a destructive worm, it only caused computers to slow and buckle under the weight of unnecessary processing.  Nor was the intent of Morris clear: some speculate that the release was either premature or accidental. Nonetheless the event precipitated two different responses that have since become the focus of much attention and concern over the intervening years.  Exploring these two responses reveals something about what “systemic risk” might or might not mean in the context of the Internet and how it relates to other uses of the concept.

Architecture

According to its creator, the Morris worm was not written to cause damage, but to gauge the size of the Internet. The worm was released from MIT in the hope of suggesting that its creator studied there[citation needed], which Morris did not (though Morris later became a tenured professor at MIT in 2006). It worked by exploiting known vulnerabilities in Unix sendmail, finger, and rsh/rexec, as well as weak passwords. Due to reliance on rsh (normally disabled on untrusted networks), fixes to sendmail, finger, the widespread use of network filtering, and improved awareness of the dangers of weak passwords, it should not succeed on a recent, properly configured system.

A supposedly unintended consequence of the code, however, caused it to be more damaging: a computer could be infected multiple times and each additional process would slow the machine down, eventually to the point of being unusable. This would have the same effect as a fork bomb and crash the computer several times. The main body of the worm could only infect DEC VAX machines running 4BSD, and Sun-3 systems. A portable C “grappling hook” component of the worm was used to pull over (download) the main body, and the grappling hook could run on other systems, loading them down and making them peripheral victims.

The mistake

The critical error that transformed the worm from a potentially harmless intellectual exercise into a virulent denial of service attack was in the spreading mechanism. The worm could have determined whether to invade a new computer by asking whether there was already a copy running. But just doing this would have made it trivially easy to kill: everyone could just run a process that would answer “yes” when asked whether there was already a copy, and the worm would stay away. The defense against this was inspired by Michael Rabin’s mantra “Randomization”. To compensate for this possibility, Morris directed the worm to copy itself even if the response is “yes” 1 out of 7 times. This level of replication proved excessive, and the worm spread rapidly, infecting some computers multiple times. Rabin said that Morris “should have tried it on a simulator first.”

Effects

The U.S. Government Accountability Office put the cost of the damage at $100,000–10,000,000. Clifford Stoll, who helped fight the worm, wrote in 1989 “I surveyed the network, and found that two thousand computers were infected within fifteen hours. These machines were dead in the water—useless until disinfected. And removing the virus often took two days.” It is usually reported that around 6,000 major UNIX machines were infected by the Morris worm; however, Morris’s colleague Paul Graham claimed, “I was there when this statistic was cooked up, and this was the recipe: someone guessed that there were about 60,000 computers attached to the Internet, and that the worm might have infected ten percent of them.” (Stoll wrote “Rumors have it that [Morris] worked with a friend or two at Harvard’s computing department (Harvard student Paul Graham sent him mail asking for ‘Any news on the brilliant project’)”. The Internet was partitioned for several days, as regional networks disconnected from the NSFNet backbone and from each other to prevent recontamination as they cleaned their own networks.

The first and most direct response was that Morris became the first individual to be tried under the new Computer Fraud and Abuse Act of 1986, 18 U.S.C. Section 1030(a)(5)(A).  Morris was tried, convicted and sentenced to three years of probation, 400 hours of community service, a fine of $10,050, and the costs of his supervision.

The case was appealed, and the conviction upheld.  The second response was the creation by Defense Advanced Research Projects Agency (DARPA) of the Computer Emergency Response Team (CERT), in order to coordinate information and responses to computer vulnerabilities and security.

I

Of the first response—the criminal prosecution—a couple of things stand out.  The first is the continuing moral ambivalence about Morris’s intentions. On the one hand, what Morris did, objectively, was to force certain security vulnerabilities to be fixed by writing a program that publicly exploited them.  As the author of one official investigation, Eugene Spafford, pointed out, the code contained no commands that would harm a computer on which it ran, only commands intended to exploit vulnerabilities that allowed the code to copy itself and spread.  On the other hand, his conviction for Fraud and Abuse clearly sends a different message—that this was a criminal act, and as the law had it, one that threatened not just citizens, but the federal government itself.

The practice of publicly demonstrating exploitable vulnerabilities in order to force vendors and system administrators to fix their systems has become established in the academic field of computer science, though it has been largely restricted to the publication of papers that demonstrate how to do it, rather than the release of software that actually does so.  This puts Morris, who is now employed at MIT’s AI lab and a successful researcher, squarely in the camp of what is known colloquially as “white hat hackers”—hackers, including both academic and corporate employees, who (demonstrate how to) exploit vulnerabilities in order to make publicly visible security flaws that need fixing.  Morris’s worm, from this standpoint looks more like incompetence as a white hat hacker, than the criminal action of a black hat hacker.[2] The moral ambivalence mirrors that around many of the hacker-cum-Silicon Valley success stories that might be cited in this instance.

In terms of the criminality of the Morris worm, one might ask:  is the risk that such actions pose a systemic risk?  The Morris Worm was neither designed to, nor did it cause harm of a particular sort (theft of documents or information, deletion or destructive interference, spam, porn, terrorist intervention, etc.).   Rather, its effect was more general in that it caused the Internet, as a collection of interconnected computers, to do something it was not designed to do, and as a result, slow down or cease to function properly. But is such an effect an issue of “systemic risk?”  In part the answer may rest on what is defined as the system, and in particular whether the system or the risk is perceived as the “emergent” property (i.e. something which emerges through the interaction of parts, but cannot be reduced to, or predicted by those interactions).   In the case of the Internet, the system itself is the emergent part: what we call the Internet is only the effect of tens of millions of devices all operating with standardized software-based protocols that govern how they connect, transfer information and disconnect.  The platform that emerges is flexible, reconfigurable, asynchronous and without any pre-designed structure to it.  Worms and viruses affect this emergent system by affecting software on many particular computers; in the case of the Morris worm, by changing the function of the email management software called sendmail, and the name lookup service called finger.  The particular vulnerabilities that allow a worm or virus to do this (such as “buffer overflow” vulnerabilities) are the proper quarry of the computer security researcher.

The terminology of worms and viruses express different conceptions of propagation in terms of computer programs. Viruses operate on the analogy of biological viruses by inserting a bit of code into a running program which exploits a vulnerability to allow it to replicate, and potentially, to do harm.  A worm by contrast (shortened from tapeworm) is a complete program more like a parasite or symbiont; it reproduces itself via the very vulnerabilities that it exploits.  In both cases, individual (networked) computers are the locus and necessary starting point: because networked operating system software is designed to create myriad forms of connections with other computers, and hence bring a network into being, it can be exploited to spread worms and viruses similar to how infections or rumors spread.  There is no difference, therefore, between the risk to all of the infected computers combined, and the risk to the “system” understood as the network that emerges out of the interconnection of machines. Or to put it differently, security researchers understand the nature of the risk of a virus or worm to be simply the risk posed by a virus to a particular computer multiplied by the number of computers that are affected.  In this sense, worms or viruses do not pose a  systemic risk—i.e. a new risk that emerges out of or on top of aggregate risks to known components.  Rather they are of a piece with the emergence of the network itself.

Contrast this with systemic risk in a case like catastrophe insurance.  In the case of catastrophe insurance it is not necessarily the system that is emergent, but the risk.  Individual policies have well-defined risks, but the risk of a portfolio of policies cannot be predicted by the aggregate risk of all the individual policies.  Similarly, there is nothing analogous to the Internet as an “emergent network” that results from the issuing of policies—though catastrophe insurance policies can clearly be said to interact at some level (especially when events occur).  As such there is a “systemic risk” at play that is different in kind, not only degree, from the risk that pertains to individual policies.  The comparison is uneven, however, since the case of insurance already builds on the concept of population-level risk, and not just that of individuals—there is no obvious point of comparison here to the relationship between individual computers and a network.

Nonetheless, the idea that computer security risks are not “emergent” explains the often indignant response of computer security researchers (and white hat hackers) to vulnerabilities:  there is no mystery, there is no “emergent risk,” the vulnerabilities can and should be fixed by users and vendors by fixing all the individual software components.

There are other uses of “systemic” in the context of computer security. The language of viruses and worms clearly designates a threat to a system understood as an organism.  The language of payloads (often also used in the biological field) designates the specific threat or violence a virus or worm might visit on a computer and its associated network. At least one paper uses the language of an ecology of “security and policy flaws” in which it is possible for worms and viruses to thrive (p. 16): application design, buffer overflows, privileges, application deployed widely, economic costs of development and testing of software, patch deployment, and “monocultures” (i.e. the absence of a diversity of machines and software).

For the most part, the Morris worm as a problem of hackers, criminals, computer security researchers and software vendors and users is concerned with the possibility of understanding, making visible and controlling vulnerabilities in parts, in order to safeguard an emergent system that forms through their interaction.  Risk is almost always equated to vulnerabilities (known and unknown) in this sense.

II

By contrast, the other response, the formation of the Computer Emergency Response Team, ties the event of the Morris worm directly to the rise of thinking about critical infrastructure protection.  The DARPA funded project, which is headquartered at Carnegie Mellon’s Software Engineering Institute, publishes a variety of announcements, analyses, fixes and warnings concerning software and networking vulnerabilities.  In 1997 they wrote a report to the Presidents Commission on Critical Infrastructure Protection that reported a list of problems that CERT has continually encountered (similar to the ecology cited earlier, but not so-named).[4] A handful of publications and presentations that identify problems of interdependency have been published, and the organization has focused some of its energy on creating tools, mechanisms, textbooks and guidelines for “software assurance”, as well as topics like  “resiliency engineering management” and “vulnerability disclosure.”

The rise of “Critical Infrastructure Interdependency” research that began with the publication of Rinaldi et.al. in 2001 has grown out of and alongside these institutions. The notion of “infrastructure interdependency” is a more apt conceptual analog to “systemic risk” than are worms and viruses (which in this context look more like specific techniques, rather than events in themselves).  CII research suggests that it is not the system that is emergent, but the risk.  Individual systems (or infrastructures) may be emergent, as the Internet is, and even highly designed systems such as the electrical power grid might also exhibit emergent features.[6] However, it is the interaction between systems or infrastructures that introduces risks that cannot be predicted or reduced to the component parts.  CII researchers are therefore intensely concerned with the concrete points of connection between systems, a much discussed example of which is supervisory control and data acquisition (SCADA) software, especially when it is deployed in the context of the Internet as a tool for managing industrial processes.  The “emergent” risk comes from the unpredictable vulnerabilities that obtain when, for instance, electrical power grids are monitored by and dependent upon networked computers.  The language of “cascades” and contagious risk appears in this research with as much regularity as it does in the domain of finance (e.g. the bank contagion literature).

For its part, the academic field of computer security research seems to remain at a distance from the related concerns of systemic risk in finance, public health, or defense, despite being well-funded by the likes of DARPA and well-represented by agencies like the National Security Agency (whose National Computer Security Center was for a long period headed by none other than Robert Morris Sr.).  For most computer security researchers, flaws and vulnerabilities are confined to the operating system level of networked computers, or at most, extend to social or policy vulnerabilities, such as physical access to computers, poor management of accounts and privileges, or non-existent monitoring of personal computers connected to the Internet.  They have historically not been much concerned with the interdependency of two networked systems, like the electrical grid and the Internet.  A certain hubris, perhaps, follows from the separation of worms/viruses and their “payload”—and the field is alive with imaginaries about exploitation by evil-doers, rather than collapse or break-down due to normal design or usage.  The idea of a “normal accident” is almost completely foreign to existing computer security research.  Intent, even in the attenuated form that resulted in Morris’ conviction, is an obsession of this field.  By contrast, breakdowns and the failure of software to function as expected has largely been the subject of software engineering, which has been berating itself for a very long time now (40 years, beginning with the famous 1968 NATO report on Software Engineering).  Instead of security, such concerns are largely the domain of software development practices, quality assurance systems, and an ever-increasing number of standards intended to manage the problem of poorly designed, poorly performing software.

By way of conclusion, an interesting point of convergence has emerged just in the past three months.  The recent W32.Stuxnet worm targeted SCADA software made by Siemens used to conduct process engineering and control the operation of large-scale plants, like Iran’s new nuclear power plant.  It’s not at all clear at this point that this should be called an Internet worm or virus, because it could not have infected the computers it infected without someone physically inserting a USB stick with the Stuxnet code into a computer running the Siemens SCADA software.  The worm replicated itself through an internally networked system and it apparently opens up control to outsiders, but it does not go on to affect the function of the Internet in any way, only the operation of the plant in question.  Here, again, the risk is “emergent” but only in the sense that the longstanding attempt to create computer controlled industrial processes has itself produced unpredictable vulnerabilities.  At a technical level such vulnerabilities are not different in kind than those the Morris worm exploited: they result from poor engineering, overlooked weaknesses, or poor security practice.  But at a more “systemic” level they are different in kind since they affect not only the operation of the computers themselves, but the physical operation of a plant or “interdependent” system.

Mobilephones – The History of Domain Names

Mobile Phones Connect

Date: 01/01/1996

1996 Mobile phones and the Internet

The first mobile phone with Internet connectivity was theNokia 9000 Communicator, launched in Finland in 1996.The viability of Internet services access on mobile phones was limited until prices came down from that model and network providers started to develop systems and services conveniently accessible on phones. NTT DoCoMo inJapan launched the firstmobile Internet service, i-mode, in 1999 and this is considered the birth ofthe mobilephone Internet services. In 2001 the mobile phone email system by Research in Motion for their BlackBerry product was launched in America. To make efficient use of the small screen and tiny keypad and one-handed operation typical of mobile phones, a specific document and networking model was created for mobile devices, the Wireless Application Protocol (WAP). Most mobile device Internet services operate using WAP. The growth of mobile phone services was initially a primarily Asian phenomenon with Japan, South Korea and Taiwan allsoon finding the majority of their Internet users accessing resources by phone rather than by PC.Developing countries followed, with India, South Africa,Kenya, Philippines and Pakistan all reporting that the majority of their domestic users accessed the Internet from a mobile phone rather than a PC. The European and North American use of the Internet was influenced by a large installed base of personal computers, and the growth of mobile phone Internet access was more gradual, but had reached national penetration levels of 20–30% in most Western countries.Thecross-over occurred in 2008, when more Internet access devices were mobile phones than personal computers. In many parts of the developing world, the ratio is as much as 10 mobile phone users to one PC user.

1997 In 1997, PGMedia filed an anti-trust suit against NSI citing the ROOT zone as an essential facility, and the US National ScienceFoundation(NSF) was enjoined to this action. Ultimately, Network Solutions NSI was granted immunity from anti-trust litigation, but the litigation created enough pressure to restructure the domain name market.

1997 May The application that runs almost every DNS server on the Internet is called BIND, for Berkeley Internet Name Domain, first developed as a graduate student project at the University of California at Berkeley, andmaintained through version 4.8.3 by the university’s Computer Systems Research Group (CSRG). The initial BIND development team consisted of Mark Painter, David Riggle, Douglas Terry, and Songnian Zhou. Later work was done by Ralph Campbell and Kevin Dunlap, and others that contributed include Jim Bloom, Smoot Carl-Mitchell, DougKingston, Mike Muuss, Craig Partridge, and Mike Schwartz. Application maintenance was done by Mike Karels and O. Kure.Versions 4.9 and 4.9.1 of BIND were released by then the number two computer company, Digital Equipment Corporation. The lead developer was Paul Vixie, with assistance from Paul Albitz, Phil Almquist, Fuat Baran,Alan Barrett, Bryan Beecher, Andy Cherenson, Robert Elz, Art Harkin, Anant Kumar, Don Lewis, Tom Limoncelli, Berthold Paffrath, Andrew Partan, Win Treese, and Christophe Wolfhugel. After Vixie left to establish Vixie Enterprises, he sponsored the development of BIND Version 4.9.2, and became the application’s principal architect. Versions 4.9.3 on have been developed and maintained by the Internet Systems Consortium. A major architectural update called Version 8 was co-developed by Bob Halley and Paul Vixie and released in May 1997. Another major architectural rewrite called Version 9 with enhanced security support was developed and released in the year 2000.

1997 There are two ccTLDs that have been deleted after the corresponding 2-letter code was withdrawn from ISO 3166-1: cs (forCzechoslovakia) and zr (for Zaire). There may be a significant delay between withdrawal from ISO 3166-1 and deletion from the DNS; for example, ZR ceased to be an ISO 3166-1 code in 1997, but the zr ccTLD was not deleted until 2001.Other ccTLDs corresponding to obsolete ISO 3166-1 have not yet been deleted. In some cases they may never be deleted due to the amount of disruption this would cause for a heavily used ccTLD. In particular, the Soviet Union’s ccTLD su remains in use more than a decade after SU was removed from ISO 3166-1.

1997 By the mid-1990s there was discussion of introduction of more TLDs. Jon Postel, as head of IANA, invited applications from interested parties. In early 1995, Postel created “Draft Postel”, an Internet draft containing the procedures to create new domain name registries and new TLDs. Draft Postel created a number of small committees to approve the newTLDs. Because of the increasing interest, a number of large organizations took overthe process under the Internet Society’s umbrella. This second attempt involved setting up a temporary organization called the International Ad HocCommittee (IAHC). On February 4, 1997, the IAHC issued a report ignoring theDraft Postel recommendations and instead recommended the introduction of seven new TLDs (arts, firm, info, nom,rec, store, and web). However, these proposals were abandoned after the U.S. government intervened.

1998 In addition to his role as the RFC Editor, Jon Postel worked as the manager of IANA until his death in 1998.

Mobi – The History of Domain Names

.Mobi Reaches its Fifth Year

September 27, 2011

Mobile domain reaches five year anniversary.

.Mobi was launched Five years ago, officially opening the doors to what would become a roller coaster for domain investors.(Example – Flowers.mobi)

Despite radical changes on the mobile front, .mobi as a domain has continued its growth. There are now more than one million active registrations and .mobi is the sixth largest generic Top-Level Domain in the world (.com, .net..org. .info and .biz come before it.) There are now two-character domains like nv.mobi and 53.mobi. There are now .mobi sites covering the world, in every vertical.

Technology quickly caught up and Mobile web access became less “mobile specific” with the growing prevalence of smart phones.

MLB-Athletics – The History of Domain Names

MLB buys Athletics.com domain name

May 17, 2012

Major League Baseball makes another big domain acquisition.

Just a couple days ago I was revisiting a post I wrote in 2010 about Major League Baseball’s domain names. The only change to the list since I posted was MLB picking up Angels.com.

But this week MLB picked up another one. Jamie Zoch of dotWeekly discovered that the league just bought Athletics.com.

That leaves just five teams that MLB doesn’t own the “exact” domain name for. Of the remaining domains, Rockies.com and Twins.com seem like they would be the most “available” given their current use and owner.

MLB has been paying a couple hundred grand per domain it acquires. It also got a late start with domain names. It wasn’t the original owner of MLB.com. MLB.com was owned by one of the league’s law firms. (You can read the fascinating story of its transfer here.)

All things considered, I bet Major League Baseball applies for .mlb. I don’t know if they’ll use it, but it’s relatively inexpensive and worth picking up just in case. And this just may be one of those cases where a brand needs to apply for a domain to keep someone else from getting it, since MLB could stand for multiple things. Like a law firm.

Milnet – The History of Domain Names

Milnet

Date: 01/01/1983

The MILNET, also called the Military Network or Military Net, was part of the ARPANET, which was the original name of the Internet. It was used for unclassified United States Department of Defense (DoD) traffic.

The MILNET, also called the Military Network or Military Net, was part of the ARPANET, which was the original name of the Internet. It was used for unclassified United States Department of Defense (DoD) traffic. MILNET was split from the ARPANET in 1983, leaving the ARPANET to be used for academic research and public use. All links between MILNET and ARPANET were severed when MILNET split off. E-mail between the two networks were delivered using gateways.

BBN Technologies built and maintained MILNET and ARPANET. In the 1980s, MILNET was expanded and became the Defense Data Network. This large network was a compilation of multiple smaller military networks, each one running at a different security level. In the 1990s, MILNET gave birth to NIPRNet, which was used to transmit sensitive, but unclassified data between internal users and also provide those users with Internet access.

MILNET was physically separated from the ARPANET in 1983. the ARPANET remained in service for the academic research community, but direct connectivity between the networks was severed for security reasons. Gateways relayed electronic mail between the two networks. BBN Technologies built and managed both the MILNET and the ARPANET and the two networks used very similar technology. It is also known as “Military Net.” During the 1980s the MILNET expanded to become the Defense Data Network, a worldwide set of military networks running at different security levels. In the 1990s, MILNET became the NIPRNET.

In 1983, the U.S. military portion of the ARPANET was broken off as a separate network, the MILNET. MILNETsubsequently became the unclassified but military-only NIPRNET, in parallel with the SECRET-level SIPRNET and JWICS for TOP SECRET and above. NIPRNET does have controlled security gateways to the public Internet. The networks based on the ARPANET were government funded and therefore restricted to non commercial uses such as research; unrelated commercial use was strictly forbidden. This initially restricted connections to military sites and universities. 1984 The Internet began to penetrate Asia in the late 1980s.Japan, which had built the UUCP-based network JUNET in 1984, connected to NSFNET in 1989. It hosted the annual meeting of the Internet Society, INET’92, in Kobe. Singapore developed TECHNET in 1990, and Thailand gained a global Internet connection between Chulalongkorn University and UUNET in 1992. 1984 TCP/IP goes global Between 1984 and 1988 CERN began installation and operation of TCP/IP to interconnect its major internal computer systems, workstations, PCs and an accelerator control system. CERN continued to operate a limited self-developed system CERNET internally and several incompatible (typically proprietary) network protocols externally. There was considerable resistance in Europe towards more widespread use of TCP/IP and the CERN TCP/IP intranets remained isolated from the Internet until 1989.

1984 As interest in widespread networking grew and new applications for it were developed, the Internet’s technologies spread throughout the rest of the world. The network-agnostic approach in TCP/IP meant that it was easy touse any existing network infrastructure, such as the IPSS X.25 network, to carry Internet traffic. In 1984, University College London replaced its transatlantic satellite links with TCP/IP over IPSS. Many sites unable to link directly to the Internet started to create simple gateways to allow transfer of e-mail, at that time the most important application. Sites which only had intermittent connections used UUCP or FidoNet and relied on the gateways between these networks and the Internet. Some gateway services went beyond simple email peering, such as allowing access to FTP sites via UUCP or e-mail. Finally, the Internet’s remaining centralized routing aspects were removed. The EGP routing protocol was replaced by a new protocol, the Border Gateway Protocol (BGP). This turned the Internet into ameshed topology and moved away from the centric architecture which ARPANET had emphasized. In 1994, Classless Inter-Domain Routing was introduced to support better conservation of address space which allowed use of route aggregation to decrease the size of routing tables.

1984 UUCP and Usenet By 1981 the number of UUCP hosts hadgrown to 550, nearly doubling to 940 in 1984.

Milnet Split – The History of Domain Names

MILNET split off from ARPANET

Date: 01/01/1983

MILNET was the name given to the part of ARPANET, the forerunner of the Internet, that was designated for nonclassified U.S. military use. When MILNET was split off from ARPANET in 1983, ARPANET remained for use by the academic research community. However, gateway computers interconnected the networks.

Today, the top level domain (TLD) name of .mil is reserved for use only by U.S. military agencies.

It was used for unclassified United States Department of Defense (DoD) traffic. MILNET was split from the ARPANET in 1983, leaving the ARPANET to be used for academic research and public use. All links between MILNET and ARPANET were severed when MILNET split off.  E-mail between the two networks were delivered using gateways.

BBN Technologies built and maintained MILNET and ARPANET. In the 1980s, MILNET was expanded and became the Defense Data Network. This large network was a compilation of multiple smaller military networks, each one running at a different security level. In the 1990s, MILNET gave birth to NIPRNet, which was used to transmit sensitive, but unclassified data between internal users and also provide those users with Internet access.

In 1983 ARPANET split into ARPANET and MILNET, removing the military component from ARPANET.  MILNET, designed for unclassified U.S. Department of Defense traffic, was integrated into the Defense Data NetworkOffsite Link that had been created the previous year.

MILNET was physically separated from the ARPANET in 1983. The ARPANET remained in service for the academic research community, but direct connectivity between the networks was severed for security reasons. Gateways relayed electronic mail between the two networks. BBN Technologies built and managed both the MILNET and the ARPANET and the two networks used very similar technology. It is also known as “Military Net.”

During the 1980s the MILNET expanded to become the Defense Data Network, a worldwide set of military networks running at different security levels. In the 1990s, MILNET became the NIPRNET.

Merit – The History of Domain Names

Merit Network founded

Date: 01/01/1966

Merit Network, Inc., is a nonprofit member-governed organization providing high-performance computer networking and related services to educational, government, health care, and nonprofit organizations, primarily in Michigan. Created in 1966, Merit operates the longest running regional computer network in the United States.

Organization

Created in 1966 as the Michigan Educational Research Information Triad by Michigan State University (MSU), the University of Michigan (U-M), and Wayne State University (WSU),[2] Merit was created to investigate resource sharing by connecting the mainframe computers at these three Michigan public research universities. Merit’s initial three node packet-switched computer network was operational in October 1972 using custom hardware based on DEC PDP-11 minicomputers and software developed by the Merit staff and the staffs at the three universities. Over the next dozen years the initial network grew as new services such as dial-in terminal support, remote job submission, remote printing, and file transfer were added; as gateways to the national and international Tymnet, Telenet, and Datapac networks were established, as support for the X.25 and TCP/IP protocols was added; as additional computers such as WSU’s MVS system and the UM’s electrical engineering’s VAX running UNIX were attached; and as new universities became Merit members.

Merit’s involvement in national networking activities started in the mid-1980s with connections to the national supercomputing centers and work on the 56 kbit/s National Science Foundation Network (NSFNET), the forerunner of today’s Internet. From 1987 until April 1995, Merit re-engineered and managed the NSFNET backbone service. MichNet, Merit’s regional network in Michigan was attached to NSFNET and in the early 1990s Merit began extending “the Internet” throughout Michigan, offering both direct connect and dial-in services, and upgrading the statewide network from 56 kbit/s to 1.5 Mbit/s, and on to 45, 155, 622 Mbit/s, and eventually 1 and 10 Gbit/s. In 2003 Merit began its transition to a facilities based network, using fiber optic facilities that it shares with its members, that it purchases or leases under long term agreements, or that it builds. In addition to network connectivity services, Merit offers a number of related services within Michigan and beyond, including: Internet2 connectivity, VPN, Network monitoring, Voice over IP (VOIP), Cloud storage, E-mail, Domain Name, Network Time, VMware and Zimbra software licensing, Colocation, Michigan Cyber Range cybersecurity courses, and professional development seminars, workshops, classes, conferences, and meetings.

Creating the network: 1966 to 1973

The Michigan Educational Research Information Triad (MERIT) was formed in the fall of 1966 by Michigan State University (MSU), University of Michigan (U-M), and Wayne State University (WSU). More often known as the Merit Computer Network or simply Merit, it was created to design and implement a computer network connecting the mainframe computers at the universities. In the fall of 1969, after funding for the initial development of the network had been secured, Bertram Herzog was named director for MERIT. Eric Aupperle was hired as senior engineer, and was charged with finding hardware to make the network operational. The National Science Foundation (NSF) and the State of Michigan provided the initial funding for the network. In June 1970, the Applied Dynamics Division of Reliance Electric in Saline, Michigan was contracted to build three Communication Computers or CCs. Each would consist of a Digital Equipment Corporation (DEC) PDP-11 computer, dataphone interfaces, and interfaces that would attach them directly to the mainframe computers. The cost was to be slightly less than the $300,000 ($1,828,000, adjusted for inflation) originally budgeted. Merit staff wrote the software that ran on the CCs, while staff at each of the universities wrote the mainframe software to interface to the CCs. The first completed connection linked the IBM S/360-67 mainframe computers running the Michigan Terminal System at WSU and U-M, and was publicly demonstrated on December 14, 1971. The MSU node was completed in October 1972, adding a CDC 6500 mainframe running Scope/Hustler. The network was officially dedicated on May 15, 1973

Expanding the network: 1974 to 1985

In 1974, Herzog returned to teaching in the University of Michigan’s Industrial Engineering Department, and Aupperle was appointed as director.Use of the all uppercase name “MERIT” was abandoned in favor of the mixed case “Merit”.The first network connections were host to host interactive connections which allowed person to remote computer or local computer to remote computer interactions. To this, terminal to host connections, batch connections (remote job submission, remote printing, batch file transfer), and interactive file copy were added. And, in addition to connecting to host computers over custom hardware interfaces, the ability to connect to hosts or other networks over groups of asynchronous ports and via X.25 were added. Merit interconnected with Telenet (later SprintNet) in 1976 to give Merit users dial-in access from locations around the United States. Dial-in access within the U.S. and internationally was further expanded via Merit’s interconnections to Tymnet, ADP’s Autonet, and later still the IBM Global Network as well as Merit’s own expanding network of dial-in sites in Michigan, New York City, and Washington, D.C. In 1978, Western Michigan University (WMU) became the fourth member of Merit (prompting a name change, as the acronym Merit no longer made sense as the group was no longer a triad). To expand the network, the Merit staff developed new hardware interfaces for the Digital PDP-11 based on printed circuit technology. The new system became known as the Primary Communications Processor (PCP), with the earliest PCPs connecting a PDP-10 located at WMU and a DEC VAX running UNIX at U-M’s Electrical Engineering department. A second hardware technology initiative in 1983 produced the smaller Secondary Communication Processors (SCP) based on DEC LSI-11 processors. The first SCP was installed at the Michigan Union in Ann Arbor, creating UMnet, which extended Merit’s network connectivity deeply into the U-M campus. In 1983 Merit’s PCP and SCP software was enhanced to support TCP/IP and Merit interconnected with the ARPANET.

National networking, NSFNET, and the Internet: 1986 to 1995

In 1986 Merit engineered and operated leased lines and satellite links that allowed the University of Michigan to access the supercomputing facilities at Pittsburgh, San Diego, and NCAR. In 1987, Merit, IBM and MCI submitted a winning proposal to NSF to implement a new NSFNET backbone network. The new NSFNET backbone network service began 1 July 1988. It interconnected supercomputing centers around the country at 1.5 megabits per second (T1), 24 times faster than the 56 kilobits-per-second speed of the previous network. The NSFNET backbone grew to link scientists and educators on university campuses nationwide and connect them to their counterparts around the world. The NSFNET project caused substantial growth at Merit, nearly tripling the staff and leading to the establishment of a new 24-hour Network Operations Center at the U-M Computer Center. In September 1990 in anticipation of the NSFNET T3 upgrade and the approaching end of the 5-year NSFNET cooperative agreement, Merit, IBM, and MCI formed Advanced Network and Services (ANS), a new non-profit corporation with a more broadly based Board of Directors than the Michigan-based Merit Network. Under its cooperative agreement with NSF, Merit remained ultimately responsible for the operation of NSFNET, but subcontracted much of the engineering and operations work to ANS. In 1991 the NSFNET backbone service was expanded to additional sites and upgraded to a more robust 45 Mbit/s (T3) based network. The new T3 backbone was named ANSNet and provided the physical infrastructure used by Merit to deliver the NSFNET Backbone Service. On April 30, 1995 the NSFNET project came to an end, when the NSFNET backbone service was decommissioned and replaced by a new Internet architecture with commercial ISPs interconnected at Network Access Points provided by multiple providers across the country.

Bringing the Internet to Michigan: 1985 to 2001

During the 1980s, Merit Network grew to serve eight member universities, with Oakland University joining in 1985 and Central Michigan University, Eastern Michigan University, and Michigan Technological University joining in 1987. In 1990, Merit’s board of directors formally changed the organization’s name to Merit Network, Inc., and created the name MichNet to refer to Merit’s statewide network. The board also approved a staff proposal to allow organizations other than publicly supported universities, referred to as affiliates, to be served by MichNet without prior board approval. 1992 saw major upgrades of the MichNet backbone to use Cisco routers in addition to the PDP-11 and LSI-11 based PCPs and SCPs. This was also the start of relentless upgrades to higher and higher speeds, first from 56 kbit/s to T1 (1.5 Mbit/s) followed by multiple T1s (3.0 to 10.5 Mbit/s), T3 (45 Mbit/s), OC3c (155 Mbit/s), OC12c (622 Mbit/s), and eventually one and ten gigabits (1000 to 10,000 Mbit/s). In 1993 Merit’s first Network Access Server (NAS) using RADIUS (Remote Authentication Dial-In User Service) was deployed. The NASs supported dial-in access separate from the Merit PCPs and SCPs. In 1993 Merit started what would become an eight-year phase out of its aging PCP and SCP technology. By 1998 the only PCPs still in service were supporting Wayne State University’s MTS mainframe host. During their remarkably long twenty-year life cycle the number of PCPs and SCPs in service reached a high of roughly 290 in 1991, supporting a total of about 13,000 asynchronous ports and numerous LAN and WAN gateways. In 1994 the Merit Board endorsed a plan to expand the MichNet shared dial-in service, leading to a rapid expansion of the Internet dial-in service over the next several years.

In 1994 there were 38 shared dial-in sites. By 1996 there were 131 shared dial-in sites and more than 92% of Michigan residents could reach the Internet with a local phone call. And by the end of 2001 there were 10,733 MichNet shared dial-in lines in over 200 Michigan cities plus New York City, Washington, D.C., and Windsor, Ontario, Canada. As an outgrowth of this work, in 1997, Merit created the Authentication, Authorization, and Accounting (AAA) Consortium. During 1994 an expanded K-12 outreach program at Merit helped lead the formation of six regional K-12 groups known as Hubs. The Hubs and Merit applied for and were awarded funding from the Ratepayer fund, which as part of a settlement of an earlier Ameritech of Michigan ratepayer overcharge, had been established by Michigan Public Service Commission to further the K-12 community’s network connectivity.

Transition to the commercial Internet, Internet2 and the vBNS: 1994 to 2005

In 1994, as the NSFNET project was drawing to a close, Merit organized the meetings for the North American Network Operators’ Group (NANOG). NANOG evolved from the NSFNET “Regional-Techs” meetings, where technical staff from the regional networks met to discuss operational issues of common concern with each other and with the Merit engineering staff. At the February 1994 regional techs meeting in San Diego, the group revised its charter to include a broader base of network service providers, and subsequently adopted NANOG as its new name. In 2000, Merit spun off two for-profit companies: NextHop Technologies, which developed and marketed GateD routing software, and Interlink Networks, which specialized in authentication, authorization, and accounting (AAA) software. Eric Aupperle retired as president in 2001, after 27 years at Merit. He was appointed President Emeritus by the Merit board. Hunt Williams became Merit’s new president.

Creating a facilities based network, adding new services: 2003 to the present

Mentor – The History of Domain Names

Mentor Graphics – mentor.com was registered

Date: 10/27/1986

On October 27, 1986, Mentor Graphics registered the mentor.com domain name, making it 30th .com domain ever to be registered.

Mentor Graphics, Inc is a US-based multinational corporation dealing in electronic design automation (EDA) for electrical engineering and electronics. In 2001, it was ranked third in the EDA industry it helped create. Founded in 1981, the company is headquartered in Wilsonville, Oregon, and employs roughly 5,200 people worldwide with annual revenues of around $1 billion.

Company History

Mentor Graphics Corporation is among the world leaders in electronic design automation (EDA), the use of computer software to design and analyze electronic components and systems. Mentor Graphics designs, manufactures, and markets software used in a number of industries, including aerospace, consumer electronics, computer, semiconductor, and telecommunications. Software produced by Mentor Graphics assists engineers in all of these industries in developing complex integrated circuits. Missile guidance systems, microprocessors, and automotive electronics are among the products designed with the help of Mentor Graphics software. Mentor Graphics also offers customers support and training in the use of its EDA systems.

Mentor Graphics was founded in 1981 by a group of young aggressive computer professionals at Tektronix, Inc., the largest electronics manufacturing company in Oregon. The main visionary in the group was Thomas Bruggere, a software engineering manager who had spent several years at Burroughs Corporation before joining Tektronix in 1977. Convinced that he could do better with his own company, Bruggere began assembling a core group of collaborators from among his associates at Tektronix. The group eventually consisted of seven members, who would meet after work to discuss what form a new company should take. Along with Bruggere, the group included Gerard Langeler, head of the Business Unit marketing department at Tektronix, and David Moffenbeier, the company’s manager of operations analysis.

Initially, the company’s pioneering group met in Bruggere’s living room and had only a vague idea of what they would be producing. They decided that the area with the best prospects for success was computer aided engineering, or CAE. Once startup financing was in place, members of the Mentor Graphics team traveled the country interviewing engineers to see what qualities were most important to them in a CAE system. For the company’s initial product, Bruggere and company settled on a workstation that used their own software run on a powerful desktop computer manufactured by Apollo Computer, a Massachusetts-based company also in its infancy. The system was named the IDEA 1000, and represented a substantial improvement over anything already in use in the CAE field.

Once the system was conceived, its production became a race against time. The Mentor Graphics team believed that it was critical to have a working product finished in time to unveil at the June 1982 Design Automation Conference in Las Vegas, the industry’s most important trade show. The IDEA 1000 made a big splash at the conference, and orders for the workstation began to pour in.

Throughout the planning stages, Bruggere and the others expected the company’s principal competition to come from established industry heavyweights like Hewlett-Packard and alma mater Tektronix. However, during Mentor Graphics’ first year of operation, two small companies, Daisy Systems and Valid Logic Systems, emerged in the Silicon Valley with CAE products and proved to be Mentor Graphics’ stiffest competition. For several years the computer press generally lumped the three companies together, referring to them collectively by their first initials, DMV. Two things actually distinguished Mentor Graphics from the others. First, Mentor Graphics bought its computers from Apollo, while the Daisy and Valid Logic built their own hardware. This allowed Mentor Graphics to concentrate on the software side. Secondly, Mentor Graphics developed its software from scratch, in contrast to its competitors, whose software was either a hybrid or an adaptation of existing software packages. Because Mentor Graphics took the time following its conference success to develop its own database package rather than rely on the inferior one supplied by another company, Daisy gained a headstart in the race for customers. From the fall of 1982 until about 1985, Mentor Graphics and Daisy engaged in a brutal war for domination of the CAE business, with nearly every decision made at Mentor Graphics aimed at gaining market share from its rival.

In 1983 Mentor Graphics made its first acquisition of another company, California Automated Design, Inc. (CADI). CADI was developing software similar to Daisy’s, and the purchase both strengthened Mentor Graphics’ position against its chief rival and nipped another potential competitor in the bud. The results of the acquisition were mixed. Although the purchase gave Mentor Graphics an entrance into a new market segment, the two companies clashed philosophically. The relationship remained strained until 1986, when CADI founder Ning Nan stepped down from his position as vice chairman of Mentor Graphics’ board. A more clear-cut success for Mentor Graphics in 1983 was the introduction of a new product called MSPICE, an interactive analog simulator. The first product of its kind on the CAE market, MSPICE made the process of designing and analyzing the behavior of analog circuits much more efficient.

1983 also marked Mentor Graphics’ move into the international market with the formation of Mentor Graphics (UK) Ltd. Subsidiaries were added in France, Italy, the Netherlands, West Germany, Japan, and Singapore by the following year. By 1984, international sales were accounting for about 20 percent of the company’s total. In September 1984 Mentor Graphics completed the acquisition of Synergy Dataworks, Inc., another young company based in Oregon. Mentor Graphics turned its first profit that year, reporting net income of $8.3 million after losing $221,000 in 1983. In addition, Mentor’s initial public stock offering took place in January 1984. $51 million was raised through the sale of about 3 million shares of Mentor Graphics common stock.

Mentor Graphics’ decision to use hardware produced outside the company in its workstations paid off handsomely in 1985. That year, archrival Daisy missed the deadline for its next generation of computer. Because Mentor Graphics’ industry-standard workstations built by Apollo were experiencing no such delays in upgrading, Mentor Graphics was able to move into the industry lead for the first time. 1985 did not pass without major problems, however. The U.S. electronics industry suddenly encountered its worst recession in 20 years. One result was a glut in the semiconductor market, and semiconductor manufacturers were responsible for a quarter of Mentor Graphics’ business. Mentor Graphics’ net income for 1985 slipped to $7.99 million. The company was spared from worse devastation by the relative health of the aerospace and telecommunications industries, plus substantial growth in the company’s international sales, which accounted for 37 percent of revenue for 1985.

By 1986, Mentor Graphics was releasing new products at the rate of one a month. The company’s international operations continued to grow briskly, consisting by that time of 260 individuals working out of 17 offices in 13 countries. Their share of Mentor Graphics’ revenue had reached 44 percent. One of the year’s highlights was the debut of the Compute Engine Accelerator, a device capable of breaking through the computer bottlenecks often encountered by engineers during complex, multifaceted CAE operations. That year, Mentor Graphics’ revenue reached $173.5 million. With both Daisy and Valid Logic losing money, Mentor Graphics’ position at the top of the CAE industry was more or less cemented. In the broader design automation arena, Mentor Graphics was fourth largest.

The downturn in the computer industry had ended by 1987, and Mentor Graphics was able to increase its profits by 85 percent for the year. Sales were up to $222 million. 1988 was even better, as revenue passed the $300 million mark, and net income grew by another 65 percent. That year, Mentor Graphics was the most profitable among all design automation firms, earning more per share than such major players as IBM and McDonnell Douglas. In March 1988 Mentor Graphics absorbed the CAE business of Tektronix, paying $5 million for a business into which Tektronix had already sunk $200 million in development costs. By the middle of the year, Mentor Graphics controlled about a fourth of the $900 million market for electronic computer-aided design products, whereas the fading Daisy’s share had dropped to 12 percent. About half the company’s business was overseas by this time. Mentor Graphics was making an especially good showing in Japan, where the company held 60 percent of the market for CAE workstations.

Mentor Graphics’ growth continued through 1989. The company’s net income made another big jump, reaching $44.8 million on sales of $380 million. With everything looking rosy, the company embarked on an ambitious new project that year. Mentor Graphics announced its commitment to develop Release 8.0, a new generation of design automation software with capabilities far exceeding those of any existing product. This dream package was a bundle of 50 integrated programs designed into a framework that would allow a customer to move data freely among the various programs. It was hoped that Release 8.0 would cut months off the time required to design a new computer chip.

Several problems in 1990 combined to halt Mentor Graphics’ dominance of the market. As Mentor Graphics’ engineers continued incorporating new features into Release 8.0, the project became increasingly complex. Work on Release 8.0 fell months behind schedule. The company suffered from a faltering economy and customers who stopped buying Mentor Graphics’ older products knowing that 8.0 was to be released soon. At the same time, new competition sprang up from Cadence Design Systems, a five-year-old company that sold only software rather than entire workstations. Whereas Mentor Graphics’ products were essentially a closed system, incompatible with other software packages, Cadence was producing software that could run on a wide range of workstations and design more complex chips. Between the delays in 8.0 and the emergence of Cadence, Mentor Graphics hit a wall.

The company made several changes to protect its position in the newly heated up race for EDA preeminence. One was to strengthen its integrated circuit design capability by acquiring Silicon Compiler Systems, which was integrated as a division of Mentor Graphics. The company also adopted Sun Microsystems hardware as a second platform for its products. Toward the end of 1990, Mentor Graphics reorganized its command structure in an effort to get the 8.0 project back on track. The company was divided into three distinct product groups: Concurrent Engineering, headed by Philip Robinson, a former vice president at Tektronix, the Systems group, led by Langeler, whose previous titles of president and chief operating officer were eliminated, and World Trade, under David Moffenbeier, another member of Mentor Graphics’ core founding group. Bruggere remained chairman and CEO.

One of the most important causes of Mentor Graphics’ ills during this period was its reluctance to adapt to certain changes taking place in the electronic design automation industry. Prior to the 1990s, the bulk of Mentor Graphics’ sales came from complete packages of workstations and software. Around 1991, however, most customers already had workstations they were comfortable with and were interested mainly in purchasing software that could run on whatever hardware platform they preferred.

In April 1991, Mentor Graphics reported a quarterly loss for the first time in its history as a public company. A few months later, the company announced a round of layoffs that eliminated 435 jobs, or about 15 percent of its work force. By the end of 1991, Cadence had passed Mentor Graphics in software revenues. Mentor Graphics finished the year by losing $61.6 million on sales of $400 million. The company’s skid continued into 1992. When 8.0 was finally released early in the year, it performed more slowly than expected, and was plagued with bugs. Mentor Graphics’ stock plummeted, diving as low as 5 in October. For 1992, the company’s sales took another major plunge to $351 million, and the company reported a net loss of nearly $51 million.

Mentor Graphics’ struggle to turn itself around continued in 1993. The rivalry between Mentor Graphics and Cadence became fierce, with each company aggressively courting the other’s customers. Cadence won a three-year multimillion dollar contract from Tektronix, who had been a loyal Mentor Graphics customer for years. Mentor Graphics countered by forging a cooperative relationship with Harris Corporation, an early Cadence ally. The company underwent further restructuring in an effort to cut costs. Mentor Graphics still lost money in 1993 ($32 million on revenue of $340 million), but some of its business segments showed signs of recovery. A $17 million contract with Motorola contributed to the company’s slightly improving prospects. The process of changing itself more completely into a software company continued.

In March 1994, Bruggere announced that he was stepping down as chairman to pursue other interests. After a short period during which the company’s day-to-day operations were handled by president and chief executive, Walden Rhines, Jon Shirley (a former Microsoft president) was named Mentor Graphics’ new chairman. With adjustments in the company’s approach to its products completed, the leadership at Mentor Graphics hoped that its offerings–once at the cutting edge of electronic design automation–had again caught up with the needs of its customers.