Category Archives: Uncategorized

IBM – The History of Domain Names

IBM.com was registered

Date: 03/19/1986

On March 19th, 1986, IBM registered the ibm.com domain name, making it the ninth Internet .com domain ever to be registered.

International Business Machines Corporation (commonly referred to as IBM) is an American multinational technology company headquartered in Armonk, New York, United States, with operations in over 170 countries. The company originated in 1911 as the Computing-Tabulating-Recording Company (CTR) and was renamed “International Business Machines” in 1924. IBM manufactures and markets computer hardware, middleware and software, and offers hosting and consulting services in areas ranging from mainframe computers to nanotechnology. IBM is also a major research organization, holding the record for most patents generated by a business (as of 2016) for 23 consecutive years. Inventions by IBM include the automated teller machine (ATM), the floppy disk, the hard disk drive, the magnetic stripe card, the relational database, the SQL programming language, the UPC barcode, and dynamic random-access memory (DRAM).

IBM has continually shifted its business mix by exiting commoditizing markets and focusing on higher-value, more profitable markets. This includes spinning off printer manufacturer Lexmark in 1991 and selling off its personal computer (ThinkPad) and x86-based server businesses to Lenovo (2005 and 2014, respectively), and acquiring companies such as PwC Consulting (2002), SPSS (2009), and The Weather Company (2016).  Also in 2014, IBM announced that it would go “fabless”, continuing to design semiconductors but offloading manufacturing to GlobalFoundries. Nicknamed Big Blue, IBM is one of 30 companies included in the Dow Jones Industrial Average and one of the world’s largest employers, with (as of 2016) nearly 380,000 employees. Known as “IBMers”, IBM employees have been awarded five Nobel Prizes, six Turing Awards, ten National Medals of Technology and five National Medals of Science.

History

In the 1880s, technologies emerged that would ultimately form the core of what would become International Business Machines (IBM). Julius E. Pitrat patented the computing scale in 1885; Alexander Dey invented the dial recorder (1888); Herman Hollerith patented the Electric Tabulating Machine;[8] and Willard Bundy invented a time clock to record a worker’s arrival and departure time on a paper tape in 1889. On June 16, 1911, their four companies were consolidated in New York State by Charles Ranlett Flint to form the Computing-Tabulating-Recording Company (CTR) based in Endicott, New York. The four companies had 1,300 employees and offices and plants in Endicott and Binghamton, New York; Dayton, Ohio; Detroit, Michigan; Washington, D.C.; and Toronto. They manufactured machinery for sale and lease, ranging from commercial scales and industrial time recorders, meat and cheese slicers, to tabulators and punched cards. Thomas J. Watson, Sr., fired from the National Cash Register Company by John Henry Patterson, called on Flint and, in 1914, was offered CTR. Watson joined CTR as General Manager then, 11 months later, was made President when court cases relating to his time at NCR were resolved. Having learned Patterson’s pioneering business practices, Watson proceeded to put the stamp of NCR onto CTR’s companies. He implemented sales conventions, “generous sales incentives, a focus on customer service, an insistence on well-groomed, dark-suited salesmen and had an evangelical fervor for instilling company pride and loyalty in every worker”. His favorite slogan, “THINK”, became a mantra for each company’s employees.[14] During Watson’s first four years, revenues more than doubled to $9 million and the company’s operations expanded to Europe, South America, Asia and Australia. “Watson had never liked the clumsy hyphenated title of the CTR” and chose to replace it with the more expansive title “International Business Machines”.

In 1937, IBM’s tabulating equipment enabled organizations to process unprecedented amounts of data, its clients including the U.S. Government, during its first effort to maintain the employment records for 26 million people pursuant to the Social Security Act, and Hitler’s Third Reich, largely through the German subsidiary Dehomag. During the Second World War the company produced small arms for the American war effort (M1 Carbine, and Browning Automatic Rifle).

In 1949, Thomas Watson, Sr., created IBM World Trade Corporation, a subsidiary of IBM focused on foreign operations. In 1952, he stepped down after almost 40 years at the company helm, and his son Thomas Watson, Jr. was named president. In 1956, the company demonstrated the first practical example of artificial intelligence when Arthur L. Samuel of IBM’s Poughkeepsie, New York, laboratory programmed an IBM 704 not merely to play checkers but “learn” from its own experience. In 1957, the FORTRAN scientific programming language was developed. In 1961, IBM developed the SABRE reservation system for American Airlines and introduced the highly successful Selectric typewriter. In 1963, IBM employees and computers helped NASA track the orbital flight of the Mercury astronauts. A year later it moved its corporate headquarters from New York City to Armonk, New York. The latter half of the 1960s saw IBM continue its support of space exploration, participating in the 1965 Gemini flights, 1966 Saturn flights and 1969 lunar mission.

On April 7, 1964, IBM announced the first computer system family, the IBM System/360. Sold between 1964 and 1978, it spanned the complete range of commercial and scientific applications from large to small, allowing companies for the first time to upgrade to models with greater computing capability without having to rewrite their application. In 1974, IBM engineer George J. Laurer developed the Universal Product Code. IBM and the World Bank first introduced financial swaps to the public in 1981 when they entered into a swap agreement. The IBM PC, originally designated IBM 5150, was introduced in 1981, and it soon became an industry standard. In 1991, IBM sold printer manufacturer Lexmark. In 1993, IBM posted a US$8 billion loss – at the time the biggest in American corporate history. Lou Gerstner was hired as CEO from RJR Nabisco to turn the company around. In 2002, IBM acquired PwC consulting, and in 2003 it initiated a project to redefine company values, hosting a three-day online discussion of key business issues with 50,000 employees. The result was three values: “Dedication to every client’s success”, “Innovation that matters—for our company and for the world”, and “Trust and personal responsibility in all relationships”.

In 2005, the company sold its personal computer business to Chinese technology company Lenovo and, in 2009, it acquired software company SPSS Inc. Later in 2009, IBM’s Blue Gene supercomputing program was awarded the National Medal of Technology and Innovation by U.S. President Barack Obama. In 2011, IBM gained worldwide attention for its artificial intelligence program Watson, which was exhibited on Jeopardy! where it won against game-show champions Ken Jennings and Brad Rutter. In 2012, IBM announced it has agreed to buy Kenexa, and a year later it also acquired SoftLayer Technologies, a web hosting service, in a deal worth around $2 billion. In 2014, IBM announced it would sell its x86 server division to Lenovo for a fee of $2.1 billion. Also that year, IBM began announcing several major partnerships with other companies, including Apple Inc., Twitter, Facebook, Tencent, Cisco, UnderArmour, Box, Microsoft, VMware, CSC, Macy’s, and Sesame Workshop, the parent company of Sesame Street. In 2015, IBM announced two major acquisitions: Merge Healthcare for $1 billion and all digital assets from The Weather Company, including Weather.com and the Weather Channel mobile app.  Also that year, IBMers created the film A Boy and His Atom, which was the first molecule movie to tell a story. In 2016, IBM acquired video conferencing service Ustream and formed a new cloud video unit. In April 2016, it posted a 14-year low in quarterly sales. The following month, Groupon sued IBM accusing it of patent infringement, two months after IBM accused Groupon of patent infringement in a separate lawsuit.

Icann Approval – The History of Domain Names

ICANN approved .asia, .cat, .jobs, .mobi, .tel and .travel as TLD

Date: 12/15/2003

ICANN added further TLDs,starting with a set of sponsored top-level domains. The application period for these was from December 15, 2003 until March 16, 2004, and resulted in ten applications. Of these, ICANN has approved asia, cat, jobs, mobi, tel and travel, all of which are now in operation

2004 February VeriSign filed a lawsuit against ICANN on February 27, 2004, claiming that ICANN had oversteppedits authority. In this lawsuit, VeriSign sought to reduce ambiguity about ICANN’s authority. The antitrust component of VeriSign’s claim was dismissed in August 2004.VeriSign’s broader challenge that ICANN overstepped its contractual rights is currently outstanding. A proposed settlement already approved by ICANN’s board would resolve VeriSign’s challenge to ICANN in exchange for the right to increase pricing on .com domains. At the meeting of ICANN in Rome which took place from March 2 to March 6, 2004, ICANN agreed to ask approval of the US Department of Commerce for the Waiting List Service of VeriSign.

2004 May 17th On May 17, 2004, ICANN published a proposed budget for the year 2004-05. It included proposals to increase the openness and professionalism of its operations, and greatly increased its proposed spending from US $8.27 million to $15.83 million. The increase was to be funded by the introduction of new top-level domains, charges to domain registries, and a fee for some domain name registrations, renewals and transfers (initially USD 0.20 for all domains within a country-code top-leveldomain, and USD 0.25 for all others). The Council of European National Top Level Domain Registries (CENTR), which represents the Internet registries of 39 countries, rejected the increase, accusing ICANN of a lack of financial prudence and criticizing what it describes as ICANN’s “unrealistic political and operational targets”. Despite the criticism, the registry agreement for the top-level domains jobs and travel includes a US $2 fee on every domain the licensed companies sell or renew.

  • 2004 June pro became a gTLD in May 2002, but did not become fully operational until June 2004
  • 2004 July CreditCards.com for $2.75 million in July 2004
  • 2005 After a second round of negotiations in 2004, the TLDs eu, asia, travel, jobs, mobi, and cat were introduced in 2005.
  • 2005 tp (the previous ISO 3166-1 code for East Timor): Being phased out in favor of tl since 2005.
  • 2005 The org domain registry allows the registration of selected internationalized domain names (IDNs) as second-level domains. For German, Danish, Hungarian, Icelandic, Korean, Latvian, Lithuanian, Polish, and Swedish IDNs this has been possible since 2005. Spanish IDN registrations have been possible since 2007.

2005 June 30th Verisign, the operator of net after acquiring Network Solutions, held an operations contract that expired on June 30, 2005. ICANN, the organization responsible for domain management, sought proposals from organizations to operate the domain upon expiration of the contract. Verisign regained the contract bid, and secured its control over the net registry for another six years. On June 30, 2011, the contract with Verisign was automatically renewed for another six years due do an clause in the contract with ICANN which states renewal will be automatic unless Verisign commits something egregious.

2005 November In the early 2000s, there had been speculation that the United Nations might signal a takeover of ICANN, followed by a negative reaction from the US government and worries about a division of the Internet the World Summit on the Information Society in Tunisia in November

2005 agreed not to get involved in the day-to-day and technical operations of ICANN.However it also agreed to set up an international Internet Governance Forum, with a consultative role on the future governance of the Internet. ICANN’s Government Advisory Committee is currently set up to provide advice to ICANN regarding public policy issues and has participation by many of the world’s governments.

2005 December 7th eu (European Union): On September 25, 2000, ICANN decided to allow the use of any two-lettercode inthe ISO 3166-1 reserve list that is reserved for all purposes. Only EU currently meets this criterion. Following a decision by the EU’s Council of Telecommunications Ministers in March 2002, progress was slow, but a registry (namedEURid) was chosen by the European Commission, and criteria for allocation set: ICANN approved eu as a ccTLD, and it opened for registration on 7 December 2005 for the holders of prior rights. Since 7 April 2006, registration is open to all.

2006 Database size, which had been a significant marketing feature through the early 2000s, was similarly displaced by emphasis on relevancy ranking, the methods by which search engines attempt to sort the best results first. Relevancy ranking first became a major issue circa 1996, when it became apparent that it was impractical to review full lists of results.Consequently, algorithms for relevancy ranking have continuously improved.Google’s PageRank method for ordering the results has received the most press,but all major search engines continually refine their ranking methodologies with a view toward improving the ordering of results. As of 2006, search engine rankings are more important than ever, so much so that an industry has developed (“search engine optimizers”, or “SEO”) to help web-developers improve their search ranking, and an entire body of case law has developed around matters that affect search engine rankings, such as use of trademarks in metatags. The sale ofsearch rankings by some search engines has also created controversy among librarians and consumer advocates.

2006 28th February On February 28, 2006, ICANN’s board approved a settlement with VeriSign in the lawsuit resulting from SiteFinder that involved allowing VeriSign (the registry) to raise its registration fees by up to 7% a year. This was criticised by some people in theUS House of Representatives’ Small Business committee.

2006 March 1996 ac (Ascension Island): This code is avestige of IANA’s decision in1996 to allow the use of codes reserved in the ISO3166-1 alpha-2 reserve list for use by the Universal Postal Union. The decision was later reversed, with Ascension Island now the sole outlier. (Three otherccTLDs, gg (Guernsey), im (Isle of Man) and je (Jersey) also fell under this category from 1996 until they received corresponding ISO 3166 codes in March2006.)

2006 July On July 26, 2006, the United States government renewed the contract with ICANN for performance of the IANA function for an additional one to five years. The context of ICANN’s relationship with the U.S.government was clarified on September 29, 2006 when ICANN signed a new Memorandum of Understanding with the United States Department of Commerce(DOC).

ICANN Newgtlds – The History of Domain Names

ICANN has $750k to advertise new gTLDs

July 6, 2011

ICANN is looking for an advertising agency to help it get the word out about the new generic top-level domains program, but it only has $750,000 to spend.

The organization published a request for proposals last night.

The budget is not much in the advertising world, especially considering that ICANN’s awareness program will have to be global and multilingual to be truly effective.

With such a limited budget, the RFP and accompanying FAQ acknowledges that it will need “creative solutions” from its ad agency.

This is likely to mean a big PR push for advertising equivalent editorial – lots and lots of news stories about new gTLDs.

To an extent, the word is already out by this measure. My standing Google News and Twitter searches for “ICANN” have been going crazy since the gTLD program was approved two weeks ago.

I think it’s fair to say that the vast majority deal of the coverage so far has been either neutral or negative, with much of the focus on potential legal, branding and security problems.

That’s pretty much par for the course in the domain name business, of course.

And ICANN does not necessarily need positive spin – it’s trying to raise awareness of the program’s existence, and negative coverage does that job just as well.

There is, as they say, no such thing as bad publicity.

ICANN’s job of promoting the program is already being done to a large extent by the registries, many of which were investing heavily in media outreach before new gTLDs were approved.

GTLD-GoogleRanking – The History of Domain Names

NEW GTLD’S WILL NOT HAVE MORE GOOGLE RANKING POWER

March 19, 2012

The world of search engine optimization (SEO) is full of myths in regard to ranking well on Google – and the latest to be busted relates to new generic Top Level Domains (gTLD’s).

Rumour has it that new gTLD’s will rank more strongly on Google, with some of those rumours originating with companies involved with promoting or the application process for the TLD’s.

Google’s Matt Cutts has poured cold water on these claims.

A software engineer who has been with the company for over a decade, Mr. Cutts heads Google’s web spam team – the commandos who investigate how Google’s closely guarded ranking algorithms are or might be gamed and then take steps to level the playing field.

Mr. Cutts often communicates with the webmaster community through official Google channels, his own blog and regularly pops up in online communities around the web. Sometimes he can be vague or cryptic in his guidance to avoid giving too much away; but on this topic he was very clear.

Responding via his Google+ account to the claims of any newly-minted generic Top Level Domain having additional ranking power, Mr. Cutts states:

“Sorry, but that’s just not true, and as an engineer in the search quality team at Google, I feel the need to debunk this misconception. Google has a lot of experience in returning relevant web pages, regardless of the top-level domain (TLD). Google will attempt to rank new TLDs appropriately, but I don’t expect a new TLD to get any kind of initial preference over .com, and I wouldn’t bet on that happening in the long-term either.”

Further warning about the risk of applying for a new gTLD with top rankings in mind, Mr Cutts said:

“If you want to register an entirely new TLD for other reasons, that’s your choice, but you shouldn’t register a TLD in the mistaken belief that you’ll get some sort of boost in search engine rankings.”

GTLD Restrictions – The History of Domain Names

Restrictions on Generic Top Level Domain Names

Date: 06/20/2011

On June 20, 2011 ICANN’s board voted to end most restrictions on the generic top-level domain names (gTLD)fromthe 22 currently available. Companies and organizations will be able to choose essentially arbitrary top level Internet domains. The use of non-latin characters (such as Cyrillic, Arabic, Chinese, etc.) will also be allowed ingTLDs. ICANN will begin accepting applications for new gTLDs on January 12,2012. Entertainment and financial services brands are most likely to apply fornew gTLDs for their brands, according to a survey by registrar Melbourne IT.The initial price to apply for a new gTLD will be $185,000, with an annual fee of $25,000. ICANN expects that the first batch of new gTLDs will be operational at the beginning of 2013. ICANN expects the new rules to significantly change the face of the internet. Peter Thrush, chairman of ICANN’s board of directorsstated after the vote: “Today’s decision will usher in a new internet age.We have provided a platform for the next generation of creativity and inspiration. Unless there is a good reason to restrain it, innovation shouldbe allowed to run free.” Industry analysts predicted 500–1000 new gTLDs,mostly reflecting names of companies and products, but also cities and generic names like bank and sport. According to Theo Hnarakis, chief executiveof Melbourne IT, the decision “will allow corporations to better take control of their brands. For example, apple or ipad would take customers right to those products.” However some companies have ruled out a branded gTLD.

2011 June 30th Verisign, the operator of net after acquiring Network Solutions, held an operations contract that expired on June 30, 2005. ICANN, the organization responsible for domain management,sought proposals from organizations to operate the domain upon expiration ofthe contract. Verisign regained the contract bid, and secured its control over the net registry for another six years. On June 30, 2011, the contract withVerisign was automatically renewed for another six years due do an clause inthe contract with ICANN which states renewal will be automatic unless Verisign commits something egregious.

2011 The root name servers are hosted in multiple secure sites with high-bandwidth access to accommodate the traffic load. Initially all of these installations were located in the United States. However, the distributionhas shifted and this is no longer the case. Usually each DNSserver installation at a given site is physically a cluster of machines with load-balancing routers. A comprehensive list of servers, their locations, and properties is available at http://root-servers.org. As of May 2011 there were 242 root servers worldwide.

2011 Of the remaining applications (post,mail and analternative tel proposal), post are still under consideration.

2012 April 27th ICANN will begin to release official new TLD application information to the public on approximately April 27th, 2012. Several unofficial lists have been established which track new gTLD applications.

GTLD – The History of Domain Names

Requests For More Generic Top Level Domains

Date: 01/01/2008

The introduction of several generic top-level domains over the years has not stopped the demand for more gTLDsand ICANN has received many proposals for establishment of new top-level domains. Proponents have argued for avariety of models ranging from adoption of policies for unrestricted gTLDs (see above) to chartered gTLDs for specialized uses by specialized organizations. A new initiative started in 2008 foresees astringent application process for new domains that adhere to a restricted naming policy for open gTLDs, community-based domains, and internationalized domain names(IDNs). According to a guidebook published by ICANN,a community-based gTLD is”a gTLD that is operated for the benefit of a defined community consisting of a restricted population.” All other domains fall under the category open gTLD’, which “is one that can be used for any purpose consistent withthe requirements of the application and evaluation criteria, and with the registry agreement. An open gTLD mayor may not have a formal relationship with an exclusive registrant or user population.It may or may not employeligibility or use restrictions.” The establishment of new gTLDs under this program requires the operation of a domain registry and a demonstration of technical and financial capacity for such operations and the management of registrar relationships. A fourth version of the draft applicant guide book (DAG4) was published in May2011.

2008 April The former .um ccTLD for the U.S. Minor Outlying Islands was removed in April 2008. Under RFC 1591 rules .um is eligible as accTLD on request by the relevant governmental agency and local Internet user community.

Halal – The History of Domain Names

.Halal top level domain in the works

February 16, 2012

Halalan Tayyiban, Corp is planning to apply for the .halal top level domain name.

The company operates a web site HalalFinder.com. The site used to be a marketplace for halal food products, but now appears to be a general marketplace.

The corporation filed a trademark application for DotHalal with the U.S. Patent and Trademark Office for “Domain name registration services”. The application was filed February 12.

DotHalal.com was registered back in 2009, so these plans are not new. It’s not clear how long the message has been up on the web site.

I called the phone number listed on the trademark application and the person who answered confirmed that the company is applying for .halal through ICANN.

The trademark application will, of course, be refused by the trademark office since you can’t trademark a top level domain.

This will be an interesting application to watch. Domains that have anything to do with religion will likely get added scrutiny in the objection process.

Host – The History of Domain Names

Host.TXT (naming computers)

Date: 01/01/1983

Before 1983 each computer on the network retrieved a file called HOSTS.TXT from a computer at SRI. The HOSTS.TXTfile mapped names to numerical addresses. A host’s file still exists on most modern operating systems by defaultand generally contains a mapping of the IP address 127.0.0.1to “localhost”. Many operating systems use name resolution logic that allows the administrator to configure selection priorities for available name resolution methods.

The hosts file is a computer file used by an operating system to map hostnames to IP addresses. The hosts file is a plain text file, and is conventionally named hosts. Originally, a file named HOSTS.TXT was manually maintained and made available via file sharing by Stanford Research Institute for the ARPANET membership, containing the hostnames and address of hosts as contributed for inclusion by member organizations. The Domain Name System, first described in 1983 and implemented in 1984, automated the publication process and provided instantaneous and dynamic hostname resolution in the rapidly growing network. In modern operating systems, the hosts file remains an alternative name resolution mechanism, configurable often as part of facilities such as the Name Service Switch as either the primary method or as a fallback method.

History

The ARPANET, the predecessor of the Internet, had no distributed host name database. Each network node maintained its own map of the network nodes as needed and assigned them names that were memorable to the users of the system. There was no method for ensuring that all references to a given node in a network were using the same name, nor was there a way to read the hosts file of another computer to automatically obtain a copy.

The small size of the ARPANET kept the administrative overhead small to maintain an accurate hosts file. Network nodes typically had one address and could have many names. As local area TCP/IP computer networks gained popularity, however, the maintenance of hosts files became a larger burden on system administrators as networks and network nodes were being added to the system with increasing frequency. Standardization efforts, such as the format specification of the file HOSTS.TXT in RFC 952, and distribution protocols, e.g., the hostname server described in RFC 953, helped with these problems, but the centralized and monolithic nature of hosts files eventually necessitated the creation of the distributed Domain Name System (DNS).

On some old systems a file named networks is present that has similar to hosts file functions containing names of networks.

Purpose

The hosts file is one of several system facilities that assists in addressing network nodes in a computer network. It is a common part of an operating system’s Internet Protocol (IP) implementation, and serves the function of translating human-friendly hostnames into numeric protocol addresses, called IP addresses, that identify and locate a host in an IP network. In some operating systems, the contents of the hosts file is used preferentially to other name resolution methods, such as the Domain Name System (DNS), but many systems implement name service switches, e.g., nsswitch.conf for Linux and Unix, to provide customization. Unlike remote DNS resolvers, the hosts file is under the direct control of the local computer’s administrator.

File content

The hosts file contains lines of text consisting of an IP address in the first text field followed by one or more host names. Each field is separated by white space – tabs are often preferred for historical reasons, but spaces are also used. Comment lines may be included; they are indicated by a hash character (#) in the first position of such lines. Entirely blank lines in the file are ignored. For example, a typical hosts file may contain the following:

127.0.0.1  localhost loopback

::1            localhost

This example only contains entries for the loopback addresses of the system and their host names, a typical default content of the hosts file. The example illustrates that an IP address may have multiple host names (localhost and loopback), and that a host name may be mapped to both IPv4 and IPv6 IP addresses.

Security issues

The hosts file may present an attack vector for malicious software. The file may be modified, for example, by adware, computer viruses, or trojan horse software to redirect traffic from the intended destination to sites hosting malicious or unwanted content. The widespread computer worm Mydoom. Blocked users from visiting sites about computer security and antivirus software and also affected access from the compromised computer to the Microsoft Windows Update website.

Hotmail – The History of Domain Names

Hotmail – free web-based e-mail

Date: 01/01/1996

Hotmail is a free e-mail service provided by Microsoft. It was established in 1995.

Hotmail , Windows Live Hotmail , previously known as MSN Hotmail and commonly referred to as Hotmail was an email service operated by Microsoft as part of Windows Live . The service was founded by Sabeer Bhatia and Jack Smith and launched for the first time in July 1996 as ” Hotmail ”  – the capital letters is interpreted HTML . Shortly after the launch Hotmail was bought by Microsoft for $ 400 million and changed its name to ” Microsoft Network Hotmail”. The current version, “Windows Live Hotmail”, was released in 2007. Windows Live Hotmail offers unlimited space, patented security measures, Ajax programming and integration with Windows Live Messenger , Hotmail Calendar , SkyDrive and Windows Live Contacts . According to the Internet-based market research company comScore was Windows Live Hotmail, the world’s largest web-based service for e-mail in August 2007, with 364 million users, with Yahoo! Mail in second place with 280 million users and Gmail third with 191 million users. Windows Live Hotmail is available in 36 languages. Both Hotmail and development of the business is based in Mountain View, California. Before Hotmail was acquired by Microsoft had its headquarters in Sunnyvale, California.

31 July 2012, Microsoft stated that Hotmail will eventually be replaced by a new service, called Outlook.com .

On April 3, 2013 Hotmail was closed and was replaced by Outlook.

History

MSN Hotmail

Hotmail was sold to Microsoft in 1997 for $ 400 million, and joined the “MSN group of services”. Hotmail quickly became popular because the service was both international and local in many places around the world. In February 1999, Hotmail reported as the largest webmail with up to 30 million active users. From the beginning Hotmail was running on an operating system that mixed FreeBSD and Solaris .  To Hotmail would manage Windows 2000 launched a development project. In June 2002, Microsoft announced that the project was finished, a few days later came the new information that the previous message was taken back and that Hotmail was still dependent on FreeBSD.

2001, Hotmail was taken over by hackers who made the discovery that anyone could log into their Hotmail account and then send lots of emails from any other account at any time by processing the URL, the second account user name and e-mail numbers. The intrusion was so simple that a number of newspapers and websites in detail could describe how the intrusion would go, this resulted in turn in the tens of thousands of hackers tried and millions of accounts were manipulated from 7 August 2001 and 31 August 2001. after a period of technological stagnation received industry within the webmail a boost in 2004, when Google released its new service, Gmail . The service had more storage space, higher speed and a flexible interface, in other words, meant the new competitor to a wave of innovation spread throughout the industry. The former heavyweights, Hotmail and Yahoo! Mail came up with an upgraded version that recalled that Gmail was launched, more speed, more security and more advanced features.

Windows Live Hotmail

In November 2005, Microsoft’s new service for e-mail out, code-named “Kahuna” was a beta version released to a few thousand testers. For those who have not had the opportunity to try the beta version could request an invitation lead to access. The new version was rebuilt from the ground and put great emphasis on three concepts: “faster, simpler, safer”. During the development period was the number of beta testers, more and more, and at the end of 2006, the number had risen to one million. The plan was to brand the Hotmail would be phased out when Microsoft launched its new e-mail system, Windows Live Mail, this changed, however, when the testers of the beta product set critical of switching off the well-known name to something unknown, the name was because the Windows Live Hotmail. After another trial released product in the Netherlands, 9 November 2006. Development of the beta was finished in March 2007, and Windows Live Hotmail could be released as a new brand in May 2007. The 260 million existing MSN Hotmail accounts had immediate access to the new system. The interface of the old system could only be reached by previous users who registered before the Windows Live Hotmail was released and then chosen not to update their system. The phasing out of the old system was completed in October 2007. In February 2007, Windows Live Hotmail price “Editor’s Choice” by PC Magazine . In March 2007, received the same prize with a score of 4 out of 5 stars. 2008 was announced the Windows Live Hotmail website  that the service would be updated with a focus on increased speed, greater storage space, greater ease of use as well as a better user experience. The time it takes to log on would for example be reduced by 70 percent. Windows Live Hotmail has been updated in several stages as a result of feedback from users. Advertising is moved from the top to the edge, more themes have been added and a larger number of messages per page viewed, including the ability to send instant messages to other users online directly from your inbox updates are suggested by users.

When Firefox was launched, it took a few months for Windows Live Hotmail to complete support for the browser. In October 2008 discovered users with Linux-based web browser that they could only reach a “reaad-only” access to their Hotmail accounts, in November the same year completed full support for Google Chrome . the ability to send instant messages directly from the inbox was an update that is integrated with Windows Live Messenger . The former “MSN Web Messenger” would be replaced with a project started in 2007 and was called “Windows Live Web Messenger”. On 18 March 2010, Microsoft came out with what they called “the fourth wave” of updates for Hotmail, updates offered features: olästfilter, “Active View” and sorting. They also included integration with Windows Live SkyDrive and Windows Live Office , a free version of Microsoft’s Office Web Apps . the new updated version of Windows Live Hotmail gradually began to be released June 15, 2010 and was completed on August 3 that year. A feature that enabled sync between the user’s mobile phone, the e-mail, Windows Live contacts and Windows Live Hotmail calendar called Exchange ActiveSync and released for all Hotmail Users 30 August 2010.

Transition to Outlook.com

In May 2015, Microsoft announced it would move the service over to what it described as an Office 365-based infrastructure.This was followed in June 2015 by the introduction through an opt-in preview of new features, including new calendar layout options, a filtering service called “Clutter” and new theme designs.Microsoft also introduced the ability for third-party providers such as PayPal and Evernote to include add-ins into the service. Additionally, contact suggestions and updates from emails such as flight reservations are due to be introduced to Office 365 subscribers’ accounts and Outlook.com users’ from January and March 2016 respectively. With the upgrade, users will no longer be able to use the Windows Live Mail 2012 client, and are encouraged to view Outlook.com through a browser, through the Mail app, or through the Outlook client.

Although DeltaSync has been discontinued as of June 30th 2016, Microsoft’s Windows Live Mail 2011 and 2012 will continue to work with Hotmail e-mail accounts, by using IMAP (or, less effectively, POP) in place of DeltaSync.

Features

Similar to other major webmail services, Outlook.com uses Ajax programming techniques and supports later versions of Internet Explorer, Firefox, Safari, and Google Chrome. Some of its features include keyboard controls giving the ability to navigate around the page without using the mouse, the ability to search the user’s messages including structured query syntax such as “from:ebay”, message filters, folder-based organization of messages, auto-completion of contact addresses when composing, contact grouping, importing and exporting of contacts as CSV files, rich text formatting, rich text signatures, spam filtering and virus scanning, support for multiple addresses, and different language versions.

One example of a feature no longer present is the ability to create custom domain names.

HP – The History of Domain Names

HP.com was registered

Date: 03/03/1986

On March 3, 1986, HP registered the HP.com domain name, making it the ninth Internet .com domain ever to be registered.

The Hewlett-Packard Company (commonly referred to as HP) was an American multinational information technology company headquartered in Palo Alto, California. It developed and provided a wide variety of hardware components as well as software and related services to consumers, small- and medium-sized businesses (SMBs) and large enterprises, including customers in the government, health and education sectors.  The company was founded in a one-car garage in Palo Alto by William “Bill” Redington Hewlett and David “Dave” Packard, and initially produced a line of electronic test equipment. HP was the world’s leading PC manufacturer from 2007 to Q2 2013, after which Lenovo came to rank ahead of HP. It specialized in developing and manufacturing computing, data storage, and networking hardware, designing software and delivering services. Major product lines included personal computing devices, enterprise and industry standard servers, related storage devices, networking products, software and a diverse range of printers and other imaging products. HP marketed its products to households, small- to medium-sized businesses and enterprises directly as well as via online distribution, consumer-electronics and office-supply retailers, software partners and major technology vendors. HP also had services and consulting business around its products and partner products.

History

In 1984, HP introduced both inkjet and laser printers for the desktop. Along with its scanner product line, these have later been developed into successful multifunction products, the most significant being single-unit printer/scanner/copier/fax machines. The print mechanisms in HP’s tremendously popular LaserJet line of laser printers depend almost entirely on Canon Inc.’s components (print engines), which in turn use technology developed by Xerox. HP develops the hardware, firmware, and software that convert data into dots for the mechanism to print.[citation needed] HP transitioned from the HP3000 to the HP9000 series minicomputers with attached storage such as the HP 7935 hard drive holding 404 MiB.

On March 3, 1986, HP registered the HP.com domain name, making it the ninth Internet .com domain ever to be registered.

In 1987, the Palo Alto garage where Hewlett and Packard started their business was designated as a California State historical landmark.

In the 1990s, HP expanded their computer product line, which initially had been targeted at university, research, and business users, to reach consumers. HP also grew through acquisitions. It bought Apollo Computer in 1989 and Convex Computer in 1995.

Later in the decade, HP opened hpshopping.com as an independent subsidiary to sell online, direct to consumers; in 2005, the store was renamed “HP Home & Home Office Store.” From 1995 to 1998, Hewlett-Packard were sponsors of the English football team Tottenham Hotspur.

In 1999, all of the businesses not related to computers, storage, and imaging were spun off from HP to form Agilent Technologies. Agilent’s spin-off was the largest initial public offering in the history of Silicon Valley. The spin-off created an $8 billion company with about 30,000 employees, manufacturing scientific instruments, semiconductors, optical networking devices, and electronic test equipment for telecom and wireless R&D and production. In July 1999, HP appointed Carly Fiorina as CEO, the first female CEO of a company in the Dow Jones Industrial Average. Fiorina served as CEO during the technology downturn[citation needed] of the early 2000s. During her tenure, the market value of HP halved and the company cut jobs heavily. The HP Board of Directors asked Fiorina to step down in 2005, and she resigned on February 9, 2005.

HP produces lines of printers, scanners, digital cameras, calculators, PDAs, servers, workstation computers, and computers for home and small-business use; many of the computers came from the 2002 merger with Compaq. HP as of 2001 promotes itself as supplying not just hardware and software, but also a full range of services to design, implement, and support IT infrastructure. HP’s Imaging and Printing Group (IPG) was described by the company in 2005 as “the leading imaging and printing systems provider in the world for printer hardware, printing supplies and scanning devices, providing solutions across customer segments from individual consumers to small and medium businesses to large enterprises”.

Hypercard – The History of Domain Names

Apple Computer’s HyperCard created

Date: 01/01/1987

HyperCard is an application program and programming tool for Apple Macintosh and Apple IIGS computers, that is among the first successful hypermedia systems before the World Wide Web. It combines database capabilities with a graphical, flexible, user-modifiable interface. HyperCard also features HyperTalk, a programming language for manipulating data and the user interface. This combination of features – simple form layout, database capabilities and ease of programming – led to widespread use in many different roles. Some HyperCard users employed it as a programming system for rapid application development of applications and databases, others for building interactive applications with no database requirements, command and control systems, and many examples in the demoscene. HyperCard was originally released in 1987 for $49.95 and was included for free with all new Macs sold at the time. It was withdrawn from sale in March 2004 after its final update in 1998. HyperCard has not been ported to Mac OS X but ran in the Classic Environment.

HyperCard changed the way people on the Macintosh platform created programs. Furthermore, it changed the way they worked -and, for some, the way they learned. A regularly used and successful product after its launch in 1987, HyperCard was being used by individuals, home users, businesses, and in education. Rather than having one sole purpose, HyperCard could be used for many things ranging from communications, presentations, and basic games to keeping an address book or solely for programming. Like many other Apple products, HyperCard was ahead of its time. Many have viewed it as a lost opportunity for Apple, especially because of its resemblance to HTML (HyperText Markup Language) on the World Wide Web.

Beginnings

Bill Atkinson started the HyperCard project in 1985. Atkinson had previously been a key member of the Macintosh and Lisa teams at Apple, writing MacPaint and contributing many inventions to the field of computing including the marching ants and selection lasso (both seen in many programs today like Photoshop and GIMP). Atkinson is also credited with the creation of QuickDraw and LisaGraf. In a series of unfortunate events in the early 1980s,  Apple’s contract with Microsoft for Applesoft Basic ended in September 1985.2 Bill Gates, then CEO of Microsoft, exploited the opportunity and stopped Apple from releasing its own BASIC for the Mac. At the same time, Gates gained a license to use the Macintosh user interface in future versions of his competing Windows operating system from then Apple CEO John Sculley. Ultimately, this meant Apple lost a copyright infringement lawsuit against Microsoft that lasted four years. Atkinson started the HyperCard project to give users and developers on the Macintosh an easier way to create programs. At the time, programs were typically created with assembly code, and an Apple Lisa was also required to develop for the Mac. HyperCard was known as WildCard during its development. The name was later changed to HyperCard to avoid trademark issues. However, the HyperCard application and associated files retain a creator code of WILD, reflecting the development name.

HyperCard worked on the principle of stacks and cards. Cards hold data like a rolodex does, and they are contained in a stack that can be read by anyone who has HyperCard or the HyperCard Player. Cards can display whatever the creator wants, whether it be images, text, or sound. What made HyperCard unique was its capability to do multiple tasks. It could be used as a word processor, database, graphics system, calculator, an address book, and more. What’s more, it was entirely interactive rather than providing a static solution like a word processor does with a document that “just sits there”, as HyperCard’s creator Atkinson said in 1987.

Releases

HyperCard was officially released in 1987 to coincide with the MacWorld Conference & Expo in Boston, Massachusetts to generate publicity and raise awareness. At the time, Apple CEO John Sculley said “In many ways, HyperCard is just as important as the personal computer itself,” HyperCard was bundled free-of-charge with all new Macintoshes, and Sculley said current owners could purchase it for less than $50.4 Despite all the attention HyperCard attracted, it was not the first piece of software of its type, as Apple had claimed. Elizabeth Armstrong of STart Magazine mentioned in 1987 that the Atari ST’s Zoomracks program came before HyperCard – two years earlier in fact. Zoomracks was very similar to HyperCard in that it used the “cards and stacks” metaphor.

Apple introduced HyperCard for the Apple IIGS (the IIGS was part of Apple’s computer lineup from 1987 until 1993) in January of 1991.6 Roger Wagner Publishing had already released a similar product for the IIGS in 1989 called HyperStudio that also included a programming language similar to HyperText which offered more control over the way stacks were created. Apple handed HyperCard to Claris, its software subsidiary, in 1990.8 While the product was under Claris, several significant improvements were introduced including System 7 support. In January 1993, the Macintosh press was worried that HyperCard would soon be discontinued because of Claris’ inability to make a profit from the product, and the lack of finances to support the “many millions of HyperCard users who acquired the program for free when it was bundled with every Mac”.9 Little less than a month later, HyperCard was handed back to Apple and a crisis was quickly averted. MacWeek called the original transfer a “misstep” as Claris targeted the mass market instead of developers and scriptors who “are HyperCard’s real audience”. The last HyperCard update was version 2.4, released in April 1998 according to Apple’s Technical Notes.11 MacWorld positively reviewed the update, calling it a “no-brainer” for existing 2.3 users.12 However, it was mentioned that “it’s hard to take HyperCard seriously when, after 11 years, it still sports the black-and-white interface designed for the original 9-inch Mac screen”.

HyperCard was desperately in need of a 3.0 update, and the Macintosh community showed demand for such a release.

Demise and Legacy

As early as 1998, Macintosh users were wondering what Apple was doing with HyperCard.13 Some wanted Apple to release an update to the ever-aging product, while others thought the source code should be released to the community. Apple CEO Steve Jobs called the rumors of HyperCard’s cancellation “bullshit” at CAUSE ’98. However only two years later with no new updates, Jobs reassigned HyperCard engineers to other roles as he decided to abandon the project. In 2002, the International HyperCard Users Group estimated there were 10,000 active HyperCard users worldwide. Even after this occurrence, Macintosh faithful still hoped for a Hypercard 3.0. Apple continued selling HyperCard until March 2004, when sales ceased and the product was removed from their website.14 This move disappointed HyperCard users, especially those reliant on the product like Mr. Mays, who in 2002 ran two Dallas fast-food franchises with an order tracking system he built himself using HyperCard.

Hypertext – The History of Domain Names

Tim Berners-Lee invented a network-based implementation of the hypertext concept (www.)

Date: 01/01/1989

HyperText is a way to link and access information of various kinds as a web of nodes in which the user can browse at will. It provides a single user-interface to large classes of information (reports, notes, data-bases, computer documentation and on-line help).

Hypertext concepts

The principles of hypertext, and their applicability to the CERN environment, are discussed more fully in, a glossary of technical terms is given in . Here we give a short presentation of hypertext. A program which provides access to the hypertext world we call a browser. When starting a hypertext browser on your workstation, you will first be presented with a hypertext page which is personal to you : your personal notes, if you like. A hypertext page has pieces of text which refer to other texts. Such references are highlighted and can be selected with a mouse (on dumb terminals, they would appear in a numbered list and selection would be done by entering a number). When you select a reference, the browser presents you with the text which is referenced: you have made the browser follow a hypertext link.

That text itself has links to other texts and so on. Clicking on the GHI would take you to the minutes of that meeting. There you would get interested in the discussion of the UPS, and click on the highlighted word UPS to find out more about it.

The texts are linked together in a way that one can go from one concept to another to find the information one wants. The network of links is called a web . The web need not be hierarchical, and therefore it is not necessary to “climb up a tree” all the way again before you can go down to a different but related subject. The web is also not complete, since it is hard to imagine that all the possible links would be put in by authors. Yet a small number of links is usually sufficient for getting from anywhere to anywhere else in a small number of hops.

The texts are known as nodes. The process of proceeding from node to node is called navigation . Nodes do not need to be on the same machine: links may point across machine boundaries. Having a world wide web implies some solutions must be found for problems such as different access protocols and different node content formats. These issues are addressed by our proposal.

Nodes can in principle also contain non-text information such as diagrams, pictures, sound, animation etc. The term hypermedia is simply the expansion of the hypertext idea to these other media. Where facilities already exist, we aim to allow graphics interchange, but in this project, we concentrate on the universal readership for text, rather than on graphics.

Applications

The application of a universal hypertext system, once in place, will cover many areas such as document registration, on-line help, project documentation, news schemes and so on. It would be inappropriate for us (rather than those responsible) to suggest specific areas, but experiment online help, accelerator online help, assistance for computer center operators, and the dissemination of information by central services such as the user office and CN and ECP divisions are obvious candidates. WorldWideWeb (or W3 ) intends to cater for these services across the HEP community.

CERN. In 1980, Berners-Lee first started work as a consultant at CERN, originally called the Conseil Européen pour la Recherche Nucleaire, and now the European Particle Physics Laboratory, but still called CERN for old time’s sake. The organization consists of many facilities located in a beautiful area in the Jura mountains on the border between France and Switzerland. It was because CERN was so large and complex, with thousands of researchers and hundreds of systems, that Berners-Lee developed his first hypertext system to keep track of who worked on which project, what software was associated with which program, and which software ran on which computers. Like the development of packet switching, hyperlinks are an idea that seemed to want to be found, with Berners-Lee independently developing his ideas within five years of Ted Nelson and Douglas Engelbart.

Berners-Lee named his first hypertext system Enquire, after an old book he found as a child in his parents house called Enquire Within upon Everything which provided a range of household tips and advice. The book fascinated young Tim with the suggestion that it magically contained the answer to any problem in the world. With the building of the Enquire system in 1980, and then the Web ten years later, Berners-Lee has pretty much successfully dedicated his life to making that childhood book real.

From 1981 to 1984, Berners-Lee left CERN and worked at Image Computer Systems as Technical Design Lead, with responsibility for real-time, graphics, and communications software for an innovative software program that enabled older dot-matrix printers to print a wide range of advanced graphics. He then rejoined CERN full-time in 1984, and almost immediately started trying to get a hypertext project approved for official funding. In March, 1989, he completed a project proposal for a system to communicate information among researchers in the CERN High Energy Physics department, intended to help those having problems sharing information across a wide range of different networks, computers, and countries. The project had two main goals:

Open design. Like Robert Kahn’s design for TCP/IP, the hypertext system should have an open architecture, and be able to run on any computer being used at CERN including Unix, VMS, Macintosh, NextStep, and Windows.

Network distribution. The system should be distributed over a communications network. However, Berners-Lee thought that there might be an intermediary period when most of the research material was carried on individual CDROM’s, which never became necessary.

Cailliau. Robert Cailliau had independently proposed a project to develop a hypertext system at CERN, and joined Berners-Lee as a partner in his efforts to get the web off the ground. He rewrote the project proposal, lobbied management for funding, rounded up programmers, collaborated with Berners-Lee on papers and presentations, and helped run the first WWW conference. Cailliau later became President of the International World Wide Web Conference Committee (IW3C2).

Web development. In the fall of 1990, Berners-Lee took about a month to develop the first web browser on a NeXT computer, including an integrated editor that could create hypertext documents. He deployed the program on his and Cailliau’s computers, and they were both communicating with the world’s first web server at info.cern.ch on December 25, 1990.

The first project Berners-Lee and Cailliau tackled was to put the CERN telephone book on the web site, making the project immediately useful and gaining it rapid acceptance. Some CERN staff started keeping one window open on their computer at all times just to access the telephone web page.

Luckily, CERN had been connected to the ARPANET through the EUnet in 1990. In August, 1991, Tim posted a notice on the alt.hypertext newsgroup about where to download their web server and line mode browser, making it available around the world. Web servers started popping up around the globe almost immediately. An official Usenet 8 newsgroup called comp.infosystems.www was soon established to share info.

Berners-Lee then added support for the FTP protocol to the server, making a wide range of existing FTP directories and Usenet newsgroups immediately accessible through a web page. He also added a telnet server on info.cern.ch, making a simple line browser available to anyone with a telnet client. The first public demonstration of the web server was given at the Hypertext 91 conference. Development of this web server, which came to be called CERN httpd, would continue until July, 1996.

In June, 1992, CERN sent Berners-Lee on a three month trip through the United States. First he visited MIT’s Laboratory for Computer Science, then went to an IETF conference in Boston, then visited Xerox-Parc in Palo Alto, California. At the end of this trip he visited Ted Nelson, then living on a houseboat in Sausalito. Interestingly, Nelson had experience with film making, Berners-Lee had experience working with lighting and audiovisual equipment in the amateur theater, and Tom Bruce, who created the first PC web browser called Cello, also worked professionally as a stage manager in the theater. Maybe these Internet techies are all really just artists at heart…

In a fateful decision that significantly helped the web to grow, Berners-Lee managed to get CERN to provide a certification on April 30, 1993, that the web technology and program code was in the public domain so that anyone could use and improve it.

Google – The History of Domain Names

Google founded

Date: 01/01/1998

Google is an American multinational technology company specializing in Internet-related services and products that include online advertising technologies, search, cloud computing, software, and hardware. Most of its profits are derived from AdWords, an online advertising service that places advertising near the list of search results.

Google was founded in 1996 by Larry Page and Sergey Brin while they were Ph.D. students at Stanford University, California. Together, they own about 14 percent of its shares and control 56 percent of the stockholder voting power through supervoting stock. They incorporated Google as a privately held company on September 4, 1998. An initial public offering (IPO) took place on August 19, 2004, and Google moved to its new headquarters in Mountain View, California, nicknamed the Googleplex.

In August 2015, Google announced plans to reorganize its interests as a holding company called Alphabet Inc. When this restructuring took place on October 2, 2015, Google became Alphabet’s leading subsidiary, as well as the parent for Google’s Internet interests.

Rapid growth since incorporation has triggered a chain of products, acquisitions and partnerships beyond Google’s core search engine (Google Search). It offers services designed for work and productivity (Google Docs, Sheets and Slides), email (Gmail), scheduling and time management (Google Calendar), cloud storage (Google Drive), social networking (Google+), instant messaging and video chat (Google Allo/Duo/Hangouts), language translation (Google Translate), mapping and turn-by-turn navigation (Google Maps), video-sharing (YouTube), taking notes (Google Keep), organizing and editing photos (Google Photos), and a web browser (Google Chrome). The company leads the development of the Android mobile operating system and the browser-only Chrome OS for a class of netbooks known as Chromebooks and desktop PCs known as Chromeboxes. Google has moved increasingly into hardware; from 2010 to 2015, it partnered with major electronics manufacturers in the production of its Nexus devices, and in October 2016, it launched multiple hardware products (the Google Pixel, Home, Wifi, and Daydream View), with new hardware chief Rick Osterloh stating that “a lot of the innovation that we want to do now ends up requiring controlling the end-to-end user experience”. In 2012, a fiber-optic infrastructure was installed in Kansas City to facilitate a Google Fiber broadband service, and in 2016, the company launched the Google Station initiative to make public “high-quality, secure, easily accessible Wi-Fi” around the world, which had already started, and become a success, in India.

Google has been estimated to run more than one million servers in data centers around the world (as of 2007). It processes over one billion search requests and about 24 petabytes of user-generated data each day (as of 2009). In December 2013, Alexa listed Google.com as the most visited website in the world. Numerous Google sites in other languages figure in the top one hundred, as do several other Google-owned sites such as YouTube and Blogger.

Google has been the second most valuable brand in the world for 4 consecutive years, and has a valuation in 2016 at $133 billion. Google’s mission statement from the outset was “to organize the world’s information and make it universally accessible and useful,” and its unofficial slogan was “Don’t be evil”. In October 2015, the motto was replaced in the Alphabet corporate code of conduct by the phrase: “Do the right thing”. Google’s commitment to such robust idealism has been increasingly called into doubt due to a number of actions and behaviours which appear to contradict this.

History

Google began in January 1996 as a research project by Larry Page and Sergey Brin when they were both PhD students at Stanford University in Stanford, California.

While conventional search engines ranked results by counting how many times the search terms appeared on the page, the two theorized about a better system that analyzed the relationships between websites.[44] They called this new technology PageRank; it determined a website’s relevance by the number of pages, and the importance of those pages, that linked back to the original site.

Page and Brin originally nicknamed their new search engine “BackRub”, because the system checked backlinks to estimate the importance of a site. Eventually, they changed the name to Google, originating from a misspelling of the word “googol”, the number one followed by one hundred zeros, which was picked to signify that the search engine was intended to provide large quantities of information. Originally, Google ran under Stanford University’s website, with the domains google.stanford.edu and z.stanford.edu.

The domain name for Google was registered on September 15, 1997, and the company was incorporated on September 4, 1998. It was based in the garage of a friend (Susan Wojcicki) in Menlo Park, California. Craig Silverstein, a fellow PhD student at Stanford, was hired as the first employee.

In May 2011, the number of monthly unique visitors to Google surpassed one billion for the first time, an 8.4 percent increase from May 2010 (931 million). In January 2013, Google announced it had earned US$50 billion in annual revenue for the year of 2012. This marked the first time the company had reached this feat, topping their 2011 total of $38 billion.

The company has reported fourth quarter (Dec 2014) Earnings Per Share (EPS) of $6.88 – $0.20 under projections. Revenue came in at $14.5 billion (16.9% growth year over year), also under expectations by $110 million.

Financing, 1998 and initial public offering, 2004

The first funding for Google was an August 1998 contribution of $100,000 from Andy Bechtolsheim, co-founder of Sun Microsystems, given before Google was incorporated. Early in 1999, while graduate students, Brin and Page decided that the search engine they had developed was taking up too much time and distracting their academic pursuits. They went to Excite CEO George Bell and offered to sell it to him for $1 million. He rejected the offer and later criticized Vinod Khosla, one of Excite’s venture capitalists, after he negotiated Brin and Page down to $750,000. On June 7, 1999, a $25 million round of funding was announced, with major investors including the venture capital firms Kleiner Perkins Caufield & Byers and Sequoia Capital.

Google’s initial public offering (IPO) took place five years later on August 19, 2004. At that time Larry Page, Sergey Brin, and Eric Schmidt agreed to work together at Google for 20 years, until the year 2024. The company offered 19,605,052 shares at a price of $85 per share. Shares were sold in an online auction format using a system built by Morgan Stanley and Credit Suisse, underwriters for the deal. The sale of $1.67 bn (billion) gave Google a market capitalization of more than $23bn. By January 2014, its market capitalization had grown to $397bn. The vast majority of the 271 million shares remained under the control of Google, and many Google employees became instant paper millionaires. Yahoo!, a competitor of Google, also benefited because it owned 8.4 million shares of Google before the IPO took place.

There were concerns that Google’s IPO would lead to changes in company culture. Reasons ranged from shareholder pressure for employee benefit reductions to the fact that many company executives would become instant paper millionaires. As a reply to this concern, co-founders Sergey Brin and Larry Page promised in a report to potential investors that the IPO would not change the company’s culture. In 2005, articles in The New York Times and other sources began suggesting that Google had lost its anti-corporate, no evil philosophy. In an effort to maintain the company’s unique culture, Google designated a Chief Culture Officer, who also serves as the Director of Human Resources. The purpose of the Chief Culture Officer is to develop and maintain the culture and work on ways to keep true to the core values that the company was founded on: a flat organization with a collaborative environment. Google has also faced allegations of sexism and ageism from former employees. In 2013, a class action against several Silicon Valley companies, including Google, was filed for alleged “no cold call” agreements which restrained the recruitment of high-tech employees.

The stock performed well after the IPO, with shares hitting $350 for the first time on October 31, 2007, primarily because of strong sales and earnings in the online advertising market. The surge in stock price was fueled mainly by individual investors, as opposed to large institutional investors and mutual funds. GOOG shares split into GOOG Class C shares and GOOGL class A shares. The company is listed on the NASDAQ stock exchange under the ticker symbols GOOGL and GOOG, and on the Frankfurt Stock Exchange under the ticker symbol GGQ1. These ticker symbols now refer to Alphabet Inc., Google’s holding company, since the fourth quarter of 2015.

Growth

In March 1999, the company moved its offices to Palo Alto, California, which is home to several prominent Silicon Valley technology start-ups. The next year, against Page and Brin’s initial opposition toward an advertising-funded search engine, Google began selling advertisements associated with search keywords. In order to maintain an uncluttered page design and increase speed, advertisements were solely text-based. Keywords were sold based on a combination of price bids and click-throughs, with bidding starting at five cents per click.

This model of selling keyword advertising was first pioneered by Goto.com, an Idealab spin-off created by Bill Gross. When the company changed names to Overture Services, it sued Google over alleged infringements of the company’s pay-per-click and bidding patents. Overture Services would later be bought by Yahoo! and renamed Yahoo! Search Marketing. The case was then settled out of court; Google agreed to issue shares of common stock to Yahoo! in exchange for a perpetual license.

In 2001, Google received a patent for its PageRank mechanism. The patent was officially assigned to Stanford University and lists Lawrence Page as the inventor. In 2003, after outgrowing two other locations, the company leased an office complex from Silicon Graphics at 1600 Amphitheatre Parkway in Mountain View, California. The complex became known as the Googleplex, a play on the word googolplex, the number one followed by a googol zeroes. The Googleplex interiors were designed by Clive Wilkinson Architects. Three years later, Google bought the property from SGI for $319 million. By that time, the name “Google” had found its way into everyday language, causing the verb “google” to be added to the Merriam-Webster Collegiate Dictionary and the Oxford English Dictionary, denoted as “to use the Google search engine to obtain information on the Internet”. The first use of “Google” as a verb in pop culture (TV) happened on Buffy the Vampire Slayer in 2002.

The immense popularity of the search engine has led its fans calling themselves ‘Googlists’ as they follow ‘Googlism’, the new religion. Devotees of Google have found a non-profit online organization The Church of Google, a website where they worship the search engine giant. The New York Times had discussed the topic “Is Google God?” under its ‘opinion’ category.

2013 onward

Google announced the launch of a new company called Calico on September 19, 2013, which will be led by Apple chairman Arthur Levinson. In the official public statement, Page explained that the “health and well-being” company will focus on “the challenge of ageing and associated diseases”.

Google celebrated its 15-year anniversary on September 27, 2013, and in 2016 it celebrated its 18th birthday with an animated Doodle shown on web browsers around the world. although it has used other dates for its official birthday. The reason for the choice of September 27 remains unclear, and a dispute with rival search engine Yahoo! Search in 2005 has been suggested as the cause.

The Alliance for Affordable Internet (A4AI) was launched in October 2013 and Google is part of the coalition of public and private organisations that also includes Facebook, Intel and Microsoft. Led by Sir Tim Berners-Lee, the A4AI seeks to make Internet access more affordable so that access is broadened in the developing world, where only 31% of people are online. Google will help to decrease Internet access prices so that they fall below the UN Broadband Commission’s worldwide target of 5% of monthly income.

The corporation’s consolidated revenue for the third quarter of 2013 is reported in mid-October 2013 as $14.89 billion, a 12 percent increase compared to the previous quarter. Google’s Internet business was responsible for $10.8 billion of this total, with an increase in the number of users’ clicks on advertisements.

In November 2013, Google announced plans for a new 1-million-sq-ft (93,000 sq m) office in London, which is due to open in 2016. The new premises will be able to accommodate 4,500 employees and has been identified as one of the biggest ever commercial property acquisitions in Britain.

According to Interbrand’s annual Best Global Brands report, Google has been the second most valuable brand in the world (behind Apple Inc.) in 2013, 2015, and 2016, with a valuation of $133 billion.

In September 2015, Google engineering manager Rachel Potvin revealed details about Google’s software code at an engineering conference. She revealed that the entire Google codebase, which spans every single service it develops, consists of over 2 billion lines of code. All that code is stored on a code repository available to all 25,000 Google engineers, and the code is regularly copied and updated on 10 Google data centers. To keep control, Potvin said Google has built its own “version control system”, called “Piper”, and that “when you start a new project, you have a wealth of libraries already available to you. Almost everything has already been done.” Engineers can make a single code change and deploy it on all services at the same time. The only major exceptions are that the PageRank search results algorithm is stored separately with only specific employee access, and the code for the Android operating system and the Google Chrome browser are also stored separately as they don’t run on the Internet. The “Piper” system spans 85 TB of data, and Google engineers make 25,000 changes to the code each day, and on a weekly basis change approximately 15 million lines of code across 250,000 files. With that much code, automated bots have to help, with Potvin saying, “You need to make a concerted effort to maintain code health. And this is not just humans maintaining code health, but robots too.” Bots aren’t writing code, but generating a lot of the data and configuration files needed to run the company’s software. “Not only is the size of the repository increasing,” Potvin explained, “but the rate of change is also increasing. This is an exponential curve.”

In September 2016, Google released its Google Trips app for Android and iOS. The app, which automatically gathers information about upcoming trips users take based on the user’s Gmail messages, offers recommendations for places to go, things to do, and anything interesting to explore during the user’s travel. Google states that “each trip contains key categories of information, including day plans, reservations, things to do, food & drink, and more” in trip cards in the app, and that offline availability is supported.

Also in September 2016, Google began expanding its Google Station initiative, which was previously a project for public Wi-Fi at railway stations in India. Caesar Sengupta, VP for Google’s next billion users, told The Verge that 15,000 people get online for the first time thanks to Google Station, and that 3.5 million people use the service every month. The expansion meant that Google was looking for partners around the world to further develop the initiative, which promised “high-quality, secure, easily accessible Wi-Fi”.

As of October 2016, Google operates 70 offices in more than 40 countries.

Push into hardware

In April 2016, Recode reported that Google had hired Rick Osterloh, Motorola Mobility’s former President, to be in charge of Google’s new hardware division. Later, in October 2016, The Information reported that David Foster, Amazon.com’s former Kindle hardware chief, had joined Google as hardware chief for a new brand of smartphones by Google.

On October 4, 2016, Google held a #MadeByGoogle press event, where it announced its intention to create more hardware, with Rick Osterloh stating that “a lot of the innovation that we want to do now ends up requiring controlling the end-to-end user experience”, and introduced:

The Pixel and Pixel XL smartphones with the Google Assistant, a next-generation contextual voice assistant, built-in.

Google Home, an Amazon Echo-like voice assistant placed in the house that can answer voice queries, play music, find information from apps (calendar, weather etc.), and control third-party smart home appliances (users can tell it to turn on the lights, for example).

Daydream View virtual reality headset that lets Android users with compatible Daydream-ready smartphones put their phones in the headset and enjoy VR content.

Google Wifi, a connected set of Wi-Fi routers to simplify and extend coverage of home Wi-Fi.

GoogleEarth – The History of Domain Names

Google Earth virtual globe

Date: 01/01/2005

Google Earth is a virtual globe, map and geographical information program that was originally called EarthViewer 3D created by Keyhole, Inc, a Central Intelligence Agency (CIA) funded company acquired by Google in 2004 (see In-Q-Tel). It maps the Earth by the superimposition of images obtained from satellite imagery, aerial photography and geographic information system (GIS) onto a 3D globe. It was originally available with three different licenses, but has since been reduced to just two: Google Earth (a free version with limited function) and Google Earth Pro, which is now free (it previously cost $399 a year) and is intended for commercial use. The third original option, Google Earth Plus, has been discontinued.

The product, re-released as Google Earth in 2005, is available for use on personal computers running Windows 2000 and above, Mac OS X 10.3.9 and above, Linux kernel: 2.6 or later (released on June 12, 2006), and FreeBSD. Google Earth is also available as a browser plugin which was released on May 28, 2008. It was also made available for mobile viewers on the iPhone OS on October 28, 2008, as a free download from the App Store, and is available to Android users as a free app in the Google Play store. In addition to releasing an updated Keyhole based client, Google also added the imagery from the Earth database to their web-based mapping software, Google Maps. The release of Google Earth in June 2005 to the public caused a more than tenfold increase in media coverage on virtual globes between 2004 and 2005, driving public interest in geospatial technologies and applications. As of October 2011, Google Earth has been downloaded more than a billion times.

Google Earth displays satellite images of varying resolution of the Earth’s surface, allowing users to see things like cities and houses looking perpendicularly down or at an oblique angle (see also bird’s eye view). The degree of resolution available is based somewhat on the points of interest and popularity, but most land (except for some islands) is covered in at least 15 meters of resolution. Maps showing a visual representation of Google Earth coverage Melbourne, Victoria, Australia; Las Vegas, Nevada, United States; and Cambridge, Cambridgeshire, United Kingdom include examples of the highest resolution, at 15 cm (6 inches). Google Earth allows users to search for addresses for some countries, enter coordinates, or simply use the mouse to browse to a location.

For large parts of the surface of the Earth only 2D images are available, from almost vertical photography. Viewing this from an oblique angle, there is perspective in the sense that objects which are horizontally far away are seen smaller, like viewing a large photograph, not quite like a 3D view.

For other parts of the surface of the Earth, 3D images of terrain and buildings are available. Google Earth uses digital elevation model (DEM) data collected by NASA’s Shuttle Radar Topography Mission (SRTM). This means one can view almost the entire earth in three dimensions. Since November 2006, the 3D views of many mountains, including Mount Everest, have been improved by the use of supplementary DEM data to fill the gaps in SRTM coverage.

Some people use the applications to add their own data, making them available through various sources, such as the Bulletin Board Systems (BBS) or blogs mentioned in the link section below. Google Earth is able to show various kinds of images overlaid on the surface of the earth and is also a Web Map Service client. Google Earth supports managing three-dimensional Geospatial data through Keyhole Markup Language (KML).

Google Earth is simply based on 3D maps, with the capability to show 3D buildings and structures (such as bridges), which consist of users’ submissions using SketchUp, a 3D modeling program software. In prior versions of Google Earth (before Version 4), 3D buildings were limited to a few cities, and had poorer rendering with no textures. Many buildings and structures from around the world now have detailed 3D structures; including (but not limited to) those in the United States, Canada, Mexico, India, Japan, United Kingdom, Spain, Germany, Pakistan and the cities, Amsterdam and Alexandria. In August 2007, Hamburg became the first city entirely shown in 3D, including textures such as façades. The ‘Westport3D’ model was created by 3D imaging firm AM3TD using long-distance laser scanning technology and digital photography and is the first such model of an Irish town to be created. As it was developed initially to aid Local Government in carrying out their town planning functions it includes the highest-resolution photo-realistic textures to be found anywhere in Google Earth. Three-dimensional renderings are available for certain buildings and structures around the world via Google’s 3D Warehouse and other websites. In June 2012, Google announced that it will start to replace user submitted 3D buildings with auto-generated 3D mesh buildings starting with major cities. Although there are many cities on Google Earth that are fully or partially 3D, more are available in the Earth Gallery. The Earth Gallery is a library of modifications of Google Earth people have made. In the library there are not only modifications for 3D buildings, but also models of earthquakes using the Google Earth model, 3D forests, and much more.

Recently, Google added a feature that allows users to monitor traffic speeds at loops located every 200 yards in real-time. In 2007, Google began offering traffic data in real-time, based on information crowdsourced from the GPS-identified locations of cellular phone users. In version 4.3 released on April 15, 2008, Google Street View was fully integrated into the program allowing the program to provide an on the street level view in many locations.

On January 31, 2010, the entirety of Google Earth’s ocean floor imagery was updated to new images by SIO, NOAA, US Navy, NGA, and GEBCO. The new images have caused smaller islands, such as some atolls in the Maldives, to be rendered invisible despite their shores being completely outlined.

Uses

Google Earth may be used to perform some day-to-day tasks and for other purposes.

Google Earth can be used to view areas subjected to widespread disasters if Google supplies up-to-date images. For example, after the January 12, 2010 Haiti earthquake images of Haiti were made available on January 17.

With Google’s push for the inclusion of Google Earth in the Classroom, teachers are adopting Google Earth in the classroom for lesson planning, such as teaching students geographical themes (location, culture, characteristics, human interaction, and movement) to creating mashups with other web applications such as Wikipedia.

One can explore and place location bookmarks on the Moon and Mars.

One can get directions using Google Earth, using variables such as street names, cities, and establishments. But the addresses must by typed in search field, one cannot simply click on two spots on the map.

Google Earth can function as a hub of knowledge, pertaining the users location. By enabling certain options, one can see the location of gas stations, restaurants, museums, and other public establishments in their area. Google Earth can also dot the map with links to images, YouTube videos, and Wikipedia articles relevant to the area being viewed.

One can create custom image overlays for planning trips, hikes on handheld GPS units.

Google Earth can be used to map homes and select a random sample for research in developing countries.

All of these features are also released by Google Earth Blog.

Features

Wikipedia and Panoramio integration

In December 2006, Google Earth added a new layer called “Geographic Web” that includes integration with Wikipedia and Panoramio. In Wikipedia, entries are scraped for coordinates via the Coord templates. There is also a community-layer from the project Wikipedia-World. More coordinates are used, different types are in the display and different languages are supported than the built-in Wikipedia layer. Google announced on May 30, 2007 that it is acquiring Panoramio. In March 2010, Google removed the “Geographic Web” layer. The “Panoramio” layer became part of the main layers and the “Wikipedia” layer was placed in the “More” layer.

Flight simulator

In Google Earth v4.2 a flight simulator was included as a hidden feature. Starting with v4.3 it is no longer hidden. The flight simulator could be accessed by holding down the keys Ctrl, Alt, and A. Initially the F-16 Fighting Falcon and the Cirrus SR-22 were the only aircraft available, and they could be used with only a few airports. However, one can start flight in “current location” and need not to be at an airport. One will face the direction they face when they start the flight simulator. They cannot start flight in ground level view and must be near the ground (approximately 50m-100m above the ground) to start in take-off position. Otherwise they will be in the air with 40% flaps and gears extended (landing position). In addition to keyboard control, the simulator can be controlled with a mouse or joystick. Google Earth v5.1 and higher crashes when starting flight simulator with Saitek and other joysticks. The user can also fly underwater.

Sky mode

Google Sky is a feature that was introduced in Google Earth 4.2 on August 22, 2007, and allows users to view stars and other celestial bodies. It was produced by Google through a partnership with the Space Telescope Science Institute (STScI) in Baltimore, the science operations center for the Hubble Space Telescope. Dr. Alberto Conti and his co-developer Dr. Carol Christian of STScI plan to add the public images from 2007, as well as color images of all of the archived data from Hubble’s Advanced Camera for Surveys. Newly released Hubble pictures will be added to the Google Sky program as soon as they are issued. New features such as multi-wavelength data, positions of major satellites and their orbits as well as educational resources will be provided to the Google Earth community and also through Christian and Conti’s website for Sky. Also visible on Sky mode are constellations, stars, galaxies and animations depicting the planets in their orbits. A real-time Google Sky mashup of recent astronomical transients, using the VOEvent protocol, is being provided by the VOEventNet collaboration. Google’s Earth maps are being updated each 5 minutes.

Google Sky faces competition from Microsoft WorldWide Telescope (which runs only under the Microsoft Windows operating systems) and from Stellarium, a free open source planetarium that runs under Microsoft Windows, OS X, and Linux.

On March 13, 2008, Google made a web-based version of Google Sky available via the internet.

Street View

On April 15, 2008 with version 4.3, Google fully integrated its Street View into Google Earth. In version 6.0, the photo zooming function has been removed because it is incompatible with the new ‘seamless’ navigation.

Google Street View provides 360° panoramic street-level views and allows users to view parts of selected cities and their surrounding metropolitan areas at ground level. When it was launched on May 25, 2007 for Google Maps, only five cities were included. It has since expanded to more than 40 U.S. cities, and includes the suburbs of many, and in some cases, other nearby cities. Recent updates have now implemented Street View in most of the major cities of Canada, Mexico, Denmark, South Africa, Japan, Spain, Norway, Finland, Sweden, France, the UK, Republic of Ireland, the Netherlands, Italy, Switzerland, Portugal, Taiwan, and Singapore.

Google Street View, when operated, displays photos that were previously taken by a camera mounted on an automobile, and can be navigated by using the mouse to click on photograph icons displayed on the screen in the user’s direction of travel. Using these devices, the photos can be viewed in different sizes, from any direction, and from a variety of angles.

Water and ocean

Introduced in version 5.0 (February 2009), the Google Ocean feature allows users to zoom below the surface of the ocean and view the 3D bathymetry beneath the waves. Supporting over 20 content layers, it contains information from leading scientists and oceanographers. On April 14, 2009, Google added underwater terrain data for the Great Lakes. In 2010, Google added underwater terrain data for Lake Baikal.

In June 2011, higher resolution of some deep ocean floor areas increased in focus from 1-kilometer grids to 100 meters thanks to a new synthesis of seafloor topography released through Google Earth. The high-resolution features were developed by oceanographers at Columbia University’s Lamont-Doherty Earth Observatory from scientific data collected on research cruises. The sharper focus is available for about 5 percent of the oceans (an area larger than North America). Underwater scenery can be seen of the Hudson Canyon off New York City, the Wini Seamount near Hawaii, and the sharp-edged 10,000-foot-high Mendocino Ridge off the U.S Pacific Coast. There is a Google 2011 Seafloor Tour for those interested in viewing ocean deep terrain.

Historical Imagery

Introduced in version 5.0, Historical Imagery allows users to traverse back in time and study earlier stages of any place. This feature allows research that require analysis of past records of various places.

Influences

Google Earth can be traced directly back to a small company named Autometric, now a part of Boeing. A team at Autometric, led by Robert Cowling, created a visualization product named Edge Whole Earth. Bob demonstrated Edge to Michael T. Jones, Chris Tanner and others at SGC in 1996. Several other visualization products using imagery existed at the time, including Performer-based ones, but Michael T. Jones stated emphatically that he had “never thought of the complexities of rendering an entire globe …” The catch phrase “from outer space to in your face” was coined by Autometric President Dan Gordon, and used to explain his concept for personal/local/global range. Edge blazed a trail as well in broadcasting, being used in 1997 on CBS News with Dan Rather, in print for rendering large images draped over terrain for National Geographic, and used for special effects in the feature film Shadow Conspiracy in 1997.

Gordon was a huge fan of the ‘Earth’ program described in Neal Stephenson’s sci-fi classic Snow Crash. Indeed, a Google Earth co-founder claimed that Google Earth was modeled after Snow Crash, while another co-founder said it was inspired by the short science education film Powers of Ten. In fact Google Earth was at least partly inspired by a Silicon Graphics demo called “From Outer Space to in Your Face” which zoomed from space into the Swiss Alps then into the Matterhorn. This launch demo was hosted by an Onyx 3000 with InfiniteReality4 graphics, which supported Clip Mapping and was inspired by the hardware texture paging capability (although it did not use the Clip Mapping) and “Powers of Ten”. The first Google Earth implementation called Earth Viewer emerged from Intrinsic Graphics as a demonstration of Chris Tanner’s software based implementation of a Clip Mapping texture paging system and was spun off as Keyhole Inc.

Google Earth Pro

Google Earth Pro is a business-oriented upgrade to Google Earth that has more features than the Plus version. It is the most feature-rich version of Google Earth available to the public, with various additional features such as a movie maker and data importer. In addition to business-friendly features, it has also been found useful for travelers with map-making tools. Up until late January 2015, it was available for $399/year, however Google decided to make it free to the public. It is now for free and Google does not mention anything about new policy changes. The Pro version includes add-on software such as:

  • Movie making.
  • GIS data importer.
  • Advanced printing modules.
  • Radius and area measurements.

Google Earth Plug-in

The Google Earth API is a free beta service, available for any web site that is free to consumers. The Plug-in and its JavaScript API let users place a version of Google Earth into web pages. The API enables sophisticated 3D map applications to be built. At its unveiling at Google’s 2008 I/O developer conference, the company showcased potential applications such as a game where the player controlled a milktruck atop a Google Earth surface.

The Google Earth API has been deprecated as of 15 December 2014 and will remain supported until the 15th of December 2015. Google Chrome aims to end support for the Netscape Plugin API (which the Google Earth API relies on) by the end of 2016.

Gopher-www – The History of Domain Names

From Gopher to the WWW

Date: 01/01/1987

As the Internet grew through the 1980s and early 1990s, many people realized the increasing need to be able to find and organize files and information. Projects such as Gopher, WAIS, and the FTP Archive list attempted to create ways to organize distributed data. Unfortunately, these projects fell short in being able to accommodate all the existing data types and in being able to grow without bottlenecks. One of the most promising user interface paradigms during this period was hypertext.The technology had been inspired by Vannevar Bush’s “Memex” and developed through Ted Nelson’s research on Project Xanadu and Douglas Engelbart’s research on NLS. Many small self-contained hypertext systems had been created before, such as Apple Computer’s HyperCard (1987). Gopher became the first commonly-used hypertext interface to the Internet. While Gopher menuitems were examples of hypertext, they were not commonly perceived in that way.

Gosip – The History of Domain Names

GOSIP (without TCP/IP)

Date: 01/01/1990

Government Open Standards Interconnect Profile (GOSIP)

THE FEDERAL INFORMATION PROCESSING STANDARD (FIPS) #146: Government Open Systems Interconnection Profile (GOSIP)

National Institute of Standards and Technology (NIST), 1990

In 1990, NIST released a document called The Federal Information Processing Standard (FIPS #146) which outlined the GOSIP. All vendors were asked to comply with government offices implementing this profile starting around August of 1990.

Unfortunately, by the time the Government got around to ‘standardizing’ and ‘implementing the GOSIP standard’ the Internet had already fully implemented TCP/IP as its own standard and was using it exclusively. TCP/IP and GOSIP were NOT compatible. The government, caught with a technology that no longer matched the de-facto standard used on the Internet then mandated that the IETF and the IAB make the Internet compatible with GOSIP even though the OSI GOSIP software was not yet finished and not fully standardized even by 1990 when the FIPS #146 document was published.

If you take a look at the NIST website today, you will see that FIPS 146 / GOSIP is no longer listed, a legacy of a bygone era.

Why do we bring up GOSIP? Because when dealing with old, crusty government and military networking folks, you will hear references to the OSI protocols, FIPS 146 and GOSIP, because it takes a looong, loooooongg time for government programs and protocols to die.

Fortunately, most government offices have now fully embraced the TCP/IP suite.

GTE – The History of Domain Names

GTE Corporation – GTE.com was registered

Date: 11/05/1986

On November 5, 1986, GTE Corporation registered the gte.com domain name, making it 40th .com domain ever to be registered.

GTE Corporation, formerly General Telephone & Electric Corporation (1955–1982), was the largest independent telephone company in the United States during the days of the Bell System. The company operated from 1926, with roots tracing further back than that, until 2000, when it merged with Bell Atlantic; the combined company took the name Verizon.

Company History

GTE Corporation ranked as the world’s third-largest publicly owned telecommunications company in 1996. With over 20 million telephone access lines in 40 states, the communications conglomerate was America’s leading provider of local telephone services. The $6.6 billion acquisition of Contel Corporation in 1990 nearly doubled GTE’s Mobilnet cellular operations, making it the second-largest provider of cellular telephone services in the United States, with over two million customers. GTE’s strategy for the mid- to late 1990s focused on technological enhancement of wireline and wireless systems, expansion of data services, global expansion, and diversification into video services.

In March 1990 the largest merger in the history of the telecommunications industry united two former U.S. competitors, GTE Corporation and Contel Corporation, under the GTE name. With a market value of $28 billion, the merged company became a telecommunications powerhouse. Designed to take advantage of the two companies’ complementary businesses, the merger strengthened GTE’s assets in two of its three major areas of operations: telephone service and telecommunications products. While the two companies were united under one name, each has a rich history of its own. GTE’s heritage can be traced to 1918, when three Wisconsin public utility accountants pooled $33,500 to purchase the Richland Center Telephone Company, serving 1,466 telephones in the dairy belt of southern Wisconsin. From the outset, John F. O’Connell, Sigurd L. Odegard, and John A. Pratt worked under the guiding principle that better telephone service could be rendered to small communities if a number of exchanges were operated under one managing body.

In 1920 that principle was put into action, and the three accountants formed a corporation, Commonwealth Telephone Company, with Odegard as president, Pratt as vice-president, and O’Connell as secretary. Richland Center Telephone became part of Commonwealth Telephone, which quickly purchased telephone companies in three nearby communities. In 1922 Pratt resigned as vice-president and was replaced by Clarence R. Brown, a former Bell System employee. By the mid-1920s Commonwealth had extended beyond Wisconsin borders and purchased the Belvidere Telephone Company in Illinois. It also diversified into other utilities by acquiring two small Wisconsin electrical companies. Expansion was stepped up in 1926, when Odegard secured an option to purchase Associated Telephone Company of Long Beach, California. Odegard, with the assistance of Marshall E. Sampsell, president of Wisconsin Power and Light Company, and Morris F. LaCroix, a partner in Paine, Webber & Company in Boston, proceeded to devise a plan for a holding company, to be named Associated Telephone Utilities Company. That company was formed in 1926 to acquire Associated Telephone Company and assume the assets of Commonwealth Telephone. Sampsell was elected president of the new company, and Odegard and LaCroix were named vice-presidents. An aggressive acquisition program was quickly launched in eastern, midwestern, and western states, with the company using its own common stock to complete transactions.

During its first six years, Associated Telephone Utilities acquired 340 telephone companies, which were consolidated into 45 companies operating more than 437,000 telephones in 25 states. By the time the stock market bottomed out in October 1929, Associated Telephone Utilities was operating about 500,000 telephones with revenues approaching $17 million. In January 1930 a new subsidiary, Associated Telephone Investment Company, was established. Designed to support its parent’s acquisition program, the new company’s primary business was buying company stock in order to bolster its market value. Within two years the investment company had incurred major losses, and a $1 million loan had to be negotiated. Associated Telephone Investment was dissolved but not before its parent’s financial plight had become irreversible, and in 1933 Associated Telephone Utilities went into receivership. The company was reorganized that same year and resurfaced in 1935 as General Telephone Corporation, operating 12 newly consolidated companies. John Winn, a 26-year veteran of the Bell System, was named president. In 1936 General Telephone created a new subsidiary, General Telephone Directory Company, to publish directories for the parent’s entire service area.

In 1940 LaCroix was elected General Telephone’s first chairman, and Harold Bozell, a former banker for Associated Telephone Utilities, was named president. Like other businesses, the telephone industry was under government restrictions during World War II, and General Telephone was called upon to increase services at military bases and war-production factories. Following the war, General Telephone reactivated an acquisitions program that had been dormant for more than a decade and purchased 118,000 telephone lines between 1946 and 1950. In 1950 General Telephone purchased its first telephone-equipment manufacturing subsidiary, Leich Electric Company, along with the related Leich Sales Corporation. Bozell retired in 1951 and Donald Power, a former executive secretary for Ohio Governor John Bricker, was named president. By the time Power took over, General Telephone’s assets included 15 telephone companies operating in 20 states. During the 1950s Power guided the company in a steady, aggressive acquisition campaign punctuated by two major mergers. In 1955 Theodore Gary & Company, the second-largest independent telephone company, which had 600,000 telephone lines, was merged into General Telephone, which had grown into the largest independent outside the Bell System. The merger gave the company 2.5 million lines. Theodore Gary’s assets included telephone operations in the Dominican Republic, British Columbia, and the Philippines, as well as Automatic Electric, the second-largest telephone equipment manufacturer in the U.S. LaCroix and Power were to retain their positions in the merged company, but a month before the deal was closed, LaCroix died, and Power assumed the additional title of chairman.

In 1959 General Telephone and Sylvania Electric Products merged, and the parent’s name was changed to General Telephone & Electronics Corporation (GT&E). The merger gave Sylvania–a leader in such industries as lighting, television and radio, and chemistry and metallurgy–the needed capital to expand. For General Telephone, the merger meant the added benefit of Sylvania’s extensive research and development capabilities in the field of electronics. Power also orchestrated other acquisitions in the late 1950s, including Peninsular Telephone Company in Florida, with 300,000 lines, and Lenkurt Electric Company, Inc., a leading producer of microwave and data transmissions system. In 1960 the subsidiary GT&E International Incorporated was formed to consolidate manufacturing and marketing activities of Sylvania, Automatic Electric, and Lenkurt, outside the United States. The following year, Leslie H. Warner, a former Theodore Gary executive, was named president. Another former Theodore Gary executive, Don Mitchell, was named to the new position of vice-chairman, while Power remained chief executive officer and chairman. During the early 1960s the scope of GT&E’s research, development, and marketing activities was broadened. In 1963 Sylvania began full-scale production of color television picture tubes, and within two years it was supplying color tubes for 18 of the 23 domestic U.S. television manufacturers. About the same time, Automatic Electric began supplying electronic switching equipment for the U.S. defense department’s global communications systems, and GT&E International began producing earth-based stations for both foreign and domestic markets. GT&E’s telephone subsidiaries, meanwhile, began acquiring community-antenna television systems (CATV) franchises in their operating areas.

In 1964 Warner orchestrated a deal that merged Western Utilities Corporation, the nation’s second-largest independent telephone company, with 635,000 telephones, into GT&E. The following year Sylvania introduced the revolutionary four-sided flashcube, enhancing its position as the world’s largest flashbulb producer. Warner assumed the additional title of chief executive officer in 1966, while Power remained chairman. Acquisitions in telephone service continued under Warner during the mid-1960s. Purchases included Quebec Telephone in Canada, Hawaiian Telephone Company, and Northern Ohio Telephone Company and added a total of 622,000 telephone lines to GT&E operations. By 1969 GT&E was serving ten million telephones. In March 1970 GT&E’s New York City headquarters was bombed by a radical antiwar group in protest of the company’s participation in defense work. In December of that year the GT&E board agreed to move the company’s headquarters to Stamford, Connecticut. Power retired in 1971, and Warner was named chairman and chief executive officer. The following year Theodore F. Brophy was named president. After initially proposing to build separate satellite systems, GT&E and its telecommunications rival, American Telephone & Telegraph Co., announced in 1974 joint venture plans for the construction and operation of seven earth-based stations interconnected by two satellites. That same year Sylvania acquired name and distribution rights for Philco television and stereo products. GTE International expanded its activities during the same period, acquiring television manufacturers in Canada and Israel and a telephone manufacturer in Germany.

Warner retired in 1976 and Brophy was named to the additional post of chairman. Brophy, soon after assuming his new position, reorganized the company along five global product lines: communications, lighting, consumer electronics, precision materials, and electrical equipment. GTE International was phased out during the reorganization, and GTE Products Corporation was formed to encompass both domestic and foreign manufacturing and marketing operations. At the same time, GTE Communications Products was formed to oversee operations of Automatic Electric, Lenkurt, Sylvania, and GTE Information Systems.

Thomas A. Vanderslice was elected president and chief operating officer in 1979, and another reorganization soon followed. GTE Products Group was eliminated as an organizational unit and GTE Electrical Products, consisting of lighting, precision materials, and electrical equipment, was formed. Vanderslice also revitalized the GT&E Telephone Operating Group in order to develop competitive strategies for anticipated regulatory changes in the telecommunications industry. GT&E sold its consumer electronics businesses, including the accompanying brand names of Philco and Sylvania in 1980, after watching revenues from television and radio operations decrease precipitously with the success of foreign manufacturers. Following AT&T’s 1982 announcement that it would divest 22 telephone operating companies, GT&E made a number of reorganizational and consolidation moves. In 1982 the company adopted the name GTE Corporation and formed GTE Mobilnet Incorporated, to handle the company’s entrance into the new cellular telephone business. In 1983 GTE sold its electrical equipment, brokerage information services, and cable television equipment businesses. That same year, Automatic Electric and Lenkurt were combined as GTE Network Systems.

GTE became the third-largest long-distance telephone company in 1983 through the acquisition of Southern Pacific Communications Company. At the same time, Southern Pacific Satellite Company was acquired, and the two firms were renamed GTE Sprint Communications Corporation and GTE Spacenet Corporation, respectively. Through an agreement with the Department of Justice, GTE conceded to keep Sprint Communications separate from its other telephone companies and limit other GTE telephone subsidiaries in certain markets. In December 1983 Vanderslice resigned as president and chief operating officer. In 1984 GTE formalized its decision to concentrate on three core businesses: telecommunications, lighting, and precision metals. That same year, the company’s first satellite was launched, and GTE’s cellular telephone service went into operation; GTE’s earnings exceeded $1 billion for the first time.

James (Rocky) L. Johnson, a former senior vice-president, was named president and chief operating officer in 1986. That same year, GTE acquired Airfone Inc., a telephone service provider for commercial aircraft and railroads, and Rotaflex plc, a United Kingdom-based manufacturer of lighting fixtures. Beginning in 1986 GTE spun off several operations to form joint ventures. In 1986 GTE Sprint and United Telecommunication’s long-distance subsidiary, U.S. Telecom, agreed to merge and form US Sprint Communications Company, with each parent retaining a 50 percent interest in the new firm. That same year, GTE transferred its international transmission, overseas central office switching, and business systems operations to a joint venture with Siemens AG of Germany, which took 80 percent ownership of the new firm. The following year, GTE transferred its business systems operations in the United States to a new joint venture, Fujitsu GTE Business Systems, Inc., formed with Fujitsu Limited, which retained 80 percent ownership. Johnson succeeded Brophy as chairman and chief executive officer in 1987 and then relinquished his president’s title the following year to Charles R. Lee, a former senior vice-president. Johnson continued to streamline and consolidate operations, organizing telephone companies around a single national organization headquartered in the Dallas, Texas, area. In 1988 GTE divested its consumer communications products unit as part of a telecommunications strategy to place increasing emphasis on the services sector. The following year GTE sold the majority of its interest in US Sprint to United Telecommunications and its interest in Fujitsu GTE Business Systems to Fujitsu. In 1989 GTE and AT&T formed the joint venture company AG Communication Systems Corporation, designed to bring advanced digital technology to GTE’s switching systems. GTE retained 51 percent control over the joint venture, with AT&T pledging to take complete control of the new firm in 15 years.

With an increasing emphasis on telecommunications, in 1989 GTE launched a program to become the first cellular provider offering nationwide service and introduced the nation’s first rural service area, providing cellular service on the Hawaiian island of Kauai. The following year GTE acquired the Providence Journal Company’s cellular properties in five southern states for $710 million and became the second largest cellular-service provider in the United States. In 1990 GTE reorganized its activities around three business groups: telecommunications products and services, telephone operations, and electrical products. That same year, GTE and Contel Corporation announced merger plans that would strengthen GTE’s telecommunications and telephone sectors. Following action or review by more than 20 governmental bodies, in March 1991 the merger of GTE and Contel was approved. Johnson and Lee maintained their positions as chairman and president, respectively, while Contel’s Chairman Charles Wohlstetter became vice-chairman of GTE. Contel’s former president, Donald Weber, agreed to remain with the company during a six-month transition period, before leaving the merged company. Contel Corporation’s earliest predecessor, Telephone Communications Corporation, was founded by Charles Wohlstetter. After working as a Wall Street runner in the 1920s and as a Hollywood screenwriter in the 1930s, Wohlstetter returned to Wall Street in the 1940s and became a financier. In 1960 he made what he would later call a bad investment in an Alaskan oil company that would become the impetus for Contel. To help turn that investment around, Wohlstetter recruited the services of Jack Maguire and Phillip Lucier from a telephone supply company and then raised $1.5 million to form a holding company, Telephone Communications Corporation. Wohlstetter was named chairman of the new corporation, Lucier was named president, and Maguire was named vice-president. Some 30 years later, Wohlstetter’s $1.5 million investment had grown into a company that had acquired and consolidated more than 750 smaller companies with total corporate assets hovering around $6 billion.

One of the company’s first acquisitions was Central Western Company, which merged with Telephone Communications in 1961 to form the new parent Continental Telephone Company. The acquisition of Central Western, along with Harfil, Inc., provided the company with customer billing, general accounting, and toll separation services. Continental based its early acquisition strategy on Kreigspiel, a historical war game German generals played at Prussian war colleges. Wohlstetter applied the tenets of the game to telephone company operations and amassed detailed information on each independent telephone company in the United States. When those companies came up for sale, Wohlstetter and Maguire, who were pilots, and Lucier, whose wife was a pilot, would promptly fly off to meet the owners and negotiate purchase agreements. Many of the early acquisitions were made through exchanges of stock, including the 1964 merger with Independent Telephone Company that doubled the company’s size and changed its name in the process to Continental Independent Telephone Corporation. By the close of 1964, Continental had acquired more than 100 companies operating in 30 states. The company adopted another new name, Continental Telephone Corporation, in 1965. Also during 1965 Continental acquired 65 more telephone companies and again doubled its size. By 1966 Continental had acquired more than 500 independent companies, had become the third-largest independent telephone company in the United States, and was one of the youngest companies ever listed on the New York Stock Exchange. By 1970 Continental’s assets had topped $1 billion, and sales volume had risen to $120 million. Lucier died that year and was succeeded as president by Maguire, who moved up from a vice-presidency. Aside from its dominating telephone business, the company’s activities by that time had grown to include cable television systems, directory publishing, equipment leasing, and data services.

With the number of small independents having diminished considerably by 1970, Continental’s pace in acquiring telephone operating companies was reduced. Continental sold its cable television business in 1971, and after a sluggish economy had taken its toll on Continental’s manufacturing and supply subsidiaries, those, too, were sold in 1976. Maguire resigned in 1976 because of health problems and was succeeded as president by James V. Napier, a former executive vice-president. That same year, Continental became the first telephone company outside the Bell system to install a digital telephone switching system, a move that provided improved network operating efficiency, allowed the introduction of new calling features, and started the transition away from operations dominated by rural service areas. In response to the changing regulatory climate of the telephone industry, in 1978 Continental mapped out a diversification strategy into nonregulated businesses. Continental’s first diversification move came in 1979, with the acquisition of Executone, Inc., a New York-based communications equipment maker.

By 1980 Continental had two million telephone access lines in service and had established its first fiber-optic cable, a high-speed, high-capacity telecommunications transmission mode. While Continental continued the process of upgrading its telephone operations, during the early 1980s the company’s focus turned to greater diversification. In July 1980 Continental entered the satellite business through a joint venture with Fairchild Industries, and a communications partnership firm, American Satellite Company, was formed to operate a network of earth-based stations that provided voice and data services. To provide technology services to accommodate its expanding needs, Continental then acquired two consulting and research firms, Network Analysis Corporation and International Computing Company. In 1981 Continental acquired Page Communications Engineers Inc., later renamed Contel Page, which gave Continental expertise in the engineering, installation, and maintenance of satellite-to-earth stations. One year later, Continental hooked up with Fairchild Industries in a second joint venture called Space Communications Company, a provider of tracking and relay data services for such clients as the National Aeronautics and Space Administration. After the Federal Communications Commission opened the door to licenses for 30 cellular phone markets in 1981, Continental plunged into that field as well, acquiring sizable shares of cellular markets in Los Angeles, California; Washington, D.C.; and Minneapolis, Minnesota. Continental also entered the credit card authorization business in 1981, with the purchase of National Bancard Corporation. Two years later, Continental bolstered its interest in that business segment with the purchase of the Chase Merchants Services division of Chase Manhattan Bank. In 1982 the corporation changed its name to Continental Telecom Incorporated, adopted a new corporate logo, and inaugurated an advertising campaign around the theme “architects of telecommunications.” Continental’s expansion into the information services sector continued in 1982 with the purchase of STSC Inc., a computer services supplier; and Cado Systems Corporation, a maker of small business computers. That same year company revenues surpassed the $2 billion mark for the first time. In 1984 Continental formed the subsidiary Contel Cellular Inc. to handle the corporation’s growing cellular operations. A year later, Continental culminated its diversification moves and reorganized into four business sectors: telephone and cellular operations; business systems, offering voice and data processing products and services; federal systems, handling various facets of communication and information systems for government agencies; and information systems, offering telecommunications systems and services to large corporations, institutions, and government entities.

As a result of the company’s growing interest in the information services marketplace, in 1985 Continental acquired several computer system and software companies, including Northern Data Systems, Data Equipment Systems Corporation, and Sooner Enterprises, Inc. Continental also purchased Fairchild Industries’s interests in American Satellite Co., later renamed Contel ASC, and Space Communications Company. That same year, Continental sold its directory publishing division, its time-share services business, and its credit card authorization business. In the midst of reorganization in 1985, Napier resigned, and John N. Lemasters, former American Satellite Company president, was named president and chief executive officer. Continental’s telephone operations were repositioned during the mid-1980s through numerous sales and exchanges. Subsidiaries in Nebraska, Colorado, Alaska, the Bahamas, and Barbados were sold, and operations in Michigan were exchanged for similar operations in Indiana and three southern states. The name Contel Corporation was adopted in 1986. That same year, Contel’s new tenant services division set the stage for future growth by acquiring tenant service operations in Atlanta and Seattle. The tenant services division installed and managed customized communications systems in commercial buildings and marketed those systems to the buildings’ tenants. Contel also enhanced its information services division with the acquisition of IPC Communications, Inc., a supplier of a special-purpose telephone system used by financial traders, and expanded its federal systems operations with the purchase of Western Union Corporation’s government systems division, a provider of information handling systems. In September 1986 Contel announced it had agreed to merge with Communications Satellite Corporation (Comsat), but by mid-1987 Contel had called off the deal, citing Comsat’s unstable financial picture. The failed merger sparked the resignation of Lemasters. Donald W. Weber, former executive vice-president and head of telephone operations, was named Lemasters’s successor as president and chief executive officer. Contel acquired Comsat’s international private-line business and its very-small-aperture terminal (VSAT) satellite business in 1987, as well as Equatorial Communications Company, a provider of private satellite data networks. That same year, Contel agreed to sell Executone, its troubled telephone interconnect business, and Texocom, Contel’s equipment supply business. In the late 1980s Contel continued to narrow its focus in the information systems sectors. In 1988 it sold its computer-based business, Contel Business Systems, and a year later disposed of Contel Credit Corporation. Contel Federal Systems continued to grow during that same period, and in 1988 it acquired two Eaton Corporation subsidiaries: Information Management Systems and Data Systems Services. Two years later Contel purchased Telos Corporation, with expertise in government-preferred computer software. Contel’s tenant services and cellular businesses also got a boost in 1988 with the acquisition of RealCom Communications Corporation, an IBM tenant services subsidiary, and Southland Mobilcom Inc.’s interests in the Mobile, Alabama, and the Pensacola, Florida, cellular markets. In 1990 Contel completed the biggest acquisition in its history, a $1.3 billion purchase of McCaw Cellular Communications, Inc.’s controlling interests in 13 cellular markets, which added more than six million potential customers and doubled Contel’s cellular potential population market (known in the industry as POPs). While important, that move was eclipsed by the merger with GTE announced later that same year. Through that transition, the two former competitors were expected to integrate telephone and mobile-cellular operations and capitalize on business unit similarities in the field of satellite-communications as well as in communications systems and services targeting government entities.

Over half of Contel’s $6.6 billion purchase price, $3.9 billion, was assumed debt. When Charles Lee succeeded James (Rocky) L. Johnson to become CEO in 1992, his first order of business was reduction of that obligation. He sold GTE’s North American Lighting business to a Siemens affiliate for over $1 billion, shaved off local exchange properties in Idaho, Tennessee, Utah, and West Virginia to generate another $1 billion, divested its interest in Sprint in 1992, and sold its GTE Spacenet satellite operations to General Electric in 1994. The long-heralded telecommunications bill, expected to go into effect in 1996, promised to encourage competition among local phone providers, long distance services, and cable television companies. Many leading telecoms prepared for the new competitive realities by aligning themselves with entertainment and information providers. GTE, on the other hand, continued to focus on its core operations, seeking to make them as efficient as possible. In 1992, Lee launched a sweeping reorganization that was characterized by Telephony magazine as “easily one of the nation’s largest re-engineering processes.” Among other goals, his plan sought to double revenues and slash costs by $1 billion per year by focusing on five key areas of operation: technological enhancement of wireline and wireless systems, expansion of data services, global expansion, and diversification into video services. GTE hoped to cross-sell its large base of wireline customers on wireless, data and video services, launching Tele-Go, a user-friendly service that combined cordless and cellular phone features. The company bought broadband spectrum cellular licenses in Atlanta, Seattle, Cincinnati and Denver, and formed a joint venture with SBC Communications to enhance its cellular capabilities in Texas. In 1995, the company undertook a 15-state test of video conferencing services, as well as a video dialtone (VDT) experiment that proposed to offer cable television programming to 900,000 homes by 1997. GTE also formed a video programming and interservices joint venture with Ameritech Corporation, BellSouth Corporation, SBC, and The Walt Disney Company in the fall of 1995. Foreign efforts included affiliations with phone companies in Argentina, Mexico, Germany, Japan, Canada, the Dominican Republic, Venezuela and China. The early 1990s reorganization included a 37.5 percent workforce reduction, from 177,500 in 1991 to 111,000 by 1994. Lee’s five-fold strategy had begun to bear fruit by the mid-1990s. While the communication conglomerate’s sales remained rather flat, at about $19.8 billion, from 1992 through 1994, its net income increased by 43.7 percent, from $1.74 billion to a record $2.5 billion, during the same period.

GMR – The History of Domain Names

General Motoros – GMR.com was registered

Date: 05/08/1986

On May 8, 1986, General Motors became the 16th company to register their domain gmr.com

General Motors Research Laboratories are the part of General Motors responsible for creation of the first known operating system (GM-NAA I/O) in 1955 and contributed to the first mechanical heart, the Dodrill-GMR, successfully used while performing open heart surgery.In 1962 GM created the General Motors Research Laboratories, based in Santa Barbara, California, to conduct research and development activities on defense systems. This organization was eventually merged into Delco Electronics and renamed Delco Systems Operations.

In 1985 General Motors purchased Hughes Aircraft and merged it with Delco Electronics to form Hughes Electronics Corporation, an independent subsidiary. In 1997 all of the defense businesses of Hughes Electronics (including Delco Systems Operations) were merged with Raytheon, and the commercial portion of Delco Electronics was transferred to GM’s Delphi Automotive Systems business. Delphi became a separate publicly traded company in May 1999, and continued to use the Delco Electronics name for several of its subsidiaries through approximately 2004.

Although Delco Electronics no longer exists as an operating company, GM still retains rights to the Delco name and uses it for some of its subsidiaries including the AC Delco parts division.

The General Motors Research Laboratories

THE GENERAL MOTORS CORPORATION in its institutional advertisement in August 1965 issues of national periodicals featured librarians under the legend “Answer Ma’am’’ and revealed that among its resources are twenty-two company libraries in the United States with seventy-eight persons on their staffs.l The capstone of this library “system” is the General Motors Research Laboratories Library, housed at the beautiful 600 acre Technical Center in suburban Warren, Michigan. This Library (abbreviated as the “GMR Library”) has existed since 1917, has carried out an interloan service for other company divisions since 1927, has prepared a current awareness bulletin since 1933, has a larger total collection and staff and receives more current periodical titles than many of the public library departments of technology and science responding to the questionnaire by Daniel Pfoutz and Jackson Cohena2 Yet it is far from self-sufficient. It borrows about 1,100 items on interlibrary loan a year (equivalent to one-sixth of the total interlibrary loans supplied to all industry in a year by the Massachusetts Institute of Technology librarie~,~ has deposit accounts in several of the libraries mentioned by Ralph Phelp~,~ William B~dington,~ Dwight Gray and J. Burlin Johnson,6 and still cannot answer all the demands placed upon it by its three “publics.” These are, first, the Research Laboratories staff of 500 professional scientists and engineers; second, the several hundred engineers and scientists in other staffs and groups at the Technical Center site (some of whom have their own professional library support); and, third, the thirty-three Divisions of the Corporation located all over the world. Using communications terminology, the GMR Library serves as a “switching function” between the needs of a substantial portion of the Corporation’s literature users and the library community. (For descriptions of the GMR Library and the Corporation’s “library system,” two earlier articles may be Further, it serves as a switching function between these users and the documentation services of the Department of Defense, Atomic Energy Commission, and others. This last function is also exercised by certain Division library services having broader access to more highly classified military documents than the “need to know” of the Research Laboratories permits; It seems inevitable that there is a triangle associated with the availability of information to an industrial firm.

At the apex is a small area representing material physically located at a given location; below this, there is an area larger in size representing material elsewhere in the Company; and finally the far larger area in the remainder of the triangle represents the material aiyailable on the outside. Interlibrary loan is the catalyst that permits a decentralized industrial library system to work under these circumstances. Although the accurate keeping of statistics suffers by the pressure of business to get the job done, the figures on interlibrary loans (frm non-Ghif Libraries) and interloans (to other parts of the Corporation) are more accurate than most which were kept. Table 1 shows that they have been fairly constant at the respective levels of 1,100 interlibrary loans and nearly 6,000 interloan requests per year during the period 1957 to 1964. As interloans initiate a certain percentage of GMR interlibrary loan requests, they will be considered first. It is to be noted that Table 1 excludes all activities contiguous to the Research Laboratories, as employees of a transportation manufacturer are highly mobile and accustomed to taking advantage of “cafeteria” service of the kind offered by GMR Library.

Godaddy Auctions – The History of Domain Names

Go Daddy Auctions sells 32,151 domains in November

December 3, 2012

Go Daddy has updated its Auctions Domain Market Report with results from last month.

Last month Go Daddy auctions sold sold 32,151 domain names, which is down just slightly from the month before.

Monthly domain sales in the low 30,000 range seems to be the new norm on the site.

The top 10 sales last month were:

MyModa.com $50,000

Funlife.com $50,000 – currently in escrow with Afternic

Brilliant.org $25,000

LowRates.com $25,000 – bought by Sun West Mortgage Company, Inc., seems like a great purchase

Betland.com $20,000

Fastnet.com $15,000 – bought by FastNet International Limited, owner of Fastnet.co.uk

JBW.com $15,000 – bought by JBW Timepieces, which owns JBWtime.com

HeavyDuty.com $13,226

Solaire.net $12,000

Nationline.com $11,375

The top keyword found was once again “online”.

Godaddy-Auctionsales – The History of Domain Names

Go Daddy Auctions sells 31,481 domains

September 4, 2012

The monthly Go Daddy Auctions Domain Market Report is out, and the headline number is 31,481 domain names sold during August.

That’s lower than July’s 33,373 domains.

The top sale for August was TeachingJobs.com at a strong $89,250. The domain name appears to have been previously developed or used as a lead gen site.

Go Daddy’s report tends to be adjusted after its initial release. July’s original report showed fewer domains sold than are now reported. The higher number is good news — as is the addition of a new top sale from July: TMG.com for $80,000.

Godaddy SedoMLS – The History of Domain Names

GoDaddy joins SedoMLS network

January 30, 2012

The world’s largest domain name registrar has joined the SedoMLS network, giving more exposure to domain owners selling their domains through Sedo.

Domains that are listed on Sedo with a fixed price will automatically show up when a GoDaddy customer searches for the exact domain name. This will be possible because the domains are listed on GoDaddy auctions, and GoDaddy now shows auction listings when someone searches on its main web site.

Much like the similar deal GoDaddy has with AfternicDLS, customers will have to go through the GoDaddy Auctions shopping cart rather than the main GoDaddy checkout process in order to buy a domain name.

Also, the process does not allow for instant transfer, so domains will have to go through the standard escrow process. However, this means that you don’t have to have your domains at a SedoMLS participating registrar in order to list your domains.

Sedo will charge a 20% commission on any sale through the GoDaddy partnership.

Godaddy – The History of Domain Names

GoDaddy Sold 109,000 Domains

August 4, 2011

Godaddy launched a domain name market analysis section on their site today which touts the claim that they are selling more domain names at auction than industry veteran Sedo.com.

According to Godaddy : Go Daddy sold more than 109,000 domain names through auctions in 2nd Quarter 2011.

Sedo sold approximately 10,600 domain names in 2nd Quarter 2011.

This includes a list of top sales by month, including June’s sale of UpCloud.com for a whopping $100,000.

The sheer volume of sales is likely due to the fact that Godaddy is one of the largest domain registrars that auctions off domain names that are expiring from it’s user base and Sedo is not.  The comparison is a bit apples to oranges, yet the numbers are stunning.   Namejet.com might be a better company to compare Godaddy with.  Namejet auctions many registrar deletions at a starting price of $69 compared to Godaddy’s $10 starting price.

The data on the site also shows some interesting numbers about the amount of domains that are sold Buy it Now, Offer/Counter-offer and  Auction.  Again, the numbers are likely skewed with the Auction numbers receiving more volume because of the deleting names that Godaddy auctions.  The Buy Now number is the one that is impressive.  It would be great to see these numbers made a little more granular to know how many of those auctions were deleting/expired domains vs actual sellers and how many of the Buy It Now were Premium Listing names sold from home page searches.

The launch of this new section of the Godaddy site and the recent lowering of commissions seems to be a clear indicator that Godaddy is aggressively going after the domain aftermarket business.  Kudos to Godaddy for sharing this data.

gold biz – The History of Domain Names

Gold.biz attracts 16,000 EUR bid at Sedo

January 2, 2012

Gold.biz has a 16,000 euro bid and is scheduled to close on January 6.

That’s a heck of a price for a .biz domain, regardless of the keyword.

At 16,000 EUR gold.biz would be the highest .biz domain name sale since g.biz sold for $30,000 in August.

This is one of the highest potential sales for a .biz over the last few years.