What is the Internet?
The Internet is a worldwide, publicly accessible series of interconnected computer networks that transmit data by packet switching using the standard Internet Protocol (IP). It is a “network of networks” that consists of millions of smaller domestic, academic, business, and government networks, which together carry various information and services, such as electronic mail, online chat, file transfer, and the interlinked web pages and other resources of the World Wide Web (WWW).
The Internet and the World Wide Web are not synonymous. The Internet is a collection of interconnected computer networks, linked by copper wires, fiber-optic cables, wireless connections, etc. In contrast, the Web is a collection of interconnected documents and other resources, linked by hyperlinks and URLs. The World Wide Web is one of the services accessible via the Internet, along with various others including e-mail, file sharing, online gaming and others described below.
America Online, Comcast, Earthlink, etc. are examples of Internet service providers. They make it physically possible for you to send and access data from the Internet. They allow you to send and receive data to and from their computers or routers which are connected to the Internet.
World Wide Web is an example of an information protocol/service that can be used to send and receive information over the Internet. It supports:
The server software for the World Wide Web is called an HTTP server (or informally a Web server). Examples are Apache and IIS. The client software for World Wide Web is called a Web browser. Examples are: Netscape, Internet Explorer, Safari, Firefox, and Mozilla. These examples are particular “brands” of software that have a similar function, just like Lotus 123 and Excel are both spreadsheet software packages.
The Internet is an international network of computers connected by wires such as telephone lines. Schools, businesses, government offices, and many homes use the Internet to communicate with one another. You have access to the Internet when you work in one of this university’s computer labs. You also may have access at home or in your residence hall. If not, you can obtain access once you have three things. First, you need a computer and a modem, a device that allows you to connect your computer with the Internet. Many new computers have built-in modems. Second, you need a browser, a piece of software that allows you to view information on the Internet. Many new computers also come with a browser, usually Internet Explorer. You also can download another popular browser, Netscape Navigator, from the Internet for free. Finally, you need to subscribe to an Internet Service Provider, or ISP, such as America Online or Carolina Online.
One popular component of the Internet is electronic mail, or e-mail, which people at separate locations can use to send messages to one another. In general, each of these people has an e-mail address, which usually looks something like this: The first part of the address (.mark.canada) specifies the individual user, and the rest of the address refers to the server (uncp.edu), which is a computer that can store a lot of information.
In addition to allowing people to send e-mail messages to one another, the Internet also allows organizations and individuals to post information about themselves so that others can see it. For example, many companies post pictures and descriptions on World Wide Web sites. In fact, you can set up your own World Wide Web site by reserving space on a server. To understand how this process works, imagine that you wanted to store some articles you have written at a library so that people could come and read them. First, you would need to obtain permission from the librarians, who would assign you a folder where they would store your articles. Whenever you finished a new article, you would put a name on it and send it to the librarians, who would then place it in your folder. When people wanted to read one of these articles, they would need to know the address of the library, the name of your folder, and the name of the specific article they want to read. When they supplied this information, the librarian would give them the article they want.
The World Wide Web works the same way. First you need to identify an Internet company (librarian) and ask permission to save Web pages (articles) on its server (library). The company (librarian) then assigns you a directory (folder) where it will store your Web pages (articles). As you create each Web page (article), you give it a filename (name) and publish it on the server (send it to the library). When people want to read your Web page (article), they need your Web address, sometimes called a Uniform Resource Locator, or URL. The URL consists of the domain name of the server (address of the library), name of your directory (name of your folder), and the filename of the particular Web page (name of article).
The Internet and its Characteristics
The Internet by the late 1990s has evolved into a complex environment. Originally a military communication’s network it is now routinely used for five types of operations: (i) long-distance transactions (e.g. e-commerce, form-filling, remote work, entertainment); (ii) interpersonal communication; (iii) data storage; (iv) research (i.e. data finding); (v) remote data access and downloading.
The Internet is a dynamic and mercurial system endowed with a number of traits.
The Internet Tools and their Characteristics
The evolution of the Internet is punctuated by the introduction and mass acceptance of such key resources and tools as Unix, Email, Usenet newsgroups, Telnet, Listserv Mailing List Software, File Transfer Protocol, Internet Relay Chat, WAIS, Gopher, WWW, and more recently by the Altavista search engine, Java language
The foundations of an operating system called Unix were laid at AT&T Bell Laboratories in 1969. Unix is not a product of Internet culture. It is its catalyst and cornerstone. Internet culture owes Unix a major debt in the four areas. These conceptual and procedural debts are: multitasking, community fostering, openness and extensibility, and public access to the source code. Let’s briefly look at each of these debts.
Unix was one of the first operating systems which embodied the principle of multitasking (time-sharing). In most general terms it means that several users could simultaneously operate within a single environment and that the system as a whole coped well with this complicated situation. Unix was the first operating system which demonstrated in practical terms robustness and tolerance for the variety of it’s users simultaneous activities.
Email is the first of the Internet’s tools dedicated to the provision of fast, simple and global communication between people. This revolutionary client/server software implied for the first time that individuals (both as persons and roles) could have their unique electronic addresses. Within this framework messages were now able to chase their individual recipients anywhere in the world.
The initial format of email communication was that of a one-to-one exchange of electronic messages. This simple function was subsequently augmented by email’s ability to handle various attachments, such as documents with complex formatting, numbers and graphic files. Later, with the use of multi-recipient mailing lists electronic mail could be used for simple multicasting of messages in the form of one-to-many transmissions.
Usenet (Unix Users Network), the wide-area array of sites collating and swapping UUCP-based messages was pioneered in 1979. Usenet was originally conceived as a surrogate for the Internet (then called ARPANET). It was to be used by people who did not have ready access to the TCP/IP protocol and yet wanted to discuss their various Unix tools. It was only in 1987 that the NNTP (Network News Transfer Protocol) was established in order to enable Usenet to be carried on the Internet (i.e. TCP/IP) networks (Laursen 1997).
The networking tool called Telnet was invented in 1980 (Postel 1980). It allowed people (with adequate access rights) to login remotely into any networked computer in the world and to employ the usual gamut of computer commands. Thereby files and directories could be established, renamed and deleted; electronic mail read and dispatched; Usenet flame wars indulged in; and statistical packages run against numeric data - all at a distance. Moreover, results of all these and other operations could be remotely directed to a printer or via FTP to another networked computer. In short, Telnet gave us the ability to engage in long distance man-machine transactions, that is, ability to do the work as telecommuters.
File Transfer Protocol
The FTP client/server technology was first introduced in 1985 (Barnes 1997). Its usefulness to Internet culture is three-fold. Firstly, the FTP was a first widely-accepted tool for systematic permanent storage and world-wide transmission of substantial electronic information (e.g. programs, text files, image files).
Secondly, FTP archives promoted the use of anonymous login (i.e. limited public access) techniques as a way of coping with the mounting general requests for access to the archived information. That novel technique placed electronic visitors in a strictly circumscribed work environment. There they could browse through data subdirectories, copy relevant files, as well as deposit (within the context of a dedicated area) new digital material. However, the FTP software would not let them wander across other parts of the host, nor did the visitors have the right to change any component part of the accessed electronic archive.
Thirdly, the rapid proliferation in the number of public access FTP archives all over the world necessitated techniques for keeping an authoritative, up-to-date catalogue of their contents. This was accomplished through the Archie database (Deutsch et al. 1995) and its many mirrors. Archie used an automated process which periodically scanned the entire contents of all known “anonymous FTP” sites and report findings back to its central database.
This approach, albeit encumbered by the need to give explicit instructions as to which of the FTP systems need to be monitored, nevertheless integrated a motley collection of online resources into a single, cohesive, distributed information system.
Web based Client/Server
Gopher client/server software was used for the first time in 1991 (La Tour nd; Liu, C. et al. 1994). It was a ground-breaking development on two accounts. Firstly, it acted as a predictable, unified environment for handling an array of other electronic tools, such as Telnet, FTP and WAIS. Secondly, Gopher acted as electronic glue which seamlessly linked together archipelagos of information tracked by and referenced by other gopher systems. In short, Gopher was the first ever tool capable of the creation and mapping of a rich, large-scale, and infinitely extendable information space.
World Wide Web Server
The first prototype of the WWW server was built in 1991 (Cailliau 1995, Berners- Lee, nd; Berners-Lee 1998). The WWW server is an invention which has redefined the way the Internet is visualized by its users.
Firstly, the WWW server introduced to the Internet the powerful point-and-click hypertext capabilities. The hypertext notions of a home page and links spanning the entire body of data was first successfully employed on a small, standalone scale in 1986 in the Macintosh software called Hypercard (Goodman 1987). The WWW however, was the first hypertext technology applied to distributed online information. This invention was previously theoretically anticipated by a number of writers, including in the 1945 by Vannevar Bush of the Memex fame, and again in the 1965 by Theodor Nelson who embarked on the never-completed Project Xanadu (Nielsen 1995, Gilster 1997:267). Hypertext itself is not an new idea. It is already implicitly present (albeit in an imperfect because a paperbased form) in the first alphabetically ordered dictionaries such as Grand dictionnaire historique, compiled in 1674 by Louis Moreriego; or John Harris’ Lexicon Technicum which was published in 1704 (PWN 1964). It is also evident in the apparatus, such as footnotes, commentaries, appendices and references, of a 19th century scholarly monograph.
The hypertext principle as employed by the WWW server meant that any part of any text (and subsequently, image) document could act as a portal leading directly to any other nominated segment of any other document anywhere in the world.
Secondly, the WWW server introduced an explicit address for subsets of information. Common and simple addressing methodology (Universal Resource Locater [URL] scheme) enabled users to uniquely identify AND access any piece of networked information anywhere in the document, or anywhere on one’s computer, or - with the same ease - anywhere in the world.
Thirdly, the WWW provided a common, simple, effective and extendable language for document markup. The HTML language could be used in three different yet complementary ways: (a) as a tool for establishing the logical structure of a document; (b) as a tool for shaping the size, appearance and layout of lines of text on the page; (c) as a tool for building the internal (i.e. within the same document) and external (to a different document residing on the same or totally different server) hypertext connections.
The interlocking features of the hypertext, URLs and the markup language, have laid foundations for today’s global, blindingly fast and infinitely complex cyberspace. Moreover, the World Wide Web, like gopher before it, was also a powerful electronic glue which smoothly integrated not only most of the existing Internet tools (Email, Usenet, Telnet, Listservs FTP, IRC, and Gopher (but, surprisingly, not WAIS), but also the whole body of online information which could accessed by all those tools. However, the revolutionary strengths of the Web have not been immediately obvious to the most of the Internet community, who initially regarded the WWW as a mere (and possibly clumsy) variant of the then popular Gopher technology. This situation has changed only with the introduction of PC-based Web browsers with user-friendly, graphics-interfaces.
World Wide Web Browsers
The principle of a client/server division of labour was put to work yet again in the form of a series of WWW browsers such as Mosaic (built in 1993), Lynx (which is an ASCII, Telnet-based client software), Erwise, Viola, Cello, as well as, since 1994, several editions of Netscape and Explorer Each of the Web browsers, except for Lynx, which constitutes a deliberately simplified and thus very fast software, provided Internauts with series of novel capabilities.
These are: (a) an ability to handle multi-format, or multimedia (numbers, text, images, animations, video, sound) data within the framework of a single online document; (b) the ability to configure and modify the appearance of received information in a manner which best suits the preferences of the reader; (c) the ability to use the browser as a WYSIWYG (“what you see is what you get”) tool for crafting and proofreading of the locally created HTML pages on a user’s PC; (d) ability to acquire, save and display the full HTML source code for any and all of the published web documents.
Elements of Internet Architecture
To communicate using the Internet system, a host must implement the layered set of protocols comprising the Internet protocol suite. A host typically must implement at least one protocol from each layer.
The protocol layers used in the Internet architecture are as follows
The Application Layer is the top layer of the Internet protocol suite. The Internet suite does not further subdivide the Application Layer, although some application layer protocols do contain some internal sub-layering. The application layer of the Internet suite essentially combines the functions of the top two layers - Presentation and Application – of the OSI Reference Model [ARCH:8]. The Application Layer in the Internet protocol suite also includes some of the function relegated to the Session Layer in the OSI Reference Model.
We distinguish two categories of application layer protocols: user protocols that provide service directly to users, and support protocols that provide common system functions. The most common Internet user protocols are:
There are a number of other standardized user protocols and many private user protocols. Support protocols, used for host name mapping, booting, and management include SNMP, BOOTP, TFTP, the Domain Name System (DNS) protocol, and a variety of routing protocols.
The Transport Layer provides end-to-end communication services. This layer is roughly equivalent to the Transport Layer in the OSI Reference Model, except that it also incorporates some of OSI’s Session Layer establishment and destruction functions.
There are two primary Transport Layer protocols at present:
TCP is a reliable connection-oriented transport service that provides end-to-end reliability, resequencing, and flow control. UDP is a connectionless (datagram) transport service. Other transport protocols have been developed by the research community, and the set of official Internet transport protocols may be expanded in the future.
All Internet transport protocols use the Internet Protocol (IP) to carry data from source host to destination host. IP is a connectionless or datagram internetwork service, providing no end-to-end delivery guarantees. IP datagrams may arrive at the destination host damaged, duplicated, out of order, or not at all. The layers above IP are responsible for reliable delivery service when it is required. The IP protocol includes provision for addressing, type-of-service specification, fragmentation and reassembly, and security.
The datagram or connectionless nature of IP is a fundamental and characteristic feature of the Internet architecture. The Internet Control Message Protocol (ICMP) is a control protocol that is considered to be an integral part of IP, although it is architecturally layered upon IP - it uses IP to carry its data end-to-end. ICMP provides error reporting, congestion reporting, and first-hop router redirection.
The Internet Group Management Protocol (IGMP) is an Internet layer protocol used for establishing dynamic host groups for IP multicasting.
To communicate on a directly connected network, a host must implement the communication protocol used to interface to that network. We call this a Link Layer protocol. Some older Internet documents refer to this layer as the Network Layer, but it is not the same as the Network Layer in the OSI Reference Model.
This layer contains everything below the Internet Layer and above the Physical Layer (which is the media connectivity, normally electrical or optical, which encodes and transports messages). Its responsibility is the correct delivery of messages, among which it does not differentiate.
Protocols in this Layer are generally outside the scope of Internet standardization; the Internet (intentionally) uses existing standards whenever possible. Thus, Internet Link Layer standards usually address only address resolution and rules for transmitting IP packets over specific Link Layer protocols.
The constituent networks of the Internet system are required to provide only packet (connectionless) transport. According to the IP service specification, datagrams can be delivered out of order, be lost or duplicated, and/or contain errors.
For reasonable performance of the protocols that use IP (e.g., TCP), the loss rate of the network should be very low. In networks providing connection-oriented service, the extra reliability provided by virtual circuits enhances the end-end robustness of the system, but is not necessary for Internet operation.
Constituent networks may generally be divided into two classes:
In the Internet model, constituent networks are connected together by IP datagram forwarders which are called routers or IP routers. In this document, every use of the term router is equivalent to IP router. Many older Internet documents refer to routers as gateways. Historically, routers have been realized with packet-switching software executing on a general-purpose CPU. However, as custom hardware development becomes cheaper and as higher throughput is required, special purpose hardware is becoming increasingly common. This specification applies to routers regardless of how they are implemented.
A router connects to two or more logical interfaces, represented by IP subnets or unnumbered point to point lines . Thus, it has at least one physical interface. Forwarding an IP datagram generally requires the router to choose the address and relevant interface of the next-hop router or (for the final hop) the destination host. This choice, called relaying or forwarding depends upon a route database within the router. The route database is also called a routing table or forwarding table.
The term “router” derives from the process of building this route database; routing protocols and configuration interact in a process called routing. The routing database should be maintained dynamically to reflect the current topology of the Internet system. A router normally accomplishes this by participating in distributed routing and reachability algorithms with other routers.
Routers provide datagram transport only, and they seek to minimize the state information necessary to sustain this service in the interest of routing flexibility and robustness.
Packet switching devices may also operate at the Link Layer; such devices are usually called bridges. Network segments that are connected by bridges share the same IP network prefix forming a single IP subnet. These other devices are outside the scope of this document.
Common uses of the Internet
The concept of sending electronic text messages between parties in a way analogous to mailing letters or memos predates the creation of the Internet. Even today it can be important to distinguish between Internet and internal e-mail systems. Internet e-mail may travel and be stored unencrypted on many other networks and machines out of both the sender’s and the recipient’s control. During this time it is quite possible for the content to be read and even tampered with by third parties, if anyone considers it important enough. Purely internal or intranet mail systems, where the information never leaves the corporate or organization’s network, are much more secure, although in any organization there will be IT and other personnel whose job may involve monitoring, and occasionally accessing, the e-mail of other employees not addressed to them.
The World Wide Web
Many people use the terms Internet and World Wide Web (or just the Web) interchangeably, but, as discussed above, the two terms are not synonymous.
The World Wide Web is a huge set of interlinked documents, images and other resources, linked by hyperlinks and URLs. These hyperlinks and URLs allow the web servers and other machines that store originals, and cached copies, of these resources to deliver them as required using HTTP (Hypertext Transfer Protocol). HTTP is only one of the communication protocols used on the Internet.Web services also use HTTP to allow software systems to communicate in order to share and exchange business logic and data.
Software products that can access the resources of the Web are correctly termed user agents. In normal use, web browsers, such as Internet Explorer and Firefox, access web pages and allow users to navigate from one to another via hyperlinks. Web documents may contain almost any combination of computer data including photographs, graphics, sounds, text, video, multimedia and interactive content including games, office applications and scientific demonstrations.
Through keyword-driven Internet research using search engines like Yahoo! and Google, millions of people worldwide have easy, instant access to a vast and diverse amount of online information. Compared to encyclopedias and traditional libraries, the World Wide Web has enabled a sudden and extreme decentralization of information and data.
It is also easier, using the Web, than ever before for individuals and organizations to publish ideas and information to an extremely large audience. Anyone can find ways to publish a web page or build a website for very little initial cost. Publishing and maintaining large, professional websites full of attractive, diverse and up-to-date information is still a difficult and expensive proposition, however.
Many individuals and some companies and groups use “web logs” or blogs, which are largely used as easily updatable online diaries. Some commercial organizations encourage staff to fill them with advice on their areas of specialization in the hope that visitors will be impressed by the expert knowledge and free information, and be attracted to the corporation as a result. One example of this practice is Microsoft, whose product developers publish their personal blogs in order to pique the public’s interest in their work.
Collections of personal web pages published by large service providers remain popular, and have become increasingly sophisticated. Whereas operations such as Angelfire and GeoCities have existed since the early days of the Web, newer offerings from, for example, Facebook and MySpace currently have large followings. These operations often brand themselves as social network services rather than simply as web page hosts. Advertising on popular web pages can be lucrative, and e-commerce or the sale of products and services directly via the Web continues to grow.
In the early days, web pages were usually created as sets of complete and isolated HTML text files stored on a web server. More recently, websites are more often created using content management system (CMS) or wiki software with, initially, very little content. Contributors to these systems, who may be paid staff, members of a club or other organization or members of the public, fill underlying databases with content using editing pages designed for that purpose, while casual visitors view and read this content in its final HTML form. There may or may not be editorial, approval and security systems built into the process of taking newly entered content and making it available to the target visitors.
The Internet allows computer users to connect to other computers and information stores easily, wherever they may be across the world. They may do this with or without the use of security, authentication and encryption technologies, depending on the requirements. This is encouraging new ways of working from home, collaboration and information sharing in many industries. An accountant sitting at home can audit the books of a company based in another country, on a server situated in a third country that is remotely maintained by IT specialists in a fourth. These accounts could have been created by home-working bookkeepers, in other remote locations, based on information e-mailed to them from offices all over the world. Some of these things were possible before the widespread use of the
Internet, but the cost of private leased lines would have made many of them infeasible in practice. An office worker away from his desk, perhaps on the other side of the world on a business trip or a holiday, can open a remote desktop session into his normal office PC using a secure Virtual Private Network (VPN) connection via the Internet. This gives the worker complete access to all of his or her normal files and data, including e-mail and other applications, while away from the office.
This concept is also referred to by some network security people as the Virtual Private Nightmare, because it extends the secure perimeter of a corporate network into its employees’ homes; this has been the source of some notable security breaches, but also provides security for the workers.
The low cost and nearly instantaneous sharing of ideas, knowledge, and skills has made collaborative work dramatically easier. Not only can a group cheaply communicate and test, but the wide reach of the Internet allows such groups to easily form in the first place, even among niche interests. An example of this is the free software movement in software development, which produced GNU and Linux from scratch and has taken over development of Mozilla and OpenOffice.org (formerly known as Netscape Communicator and StarOffice).
Films such as Zeitgeist, Loose Change and Endgame have had extensive coverage on the Internet, while being virtually ignored in the mainstream media. Internet “chat”, whether in the form of IRC “chat rooms” or channels, or via instant messaging systems, allow colleagues to stay in touch in a very convenient way when working at their computers during the day. Messages can be sent and viewed even more quickly and conveniently than via e-mail. Extension to these systems may allow files to be exchanged, “whiteboard” drawings to be shared as well as voice and video contact between team members.
Version control systems allow collaborating teams to work on shared sets of documents without either accidentally overwriting each other’s work or having members wait until they get “sent” documents to be able to add their thoughts and changes.
A computer file can be e-mailed to customers, colleagues and friends as an attachment. It can be uploaded to a website or FTP server for easy download by others.It can be put into a “shared location” or onto a file server for instant use by colleagues. The load of bulk downloads to many users can be eased by the use of “mirror” servers or peer-to-peer networks.
In any of these cases, access to the file may be controlled by user authentication; the transit of the file over the Internet may be obscured by encryption, and money may change hands before or after access to the file is given. The price can be paid by the remote charging of funds from, for example, a credit card whose details are also passed— hopefully fully encrypted—across the Internet. The origin and authenticity of the file received may be checked by digital signatures or by MD5 or other message digests. These simple features of the Internet, over a worldwide basis, are changing the basis for the production, sale, and distribution of anything that can be reduced to a computer file for transmission. This includes all manner of print publications, software products, news, music, film, video, photography, graphics and the other arts. This in turn has caused seismic shifts in each of the existing industries that previously controlled the production and distribution of these products.
Internet collaboration technology enables business and project teams to share documents, calendars and other information. Such collaboration occurs in a wide variety of areas including scientific research, software development, conference planning, political activism and creative writing.
Many existing radio and television broadcasters provide Internet “feeds” of their live audio and video streams (for example, the BBC). They may also allow time-shift viewing or listening such as Preview, Classic Clips and Listen Again features. These providers have been joined by a range of pure Internet “broadcasters” who never had on-air licenses.
This means that an Internet-connected device, such as a computer or something more specific, can be used to access on-line media in much the same way as was previously possible only with a television or radio receiver. The range of material is much wider, from pornography to highly specialized, technical web casts. Pod casting is a variation on this theme, where—usually audio—material is first downloaded in full and then may be played back on a computer or shifted to a digital audio player to be listened to on the move. These techniques using simple equipment allow anybody, with little censorship or licensing control, to broadcast audio-visual material on a worldwide basis.
Webcams can be seen as an even lower-budget extension of this phenomenon. While some webcams can give full-frame-rate video, the picture is usually either small or updates slowly. Internet users can watch animals around an African waterhole, ships in the Panama Canal, the traffic at a local roundabout or their own premises, live and in real time. Video chat rooms, video conferencing, and remote controllable webcams are also popular. Many uses can be found for personal webcams in and around the home, with and without two-way sound.
You Tube, sometimes described as an Internet phenomenon because of the vast amount of users and how rapidly the site’s popularity has grown, was founded on February 15, 2005. It is now the leading website for free streaming video. It uses a flash-based web player which streams video files in the format FLV. Users are able to watch videos without signing up; however, if users do sign up they are able to upload an unlimited amount of videos and they are given their own personal profile. It is currently estimated that there are 64,000,000 videos on YouTube, and it is also currently estimated that 825,000 new videos are uploaded every day.
Voice telephony (VoIP)
VoIP stands for Voice over IP, where IP refers to the Internet Protocol that underlies all Internet communication. This phenomenon began as an optional two-way voice extension to some of the instant messaging systems that took off around the year 2000. In recent years many VoIP systems have become as easy to use and as convenient as a normal telephone. The benefit is that, as the Internet carries the actual voice traffic, VoIP can be free or cost much less than a normal telephone call, especially over long distances and especially for those with always-on Internet connections such as cable or ADSL.
Thus, VoIP is maturing into a viable alternative to traditional telephones. Interoperability between different providers has improved and the ability to call or receive a call from a traditional telephone is available. Simple, inexpensive VoIP modems are now available that eliminate the need for a PC.
Voice quality can still vary from call to call but is often equal to and can even exceed that of traditional calls. Remaining problems for VoIP include emergency telephone number dialling and reliability. Currently, a few VoIP providers provide an emergency service, but it is not universally available. Traditional phones are line-powered and operate during a power failure; VoIP does not do so without a backup power source for the electronics.
Most VoIP providers offer unlimited national calling, but the direction in VoIP is clearly toward global coverage with unlimited minutes for a low monthly fee.VoIP has also become increasingly popular within the gaming world, as a form of communication between players. Popular gaming VoIP clients include Ventrilo and Teamspeak, and there are others available also. The PlayStation 3 and Xbox 360 also offer VoIP chat features.
Common methods of home access include dial-up, landline broadband (over coaxial cable, fiber optic or copper wires), Wi-Fi, satellite and 3G technology cell phones.Public places to use the Internet include libraries and Internet cafes, where computers with Internet connections are available. There are also Internet access points in many public places such as airport halls and coffee shops, in some cases just for brief use while standing. Various terms are used, such as “public Internet kiosk”, “public access terminal”, and “Web payphone”. Many hotels now also have public terminals, though these are usually fee-based. These terminals are widely accessed for various usage like ticket booking, bank deposit, online payment etc. Wi-Fi provides wireless access to computer networks, and therefore can do so to the Internet itself.
Hotspots providing such access include Wi- Fi cafes, where would-be users need to bring their own wireless-enabled devices such as a laptop or PDA. These services may be free to all, free to customers only, or fee-based.A hotspot need not be limited to a confined location. A whole campus or park, or even an entire city can be enabled. Grassroots efforts have led to wireless community networks.
Commercial Wi-Fi services covering large city areas are in place in London, Vienna, Toronto, San Francisco, Philadelphia, Chicago and Pittsburgh. The Internet can then be accessed from such places as a park bench.Apart from Wi-Fi, there have been experiments with proprietary mobile wireless networks like Ricochet, various high-speed data services over cellular phone networks, and fixed wireless services. High-end mobile phones such as smartphones generally come with Internet access through the phone network. Web browsers such as Opera are available on these advanced handsets, which can also run a wide variety of other Internet software. More mobile phones have Internet access than PCs, though this is not as widely used. An Internet access provider and protocol matrix differentiates the methods used to get online.
The Internet has also become a large market for companies; some of the biggest companies today have grown by taking advantage of the efficient nature of low-cost advertising and commerce through the Internet, also known as e-commerce. It is the fastest way to spread information to a vast number of people simultaneously. The Internet has also subsequently revolutionized shopping—for example; a person can order a CD online and receive it in the mail within a couple of days, or download it directly in some cases. The Internet has also greatly facilitated personalized marketing which allows a company to market a product to a specific person or a specific group of people more so than any other advertising medium.
Examples of personalized marketing include online communities such as MySpace, Friendster, Orkut, Facebook and others which thousands of Internet users join to advertise themselves and make friends online. Many of these users are young teens and adolescents ranging from 13 to 25 years old. In turn, when they advertise themselves they advertise interests and hobbies, which online marketing companies can use as information as to what those users will purchase online, and advertise their own companies’ products to those users.
Online Internet Business Models
The outward signs of a robust and thriving business are:
What are the key areas that a profitable web site needs to concentrate on?
Existing business models are of many kinds:
Most successful companies pursue several related but different models concurrently. They defy easy categorization by diversifying revenue streams and becoming hybrids in a cost-efficient way.
Let’s take a look at some of the top e-businesses in the field today:
A company that has emerged unscathed from the recent dot-com bust with profits
soaring to almost 400% and revenues doubling in the past one year. It has transformed auctions that were limited to garage sales and flea markets into highly evolved emarketplaces. Selling just about anything, from antiques and jewelry to computers, automobiles and even auto insurance, it has 29.7 million registered users today. Adopting an amazing and unique culture, where buyers and sellers of all items are allowed to post their comments online, where credit-card payment facilities are secure and easy, the company projects a trustworthy and reliable image.
Apart from bidding, certain high quality goods can be sold at prices fixed by the seller. This site also offers professional services for all kinds of business needs. A widespread global reach makes its easy for a buyer in Hong Kong to bid and buy a product from a seller in Paris while the regional sites in North America are able to offer hard-to-ship merchandise.
Person-to-person trading and a barter economy have established the company on a secure B2B and B2C platform. Other companies like Yahoo! and Excite have been quick to catch on and incorporate auctions into their sites. Priceline.com, a site that offers airline tickets on discount has begun experimenting quite successfully with this business model.
To summarize, their business model can be elucidated thus:
Statistics have revealed that realty sites account for about 9.6% of all online visitors. Homestore.com is a company that has dominated the real estate field with 3.28 million customers in January 2001 and is listed among the Fortune top e-50. It registered a growth of 252% at one stage.
Homestore.com’s internet business model allows prospective buyers to review properties before buying. Is that all? No, they also offer financial advice, online loans, and buyer’s guides to homes and household items, home improvement tips, remodeling, and safety and security aspects. Useful advice when moving home and tips on resettling has ensured user satisfaction to the core.
Their main revenue came in from subscriptions (52%) and the remaining from advertising. As a subscription site they picked a specific topic which a segment of the population would be passionate about and marketed their services through strategic advertising.
Subscription sites that allow users access to a regularly updated online database of any kind for a fee are fast evolving into healthy and strong e-businesses.
This software and service provider entered the digitized world only in 1998, and metamorphosed into a digital pioneer in the span of two years. Innovative products and services and integration of these services have brought them into the forefront of web innovation today.
Internet business models like the Biz Online Initiative that deliver simple and complete online services and a host of other tools that customers require in setting up an e-business model, have made them a one stop shop for e-businesses today. Their built-in self-service system for customers, employees and suppliers improved productivity and accuracy and brought down costs by 100’s of millions of dollars. Consulting services with major firms like Sun Professional Systems have established their reliability with customers.
Their business formula:
Another company using a similar business model is Exodus Communications, an Internet data center that offers a range of web hosting services, bandwidth on demand, security monitoring. Their servers host leading web sites like Yahoo!, e-Bay and Merril Lynch. They allow these firms to deliver content and applications online round the clock without fail. 35% of their revenue comes from a very successful e-business consulting firm whom they have partnered with (Sapient). They are expanding from 19 data centers to 34 data centers this year.
Cisco develops switches and routers for Local Area Networks (LAN) and Wireless Area Networks (WAN) and the related software. They have become the worldwide leaders in networking for the Internet today. 90% of their sales are conducted over the Internet. They offer expertise in planning and executing Internet enabled solutions.
The company has grown in the past 7 years with 71 acquisitions to its credit, the latest being its investments in an optical equipment company and speech recognition software makers. Their business model could be termed an acquisition one!
The customer is King here! Amazon pampers their customers, tracks their tastes and uses this information to create a unique customer experience. This e-tailer cultivates relationships that lead to customers liking and trusting them. This kind of service surpasses the most brilliant technology in use today. Amazon brought in the world of successful oneto- one marketing, a personal touch from another era.
Recently though, they have suffered heavy losses, proving that any successful ebusiness strategy will survive provided it is based on a solid brick and mortar foundation, a la Barnes and Noble, another famous online bookseller. Although barnesandnoble.com and Barnes the Noble Ltd. are run separately, a customer tends to associate trust and comfort in a known and established brand.
To summarize, exemplary customer service, successful online advertising and special discounted offers made Amazon and books synonymous terms today.
This Fortune e-50 company offers a collection of premium sites for custom adbuys and sponsorships in various fields - Business, automobiles, entertainment, technology, travel and health. They help markets build brands, increase sales, maximize revenue and build one-to-one relationships with their customers. They offer agencies plans to manage online campaigns.
Their direct marketing strategies use customer data to refine marketing messages and increase investment returns. One of their divisions, Abacus is one of the largest databases of buyer behavior in about 90 million households in the United States itself. Another division,conducts online research to evaluate and understand online campaigns and strategies. The web has proven to be an amazing vehicle for advertising and reaching millions without spending a dime on postage and printing. Stu Heinecke Services, an advertising solutions company used personalized cartoon direct mail and achieved response rates as high as 100%.
Online ads possess tremendous communication powers. Banner ads placed on sites like CNN, Lycos, CompuServe, Pathfinder and The New York Times showed that:
The “operating system” of the net and a site for evolving search engines, free news and information services, online ads, banner ads, sports and news, video and audio, clubs and auction stores has become the most popular directory in the web. The value of this successful business model lies in its unique and easy categorization of all pages and subjects - a completely professional looking web site in all.
But, a BPI (Buying Power Index) report reveals that more online buying and popularity of a site don’t go hand in hand. Other search engines like Altavista, Excite and Juno seem to have raked in more profits recently.
Online advertising was the main revenue for Yahoo!, but they didn’t really check on what kind of ads worked online. Immediate success stopped them from evolving and developing other important aspects of e-business. As a result, this year they have been forced to cut budgets and ads, showing that generalized media doesn’t work compared to specialized media. Also, all of Yahoo’s content is owned by other sites and only licensed for their use.
Among the other successful dot-coms, trends revealed that online e-brokers offer the best economic models among consumer-centered Internet companies. Instead of spending on physical infrastructure they concentrated in increasing the volume of transactions.
1-800Flowers.com blended telephone and Internet technologies, Reflect.com, a beauty customization site outlasted other higher profile e-tailers in the business, thus proving that the basic B2C business model is valid.
Smart thinking, brilliant business plans, great and innovative promotional ideas are an integral part of any e-success. There is no doubt that in the near future, an average person anywhere in the world will surf the Internet more often than he or she watches television or uses the telephone. As a result, it makes sense for entrepreneurs of all kinds to come up with ideas of generating income by marketing their products or services to these surfers. Competition in cyberspace may become even fiercer in the future and therefore the right business plan is what will eventually ensure long-term success.
E-Commerce Concepts Related Tutorials
|Online Marketing Tutorial|
All rights reserved © 2020 Wisdom IT Services India Pvt. Ltd
Wisdomjobs.com is one of the best job search sites in India.