The origin of the expression
cloud computing is obscur, but it appears to derive from the practice of using
drawings of stylized clouds to denote networks in diagrams of computing and
communications systems. The term cloud is used as a metaphor for the Internet,
based on the standardized use of a cloud-like shape to denote a networked on
telephony schematics and later to depict the Internet in computer
network diagrams as an
abstraction of the underlying infrastructure it represents. In the 1990s,
telecommunications companies who previously offered primarily dedicated
point-to-point data circuits, began offering virtual private network (VPN) services with comparable quality of
service but at a much lower cost. By switching traffic to balance utilization
as they saw fit, they were able to utilize their overall network bandwidth more
effectively. The cloud symbol was used to denote the demarcation point between
that which was the responsibility of the provider and that which was the
responsibility of the users. Cloud computing extends this boundary to cover
servers as well as the network infrastructure.[10]
The underlying concept of cloud
computing dates back to the 1950s; when large-scale mainframe became available in academia and
corporations, accessible via thin clients / terminal computers. Because it was costly to buy a
mainframe, it became important to find ways to get the greatest return on the
investment in them, allowing multiple users to share both the physical access
to the computer from multiple terminals as well as to share the CPU time,
eliminating periods of inactivity, which became known in the industry as time-sharing.[11].
As computers became more
prevalent, scientists and technologists explored ways to make large-scale
computing power available to more users through time sharing, experimenting
with algorithms to provide the optimal use of the infrastructure, platform and
applications with prioritized access to the CPU and efficiency for the end
users.[12].
John McCarthy opined in
the 1960s that "computation may someday be organized as a public utility."
Almost all the modern-day characteristics of cloud computing (elastic
provision, provided as a utility, online, illusion of infinite supply), the
comparison to the electricity industry and the use of public, private,
government, and community forms, were thoroughly explored in Douglas Parkhill's 1966
book, The Challenge of the
Computer Utility. Other scholars have shown that cloud computing's roots go
all the way back to the 1950s when scientist Herb Grosch (the author ofGrosch's law)
postulated that the entire world would operate on dumb terminals powered by
about 15 large data centers.[13] Due to the expense of these powerful
computers, many corporations and other entities could avail themselves of
computing capability through time sharing and several organizations, such as
GE's GEISCO, IBM subsidiary The Service Bureau Corporation, Tymshare (founded
in 1966), National CSS (founded in 1967 and bought by Dun & Bradstreet in
1979), Dial Data (bought by Tymshare in 1968), and Bolt, Beranek, and
Newmanmarketed proviced time sharing as a commercial venture.
The ubiquitous availability of
high capacity networks, low cost computers and storage devices as well as the
widespread adoption of hardware virtualization, service-oriented
architecture,autonomic, and
utility computing have led to a tremendous growth in cloud computing.[14][15][16]
After the dot-com bubble, Amazon played a key role in the development of cloud
computing by modernizing their data centers, which,
like most computer networks, were
using as little as 10% of their capacity at any one time, just to leave room
for occasional spikes. Having found that the new cloud architecture resulted in
significant internal efficiency improvements whereby small, fast-moving
"two-pizza teams" could add new features faster and more easily,
Amazon initiated a new product development effort to provide cloud computing to
external customers, and launchedAmazon Web Service
(AWS) on a utility computing basis in 2006.[17][18]
In early 2008, Eucalyptus became the first open-source, AWS
API-compatible platform for deploying private clouds. In early 2008, OpenNebula,
enhanced in the RESERVOIR European Commission-funded project, became the first
open-source software for deploying private and hybrid clouds, and for the
federation of clouds.[19] In the same year, efforts were focused on
providingquality of service guarantees (as required by real-time
interactive applications) to cloud-based infrastructures, in the framework of
the IRMOS European Commission-funded project, resulting to areal-time cloud
environment.[20] By mid-2008, Gartner saw an opportunity for
cloud computing "to shape the relationship among consumers of IT services,
those who use IT services and those who sell them"[21] and observed that "organisations are
switching from company-owned hardware and software assets to per-use
service-based models" so that the "projected shift to computing...
will result in dramatic growth in IT products in some areas and significant
reductions in other areas."[22] In 2012, Dr. Biju John and Dr. Souheil
Khaddaj incorporated the semantic term into the cloud "Cloud computing is
a universal collection of data which extends over the internet in the form of
resources (such as information hardware, various platforms, services etc.) and
forms individual units within the virtualization environment. Held together by
infrastructure providers, service providers and the consumer, then it is
semantically accessed by various users." (CLUSE 2012), Bangalore, April
2012[23]
Copy Right : http://technologinews.com.as/cloudserver.php (copy paste silahkan cantumkan sumbernya, Terima Kasih)
Home