Jumat, 19 September 2008

Riwayat dari sebuah komputer

A Brief History of the Computer (b.c. - 1993a.d.)
by Jeremy Meyers Note: Yes, a lot of this is from Grolier’s Encyclopedia. Hey, I was young. I didn’t know any better. Credit where credit is due. Also, this information is only current as of the early 1990’s (1993, to be exact), and no I’m not planning to add more information anytime soon. I would not recommend scamming this for your own homework, as some of the conclusions are rather humorous today.


Citing This Work You are welcome to use this document as a reference in creating your own paper or research work on the subject. Please don’t just copy this paper verbatim and submit it as your own work, as I put a lot of time and effort into it. Plus, it’s bad karma. If you would like to use this work, please use this citation in your bibliography: Meyers, Jeremy, “A Short History of the Computer” [Online] Available ([current date])

Table of Contents:
1) In The Beginning…
5) The Modern “Stored Program” EDC
2) Babbage
6) Advances in the 1950s
3) Use of Punched Cards by Hollerith
7) Advances in the 1960s
4) Electronic Digital Computers
8) Recent Advances


In The Beginning…
The history of computers starts out about 2000 years ago, at the birth of the abacus, a wooden rack holding two horizontal wires with beads strung on them. When these beads are moved around, according to programming rules memorized by the user, all regular arithmetic problems can be done. Another important invention around the same time was the Astrolabe, used for navigation. Blaise Pascal is usually credited for building the first digital computer in 1642. It added numbers entered with dials and was made to help his father, a tax collector. In 1671, Gottfried Wilhelm von Leibniz invented a computer that was built in 1694. It could add, and, after changing some things around, multiply. Leibniz invented a special stepped gear mechanism for introducing the addend digits, and this is still being used. The prototypes made by Pascal and Leibniz were not used in many places, and considered weird until a little more than a century later, when Thomas of Colmar (A.K.A. Charles Xavier Thomas) created the first successful mechanical calculator that could add, subtract, multiply, and divide. A lot of improved desktop calculators by many inventors followed, so that by about 1890, the range of improvements included:
Accumulation of partial results
Storage and automatic reentry of past results (A memory function)
Printing of the results
Each of these required manual installation. These improvements were mainly made for commercial users, and not for the needs of science.


Babbage
While Thomas of Colmar was developing the desktop calculator, a series of very interesting developments in computers was started in Cambridge, England, by Charles Babbage (left, of which the computer store “Babbages, now GameStop, is named), a mathematics professor. In 1812, Babbage realized that many long calculations, especially those needed to make mathematical tables, were really a series of predictable actions that were constantly repeated. From this he suspected that it should be possible to do these automatically.
He began to design an automatic mechanical calculating machine, which he called a difference engine. By 1822, he had a working model to demonstrate with. With financial help from the British government, Babbage started fabrication of a difference engine in 1823. It was intended to be steam powered and fully automatic, including the printing of the resulting tables, and commanded by a fixed instruction program. The difference engine, although having limited adaptability and applicability, was really a great advance. Babbage continued to work on it for the next 10 years, but in 1833 he lost interest because he thought he had a better idea — the construction of what would now be called a general purpose, fully program-controlled, automatic mechanical digital computer. Babbage called this idea an Analytical Engine. The ideas of this design showed a lot of foresight, although this couldn’t be appreciated until a full century later. The plans for this engine required an identical decimal computer operating on numbers of 50 decimal digits (or words) and having a storage capacity (memory) of 1,000 such digits. The built-in operations were supposed to include everything that a modern general - purpose computer would need, even the all important Conditional Control Transfer Capability that would allow commands to be executed in any order, not just the order in which they were programmed. The analytical engine was soon to use punched cards (similar to those used in a Jacquard loom), which would be read into the machine from several different Reading Stations. The machine was supposed to operate automatically, by steam power, and require only one person there. Babbage’s computers were never finished. Various reasons are used for his failure. Most used is the lack of precision machining techniques at the time. Another speculation is that Babbage was working on a solution of a problem that few people in 1840 really needed to solve. After Babbage, there was a temporary loss of interest in automatic digital computers. Between 1850 and 1900 great advances were made in mathematical physics, and it came to be known that most observable dynamic phenomena can be identified by differential equations(which meant that most events occurring in nature can be measured or described in one equation or another), so that easy means for their calculation would be helpful. Moreover, from a practical view, the availability of steam power caused manufacturing (boilers), transportation (steam engines and boats), and commerce to prosper and led to a period of a lot of engineering achievements. The designing of railroads, and the making of steamships, textile mills, and bridges required differential calculus to determine such things as:
center of gravity
center of buoyancy
moment of inertia
stress distributions
Even the assessment of the power output of a steam engine needed mathematical integration. A strong need thus developed for a machine that could rapidly perform many repetitive calculations.
Use of Punched Cards by Hollerith
-->
-->
In the September issue of Computer
Massively multiplayer online worlds differ from other game services by providing support to users creating their own unique content in the virtual world. This places significant demands on servers, clients, and the network. As virtual worlds evolve to support more users, types of interaction, and realism, these demands will increase by orders of magnitude. In this issue, we feature two articles that address these challenges. We also look at next-generation RFID applications, XML document parsing, interconnection networks, and data-sharing systems.


>>See the full table of contents for the September issue.



Features September 2008 Cover FeatureSecond Life and the New Generation of Virtual Worlds Sanjeev Kumar, Jatin Chhugani, Changkyu Kim, Daehyun Kim, Anthony Nguyen, Pradeep Dubey, Christian Bienia, and Youngmin Kim Unlike online games, metaverses present a single, seamless, persistent world where users can transparently roam around without predefined objectives. An analysis of Second Life illustrates the demands such applications place on clients, servers, and the network and suggests possible optimizations.
News September 2008 News Features
The Known World September 2008 Click Here to Empty Trash David Alan Grier We are beginning to assess the impact of digital technologies and are starting to devise strategies to handle the changes that computers have wrought upon the environment.
IT Systems Perspectives September 2008 Toward the Deep Semantic Web James Geller, Soon Ae Chun, and Yoo Jung An The Semantic Deep Web fuses aspects of the Semantic Web with the use of ontology-aware browsers to extract information from the Deep Web.
Web Technologies September 2008 The Social Web: Research and Opportunities Ed H. Chi, Palo Alto Research Center Web 2.0-based technologies advance both collective and individual intelligence.
The Profession September 2008 XML Does Real Programmers a Service Ian Gorton, Pacific Northwest National Lab Having ingeniously morphed their external personae to become almost mainstream, the future looks bright for real programmers.
Past issues of Computer, from 1988 to the present, are available for free to IEEE Computer Society members. For online access to Computer articles, members need to sign up for a free Web account. Single article downloads are available for $19 to nonmembers
Computer
From Wikipedia, the free encyclopedia
Jump to: navigation, search

This article is about the machine. For other uses, see Computer (disambiguation).
"Computer technology" redirects here. For the company, see Computer Technology Limited.

This article needs additional citations for verification.Please help improve this article by adding reliable references. Unsourced material may be challenged and removed. (July 2008)

The NASA Columbia Supercomputer
A computer is a machine that manipulates data according to a list of instructions.
The first devices that resemble modern computers date to the mid-20th century (around 1940 - 1945), although the computer concept and various machines similar to computers existed earlier. Early electronic computers were the size of a large room, consuming as much power as several hundred modern personal computers.[1] Modern computers are based on tiny integrated circuits and are millions to billions of times more capable while occupying a fraction of the space.[2] Today, simple computers may be made small enough to fit into a wristwatch and be powered from a watch battery. Personal computers, in various forms, are icons of the Information Age and are what most people think of as "a computer"; however, the most common form of computer in use today is the embedded computer. Embedded computers are small, simple devices that are used to control other devices — for example, they may be found in machines ranging from fighter aircraft to industrial robots, digital cameras, and children's toys.
The ability to store and execute lists of instructions called programs makes computers extremely versatile and distinguishes them from calculators. The Church–Turing thesis is a mathematical statement of this versatility: any computer with a certain minimum capability is, in principle, capable of performing the same tasks that any other computer can perform. Therefore, computers with capability and complexity ranging from that of a personal digital assistant to a supercomputer are all able to perform the same computational tasks given enough time and storage capacity.
Contents[hide]
1 History of computing
2 Stored program architecture
2.1 Programs
2.2 Example
3 How computers work
3.1 Control unit
3.2 Arithmetic/logic unit (ALU)
3.3 Memory
3.4 Input/output (I/O)
3.5 Multitasking
3.6 Multiprocessing
3.7 Networking and the Internet
4 Further topics
4.1 Hardware
4.2 Software
4.3 Programming languages
4.4 Professions and organizations
5 See also
6 Notes
7 References
//

History of computing
Main article: History of computer hardware

The Jacquard loom was one of the first programmable devices.
It is difficult to identify any one device as the earliest computer, partly because the term "computer" has been subject to varying interpretations over time. Originally, the term "computer" referred to a person who performed numerical calculations (a human computer), often with the aid of a mechanical calculating device.
The history of the modern computer begins with two separate technologies - that of automated calculation and that of programmability.
Examples of early mechanical calculating devices included the abacus, the slide rule and arguably the astrolabe and the Antikythera mechanism (which dates from about 150-100 BC). Hero of Alexandria (c. 10–70 AD) built a mechanical theater which performed a play lasting 10 minutes and was o

Pengenalan Komputer

Computer science
From Wikipedia, the free encyclopedia
Jump to: navigation, search
Computer science (or computing science) is the study and the science of the theoretical foundations of information and computation and their implementation and application in computer systems.[1][2][3] Computer science has many sub-fields; some emphasize the computation of specific results (such as computer graphics), while others relate to properties of computational problems (such as computational complexity theory). Still others focus on the challenges in implementing computations. For example, programming language theory studies approaches to describing computations, while computer programming applies specific programming languages to solve specific computational problems. A further subfield, human-computer interaction, focuses on the challenges in making computers and computations useful, usable and universally accessible to people.
Contents[hide]
1 History
2 Major achievements
3 Fields of computer science
3.1 Theory of computation
3.1.1 Mathematical foundations
3.2 Algorithms and data structures
3.3 Programming methodology and languages
3.4 Computer elements and architecture
3.5 Numerical and symbolic computation
4 Relationship with other fields
5 Computer science education
6 See also
7 References
8 Further reading
9 External links
9.1 Webcasts
//

[edit] History
Main article: History of computer science
The early foundations of what would become computer science predate the invention of the modern digital computer. Machines for calculating fixed numerical tasks, such as the abacus, have existed since antiquity. Wilhelm Schickard built the first mechanical calculator in 1623.[4] Charles Babbage designed a difference engine in Victorian times (between 1837 and 1901)[5] helped by Ada Lovelace.[6] Around 1900, the IBM corporation sold punch-card machines.[7] However, all of these machines were constrained to perform a single task, or at best some subset of all possible tasks.
During the 1940s, as newer and more powerful computing machines were developed, the term computer came to refer to the machines rather than their human predecessors. As it became clear that computers could be used for more than just mathematical calculations, the field of computer science broadened to study computation in general. Computer science began to be established as a distinct academic discipline in the 1960s, with the creation of the first computer science departments and degree programs.[8] Since practical computers became available, many applications of computing have become distinct areas of study in their own right.
Although many initially believed it impossible that computers themselves could actually be a scientific field of study, in the late fifties it gradually became accepted among the greater academic population.[9] It is the now well-known IBM brand that formed part of the computer science revolution during this time. 'IBM' (short for International Business Machines) released the IBM 704 and later the IBM 709 computers, which were widely used during the exploration period of such devices. "Still, working with the IBM [computer] was frustrating...if you had misplaced as much as one letter in one instruction, the program would crash, and you would have to start the whole process over again".[9] During the late 1950s, the computer science discipline was very much in its developmental stages, and such issues were commonplace.
Time has seen significant improvements in the usability and effectiveness of computer science technology. Modern society has seen a significant shift from computers being used solely by experts or professionals to a more widespread user base. By the 1990s, computers became accepted as being the norm within everyday life. During this time data entry was a primary component of the use of computers, many preferring to streamline their business practices through the use of a computer. This also gave the additional benefit of removing the need of large amounts of documentation and file records which consumed much-needed physical space within offices.

[edit] Major achievements

Please help improve this section by expanding it. Further information might be found on the talk page or at requests for expansion. (June 2008)

German military used the Enigma machine during World War II for communication they thought to be secret. The large-scale decryption of Enigma traffic at Bletchley Park was an important factor that contributed to Allied victory in WWII.[10]
Despite its relatively short history as a formal academic discipline, computer science has made a number of fundamental contributions to science and society. These include:
Applications within computer science
A formal definition of computation and computability, and proof that there are computationally unsolvable and intractable problems.[11]
The concept of a programming language, a tool for the precise expression of methodological information at various levels of abstraction.[12]
Applications outside of computing
Sparked the Digital Revolution which led to the current Information Age and the Internet.[13]
In cryptography, breaking the Enigma machine was an important factor contributing to the Allied victory in World War II.[10]
Scientific computing enabled advanced study of the mind and mapping the human genome was possible with Human Genome Project.[13] Distributed computing projects like Folding@home explore protein folding.
Algorithmic trading has increased the efficiency and liquidity of financial markets by using artificial intelligence, machine learning and other statistical and numerical techniques on a large scale.[14]

[edit] Fields of computer science
As a discipline, computer science spans a range of topics from theoretical studies of algorithms and the limits of computation to the practical issues of implementing computing systems in hardware and software.[15][16] The Computer Sciences Accreditation Board (CSAB), which is made up of representatives of the Association for Computing Machinery (ACM), the Institute of Electrical and Electronics Engineers Computer Society (IEEE-CS), and the Association for Information Systems (AIS), identifies four areas that it considers crucial to the discipline of computer science: theory of computation, algorithms and data structures, programming methodology and languages, and computer elements and architecture. In addition to these four areas, CSAB also identifies fields such as software engineering, artificial intelligence, computer networking and communication, database systems, parallel computation, distributed computation, computer-human interaction, computer graphics, operating systems, and numerical and symbolic computation as being important areas of computer science.[15]

[edit] Theory of computation


P = NP ?

Automata theory
Computability theory
Computational complexity theory
Quantum computing theory

[edit] Mathematical foundations

Sabtu, 13 September 2008

Komputer saat ini

Computer
From Wikipedia, the free encyclopedia
Jump to: navigation, search

This article is about the machine. For other uses, see Computer (disambiguation).
"Computer technology" redirects here. For the company, see Computer Technology Limited.

This article needs additional citations for verification.Please help improve this article by adding reliable references. Unsourced material may be challenged and removed. (July 2008)

The NASA Columbia Supercomputer
A computer is a machine that manipulates data according to a list of instructions.
The first devices that resemble modern computers date to the mid-20th century (around 1940 - 1945), although the computer concept and various machines similar to computers existed earlier. Early electronic computers were the size of a large room, consuming as much power as several hundred modern personal computers.[1] Modern computers are based on tiny integrated circuits and are millions to billions of times more capable while occupying a fraction of the space.[2] Today, simple computers may be made small enough to fit into a wristwatch and be powered from a watch battery. Personal computers, in various forms, are icons of the Information Age and are what most people think of as "a computer"; however, the most common form of computer in use today is the embedded computer. Embedded computers are small, simple devices that are used to control other devices — for example, they may be found in machines ranging from fighter aircraft to industrial robots, digital cameras, and children's toys.
The ability to store and execute lists of instructions called programs makes computers extremely versatile and distinguishes them from calculators. The Church–Turing thesis is a mathematical statement of this versatility: any computer with a certain minimum capability is, in principle, capable of performing the same tasks that any other computer can perform. Therefore, computers with capability and complexity ranging from that of a personal digital assistant to a supercomputer are all able to perform the same computational tasks given enough time and storage capacity.
Contents[hide]
1 History of computing
2 Stored program architecture
2.1 Programs
2.2 Example
3 How computers work
3.1 Control unit
3.2 Arithmetic/logic unit (ALU)
3.3 Memory
3.4 Input/output (I/O)
3.5 Multitasking
3.6 Multiprocessing
3.7 Networking and the Internet
4 Further topics
4.1 Hardware
4.2 Software
4.3 Programming languages
4.4 Professions and organizations
5 See also
6 Notes
7 References
//

History of computing
Main article: History of computer hardware

The Jacquard loom was one of the first programmable devices.
It is difficult to identif

Jumat, 20 Juni 2008


Monitor

There are many ways to classify monitors. The most basic is in terms of color capabilities, which separates monitors into three classes:# monochrome : Monochrome monitors actually display two colors, one for the background and one for the foreground. The colors can be black and white, green and black, or amber and black.# gray-scale : A gray-scale monitor is a special type of monochrome monitor capable of displaying different shades of gray.# color: Color monitors can display anywhere from 16 to over 1 million different colors. Color monitors are sometimes called RGB monitors because they accept three separate signals -- red, green, and blue. Monitor computer Another term for display screen. The term monitor, however, usually refers to the entire box, whereas display screen can mean just the screen. In addition, the term monitor often implies graphics capabilities.