0% found this document useful (0 votes)
59 views17 pages

History of Computers and IT Evolution

The document details the history and evolution of computers, starting from early mechanical devices to modern digital technology. It outlines significant milestones, including the development of the first digital computer, the invention of transistors, and the introduction of personal computers, leading to advancements like the internet, AI, and cloud computing. Additionally, it categorizes the evolution of computers into five generations, highlighting key technological advancements in each phase.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
59 views17 pages

History of Computers and IT Evolution

The document details the history and evolution of computers, starting from early mechanical devices to modern digital technology. It outlines significant milestones, including the development of the first digital computer, the invention of transistors, and the introduction of personal computers, leading to advancements like the internet, AI, and cloud computing. Additionally, it categorizes the evolution of computers into five generations, highlighting key technological advancements in each phase.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Living in the Information Technology Era

CHAPTER ONE

Group Two

Members:

Balongag, John Rj

Baran, Christopher C.

Bautista, Zeno Nuez

Bontigao, Justine

Bugtong, Ronie James

Canete, Matt Bern Stefan

Ciloy, Cyriel

Damiles, Appleshane

Diamante, Riebert

Gomez, Jason

Perin, Clifford E.

1.1 HISTORY OF COMPUTERS


The U.S. Census crisis of 1880, where tabulation took over seven years, the government sought a faster
solution, leading to the development of punch-card-based computers that filled entire rooms. However,
the first true digital computer was conceived much earlier, in the 1830s, by Charles Babbage. His Analytical
Engine was a revolutionary design, outlining the core principles of a modern computer as an automatic,
programmable machine that could perform complex calculations and print the results.

BRIEFING:

In 1801, In France, Joseph Marie Jacquard invents a loom that uses punched wooden cards to
automatically weave fabric designs. Early computers would use similar punch cards.

In 1890: Herman Hollerith designs a punch card system to calculate the 1880 census, accomplishing the
task in just three years and saving the government $5 million. He establishes a company that would
ultimately become IBM.
In 1936, Alan Turing presents the notion of a universal machine, later called the Turing machine, capable
of computing anything that is computable. The central concept of the modern computer was based on his
ideas.

In 1941, Atanasoff and his graduate student, Clifford Berry, design a computer that can solve 29 equations
simultaneously. This marks the first time a computer is able to store information on its main memory.

In 1943-1944, Two University of Pennsylvania professors, John Mauchly and J. Presper Eckert, build
the Electronic Numerical Integrator and Calculator (ENIAC). Considered the grandfather of digital
computers, it fills a 20-foot by 40-foot room and has 18,000 vacuum tubes.

In 1946, Mauchly and Presper leave the University of Pennsylvania and receive funding from the Census
Bureau to build the UNIVAC, the first commercial computer for business and government applications

In 1947, William Shockley, John Bardeen, and Walter Brattain of Bell Laboratories invented the transistor.
They discovered how to make an electric switch with solid materials and no need for a vacuum.

Vacuum tubes and transistors both control electricity but in different ways. Unlike large, hot, and
unreliable vacuum tubes that burn out and consume a lot of power, transistors are incredibly small, cool,
and efficient. Because of these fundamental differences, they are not interchangeable; a circuit designed
for one cannot simply use the other.

In 1953, Grace Hopper created the first computer language, which became COBOL. Designed to be easy
to understand using English words and portable across different computers, COBOL solved a major
problem for businesses. Despite being old, it remains one of the most widely used languages in the world,
especially in financial systems.

In 1957, an IBM team led by John Backus developed FORTRAN (FORmula TRANslation), the first major
algorithmic language. It was designed to simplify scientific computations, making computer programming
more accessible.

In 1958, Jack Kilby and Robert Noyce unveiled the integrated circuit, or "computer chip." An IC is a small
silicon chip containing hundreds to millions of tiny components like transistors. These chips can perform
complex functions like computation and data storage, making them a foundational element of modern
electronics. Kilby won the Nobel Prize in Physics in 2000 for this groundbreaking work.

In 1964, Douglas Engelbart shows a prototype of the modern computer, with a mouse and a graphical user
interface (GUI). This marks the evolution of the computer from a specialized machine for scientists and
mathematicians to technology that is more accessible to the general public.

In 1969, Bell Labs developers created UNIX, an operating system that solved compatibility issues by being
portable across various platforms. It was written in the C programming language, a simple, flexible, and
foundational language often called "the god's programming language." C is the basis for many operating
systems and complex programs, and its influence makes it easy to learn other languages.

In 1970, Intel introduced the Intel 1103, the first commercially successful Dynamic Access Memory
(DRAM) chip. While its predecessor, the 1101, struggled, the 1103 was the first chip to store a significant
amount of information. Because it was cheaper and more power-efficient than the older core memory
technology, the 1103 was a huge success and quickly became the new standard for computer memory.
In 1971, Alan Shugart and his team at IBM invented the floppy disk, a portable magnetic storage medium.
The floppy disk, which is a thin, flexible magnetic disc encased in plastic, enabled data to be easily shared
and transferred between computers. Because they were cheaper than hard drives, floppy disks became
the standard for distributing software, transferring files, and backing up data in the early days of personal
computing.

In 1973, Robert Metcalfe, a member of the research staff for Xerox, develops Ethernet for connecting
multiple computers and other hardware.

In 1974-1977, A number of personal computers hit the market, including Scelbi & Mark-8 Altair, IBM
5100, Radio Shack's TRS-80 — affectionately known as the "Trash 80" — and the Commodore PET.

In 1975, the Altair 8800, one of the first personal computers, was featured in Popular Electronics magazine
as a kit for consumers. This revolutionary product provided an opportunity for two friends, Paul Allen and
Bill Gates, to create software for it using the new BASIC language. Their success led them to found
Microsoft on April 4th, which would go on to become a giant in the computing industry.

In 1976, Steve Jobs and Steve Wozniak start Apple Computers on April Fool's Day and roll out the Apple I,
the first computer with a single-circuit board, according to Stanford University.

In 1977, Radio Shack's initial production run of the TRS-80 was just 3,000. It sold like crazy. For the first
time, non-geeks could write programs and make a computer do what they wished.

In 1977, Jobs and Wozniak incorporate Apple and show the Apple II at the first West Coast Computer Faire.
It offers color graphics and incorporates an audio cassette drive for storage.

In 1978, Accountants rejoice at the introduction of VisiCalc, the first computerized spreadsheet program.

In 1979, WordStar was released by MicroPro International, making word processing a reality. Its creator,
Rob Barnaby, said its key features were adding margins and word wrap, getting rid of the old command
mode, and including a print function, which made it easier to use than previous programs.

In 1981, IBM introduced its first personal computer, the "Acorn," which used Microsoft's MS-DOS
operating system. The "Acorn" featured an Intel chip and two floppy disks and was the first IBM computer
sold through outside distributors like Sears & Roebuck and Computerland, popularizing the term "PC"
(Personal Computer).

In 1983, Apple's Lisa became the first personal computer to feature a Graphical User Interface (GUI),
which used interactive visual components like drop-down menus and icons. Though the Lisa was
commercially unsuccessful, its GUI technology was foundational, later evolving into the successful
Macintosh computer.

In 1985,Microsoft releases Windows, a graphical user interface (GUI) that was a direct competitor to
Apple's system.

In 1985,The first dot-com domain name, [Link], is registered, predating the formal launch of the
World Wide Web.

In 1990, Tim Berners-Lee develops HTML, the language used to create web pages, effectively giving birth
to the World Wide Web.
In 1993,The Pentium microprocessor is released, which dramatically improved the ability of PCs to handle
graphics and music. A microprocessor acts as the computer's central controlling unit, performing all
arithmetic and logical operations.

In 1994, PCs evolve into powerful gaming machines with the release of popular titles like "Command &
Conquer."

In 1996, Sergey Brin and Larry Page develop the Google search engine at Stanford University.

In 1999, The term Wi-Fi becomes part of the computing language and users begin connecting to the
Internet without wires.

In 2001, Apple unveils the Mac OS X operating system, which provides protected memory architecture and
pre-emptive multi-tasking, among other benefits. Not to be outdone, Microsoft rolls out Windows XP,
which has a significantly redesigned GUI.

In 2004, Mozilla's Firefox 1.0 challenges Microsoft's Internet Explorer, the dominant Web browser.
Facebook, a social networking site, launches.

In 2005, YouTube, a video sharing service, is founded.

In 2006, Apple introduces the MacBook Pro, its first Intel-based, dual-core mobile computer, as well as an
Intel-based iMac.

In 2007, The iPhone brings many computer functions to the smartphone.

In 2009, Microsoft launches Windows 7, which offers the ability to pin applications to the taskbar and
advances in touch and handwriting recognition, among other features.

In 2010, Apple unveils the iPad, changing the way consumers view media and jumpstarting the dormant
tablet computer segment.

In 2011, Google releases the Chromebook, a laptop that runs the Google Chrome OS.

In 2012, Facebook gains 1 billion users on October 4.

In 2015, Apple releases the Apple Watch. Microsoft releases Windows 10.

In 2016, The first reprogrammable quantum computer was created. "Until now, there hasn't been any
quantum-computing platform that had the capability to program new algorithms into their system. They're
usually each tailored to attack a particular algorithm," said study lead author Shantanu Debnath, a
quantum physicist and optical engineer at the University of Maryland, College Park.

In 2017, The Defense Advanced Research Projects Agency (DARPA) begins its "Molecular Informatics"
program. This initiative explores using molecules, with their unique properties, as a new way to store and
process data, moving beyond the traditional 0s and 1s of digital computing.

In 2018 to Present,

• Cloud Computing: Businesses are moving to the cloud to store vast amounts of data, saving time
and money while increasing reliability and security.
• The Internet of Things (IoT): This technology connects devices to create "smart" homes and
workplaces. IoT, combined with edge computing, is enhancing applications for security and real-
time problem-solving.
• Artificial Intelligence (AI): AI and machine learning are becoming an unstoppable force, helping
other technologies create sophisticated software for tasks like natural language processing,
computer vision, and more. Amazon's Alexa is a prime example.
• Virtual Assistance: Technologies like virtual assistants (e.g., Amazon Alexa) and chatbots are
changing how brands engage with customers by providing quick, 24/7 service.
• Augmented Reality (AR): AR places computer-generated images on top of the real world. Unlike
virtual reality, it uses your existing environment to add new information, as seen in Snapchat filters
and smartphone features.
• 3-D Printing: This technology is changing manufacturing by allowing the creation of a wide range
of products, from lightweight casts for broken bones to human body parts and safer vehicles, while
saving time and money.
• Robotic Process Automation (RPA): RPA uses software to automate repetitive business tasks,
freeing up people to focus on more complex work. This technology can automate up to 45% of
activities, including those of high-level professionals.
• Cybersecurity: This field is constantly evolving to defend against new threats from hackers. As long
as there are malicious actors, cybersecurity will continue to develop new methods to protect data.

1.2 GENERATION OF COMPUTERS


Computers are electronic devices that manipulate, store, and process data. Evolving from their
beginnings around 1940, they have transformed from simple data processors into versatile
machines used for a wide range of tasks, including typing documents, browsing the web, and
creating multimedia content.
FIVE GENERATIONS OF COMPUTERS
1. Vacuum Tubes and Plug boards (1951 -1958)
The first generation of computers used vacuum tubes as their main logic elements;
punched cards to input and externally store data; and rotating magnetic drums for internal
storage of data in programs written in machine language (instructions written as a string
of 0s and 1s) or assembly language (a language that allowed the programmer to write
instructions in a kind of shorthand that would then be "translated" by another program
called a compiler into machine language).

The following are the characteristics of the First Generation computers:


• Used vacuum tubes for circuitry
• Electron emitting metal in vacuum tubes burned out easily
• Used magnetic drums for memory
• Were huge, slow, expensive, and many times undependable
• Were expensive to operate
• Were power hungry
• Generated a lot of heat which would make them malfunction
• Solved one problem at a time
• Used input based on punched cards
• Had their outputs displayed in print outs
• Used magnetic tapes
• Used machine language
• Had limited primary memory
• We’re programming only in machine language

The UNIVAC and ENIAC computers are examples of first-generation computing devices.
The UNIVAC was the first commercial computer delivered to a business client, the U.S.
Census Bureau in 1951.

2. Transistors and Batch Filing (1959 -1963)

T&T's Bell Laboratories, in the 1940s, discovered that a class of crystalline mineral
materials called semiconductors could be used in the design of a device called a transistor
to replace vacuum tubes. Magnetic cores (very small donut-shaped magnets that could
be polarized in one of two directions to represent data) strung on wire within the computer
became the primary internal storage technology. Magnetic tape and disks began to
replace punched cards as external storage devices. High-level programming languages
(program instructions that could be written with simple words and mathematical
expressions), like FORTRAN and COBOL, made computers more accessible to scientists
and businesses.
Second-generation computers moved from cryptic binary machine language to symbolic,
or assembly, languages, which allowed programmers to specify instructions in words.
High-level programming languages were also being developed at this time, such as early
versions of COBOL and FORTRAN. These were also the first computers that stored their
instructions in their memory, which moved from a magnetic drum to magnetic core
technology.
The following are the characteristics of the Second Generation computers:
• Used transistors
• Faster and more reliable than first generation systems
• Were slightly smaller, cheaper, faster
• Generated heat though a little less
• Still relied on punch cards and printouts for input/output
• Allowed assembly and high-level languages
• Stored data in magnetic media
• Were still costly
• Needed air conditioning
• Introduced assembly language and operating system software
The first computers of this generation were developed for the atomic energy industry.
3. Integrated Circuits and Multi-Programming (1964 - 1979)
Individual transistors were replaced by integrated circuits. Magnetic core memories began
to give way to a new form, metal oxide semiconductor memory (MOS), which, like
integrated circuits, used silicon-backed chips. Increased memory capacity and processing
power made possible the development of operating systems — special programs that help
the various elements of the computer to work together to process information.
Programming languages like BASIC were developed, making programming easier to do.
Instead of punched cards and printouts, users interacted with third generation computers
through keyboards and monitors and interfaced with an operating system, which allowed
the device to run many different applications at one time with a central program that
monitored the memory. Computers for the first time became accessible to a mass
audience because they were smaller and cheaper than their predecessors.
The following are the characteristics of the Third Generation computers:
• Used ICs
• Used parallel processing
• Were slightly smaller, cheaper, faster
• Used motherboards
• Data was input using keyboards
• Output was visualized on the monitors
• Used operating systems, thus permitting multitasking
• Simplified programming languages (i.e. BASIC)
4. The Microprocessor, OS and GUI (1979 to Present
The microprocessor brought the fourth generation of computers, as thousands of
integrated circuits were built onto a single silicon chip. What in the first generation filled
an entire room could now fit in the palm of the hand. The Intel 4004 chip, developed in
1971, located all the components of the computer—from the central processing unit and
memory to input/output controls—on a single chip.
In 1981 IBM introduced its first computer for the home user, and in 1984 Apple
introduced the Macintosh. Microprocessors also moved out of the realm of desktop
computers and into many areas of life as more and more everyday products began to
use microprocessors.
As these small computers became more powerful, they could be linked together to form
networks, which eventually led to the development of the Internet. Fourth generation
computers also saw the development of GUIs, the mouse and handheld devices.
The following are the characteristics of the Third Generation computers:
• Used CPUs which contained thousands of transistors
• Were much smaller and fitted on a desktop, laps and palms
• Used a mouse
• Were used in networks
• Were cheap
• Had GUI
• Were very fast
• Register over 19 billion transistors in high-end microprocessors

5. The Present and The Future


Fifth generation computing devices, based on artificial intelligence, are still in
development, though there are some applications, such as voice recognition, that are
being used today. The use of parallel processing and superconductors is helping to make
artificial intelligence a reality.
Quantum computation and molecular and nanotechnology will radically change the face
of computers in years to come. The goal of fifth-generation computing is to develop
devices that respond to natural language input and are capable of learning and self-
organization.

1.3 FOUR BASIC COMPUTER PERIODS


Information technology has existed as long as humans have, spanning four main ages of communication.
While only the most recent ages—the electronic and electromechanical—directly impact us today,
understanding all four is crucial to appreciating the evolution of modern technology.

PRE MECHANICAL AGE (3000 B.C. – 1450 A.D.)

The pre-mechanical age, the earliest period of information technology, began with the invention of writing.
Around 3000 B.C., the Sumerians developed cuneiform, a system that used signs to represent sounds, a
major step up from earlier pictographs like petroglyphs. As writing became more common, so did the need
for materials to write on. This led to the creation of paper from the papyrus plant, and later from rags by
the Chinese. This new technology enabled the creation of the first books and libraries, with early examples
being clay tablets in Mesopotamia and papyrus scrolls in Egypt. These libraries served as the first forms of
permanent information storage. In parallel, the first numbering system (1-9) was developed around 100
A.D., though the number zero wasn't invented until 875 A.D. This period also saw the birth of the first
information processor, the abacus, a simple calculator that laid the groundwork for future computing
devices.

Writing Systems: The Sumerians in Mesopotamia developed the first writing system, cuneiform, which
used signs to represent spoken sounds. This advanced from earlier pictographs like petroglyths. The
Phoenician alphabet also emerged.

Paper & Pens: The development of alphabets led to the creation of paper from the papyrus plant and later
from rags (by the Chinese). The first writing was on wet clay.
Permanent Storage: The need for permanent records led to the first books and libraries. Early "books"
were clay tablets, while the Egyptians used scrolls made of papyrus. The Greeks later developed bound
books and the first public libraries around 500 B.C.

Numbering & Calculation: The first numbering system (1-9) was created by people in India around 100
A.D., but it took 775 years for the number zero (0) to be invented. The abacus emerged as the first true
calculator, a sign of early information processing.

MECHANICAL AGE(1450 – 1840)

The mechanical age is when we first start to see connections between our current technology and its
ancestors. A lot of new technologies are developed in this era as there is a large explosion in interest in this
area.

1. Johann Gutenberg in Mainz, Germany, invented the movable metal-type printing process in 1450
and sped up the process of composing pages from weeks to a few minutes. The printing press
made written information much more accessible to the general public by reducing the time and
cost that it took to reproduce the written material.
2. In the early 1600s, William Oughtred, an English clergyman, invented the slide rule, a device that
allowed the user to multiply and divide by sliding two pieces of precisely machines and scribed
wood against each other. The slide rule is an early example of an analog computer — an
instrument that measures instead of counts.
3. In the 1820s, English mathematician Charles Babbage invented the Difference Engine to solve
equations and print results, though he was unable to complete a more complex version due to a
lack of funding. He later designed the Analytical Engine in the 1830s. This design was remarkably
similar to modern computers, with a "store" for data and a use of punched cards (an idea from
French weaver Joseph Jacquard) to direct its operations.
4. Blaise Pascal, a French mathematician, invented the Pascaline around 1642 which was a very
popular mechanical computer; it used a series of wheels and cogs to add and subtract numbers.
5. Lady Augusta Ada Byron assisted Charles Babbage by designing the instructions for his machines,
earning her the title of the "first programmer." Despite her contributions, Babbage ultimately had
to abandon his ambitious plan to build the Analytical Engine due to a lack of funding. Although
the machines of this era were massive and limited to single-type calculations by today's standards,
they represented significant and groundbreaking advancements for their time.

ELECTROMECHANICAL AGE(1840 – 1940)

Now we are finally getting close to some technologies that resemble our modern-day technology. The
discovery of ways to harness electricity was the key advance made during this period. Knowledge and
information could now be converted into electrical impulses. These are the beginnings of
telecommunication.

1. The discovery of a reliable method of creating and storing electricity, with a Voltaic Battery, at the
end of the 18th century made possible a whole new method of communicating information
2. The Telegraph was created in the early 1800s. It is the first major invention to use electricity for
communication purposes and made it possible to transmit information over great distances with
great speed.
3. Morse code was created by Samuel Morse in 1835. Morse devised a system that broke down
information (in this case, the alphabet) into bits (dots and dashes) that could then be transformed
into electrical impulses and transmitted over a wire (just as today's digital technologies break
down information into zeros and ones).
4. The Telephone (one of the most popular forms of communication ever) was created by Alexander
Graham Bell in 1876. This was followed by the discovery that electrical waves travel through space
and can produce an effect far from the point at which they originated. These two events led to the
invention of the radio by Marconi in 1894.

5. In 1890, Herman Hollerith invented a machine that used electrical sensing to read punched cards,
which automated the process of sorting and counting the U.S. Census. The company he founded
eventually became [Link], Howard Aiken, a Harvard student, combined Hollerith's punch card
technology with Babbage's designs. With IBM's funding, he created the Mark I, an
electromechanical computer that used paper tape for instructions and punch cards for data.

ELETRONIC AGE(1940 – Present)

The Electronic Numerical Integrator and Computer (ENIAC) was the first high-speed, digital computer
capable of being reprogrammed to solve a full range of computing problems. This computer was designed
to be used by the U.S. Army for artillery firing tables. This machine was even bigger than the Mark 1 taking
up 680 square feet and weighing 30 tons - HUGE. It mainly used vacuum tubes to do its calculations.

4 MAIN SECTION OF DIGITAL COMPUTING

1. The first was the period of vacuum tubes and punch cards, such as the ENIAC and Mark I. Internal
storage was carried out using rotating magnetic drums.

2. The second version replaced vacuum tubes with transistors, punched cards with magnetic tape, and
rotating magnetic drums with magnetic cores for internal storage. During this time, high-level
programming languages such as FORTRAN and COBOL were developed.

3. The third generation replaced transistors with integrated circuits, magnetic tape was utilized in all
computers, and magnetic cores were converted into metal oxide semiconductors. Around this time, a
genuine operating system appeared, as did the advanced programming language BASIC.

4. The fourth and most recent iteration introduced CPUs (central processing units), which integrated
memory, logic, and control circuits on a single chip. The personal computer was created (Apple II). The
graphical user interface (GUI) was designed.

1.4 CLASSIFICATION OF COMPUTERS


Classification of Computers

The computer systems can be classified on the following basis:

1. On the basis of size.

2. On the basis of functionality.

3. On the basis of data handling.


CLASSIFICATION OF THE BASIS OF SIZE

1. Super computers : The super computers are the most high performing system. A supercomputer is
a computer with a high level of performance compared to a general-purpose computer. The actual
Performance of a supercomputer is measured in FLOPS instead of MIPS. All of the world’s fastest
500 supercomputers run Linux-based operating systems. Additional research is being conducted in
China, the US, the EU, Taiwan and Japan to build even faster, more high performing and more
technologically superior supercomputers. Supercomputers actually play an important role in the
field of computation, and are used for intensive computation tasks in various fields, including
quantum mechanics, weather forecasting, climate research, oil and gas exploration, molecular
modeling, and physical simulations. and also Throughout the history, supercomputers have been
essential in the field of the cryptanalysis.

2. Mainframe computers : These are commonly called as big iron, they are usually used by big
organizations for bulk data processing such as statics, census data processing, transaction
processing and are widely used as the servers as these systems has a higher processing capability
as compared to the other classes of computers, most of these mainframe architectures were
established in 1960s, the research and development worked continuously over the years and the
mainframes of today are far more better than the earlier ones, in size, capacity and efficiency.

3. Mini computers : These computers came into the market in mid 1960s and were sold at a much
cheaper price than the main frames, they were actually designed for control, instrumentation,
human interaction, and communication switching as distinct from calculation and record keeping,
later they became very popular for personal uses with evolution. In the 60s to describe the smaller
computers that became possible with the use of transistors and core memory technologies,
minimal instructions sets and less expensive peripherals such as the ubiquitous Teletype Model 33
ASR. They usually took up one or a few inch rack cabinets, compared with the large mainframes
that could fill a room, there was a new term “MINICOMPUTERS” coined

4. Micro computers : A microcomputer is a small, relatively inexpensive computer with a


microprocessor as its CPU. It includes a microprocessor, memory, and minimal I/O circuitry
mounted on a single printed circuit board. The previous to these computers, mainframes and
minicomputers, were comparatively much larger, hard to maintain and more expensive. They
actually formed the foundation for present day microcomputers and smart gadgets that we use in
day to day life.

CLASSIFICATION ON THE BASIS OF FUNCTIONALITY

▪ Servers : Servers are nothing but dedicated computers which are set-up to offer some services to
the clients. They are named depending on the type of service they offered. Eg: security server,
database server.
▪ Workstation : Those are the computers designed to primarily to be used by single user at a time.
They run multi-user operating systems. They are the ones which we use for our day to day personal
/ commercial work.
▪ Information Appliances : They are the portable devices which are designed to perform a limited
set of tasks like basic calculations, playing multimedia, browsing internet etc. They are generally
referred as the mobile devices. They have very limited memory and flexibility and generally run
on “as-is” basis.
▪ Embedded computers : They are the computing devices which are used in other machines to serve
limited set of requirements. They follow instructions from the non-volatile memory and they are
not required to execute reboot or reset. The processing units used in such device work to those
basic requirements only and are different from the ones that are used in personal computers-
better known as workstations.

CLASSIFICATION ON THE BASIS OF DATA HANDLING

▪ Analog : An analog computer is a form of computer that uses the continuously-changeable aspects
of physical fact such as electrical, mechanical, or hydraulic quantities to model the problem being
solved. Any thing that is variable with respect to time and continuous can be claimed as analog
just like an analog clock measures time by means of the distance traveled for the spokes of the
clock around the circular dial.

▪ Digital : A computer that performs calculations and logical operations with quantities represented
as digits, usually in the binary number system of “0” and “1”, “Computer capable of solving
problems by processing information expressed in discrete form. from manipulation of the
combinations of the binary digits, it can perform mathematical calculations, organize and analyze
data, control industrial and other processes, and simulate dynamic systems such as global weather
patterns.

▪ Hybrid : A computer that processes both analog and digital data, Hybrid computer is a digital
computer that accepts analog signals, converts them to digital and processes them in digital form.

1.5 EVOLUTION OF INFORMATION TECHNOLOGY

Technological Evolution

Since World War II, the performance capabilities of computers and telecommunications have been
doubling every few years at constant cost. For example, a decade ago $3,500 could buy a new Apple
II microcomputer. Today, $6,800 — the same amount of purchasing power (adjusted for 10 years of
inflation)-can buy a new Macintosh II microcomputer. The Macintosh handles 4 times the information
at 16 times the speed, preprogrammed and reprogrammable memory are both about 20 times larger,
disk storage is about 90 times larger, and the display has 7 times the resolution and 16 times the
number of colors. Comparable figures could be cited for other brands of machines. Equally impressive,
users’ demands for this power have increased as rapidly as it has become available. Over the next two
decades, data processing and information systems will probably be replaced by sophisticated devices
for knowledge creation, capture, transfer, and use. A similar evolution can be forecast for
telecommunications: personal video-recorders, optical fiber networks, intelligent telephones,
information utilities such as videotex, and digital discs will change the nature of media.

Cognition Enhancers
The concept of “cognition enhancers” can help us understand how we can use these
emerging technologies. A cognition enhancer combines the complementary strengths of a
person and an information technology. Two categories of cognition enhancers will have
considerable impact on the workplace: empowering environments and hypermedia.

Empowering Environments

Empowering environments enhance human accomplishment by a division of labor: the


machine handles the routine mechanics of a task, while the person is immersed in its
higher-order meanings. The workplace is adopting many empowering environments:
databases for information management, spreadsheets for modeling, computer-aided design
systems for manufacturing. And word processors with embedded spelling checkers,
thesauruses, out-liners, text analyzers, and graphics tools are driving the evolution of a new
field: desktop publishing.

Hypermedia

Hypermedia is a framework for creating an interconnected, web-like representation of


symbols (text, graphics, images, software codes) in the computer. This representation is
similar to human long-term memory: people store information in networks of symbolic,
temporal, and visual images related by association.

Evolution of Software Applications

The advancement of hardware was not sufficient to change the human life-style, had it not
been supported by software and software application. Let us see how software applications
revolved over time.

Command Line Programs (1980s)- the first generation software application included
compilers, device drivers etc., which were mainly command line programs.

Desktop Application (1990s)- with the popularity of graphical interface, GUI based desktop
applications of multiple types and forms were released: office application, audio and video
players, utility programs, browsers, etc.

Web application (21st century)- with web’s availability, the next generation applications were
developed keeping world wide web in mind. Web applications were developed keeping in
mind that they can be accessed from any location over internet. Most popular web
applications include email clients like gmail, ymail, etc. Social networking platform life
facebook, twitter, instagram, pinterest, quora, etc.

Mobile application (21st century)- advent of computer technology has resulted into smart
phones being affordable. The most popular mobile applications development platforms are
IOS, Android, windows which are also the most popular mobile operating systems.
Evolution of Programming Language

Software are developed through various programming languages. Programming started with
machine language and evolved to new-age programming systems.

1st generation programming language (1GL)- early programming was done in machine
language. So machine language is the first generation programming language.

2nd generation programming language (2GL)- also called as the assembly language
programming which is easier for computer to understand but difficult for programmers.

3rd generation programming language (3GL)- more normal English language like and hence
easier for programmers to understand. Also called High Level Languages(HLLs).

4th generation programming language (4GL)- closer to natural language than 3GLs.

5th generation programming language (5GL)- used mainly in artificial intelligence research.

1.6 EVOLUTION OF MEDIA


The media has transformed itself based on two things – (1) how information is presented; and (2) how
the connection is established.

Woodcut printing on cloth or on paper was used in the early 15th century.

It was in 1436 when Johannes Gutenberg started working on a printing press which used relief printing
and a molding system. Now, the modern printing press delivers messages in print, such as
newspapers, textbooks, and magazines.

In the 1800s, the telegraph was developed followed by the telephone which made the two-way
communication possible. Message sending and receiving can now be done both ways simultaneously.

At the beginning of the 1900s, broadcasting and recorded media were introduced. Radio and
television were used to send sound and video to homes and offices through electromagnetic spectrum
or radio waves. Audio (lower frequency band) or video (higher frequency band) content can be
received depending on the frequency used. Later on, a combination of both audio and video
information made the audience’s viewing experience more exciting. Films and movies became
popular as they catered to larger audiences.

As communication devices also evolved and became pervasive. So did information distribution. A
photo taken using a smartphone can immediately be uploaded and share on Facebook, Twitter, or
Instagram. Community websites such as [Link], a Philippine counterpart of [Link], let its users
buy and sell items online. This eliminates the need for going to physical stores.

In line with this development, the audience regardless of their professions can now interact with one
another and are no longer disconnected. News sites can even get news stories for example from
Twitter or other social media sites. According to Claudine Beaumont, author from The Telegraph, one
good example of this happened on January 15, 2009, when dozens of New Yorkers sent ‘tweets’ about
a plane crash in the city. News about the US Airways Flight 1549 which was forced to land in the
Hudson River in Manhattan, USA immediately spread all over the country. All plane’s engine shut
down when it struck a flock of geese, minutes after take-off from New York’s LaGuardia Airport.

The figure shows one of the first photos taken from a Twitter user, Jānis Krūms, showing the drowned
plane with survivors standing on its wings waiting for rescue. It was instantly forwarded across Twitter
and used by numerous blogs and news websites, causing the TwitPic service to crash due to multiple
views. In this regard, Twitter users were able to break the news of the incident around 15 minutes
before the mainstream media have alerted the public about crash incident.

This is typical example of how individuals can now deliver content to everyone and connections are
no longer controlled by professionals.

WHAT DOES MEDIA DO FOR US?

Media fulfills several basic roles in our society. One obvious role is entertainment. Media can act as a
springboard for our imaginations, a source of fantasy, and an outlet for escapism. In the 19th century,
Victorian readers disillusioned by the grimness of the Industrial Revolution found themselves drawn
into fantastic worlds of fairies and other fictitious beings. In the first decade of the 21st century,
American television viewers could peek in on a conflicted Texas high school football team in Friday
Night Lights; the violence-plagued drug trade in Baltimore in The Wire; a 1960s-Manhattan ad agency
in Mad Men; or the last surviving band of humans in a distant, miserable future in Battlestar Galactica.
Through bringing us stories of all kinds, media has the power to take us away from ourselves.

Media can also provide information and education. Information can come in many forms, and it may
sometimes be difficult to separate from entertainment. Today, newspapers and news-oriented
television and radio programs make available stories from across the globe, allowing readers or
viewers in London to access voices and videos from Baghdad, Tokyo, or Buenos Aires. Books and
magazines provide a more in-depth look at a wide range of subjects. The free online encyclopedia
Wikipedia has articles on topics from presidential nicknames to child prodigies to tongue twisters in
various languages. The Massachusetts Institute of Technology (MIT) has posted free lecture notes,
exams, and audio and video recordings of classes on its OpenCourseWare website, allowing anyone
with an Internet connection access to world-class professors.

Another useful aspect of media is its ability to act as a public forum for the discussion of important
issues. In newspapers or other periodicals, letters to the editor allow readers to respond to journalists
or to voice their opinions on the issues of the day. These letters were an important part of U.S.
newspapers even when the nation was a British colony, and they have served as a means of public
discourse ever since. The Internet is a fundamentally democratic medium that allows everyone who
can get online the ability to express their opinions through, for example, blogging or podcasting—
though whether anyone will hear is another question.

Similarly, media can be used to monitor government, business, and other institutions. Upton Sinclair’s
1906 novel The Jungle exposed the miserable conditions in the turn-of-the-century meatpacking
industry; and in the early 1970s, Washington Post reporters Bob Woodward and Carl Bernstein
uncovered evidence of the Watergate break-in and subsequent cover-up, which eventually led to the
resignation of President Richard Nixon. But purveyors of mass media may be beholden to particular
agendas because of political slant, advertising funds, or ideological bias, thus constraining their ability
to act as a watchdog. The following are some of these agendas:

▪ Entertaining and providing an outlet for the imagination


▪ Educating and informing
▪ Serving as a public forum for the discussion of important issues
▪ Acting as a watchdog for government, business, and other institutions

It’s important to remember, though, that not all media are created equal. While some forms of mass
communication are better suited to entertainment, others make more sense as a venue for spreading
information. In terms of print media, books are durable and able to contain lots of information, but
are relatively slow and expensive to produce; in contrast, newspapers are comparatively cheaper and
quicker to create, making them a better medium for the quick turnover of daily news. Television
provides vastly more visual information than radio and is more dynamic than a static printed page; it
can also be used to broadcast live events to a nationwide audience, as in the annual State of the Union
address given by the U.S. president. However, it is also a one-way medium—that is, it allows for very
little direct person-to-person communication. In contrast, the Internet encourages public discussion
of issues and allows nearly everyone who wants a voice to have one. However, the Internet is also
largely unmoderated. Users may have to wade through thousands of inane comments or misinformed
amateur opinions to find quality information.

The 1960s media theorist Marshall McLuhan took these ideas one step further, famously coining the
phrase “the medium is the message (McLuhan, 1964).” By this, McLuhan meant that every medium
delivers information in a different way and that content is fundamentally shaped by the medium of
transmission. For example, although television news has the advantage of offering video and live
coverage, making a story come alive more vividly, it is also a faster-paced medium. That means more
stories get covered in less depth. A story told on television will probably be flashier, less in-depth, and
with less context than the same story covered in a monthly magazine; therefore, people who get the
majority of their news from television may have a particular view of the world shaped not by the
content of what they watch but its medium. Or, as computer scientist Alan Kay put it, “Each medium
has a special way of representing ideas that emphasize particular ways of thinking and de-emphasize
others (Kay, 1994).” Kay was writing in 1994, when the Internet was just transitioning from an
academic research network to an open public system. A decade and a half later, with the Internet
firmly ensconced in our daily lives, McLuhan’s intellectual descendants are the media analysts who
claim that the Internet is making us better at associative thinking, or more democratic, or shallower.
But McLuhan’s claims don’t leave much space for individual autonomy or resistance. In an essay about
television’s effects on contemporary fiction, writer David Foster Wallace scoffed at the “reactionaries
who regard TV as some malignancy visited on an innocent populace, sapping IQs and compromising
SAT scores while we all sit there on ever fatter bottoms with little mesmerized spirals revolving in our
eyes…. Treating television as evil is just as reductive and silly as treating it like a toaster with pictures
(Wallace, 1997).” Nonetheless, media messages and technologies affect us in countless ways, some
of which probably won’t be sorted out until long in the future.
KEY TAKEAWAYS

Media fulfills several roles in society, including the following:

▪ entertaining and providing an outlet for the imagination,


▪ educating and informing,
▪ serving as a public forum for the discussion of important issues, and
▪ acting as a watchdog for government, business, and other institutions.

▪ Johannes Gutenberg’s invention of the printing press enabled the mass production of media,
which was then industrialized by Friedrich Koenig in the early 1800s. These innovations led to the
daily newspaper, which united the urbanized, industrialized populations of the 19th century.

▪ In the 20th century, radio allowed advertisers to reach a mass audience and helped spur the
consumerism of the 1920s—and the Great Depression of the 1930s. After World War II, television
boomed in the United States and abroad, though its concentration in the hands of three major
networks led to accusations of homogenization. The spread of cable and subsequent deregulation
in the 1980s and 1990s led to more channels, but not necessarily to more diverse ownership.
▪ Transitions from one technology to another have greatly affected the media industry, although it
is difficult to say whether technology caused a cultural shift or resulted from it. The ability to make
technology small and affordable enough to fit into the home is an important aspect of the
popularization of new technologies.

1.7 MEDIA IN THE DIGITAL AGE(Baran)

You might also like