Developed in Asia and widely used during the Middle Ages, the abacus can be considered the origin of modern computing devices. An abacus, composed of strings and beads representing numerical values, can be used for arithmetic.
French philosopher Blaise Pascal invented the world's first digital calculator in the 17th century. His machine was based on a system of rotating drums controlled with a ratchet linkage. In honor of his early contributions to computer technology, the programming language Pascal was named after him in the 1970s. A German philosopher and mathematician, Gottfried Wilhelm von Leibniz, later improved Pascal's design, making a handheld version similar to a handheld calculator. It never became available commercially, however.
The first significant automated data-processing techniques were applied to making fabric patterns, not calculating numbers. French weaver Joseph-Marie Jacquard introduced a punch-card weaving system at the 1801 World's Fair. His system was straightforward enough; the punched cards controlled the pattern applied to the cloth as it was woven. The introduction of these looms, symbolizing the replacement of people by machines, caused riots.
After proposing in 1822 that it might be possible to compute table entries using a steam engine, Charles Babbage had second thoughts about his idea and went on to design the analytical engine that had the basic components of the modern computer in 1833. This earned him the title of father of the computer. He was aided greatly by the daughter of famous poet Lord Byron, Ada Augusta King, Countess of Lovelace, who is recognized as the world's first programmer. In 1890, U.S. inventor and statistician Herman Hollerith put the punched card system to use for the 1890 census. He discovered that perforated cards could be read electrically by machines. Each perforation could stand for some important piece of information that the machine could sort and manipulate. Hollerith founded the Calculating-Tabulating-Recording Company in 1914, which eventually was renamed International Business Machines (IBM) in 1924. IBM is still an IT industry leader today, and it remains on the cutting-edge of technology. Some of its newest projects focus on blockchain technology, data analytics, artificial intelligence, and other emerging fields.
In the mid-1940s, punched cards were also used on the Electronic Numerical Integrator and Calculator (ENIAC) at the University of Pennsylvania. ENIAC's inventors developed the world's first all-electronic, general-purpose computer for the U.S. Army. This computer was enormous and relied on over 18,000 vacuum tubes. In 1949, they introduced the Binary Automatic Computer (BINAC), which used magnetic tape, and then developed the Universal Automatic Computer (UNIVAC I) for the U.S. census. The latter was the first digital computer to handle both numerical data and alphabetical information quickly and efficiently. In 1954, IBM built the first commercial computer, the 650 EDPM, which was programmed by symbolic notation.
By the late 1950s, the transistor, invented 10 years earlier, had made the second generation of computers possible. Transistors replaced the bulky vacuum tubes and were lighter, smaller, sturdier, and more efficient.
The integrated circuits of the late 1960s introduced the solid-state technology that allowed transistors, diodes, and resistors to be carried on tiny silicon chips. These advances further reduced operating costs and increased speed, capacity, and accuracy. Minicomputers, much smaller than mainframes (large-scale computers) but of comparable power, were developed shortly afterward.
The next important advances included large-scale integration and microprocessing chips. Microchips made even smaller computers possible and reduced costs while increasing capacity. The speed with which a computer processed, calculated, retrieved, and stored data improved significantly. Decreased costs allowed manufacturers to explore new markets.
In the mid-1970s, Steve Wozniak and Steve Jobs started Apple out of their garage. Their vision was to bring computers into every home in America and even the world. Toward that end, they developed a user-friendly computer offered at a reasonable price. User-friendliness was essential, since many people without computer skills would have to adapt to the computer system. The development of their eventual product, the Macintosh computer, was the first to give on-screen instructions in everyday language and successfully use a graphical interface. In addition, Apple introduced the mouse, which allows users to point and click on screen icons to enter commands instead of typing them in one by one.
IBM and manufacturers who copied their designs were quick to enter the personal computer (PC) market once they recognized the tremendous sales potential of the device. The result was a friendly debate among computer users over which are better—Macs or PCs. Regardless of personal preference, the two incompatible systems often led to problems when people tried to share information across formats. Software designers have since developed ways to make file conversions easier and software more interchangeable.
One major trend of the last few decades was the downsizing of computer systems, replacing big mainframe computers with client-server architecture, or networking. Networks allow users greater computing flexibility and increased access to an ever-increasing amount of data.
The second major recent trend has been the rapid growth of the Internet and World Wide Web. Initially developed for the U.S. Department of Defense, the Internet is composed of numerous networks connected to each other around the world. Not surprisingly, this massive network has revolutionized information sharing. It’s used for real-time video conferencing, e-mail services, online research, social networking, e-commerce, online education, entertainment, and many other purposes. The World Wide Web usually refers to the body of information that is available for retrieval online, while the Internet generally refers to the back-end network system plus its various services. In recent years, Internet use on handheld and tablet devices and through wireless networks has revolutionized people's access to technology. As of September 2020, there were more than 1.8 billion Web sites and more than 4.6 billion Internet users, according to InternetLiveStats.com. Approximately 8 percent of users lived in North America.
Hardware companies are continually striving to make faster and better microprocessors and memory chips. Advances in hardware technology have led directly to advances in software applications. As the developer of Windows, Microsoft has been the leader in the software industry. Windows is a user-friendly, visual-based operating system. (An operating system is the interface between the user, the programs stored on the hardware, and the hardware itself.) Disk operating system (DOS) is one of the early operating systems, and while still used, it requires more computer knowledge than other operating systems. The Windows and Mac systems allow users to point and click on icons and menus with a mouse to tell the computer what to do, instead of having to type in specific commands by hand, as DOS requires.
Intel and Motorola have been the innovators in microprocessor design, striving for faster and more efficient processors. Such innovations allow computer manufacturers to make smaller, lighter, and quicker computers, laptops, and handheld models. As processors get faster and memory increases, computers can process more sophisticated and complicated software programming.
Two fast-growing trends are cloud computing and mobile computing. Cloud computing allows computer users to store applications and data in the “cloud,” or cyberspace, on the Internet, accessing them only as needed from a compatible tablet, handheld, or notebook computer. The International Data Corporation, a market research, analysis, and advisory firm, reports that the worldwide public cloud services market reached $233.4 billion in 2019—up from $160 billion in 2018 and $45.7 billion in 2013. The market is projected to grow at a compound annual growth rate of 22.5 percent through 2022. Mobile computing has led to a boom in smartphones or handheld computers supported by Wi-Fi technology that allows users to access the Internet and cloud content and programs from anywhere they receive a Wi-Fi signal. In 2020, 51.5 percent of the global online population accessed the Internet from their mobile phones, according to Statista, an Internet statistics firm. This percentage is expected to grow to 72.6 percent in 2025, according to a report by the World Advertising Research Center, using data from mobile trade body GSMA. These trends are key factors driving the evolution of computing devices and the Internet today.
Other major IT trends include the growing use of the following technologies:
- Blockchain: a distributed ledger database (similar to a relational database) that maintains a continuously-growing list of records that cannot be altered, except after agreement by all parties in the chain.
- Artificial Intelligence: a concept that machines can be programmed to perform functions and tasks in a “smart” manner that mimics human decision-making processes; and
- Machine Learning: a method of data analysis that incorporates artificial intelligence to help computers study data, identify patterns or other strategic goals, and make decisions with minimal or no intervention from humans.
- Quantum Computing: a type of advanced computing in which quantum computers are used to solve challenges of massive size and complexity that cannot be solved by the computing power of traditional computers. “Quantum computers could spur the development of new breakthroughs in science, medications to save lives, machine learning methods to diagnose illnesses sooner, materials to make more efficient devices and structures, financial strategies to live well in retirement, and algorithms to quickly direct resources such as ambulances,” according to IBM. Companies such as Google, Intel, Microsoft, and IBM are making significant financial investments in quantum hardware and software. The research and advisory firm Gartner, Inc. predicts that 20 percent of organizations will be budgeting for quantum computing projects by 2023, up from 1 percent in 2019.
- Biometrics: distinctive physical or behavioral characteristics (such as fingerprints, palms, eyes, and faces) that are used to identify individuals. Biometric systems are a set of hardware and software that collect, process, and assess these characteristics and compare them against existing records to create a match. CompTIA says that biometrics “will play an important role in improving security by allowing people and devices to authenticate and move seamlessly through our high-tech world.”
Additionally, virtual reality, augmented reality, and mixed reality technologies are moving far beyond the video gaming industry for use in the health care, hospitality, training, architecture, and law enforcement industries. Virtual reality (VR) is technology (typically a headset that encompasses the field of vision) that allows users to immerse themselves visually, aurally, and through other sensations in imaginary worlds. Augmented reality (AR) is technology—a special headset or applications on a smartphone or tablet—that introduces virtual objects to the real world. Mixed reality involves a combination of virtual and augmented reality technology in which users can interact with virtual worlds by using real-world objects. The International Data Corporation, (IDC) an American market research, analysis, and advisory firm, predicts that worldwide spending on AR/VR products and services will experience a five-year compound annual growth rate of 77 percent from 2019 to 2023.
The number of AR devices worldwide is expected to increase by 140 percent from 2018 to 2022, according to the market research firm IDC. PC Magazine reports that AR revenue should be strongest in the following industries (listed in descending order of revenue) by 2025:
- video games
- health care
- life events
- video entertainment
- real estate
- the military
Many experts believe that AR will eventually become more popular than VR because it has many more real-world uses in the aforementioned areas, as well as in industrial production, training and development, construction, tourism, vehicle navigation, and law enforcement.
- Artificial Intelligence Specialists
- Augmented Reality Developers
- Big Data Developers
- Biometrics Systems Specialists
- Blockchain Developers
- Chief Information Officers
- Clinical Data Managers
- Computer and Office Machine Service Technicians
- Computer Network Administrators
- Computer Programmers
- Computer Support Service Owners
- Computer Support Specialists
- Computer Systems Programmer/Analysts
- Computer Trainers
- Customer Success Managers
- Data Processing Technicians
- Data Scientists
- Data Warehousing Specialists
- Database Specialists
- Digital Agents
- Document Management Specialists
- Electrical Engineering Technologists
- Electrical Engineers
- Electronics Engineering Technicians
- Electronics Engineers
- Electronics Service Technicians
- Embedded Systems Engineers
- Fiber Optics Technicians
- Full Stack Developers/Engineers
- Graphic Designers
- Graphics Programmers
- Hardware Engineers
- Help Desk Representatives
- Information Assurance Analysts
- Information Security Analysts
- Information Technology Consultants
- Information Technology Project Managers
- Internet Consultants
- Machine Learning Engineers
- Microelectronics Technicians
- Personal Privacy Advisors
- Semiconductor Technicians
- Site Reliability Engineers
- Software Application Developers
- Software Designers
- Software Engineers
- Software Quality Assurance Testers
- Solutions Architects
- Systems Setup Specialists
- Technical Support Specialists
- Technical Writers and Editors
- Technology Ethicists
- Wireless Service Technicians