Developed in Asia and widely used during the Middle Ages, the abacus can be considered the origin of modern computing devices. An abacus, composed of strings and beads representing numerical values, can be used for arithmetic.
French philosopher Blaise Pascal invented the world's first digital calculator in the 17th century. His machine was based on a system of rotating drums controlled with a ratchet linkage. In honor of his early contributions to computer technology, the programming language Pascal was named after him in the 1970s. A German philosopher and mathematician, Gottfried Wilhelm von Leibnitz, later improved Pascal's design, making a handheld version similar to a handheld calculator. It never became available commercially, however.
The first significant automated data-processing techniques were applied to making fabric patterns, not calculating numbers. French weaver Joseph-Marie Jacquard introduced a punch-card weaving system at the 1801 World's Fair. His system was straightforward enough; the punched cards controlled the pattern applied to the cloth as it was woven. The introduction of these looms, symbolizing the replacement of people by machines, caused riots.
After proposing in 1822 that it might be possible to compute table entries using a steam engine, Charles Babbage had second thoughts about his idea and went on to design the analytical engine that had the basic components of the modern computer in 1833. This earned him the title of father of the computer. He was aided greatly by the daughter of famous poet Lord Byron, Ada Augusta King, Countess of Lovelace, who is recognized as the world's first programmer. In 1890, U.S. inventor and statistician Herman Hollerith put the punched card system to use for the 1890 census. He discovered that perforated cards could be read electrically by machines. Each perforation could stand for some important piece of information that the machine could sort and manipulate. Hollerith founded the Calculating-Tabulating-Recording Company in 1914, which eventually was renamed International Business Machines (IBM) in 1924. IBM is still an IT industry leader today, and it remains on the cutting-edge of technology. Some of its newest projects focus on blockchain technology, data analytics, artificial intelligence, and other emerging fields.
In the mid-1940s, punched cards were also used on the Electronic Numerical Integrator and Calculator (ENIAC) at the University of Pennsylvania. ENIAC's inventors developed the world's first all-electronic, general-purpose computer for the U.S. Army. This computer was enormous and relied on over 18,000 vacuum tubes. In 1949, they introduced the Binary Automatic Computer (BINAC), which used magnetic tape, and then developed the Universal Automatic Computer (UNIVAC I) for the U.S. census. The latter was the first digital computer to handle both numerical data and alphabetical information quickly and efficiently. In 1954, IBM built the first commercial computer, the 650 EDPM, which was programmed by symbolic notation.
By the late 1950s, the transistor, invented 10 years earlier, had made the second generation of computers possible. Transistors replaced the bulky vacuum tubes and were lighter, smaller, sturdier, and more efficient.
The integrated circuits of the late 1960s introduced the solid-state technology that allowed transistors, diodes, and resistors to be carried on tiny silicon chips. These advances further reduced operating costs and increased speed, capacity, and accuracy. Minicomputers, much smaller than mainframes (large-scale computers) but of comparable power, were developed shortly afterward.
The next important advances included large-scale integration and microprocessing chips. Microchips made even smaller computers possible and reduced costs while increasing capacity. The speed with which a computer processed, calculated, retrieved, and stored data improved significantly. Decreased costs allowed manufacturers to explore new markets.
In the mid-1970s, Steve Wozniak and Steve Jobs started Apple out of their garage. Their vision was to bring computers into every home in America and even the world. Toward that end, they developed a user-friendly computer offered at a reasonable price. User-friendliness was essential, since many people without computer skills would have to adapt to the computer system. The development of their eventual product, the Macintosh computer, was the first to give on-screen instructions in everyday language and successfully use a graphical interface. In addition, Apple introduced the mouse, which allows users to point and click on screen icons to enter commands instead of typing them in one by one.
IBM and manufacturers who copied their designs were quick to enter the personal computer (PC) market once they recognized the tremendous sales potential of the device. The result was a friendly debate among computer users over which are better—Macs or PCs. Regardless of personal preference, the two incompatible systems often led to problems when people tried to share information across formats. Software designers have since developed ways to make file conversions easier and software more interchangeable.
One major trend of the last few decades was the downsizing of computer systems, replacing big mainframe computers with client-server architecture, or networking. Networks allow users greater computing flexibility and increased access to an ever-increasing amount of data.
The second major recent trend has been the rapid growth of the Internet and World Wide Web. Initially developed for the U.S. Department of Defense, the Internet is composed of numerous networks connected to each other around the world. Not surprisingly, this massive network has revolutionized information sharing. It’s used for real-time video conferencing, e-mail services, online research, social networking, e-commerce, online education, entertainment, and many other purposes. The World Wide Web usually refers to the body of information that is available for retrieval online, while the Internet generally refers to the back-end network system plus its various services. In recent years, Internet use on handheld and tablet devices and through wireless networks has revolutionized people's access to technology. As of May 2018, there were more than 1.8 billion Web sites, according to InternetLiveStats.com. And as of December 31, 2014, there were more than 3 billion Internet users, according to the Miniwatts Marketing Group. Approximately 10 percent of users lived in North America.
Hardware companies are continually striving to make faster and better microprocessors and memory chips. Advances in hardware technology have led directly to advances in software applications. As the developer of Windows, Microsoft has been the leader in the software industry. Windows is a user-friendly, visual-based operating system. (An operating system is the interface between the user, the programs stored on the hardware, and the hardware itself.) Disk operating system (DOS) is one of the early operating systems, and while still used, it requires more computer knowledge than other operating systems. The Windows and Mac systems allow users to point and click on icons and menus with a mouse to tell the computer what to do, instead of having to type in specific commands by hand, as DOS requires.
Intel and Motorola have been the innovators in microprocessor design, striving for faster and more efficient processors. Such innovations allow computer manufacturers to make smaller, lighter, and quicker computers, laptops, and handheld models. As processors get faster and memory increases, computers can process more sophisticated and complicated software programming.
Two fast-growing trends are cloud computing and mobile computing. Cloud computing allows computer users to store applications and data in the “cloud,” or cyberspace, on the Internet, accessing them only as needed from a compatible tablet, handheld, or notebook computer. The International Data Corporation, a market research, analysis, and advisory firm, reports that the worldwide public cloud services market reached $160 billion in 2018—up from $45.7 billion in 2013. The market is projected to grow at a compound annual growth rate of nearly 22 percent through 2021. Mobile computing has led to a boom in smartphones or handheld computers supported by Wi-Fi technology that allows users to access the Internet and cloud content and programs from anywhere they receive a Wi-Fi signal. In 2015, 52.7 percent of the global online population accessed the Internet from their mobile phones, according to Statista, an Internet statistics firm. This percentage is expected to grow to 63.4 percent in 2019. These trends are key factors driving the evolution of computing devices and the Internet today.
Other major IT trends include the growing use of the following technologies:
- Blockchain: a distributed ledger database (similar to a relational database) that maintains a continuously-growing list of records that cannot be altered, except after agreement by all parties in the chain;
- Artificial Intelligence: a concept that machines can be programmed to perform functions and tasks in a “smart” manner that mimics human decision-making processes; and
- Machine Learning: a method of data analysis that incorporates artificial intelligence to help computers study data, identify patterns or other strategic goals, and make decisions with minimal or no intervention from humans.
Additionally, virtual reality, augmented reality, and mixed reality technologies are moving far beyond the video gaming industry for use in the health care, hospitality, training, architecture, and law enforcement industries. Virtual reality (VR) is technology (typically a headset that encompasses the field of vision) that allows users to immerse themselves visually, aurally, and through other sensations in imaginary worlds. Augmented reality (AR) is technology—a special headset or applications on a smartphone or tablet—that introduces virtual objects to the real world. Mixed reality involves a combination of virtual and augmented reality technology in which users can interact with virtual worlds by using real-world objects. The International Data Corporation, (IDC) an American market research, analysis, and advisory firm, predicts that worldwide revenues from the VR/AR market will increase from $11.4 billion in 2017 to nearly $215 billion in 2021. The top uses for VR/AR technology will be in retail showcasing, onsite assembly, and safety and process manufacturing training.
- Biometrics Systems Specialists
- Chief Information Officers
- Clinical Data Managers
- Computer and Office Machine Service Technicians
- Computer Network Administrators
- Computer Programmers
- Computer Support Service Owners
- Computer Support Specialists
- Computer Systems Programmer/Analysts
- Computer Trainers
- Data Processing Technicians
- Data Scientists
- Data Warehousing Specialists
- Database Specialists
- Digital Agents
- Document Management Specialists
- Electrical and Electronics Engineers
- Electrical Engineering Technologists
- Electronics Engineering Technicians
- Electronics Service Technicians
- Embedded Systems Engineers
- Fiber Optics Technicians
- Graphic Designers
- Graphics Programmers
- Hardware Engineers
- Information Assurance Analysts
- Information Security Analysts
- Information Technology Consultants
- Information Technology Project Managers
- Microelectronics Technicians
- Semiconductor Technicians
- Software Application Developers
- Software Designers
- Software Engineers
- Software Quality Assurance Testers
- Solutions Architects
- Systems Setup Specialists
- Technical Support Specialists
- Technical Writers and Editors
- Wireless Service Technicians