The concept of information technology
Information technology, by the acronym, is the set of methods and technologies that are used in the public, private or business spheres for the storage, transmission, and processing of data and information through the use of networks (networks computers (PC, server, mainframe, etc.) and telecommunications equipment (datacenters, routers, smartphones, tablets, GPS, etc.). In general hardware, software, and digital communication (ICT) are the 3 sectors on which IT technologies are developed that are now widely used in social, commercial and economic contexts around the world.
The term is commonly used as a synonym for computers and computer networks, but also includes other information distribution technologies such as television, telephones and the internet. Several industries are related to information technology, including hardware, software, electronics, semiconductors, telecommunications equipment, e-commerce, web design, and IT services.
Humans memorize, retrieve, manipulate and communicate information ever since the Sumerians in Mesopotamia developed cuneiform writing around 3000 BC but the term information technology in its modern sense first appeared in its modern sense. a 1958 article published in the Harvard Business Review; authors Harold J. Leavitt and Thomas L. Whisler commented that ‘new technology does not yet have a single consolidated name. We’ll call it information technology (TI).’ Their definition consists of three categories: processing techniques, the application of statistical and mathematical methods to decision-making and the simulation of higher-order thoughts through computer programs.
Depending on the storage and processing technologies used, it is possible to distinguish four different stages of THE development: pre-mechanical (3000 BC – 1450 A.D.), mechanics (1450–1840), electromechanics (1840–40) and electronics (1940–present). This entry focuses on the most recent period, which began around 1940.
History of computer technology
Devices to help human memory perform numerical calculations have been used for thousands of years. Probably the first aids were the marking sticks.
The Antiticitera machine, dating from about the beginning of the 1st century BC, is generally considered the first known analog computer and gear mechanism. Comparable gear mechanisms did not emerge in Europe until the 16th century, and it was not until 1645 that the first mechanical computer capable of performing the four elementary arithmetic operations was developed.
Computers, using relays or valves, began to appear in the early 1940s. The Zuse Z3 electromechanical, completed in 1941, was the world’s first programmable computer and – according to modern criteria – one of the first machines that could be considered a complete computer machine. Colossus, developed in England during World War II to decrypt German messages, was the first electronic digital computer. Although programmable, it was not multipurpose, being designed to perform only one task. He also lacked the ability to store his program in a memory: programming was conducted using plugs and switches to alter internal wiring. The first recognizably modern digitally stored computer was the Small-Scale Experimental Machine (SSEM), which performed its first program on 21 June 1948.
The development of transistors in the late 1940s at Bell Labs allowed a new generation of computers to be designed with significantly reduced electricity consumption. The first commercially available memorized computer, the Ferranti Mark I, contained 4,050 valves and had an electricity consumption of 25 kilowatts. By comparison, the first transistorized computer developed at the University of Manchester and operational since November 1953, consumed only 150 watts in its final version.
Early electronic computers such as Colossus made use of perforated tape, a long strip of paper on which data was represented by a series of holes, a technology now obsolete.
Electronic data storage, which is used in modern computers, dates back to World War II when a form of delay line memory was developed to remove spurious echo from radar signals, whose first practical application was the delay line mercury. The first random-access digital memory medium was the Williams tube, based on a normal cathode-ray tube, but the information stored in it and the delay line memory was volatile as it had to be updated continuously, and so it was lost once the current was removed. The first form of non-volatile computer storage was the magnetic drum, invented in 1932 and used in the Ferranti Mark 1, the world’s first commercially available multi-purpose electronic computer.
IBM introduced the first hard drive in 1956 as a component of its 305 RAMAC computer system. Most digital data today is still stored magnetically on hard drives, or optically on means such as CD-ROMs. Until 2002, most of the information was stored on analog devices, but that year the digital storage capacity exceeded the analog capacity for the first time. As of 2007, almost 94% of the data stored worldwide was digitally stored: 52% on hard drives, 28% on optical devices and 11% on digital magnetic tape. It has been estimated that the world’s ability to store information on electronic devices grew from less than 3 exabytes in 1986 to 295 exabytes in 2007, roughly doubling every 3 years.
Database management systems emerged in the 1960s to address the problem of storing and retrieving large amounts of data accurately and quickly. One of the first systems was IBM’s Information Management System (IMS), which is still widely used more than 40 years later. The IMS stores data hierarchically, but in the 1970s Ted Codd proposed an alternative model of relational storage based on set theory and predicate logic and familiar concepts of tables, rows, and columns. Oracle made its first relational database management system (RDBMS) available in 1980.
All systems for managing relational databases consist of a number of components that together allow the data they store to be accessed simultaneously by many users while maintaining their integrity. A feature of all databases is that the structure of the data they contain is defined and stored separately from the data itself, in a database schema.
In recent years, extensible markup language (XML) has become a popular data representation format. Although XML data can be stored in normal file systems, it is commonly stored in relational databases to take advantage of their ‘robust implementation’. years of both theoretical and practical efforts.’ As an evolution of Standard Generalized Markup Language (SGML), the text-based structure of XML offers the advantage of being readable by both machines and humans.
The relational database model introduced a structured query language, the so-called Structured Query Language (SQL), independent of the programming language and based on relational algebra.
The terms ‘given’ and ‘information’ are not synonyms. Whatever is stored is a fact, but it becomes information only when it is organized and presented in a meaningful way. Most of the world’s digital data is unstructured and stored in a variety of different physical formats even within a single organization. Data warehouses began to be developed in the 1980s to integrate these disparate filings. They typically contain data extracted from various sources, including external sources such as the Internet, organized in such a way as to facilitate decision support systems (DSS).
Data transmission has three aspects: transmission, propagation, and reception. It can in principle be categorized as a broadcast, in which information is transmitted one-way downstream, or telecommunications, with two-way channels upstream and downstream.
XML has increasingly been used as a means of exchanging data since the early 2000s, particularly for machine-oriented interactions such as those in web-oriented protocols such as SOAP, which describe ‘data-in-transit rather than … data-to-rest’. One of the challenges of this use is to convert data from relational databases to structures with an XML Document Object Model (DOM).
Hilbert and Lopez identify the exponential pace of technological change (a kind of Moore’s Law): the ability of machines, specific to each application, to process information per capita roughly doubled every 14 months between 1986 and 2007; the capacity of the world’s multi-purpose computers doubled every 18 months during the same two decades; the per capita capacity of global telecommunications doubled every 34 months; per capita storage capacity took roughly 40 months to double (every 3 years); and the information transmitted per capita has doubled every 12.3 years.
In an academic context, the Association for Computing Machinery defines IT as ‘first-level degree programs that prepare students to meet the information technology needs of companies, public administration, health care, schools and other types of organizations … IT specialists take responsibility for selecting the appropriate hardware and software products for an organization, integrating those products with organizational needs and infrastructure, and installing, customizing and maintaining those applications for the organization’s COMPUTER users.’
Business and employment perspective
In a commercial context, the Information Technology Association of America defined information technology as ‘study, design, development, application, implementation, support or management of computer-based information systems’. The responsibilities of those working in the field include network administration, software development and installation, and the planning and lifecycle management of an organization’s technology, under which hardware and software are kept, updated, and replaced.
The commercial value of information technology lies in automating business processes, providing decision-making information, connecting businesses with their customers, and providing productivity tools to help them increase efficiency.