Data
Tech AcceleratorWhat is IT/OT convergence? Everything you need to know DEFINITION
information technology (IT)
What is information technology?
Information technology (IT) is the use of computers, storage, networking and other physical devices, infrastructure and processes to create, process, store, secure and exchange all forms of electronic data. Typically, IT is used in the context of business operations, as opposed to the technology used for personal or entertainment purposes. The commercial use of IT encompasses both computer technology and telecommunications.
Harvard Business Review coined the term information technology in 1958 to distinguish between purpose-built machines designed to perform a limited scope of functions and general-purpose computing machines that could be programmed for various tasks. As the IT industry evolved from the mid-20th century, computing capability increased, while device cost and energy consumption decreased, a cycle that continues today when new technologies emerge.
Information technology (IT) is the use of computers, storage, networking and other physical devices, infrastructure and processes to create, process, store, secure and exchange all forms of electronic data. Typically, IT is used in the context of business operations, as opposed to the technology used for personal or entertainment purposes. The commercial use of IT encompasses both computer technology and telecommunications.
Harvard Business Review coined the term information technology in 1958 to distinguish between purpose-built machines designed to perform a limited scope of functions and general-purpose computing machines that could be programmed for various tasks. As the IT industry evolved from the mid-20th century, computing capability increased, while device cost and energy consumption decreased, a cycle that continues today when new technologies emerge.
Types of information technology
Information technology encompasses a wide range of technologies and systems that are used to store, retrieve, process and transmit data for specific use cases.
Common information technology types include the following:
- Internet and web technologies. This includes the tools and protocols used to access, navigate and interact with information on the internet. Examples include web browsers, websites, web servers, Hypertext Markup Language, cascading style sheets, JavaScript, HTTP and other internet-related technologies.
- Cloud computing. This involves the delivery of computing resources and services over the internet on a pay-per-use basis. This can include infrastructure as a service, platform as a service, software as a service and cloud storage options.
- Databases. This includes IT systems and software used to store, organize and retrieve data. Examples include MySQL, NoSQL, relational database management systems and MongoDB.
- Artificial intelligence and machine learning. AI and ML-based IT technologies use algorithms and statistical models to enable computers to perform tasks that typically require human intelligence. Examples include speech recognition, image recognition and natural language processing.
- Cybersecurity. This type of IT includes technologies and best practices designed to protect IT systems, networks and data from unauthorized access, cyber attacks and other security threats. Cybersecurity can be enforced through firewalls, antivirus software, encryption, intrusion detection systems and security policies.
- Internet of things. This includes the network of interconnected devices and sensors that collect, exchange and analyze data. IoT technologies enable the integration of physical objects into computer systems, providing automation, monitoring and control in various domains.
- IT governance. This involves making policies and rules for the organization to ensure effective operation.
- Data analytics and business intelligence. BI focuses on tools and techniques for extracting insights from large data sets to support decision-making and business operations.
Technology is the application of conceptual knowledge to achieve practical goals, especially in a reproducible way The word technology can also mean the products resulting from such efforts including both tangible tools such as utensils or machines, and intangible ones such as software. Technology plays a critical role in science, engineering, and everyday life.
Futures studies is the systematic and interdisciplinary study of social and technological progress. It aims to quantitatively and qualitatively explore the range of plausible futures and to incorporate human values in the development of new technologies.More generally, futures researchers are interested in improving "the freedom and welfare of humankind It relies on a thorough quantitative and qualitative analysis of past and present technological trends, and attempts to rigorously extrapolate them into the futureScience fiction is often used as a source of ideas. Futures research methodologies include survey research, modeling, statistical analysis, and computer simulations.
Relation to science and engineeringEngineering is the process by which technology is developed. It often requires problem-solving under strict constraints.
Technological development is "action-oriented", while scientific knowledge is fundamentally explanatory.Polish philosopher Henryk Skolimowski framed it like so: "science concerns itself with what is, technology with what is to be.
The direction of causality between scientific discovery and technological innovation has been debated by scientists, philosophers and policymakers. Because innovation is often undertaken at the edge of scientific knowledge, most technologies are not derived from scientific knowledge, but instead from engineering, tinkering and chance. For example, in the 1940s and 1950s, when knowledge of turbulent combustion or fluid dynamics was still crude, jet engines were invented through "running the device to destruction, analyzing what broke and repeating the process".Scientific explanations often follow technological developments rather than preceding them. Many discoveries also arose from pure chance, like the discovery of penicillin as a result of accidental lab contamination.
Since the 1960s, the assumption that government funding of basic research would lead to the discovery of marketable technologies has lost credibility.
Probabilist Nassim Taleb argues that national research programs that implement the notions of serendipity and convexity through frequent trial and error are more likely to lead to useful innovations than research that aims to reach specific outcomes.
Despite this, modern technology is increasingly reliant on deep, domain-specific scientific knowledge. In 1975, there was an average of one citation of scientific literature in every three patents granted in the U.S.; by 1989, this increased to an average of one citation per patent. The average was skewed upwards by patents related to the pharmaceutical industry, chemistry, and electronics.
A 2021 analysis shows that patents that are based on scientific discoveries are on average 26% more valuable than equivalent non-science-based patents
Information technology encompasses a wide range of technologies and systems that are used to store, retrieve, process and transmit data for specific use cases.
Common information technology types include the following:
- Internet and web technologies. This includes the tools and protocols used to access, navigate and interact with information on the internet. Examples include web browsers, websites, web servers, Hypertext Markup Language, cascading style sheets, JavaScript, HTTP and other internet-related technologies.
- Cloud computing. This involves the delivery of computing resources and services over the internet on a pay-per-use basis. This can include infrastructure as a service, platform as a service, software as a service and cloud storage options.
- Databases. This includes IT systems and software used to store, organize and retrieve data. Examples include MySQL, NoSQL, relational database management systems and MongoDB.
- Artificial intelligence and machine learning. AI and ML-based IT technologies use algorithms and statistical models to enable computers to perform tasks that typically require human intelligence. Examples include speech recognition, image recognition and natural language processing.
- Cybersecurity. This type of IT includes technologies and best practices designed to protect IT systems, networks and data from unauthorized access, cyber attacks and other security threats. Cybersecurity can be enforced through firewalls, antivirus software, encryption, intrusion detection systems and security policies.
- Internet of things. This includes the network of interconnected devices and sensors that collect, exchange and analyze data. IoT technologies enable the integration of physical objects into computer systems, providing automation, monitoring and control in various domains.
- IT governance. This involves making policies and rules for the organization to ensure effective operation.
- Data analytics and business intelligence. BI focuses on tools and techniques for extracting insights from large data sets to support decision-making and business operations.
Futures studies is the systematic and interdisciplinary study of social and technological progress. It aims to quantitatively and qualitatively explore the range of plausible futures and to incorporate human values in the development of new technologies.
Engineering is the process by which technology is developed. It often requires problem-solving under strict constraints.
Technological development is "action-oriented", while scientific knowledge is fundamentally explanatory.Polish philosopher Henryk Skolimowski framed it like so: "science concerns itself with what is, technology with what is to be.
The direction of causality between scientific discovery and technological innovation has been debated by scientists, philosophers and policymakers. Because innovation is often undertaken at the edge of scientific knowledge, most technologies are not derived from scientific knowledge, but instead from engineering, tinkering and chance. For example, in the 1940s and 1950s, when knowledge of turbulent combustion or fluid dynamics was still crude, jet engines were invented through "running the device to destruction, analyzing what broke and repeating the process".Scientific explanations often follow technological developments rather than preceding them. Many discoveries also arose from pure chance, like the discovery of penicillin as a result of accidental lab contamination.
Since the 1960s, the assumption that government funding of basic research would lead to the discovery of marketable technologies has lost credibility.
Probabilist Nassim Taleb argues that national research programs that implement the notions of serendipity and convexity through frequent trial and error are more likely to lead to useful innovations than research that aims to reach specific outcomes.
Despite this, modern technology is increasingly reliant on deep, domain-specific scientific knowledge. In 1975, there was an average of one citation of scientific literature in every three patents granted in the U.S.; by 1989, this increased to an average of one citation per patent. The average was skewed upwards by patents related to the pharmaceutical industry, chemistry, and electronics.
A 2021 analysis shows that patents that are based on scientific discoveries are on average 26% more valuable than equivalent non-science-based patents
Emerging technologies:-Emerging technologies are novel technologies whose development or practical applications are still largely unrealized. They include nanotechnology, biotechnology, robotics, 3D printing, blockchains, and artificial intelligence.
In 2005, futurist Ray Kurzweil claimed the next technological revolution would rest upon advances in genetics, nanotechnology, and robotics, with robotics being the most impactful of the three technologies.
Genetic engineering will allow far greater control over human biological nature through a process called directed evolution. Some thinkers believe that this may shatter our sense of self, and have urged for renewed public debate exploring the issue more thoroughly others fear that directed evolution could lead to eugenics or extreme social inequality. Nanotechnology will grant us the ability to manipulate matter "at the molecular and atomic scale which could allow us to reshape ourselves and our environment in fundamental ways.
Nanobots could be used within the human body to destroy cancer cells or form new body parts, blurring the line between biology and technology.
Autonomous robots have undergone rapid progress, and are expected to replace humans at many dangerous tasks, including search and rescue, bomb disposal, firefighting, and war.
Estimates on the advent of artificial general intelligence vary, but half of machine learning experts surveyed in 2018 believe that AI will "accomplish every task better and more cheaply" than humans by 2063, and automate all human jobs by 2140.[109] This expected technological unemployment has led to calls for increased emphasis on computer science education and debates about universal basic income. Political science experts predict that this could lead to a rise in extremism, while others see it as an opportunity to usher in a post-scarcity economy
In 2005, futurist Ray Kurzweil claimed the next technological revolution would rest upon advances in genetics, nanotechnology, and robotics, with robotics being the most impactful of the three technologies.
Genetic engineering will allow far greater control over human biological nature through a process called directed evolution. Some thinkers believe that this may shatter our sense of self, and have urged for renewed public debate exploring the issue more thoroughly others fear that directed evolution could lead to eugenics or extreme social inequality. Nanotechnology will grant us the ability to manipulate matter "at the molecular and atomic scale which could allow us to reshape ourselves and our environment in fundamental ways.
Nanobots could be used within the human body to destroy cancer cells or form new body parts, blurring the line between biology and technology.
Autonomous robots have undergone rapid progress, and are expected to replace humans at many dangerous tasks, including search and rescue, bomb disposal, firefighting, and war.
Estimates on the advent of artificial general intelligence vary, but half of machine learning experts surveyed in 2018 believe that AI will "accomplish every task better and more cheaply" than humans by 2063, and automate all human jobs by 2140.[109] This expected technological unemployment has led to calls for increased emphasis on computer science education and debates about universal basic income. Political science experts predict that this could lead to a rise in extremism, while others see it as an opportunity to usher in a post-scarcity economy
Other animal species:- The use of basic technology is also a feature of non-human animal species. Tool use was once considered a defining characteristic of the genus Homo.This view was supplanted after discovering evidence of tool use among chimpanzees and other primates,dolphins, and crows.for example, researchers have observed wild chimpanzees using basic foraging tools, pestles, levers, using leaves as sponges, and tree bark or vines as probes to fish termites.West African chimpanzees use stone hammers and anvils for cracking nuts,as do capuchin monkeys of Boa Vista, Brazil.Tool use is not the only form of animal technology use; for example, beaver dams, built with wooden sticks or large stones, are a technology with "dramatic" impacts on river habitats and ecosystems
Futures studies Futures studies is the systematic and interdisciplinary study of social and technological progress. It aims to quantitatively and qualitatively explore the range of plausible futures and to incorporate human values in the development of new technologies. 54 More generally, futures researchers are interested in improving "the freedom and welfare of humankind
It relies on a thorough quantitative and qualitative analysis of past and present technological trends, and attempts to rigorously extrapolate them into the future.
It relies on a thorough quantitative and qualitative analysis of past and present technological trends, and attempts to rigorously extrapolate them into the future.
Main article: Philosophy of technology:-Philosophy of technology is a branch of philosophy that studies the "practice of designing and creating artifacts", and the "nature of the things so created.
It emerged as a discipline over the past two centuries, and has grown "considerably" since the 1970s.
The humanities philosophy of technology is concerned with the "meaning of technology for, and its impact on, society and culture.]
Initially, technology was seen as an extension of the human organism that replicated or amplified bodily and mental faculties.Marx framed it as a tool used by capitalists to oppress the proletariat, but believed that technology would be a fundamentally liberating force once it was "freed from societal deformations". Second-wave philosophers like Ortega later shifted their focus from economics and politics to "daily life and living in a techno-material culture", arguing that technology could oppress "even the members of the bourgeoisie who were its ostensible masters and possessors." Third-stage philosophers like Don Ihde and Albert Borgmann represent a turn toward de-generalization and empiricism, and considered how humans can learn to live with technology.
Early scholarship on technology was split between two arguments: technological determinism, and social construction. Technological determinism is the idea that technologies cause unavoidable social changes. It usually encompasses a related argument, technological autonomy, which asserts that technological progress follows a natural progression and cannot be prevented.
Social constructivists[who?] argue that technologies follow no natural progression, and are shaped by cultural values, laws, politics, and economic incentives. Modern scholarship has shifted towards an analysis of sociotechnical systems, "assemblages of things, people, practices, and meanings", looking at the value judgments that shape technology.
Cultural critic Neil Postman distinguished tool-using societies from technological societies and from what he called "technopolies", societies that are dominated by an ideology of technological and scientific progress to the detriment of other cultural practices, values, and world views.[
Herbert Marcuse and John Zerzan suggest that technological society will inevitably deprive us of our freedom and psychological health.
Futures studies is the systematic and interdisciplinary study of social and technological progress. It aims to quantitatively and qualitatively explore the range of plausible futures and to incorporate human values in the development of new technologies.
More generally, futures researchers are interested in improving "the freedom and welfare of humankind".
It relies on a thorough quantitative and qualitative analysis of past and present technological trends, and attempts to rigorously extrapolate them into the future. Science fiction is often used as a source of ideas.[Futures research methodologies include survey research, modeling, statistical analysis, and computer simulations
It emerged as a discipline over the past two centuries, and has grown "considerably" since the 1970s.
The humanities philosophy of technology is concerned with the "meaning of technology for, and its impact on, society and culture.]
Initially, technology was seen as an extension of the human organism that replicated or amplified bodily and mental faculties.Marx framed it as a tool used by capitalists to oppress the proletariat, but believed that technology would be a fundamentally liberating force once it was "freed from societal deformations". Second-wave philosophers like Ortega later shifted their focus from economics and politics to "daily life and living in a techno-material culture", arguing that technology could oppress "even the members of the bourgeoisie who were its ostensible masters and possessors." Third-stage philosophers like Don Ihde and Albert Borgmann represent a turn toward de-generalization and empiricism, and considered how humans can learn to live with technology.
Early scholarship on technology was split between two arguments: technological determinism, and social construction. Technological determinism is the idea that technologies cause unavoidable social changes. It usually encompasses a related argument, technological autonomy, which asserts that technological progress follows a natural progression and cannot be prevented.
Social constructivists[who?] argue that technologies follow no natural progression, and are shaped by cultural values, laws, politics, and economic incentives. Modern scholarship has shifted towards an analysis of sociotechnical systems, "assemblages of things, people, practices, and meanings", looking at the value judgments that shape technology.
Cultural critic Neil Postman distinguished tool-using societies from technological societies and from what he called "technopolies", societies that are dominated by an ideology of technological and scientific progress to the detriment of other cultural practices, values, and world views.[
Herbert Marcuse and John Zerzan suggest that technological society will inevitably deprive us of our freedom and psychological health.
Futures studies is the systematic and interdisciplinary study of social and technological progress. It aims to quantitatively and qualitatively explore the range of plausible futures and to incorporate human values in the development of new technologies.
More generally, futures researchers are interested in improving "the freedom and welfare of humankind".
It relies on a thorough quantitative and qualitative analysis of past and present technological trends, and attempts to rigorously extrapolate them into the future. Science fiction is often used as a source of ideas.[Futures research methodologies include survey research, modeling, statistical analysis, and computer simulations
History
Tools were initially developed by hominids through observation and trial and error.[
Around 2 Mya (million years ago), they learned to make the first stone tools by hammering flakes off a pebble, forming a sharp hand axe. This practice was refined 75 kya (thousand years ago) into pressure flaking, enabling much finer work.
The discovery of fire was described by Charles Darwin as "possibly the greatest ever made by man". Archaeological, dietary, and social evidence point to "continuous [human] fire-use" at least 1.5 Mya. Fire, fueled with wood and charcoal, allowed early humans to cook their food to increase its digestibility, improving its nutrient value and broadening the number of foods that could be eaten.
The cooking hypothesis proposes that the ability to cook promoted an increase in hominid brain size, though some researchers find the evidence inconclusive. Archaeological evidence of hearths was dated to 790 kya; researchers believe this is likely to have intensified human socialization and may have contributed to the emergence of language.
Around 2 Mya (million years ago), they learned to make the first stone tools by hammering flakes off a pebble, forming a sharp hand axe. This practice was refined 75 kya (thousand years ago) into pressure flaking, enabling much finer work.
The discovery of fire was described by Charles Darwin as "possibly the greatest ever made by man". Archaeological, dietary, and social evidence point to "continuous [human] fire-use" at least 1.5 Mya. Fire, fueled with wood and charcoal, allowed early humans to cook their food to increase its digestibility, improving its nutrient value and broadening the number of foods that could be eaten.
The cooking hypothesis proposes that the ability to cook promoted an increase in hominid brain size, though some researchers find the evidence inconclusive. Archaeological evidence of hearths was dated to 790 kya; researchers believe this is likely to have intensified human socialization and may have contributed to the emergence of language.
No comments:
Post a Comment