The term computing comes from french informatic, implemented by the engineer Philippe Dreyfus in the early 1960s. The word is, in turn, an acronym for information and automatic.

In this way, computing refers to the automatic information processing through electronics devices and computer systems. Computer systems must have the ability to perform three basic tasks: entry (information gathering), prosecution and Exit (transmission of results). The set of these three tasks is known as algorithm.


Computer science provides solutions in multiple areas.

Uses of computing

Computer science brings together many of the techniques that man has developed with the aim of enhancing his thinking, memory and communication skills. Its area of ​​application has no limits: computing is used in the business management, at information storage, at process control, in the communications, in the transportation, on the medicine and in many other sectors.

Informatics also covers the main fundamentals of Sciences of computing, such as programming for the development of software, the architecture of the computers and from hardware, networks like Internet and the artificial intelligence. It is even applied in various subjects of the electronics.

History and evolution

It is considered that the first fully automatic programmable machine in history was the computer Z3, designed by the German scientist Konrad zuse on 1941. This machine weighed 1000 kilograms and it took three seconds to perform a multiplication or division. The operations of addition or subtraction, on the other hand, took 0.7 seconds.


Hardware is an important part of computing.

The evolution of computing in recent decades is not as interesting as the one that its users have gone through, since many of them went from a state of lack of interest to one of absolute dependence on the technology. Needless to say, there are nuances in this story: using a computer or a mobile phone does not make us experts, but it is more than staying to one side of this phenomenon simply because of the lack of will to understand its potential.

Computing was a real revolution that began in the late 1970s with the computers but it gained more force during the 80s and exploded in the 90s. We are, therefore, facing a process that took about two decades to fully flourish, although its level of massification increased dramatically in the middle of the 21st century when it finally brought together the three crucial elements of computing: video games, the Internet and mobile phones. Yes, portable devices played a key role in the difficult task of convincing the most skeptical to break down their barriers and start enjoying computing.

The importance of computing

Any use that we make of a program to automate our activities can be included in the category of computing, regardless of which device it is on. Thanks to the insertion of mobile phones and, later, tablets, many people dared to take their first steps through email boxes, instant messaging and writing documents in digital format, to later move on to computers , particularly notebooks, and finally take advantage of the series of benefits that had been around for years.

In the workplace, computer skills can be key to accessing certain offers, although this is misrepresented by requiring different academic qualifications. This requirement is very common and thanks to it thousands of institutions of training They offer official certification after the courses. Computer science is the basis of almost all the tasks that are carried out in most companies, because it allows us to organize and control them in a more orderly and efficient way than pencil and paper.