The computer was designed by its inventors as a universal machine. That is a machine able to execute any process, as soon as this process can be described.
This is a very peculiar property. If you can exhaustively describe a system on a sheet of paper, this system can be embodied, implemented, in a computer.
The system description is called an algorithm. Its implementation in a computer is a computer program. A program can be started, in which case the system functions. The system is stopped by stopping the program.
Algorithms can be written in any langage. But to be able to be understood by a computer, it must be expressed in a computer langage. There are many computer langages (Fortran, Pascal, Lisp, C, C++, java, DHTML, basic, lingo, etc…). These are textual langages, made of words. There are also visual langages, made out of a graphical vocabulary, like Max, NeMo or BigEye.
Computer programs are also called applications, software, etc…The browser with wich you are reading this text is a program, that contains a large number of algorithms. These algorithms have been invented by theoricians and implemented by programmers (who are often one and the same person).
An algorithm consists in rules, operations, memory, and usually in computers, inputs and outputs. The operations transform the values of the memories, the inputs and the outputs. The rules define which operations to execute depending on conditions on values of inputs and memories.
Algorithms are fundamentally deterministic and do not know randomness. But randomness can be simulated, or the programme can branch on external reality in order to introduce randomness inside the algorithm.
A program is thus a dynamic object, which is not the case for an image, a sound, a movie or a sculpture. A program acts and reacts, according to internal states or relatively to its environmen: it has a behavior. This dynamic property along with the universal malleability of the algorithm are the reason why the computer is a radically new medium of creation and expression.