Add Me!Close Menu Navigation

Computers

Far too often, I hear people ask questions like “How am I suppose to know where to put ends for statement blocks?” or “Why can’t the computer just infer that I want this piece of code to do something else?”. These questions reflect something that the computer manufacturing industry have done quite successfully. They have succeeded in selling computers as smart general purpose platforms, which is an essential “feature” for about 99% of the computer users out there.

But programmers don’t usually work under these assumptions that a computer is smart. In order to fully understand programming, we need to start off with the fundamentals, that a computer is nothing more than a dumb calculator.

This is such a generic topic. Even then, I’ll try to keep this article as straightforward as possible.

1. First of all, what exactly is a computer?

Well, okay fine, this is actually a computer, in fact, it’s more than just a computer. But this is not what a programmer actually worries about (most of the time).

A computer is actually nothing more than a calculator. By definition, a computer is a complex system of non-computational units that works together to do… well, computation.

In this sense of the word, many objects qualifies that definition, for example, a rope with knots tied onto it, a living cell, even the entire universe itself can be seen as a gigantic computational machine. But here on earth, having no room for entire galaxies, we humans usually just hook up complex systems of transistors with wires or solder.

Disregarding the components of the computer that handles IO and memory, we’re just left with two pieces of engineering marvel: the ALU and the CPU.

At an abstract, the CPU directs the memory component to read instructions from memory and pass it to the CPU, the CPU then maps the instruction to its defined behavior and executes it (load a value onto the register, push a register onto the stack, jump to a different address, etc).

The ALU on the other hand, handles all computations, including bitwise operations as well as integer arithmetic. To do this, the ALU accepts the data (in the form of HI/LO current which enumerates the boolean values for 1 and 0 respectively) and pushes them through a series of logic gates or chips. Since the ALU expects integers, it usually have input pins (think of them as the arguments to a function) for at least 3*16 bits (on 32x systems, this is 3*32, and so on based on the atomic size of the operands). These composites of logic gates will then translate to simple arithmetic (for example, ADD is one XOR gate + carry for overflow, SUB also one XOR + borrow, etc). Every instruction is then just a bunch of electrons bouncing across multiple transistors.

Trivia: Every chip in a modern computer is atomically composed of only NAND logic gates.

So what does all of this have to do with programming? Well, it just illustrates that computers are basically dumb calculators. Even in higher level languages, it’s impossible for the computer itself to infer what the programmer wants. For example, given the following code in Lua:

function text()
    print "Hello World!"
    if 2+2 == 5 then
        return
    end
-- Why do I need another end?

The computer can’t tell that the programmer wants the function block to end after a return statement because we may actually be in an if block or a loop or maybe there wouldn’t be a return value. Since there are uses for embedding returns in loops, we must explicitly tell Lua that this function ends here. What this means is that for every statement block, there must be at one corresponding end.

At the same time, most of the low level interactions with the machine have also been hidden away from the user in higher level languages in order to simplify the experience. For example, the following code creates a contiguous space in memory that can be indexed as an array:

char* bytes = (char*)malloc(0xff); // creates an "array" 255 bytes long
bytes[1] = 0x00;
// We need to grow this array
bytes = (char*)realloc(bytes, 0x201);
bytes[0x200] = 0x00;

Whereas in a language such as Lua, we never need to worry about the size of our tables

table = {}
table[1] = 0x00
table[0x200] = 0x00 -- No need to manually reallocate memory because the concept of memory doesn't exist in Lua

With that said, a little bit of knowledge and common sense can go a long way.

Posted By Lee

Woosh

  • Artfan :)

    Your drawing skills are epic :D

Recent Comments