The trick the computer uses in order to be so productive is to divide its attention between a number of tasks – and for this it uses interrupts. But what exactly is an interrupt and how should programmers think about this essentially hardware idea?
Hashing is arguably one of the great ideas of computing and it has hidden depths. Extensible hashing and perfect hashing are ideas that are worth exploring.
The Kinect is currently the hardware that provides developers with the greatest opportunities for innovative programs - both games and "serious" artificial applications. How does it work? How do you use it? What can you use it for?
The search for intelligent machines started long before the computer was invented and AI has many different strands. So many that it can be difficult to see what it is trying to do or what it is for. We already have an easy way to create intelligent beings from scratch why do we need another one?
The sort of instructions that most computers recognize are too simple for humans to be bothered with - and so we invented assembly language. Find out how it works and how it started the whole movement to abstract away from the computer's hardware.
Binary arithmetic is easy, so easy a computer can do it, but what about negative numbers? This is altogether more tricky and isn't just a matter of putting a negative sign in front of the number - although that is one way to do it..
Do you know binary? There are only 10 possible answers and even if it's a 1 it's still fun to consider the wider concepts.
You don't often encounter the BIOS any more but when you do it is usually something very messy and unpleasant. What is the BIOS and why do we need it?
Buses are everywhere and yes when you are looking for one they tend to come in threes! With that joke out of the way, let’s take a look at what a bus is in general and in particular.
The caching principle is very general but it is best known for its use in speeding up the CPU. We take a look a the basics of cache memory, how it works and what governs how big it needs to be to do its job.
You may know about Cellular Automata. If not you may have come across them in John Conway's game of Life, but why is this whole subject so interesting? We take a look at not only what a CA is, but why it is so important.
Theories of how we should organize databases are thin on the ground. The one exception is the work of E.F. Codd, the originator of the commandment-like "Codd’s Rules". This approach to database has been codified into SQL - Structured Query Language - and so into most of the databases on the planet, despite what the NoSQL movement might want you to think. So what are Codd's Rules and what is a relational database?
A software Easter Egg is an intentionally hidden novelty or message concealed for personal reasons within a computer program or application. We take a look at its history and original motivation and see how things changed when Googlers expanded the tradition.
The distinction between a compiler and an interpreter is one that can cause controversy. One programmer's compiler is another's interpreter and the whole subject gets very murky when you throw in the idea of the Virtual Machine and Just In Time compilation. So what is it all about?
A lightning guide to the basic ideas of computational complexity without the maths or the proofs. It's almost more fun than programming!
Copy protection, or Digital Rights Management (DRM) in general, is something that in most cases users hate and the entertainment industry really likes.
One of the most important lossless forms of compression is the LZW dictionary based method. It turns up in lots of compression utilities - ZIP, Compress, Deflate and in GIF and PNG format files. It is also an important idea in programming and you really do need to know something about how it works - if only to avoid reinventing it from scratch.
Classic data structures produce classic tutorials. In this edition of Babbage's Bag we investigate the advanced ecology of trees - perfectly balanced trees, AVL trees and B-Trees.
Part II of our look at data takes us into more sophisticated structures that are fundamental to computing - stacks, queues, deques and trees. If you don't know about these four then you are going to find programming tough and you will have to reinvent the wheel to solve otherwise simple problems.
Date and times follow their own regularities, and they have nothing at all to do with binary, or even simple decimal, counting. First, clock and watch makers had to find ways of working with hours, minutes, seconds; and then programmers had to find ways that were much simpler. Join us on a quick tour of the time and date system and how it can be mastered using the mod function.