When you’re strapping on the latest smart watch or ogling an iPhone, you probably aren’t thinking of Moore’s Law, which for 50 years has been used as a blueprint to make computers smaller, cheaper and faster.

Without Moore’s Law it’s quite possible that new types of computers like Microsoft’s HoloLens, a holographic wearable with which users can interact with floating images, would not have been developed. For decades, Moore’s Law has been a guiding star for the development of modern electronics, though in recent years its relevance has been subject to debate.

Moore’s Law isn’t a scientific theory, but a set of observations and predictions made by Intel co-founder Gordon Moore in an article [click here to download] first published in Electronics Magazine on April 19, 1965, which were subsequently modified. His core prediction states that the density of transistors, or the number of transistors on a given die area, would double every two years, which leads to double the performance. Loosely translated, that means in 18 to 24 months you could buy a computer that is significantly faster than what you have today with the same amount of money.

To read this article in full or to leave a comment, please click here