As other people have been disproving things, I feel I need to do something, so:
Just to be picky, the first semi-programmable digital computer (ie, an electronic box which took in data and spat out different data), the colossus, wasn't built until 1943. It took up at least one room. Ten were made. Eventually changes in technology (ie transistors etc.) shrank the technology, yet even today, some people still need larger computers than colossus (eg supercomputers or servers). As the technology allows shrinkage the demands go up, so the type of computer only ten of would be needed would be at least the same size. The same happens with actual computers for real people. When it came out, Crysis was more than anything could handle well, yet today it can be run at ridiculously high settings at very high FPS on not much more than mediocre hardware. Despite this, Battlefield 3 won't run at those framerates on its highest settings, because it isn't designed to go on the computers of Crysis' time, but its own time. In a short while, laptops will be able to do it as well as today's best desktop, but there will be some other game/emulator which won't run perfectly on anything but the best hardware. Eventually the gap will shrink between desktop and laptop (eg graphine may make it possible to superheat a CPU without breaking it, in which case, a laptop won't have to dissipate 2500W), but until a cheap laptop can calculate the location of every atom, photon and neutrino in the universe in real time from scanning a piece of cake, there will be things they can't do so well.
Quote:the century old "Desktop Experience"
Just to be picky, the first semi-programmable digital computer (ie, an electronic box which took in data and spat out different data), the colossus, wasn't built until 1943. It took up at least one room. Ten were made. Eventually changes in technology (ie transistors etc.) shrank the technology, yet even today, some people still need larger computers than colossus (eg supercomputers or servers). As the technology allows shrinkage the demands go up, so the type of computer only ten of would be needed would be at least the same size. The same happens with actual computers for real people. When it came out, Crysis was more than anything could handle well, yet today it can be run at ridiculously high settings at very high FPS on not much more than mediocre hardware. Despite this, Battlefield 3 won't run at those framerates on its highest settings, because it isn't designed to go on the computers of Crysis' time, but its own time. In a short while, laptops will be able to do it as well as today's best desktop, but there will be some other game/emulator which won't run perfectly on anything but the best hardware. Eventually the gap will shrink between desktop and laptop (eg graphine may make it possible to superheat a CPU without breaking it, in which case, a laptop won't have to dissipate 2500W), but until a cheap laptop can calculate the location of every atom, photon and neutrino in the universe in real time from scanning a piece of cake, there will be things they can't do so well.
OS: Windows 10 64 bit Professional
CPU: AMD Ryzen 5900X
RAM: 16GB
GPU: Radeon Vega 56
CPU: AMD Ryzen 5900X
RAM: 16GB
GPU: Radeon Vega 56
