Did Moore's law take a sabbatical in the last 3 years ?

d4005

Bronze Level Poster
When I built my current PC in March 2011, I went for the most powerful SSD's I could get (Samsung 830's) and the most powerful (but sensibly priced) processor I could get, an i7-2600K.

It's now 3 years and 4 months later, and the SSD's don't seem to have moved on. The configurator here at PCS is still offering V300 vs Samsung 830 vs Intel 530, the same choice I faced 3 years ago. OK they've reduced in price in a Moore's law kind of way, but where are the SSD's offering twice the performance for the same price I paid 3 years back? I'm happy that I can fill my new laptop with SSD's for half the price, but I'd rather be paying the same and getting twice the performance.

Same goes with processors to an extent. Three years ago the i7-2600K wasn't top of the tree, there were Xeons and other i7's above it, but not much more power could be had without spending 2x or even 5x the price. Ignoring the Xeons and the X models, the top of the range K processor is currently the i7-4930K, which only beats my i7-2600K by about 60%. I'd expect more after three and a half years. OK, now I'm writing this bit, the processor upgrade is actually pretty good, but the SSD situation still holds.

I don't know if I should be impressed or only just satisfied with the i7-4910MQ that I'm about to put into a PCS laptop. It's about 10-15% faster than my i7-2600K, but that is at a slower clock speed and mobile. I guess it's something.
 
Last edited:

keynes

Multiverse Poster
I don't know if I should be impressed or only just satisfied with the i7-4910MQ that I'm about to put into a PCS laptop. It's about 10-15% faster than my i7-2600K, but that is at a slower clock speed and mobile. I guess it's something.
what are you using the laptop for?
 

steaky360

Moderator
Moderator
I didn't think Moore's law took account of actual performance but rather simply numbers of transistors on a chip. To be honest I had kinda already thought this was accepted as either being left behind or soon to be left behind. In the sense of there will be a ceiling with respect to the numbers of transistors on a chip based on the diminishing returns having more is providing.

As you say, the newer processors are not necessarily performing all that much better than older ones and I suspect the same will be true of newer processors (Skylark for example).
 

d4005

Bronze Level Poster
what are you using the laptop for?
It'll be used for video editing, video conversion, building large software applications, and debugging on virtual machines. When I hit my machine hard it really knows about it. I'll quite often give it a batch process of converting a dozen 1080p videos while copy 50GB to a USB drive and having a virus scan sneak in while I'm not watching. It grinds a bit to a halt with the virus scan. I gotta schedule that better. I only recently learned about CUDA and utilizing the GPU or the onboard Intel graphics unit. That'll make a world of difference if I can figure out how do that in other graphics programs, that mediacoder seems a bit malwarey. I don't play demanding games, unless Candy Crush counts :surrender:
 

Wozza63

Biblical Poster
I don't think Moore's Law is being applied to consumer grade PCs anymore but I think it is still applying in other areas. Especially on mobile with the ARM architectures.

For the first time PCs have hit a point where they are more than powerful enough for 98% of users so the only thing they are concentrating on now is power efficiency and incremental steps in performance. Why spend billions in investment when you can just refresh an old chip for a few million?

The good thing is that we can keep our PCs for much longer without them becoming outdated and too slow. GPUs seem to be the only thing that require upgrading and even then it is only for a few games that use a lot of power
 

mantadog

Superhero Level Poster
From Wikipedia

Although this trend has continued for more than half a century, Moore's law should be considered an observation or conjecture and not a physical or natural law. Sources in 2005 expected it to continue until at least 2015 or 2020.[note 1][14] However, the 2010 update to the International Technology Roadmap for Semiconductors predicted that growth will slow at the end of 2013,[15] when transistor counts and densities are to double only every three years.

Moore's law aside, it's not really the number of transistors you can etch onto a given area of silicone that REALLY matters anymore. Power efficiency, heat output and (to a certain extent) lack of any real competition are holding back ultimate performance. The trend now is towards low power tablet style devices, do we really need much more powerful desktop class CPU's? I have my doubts to be honest.

With respect to SSD's you will find its not the manufacturers cant make SSD's go 2-3-4 X faster its that the SATA interface cant support it. If you look at some (incredibly expensive) PCIe SSD's (£30k mark) they can do 3GBp/s. Even more consumer level tech that fits into the PCIe slot on a desktop can get up to approaching 1GB/s for £300 ish, the problem is that you then need to use the PCIe lanes for it and that causes issues with installing your OS and just having an easy life in general. The new SATA standard just came out that supports 10Gb/s which is roughly equal to 1GB/s once you take into account overheads etc, so you should be able to buy SSD's that are about 75% faster and able to connect to the SATA interface some time soon.
 

Wozza63

Biblical Poster
With SATA standard it has two issues. The first is that SSDs are bottlenecked by SATAIII and would be bottlenecked by the SATAIV within months, but a standard hard drive only requires SATAII and isn't bottlenecked by it. So in 1 way its important and in another its unnecessary.
 

mdwh

Enthusiast
I don't think an effect of Moore's Law has ever been observed with hard disk speeds in the way that it has for CPU speeds. Certainly for memory speeds, these have lagged further and further behind CPU speeds, and is the reason why caches became so important ( http://gameprogrammingpatterns.com/data-locality.html ).

Though, do we actually have benchmarks comparing the speeds of SSDs from 3 years ago? I don't think you can go simply by the brand name like "V300", that's like saying "The fastest CPU I've been able to get hasn't changed from an i7, so things haven't improved".

E.g., see http://www.tomshardware.com/reviews/ssd-recommendation-benchmark,3269-7.html which shows various performances for "Samsung 840 EVO" depending on factors such as disk size.

mantadog makes a good point about interfaces. Hard disk speeds have tended to go up in bumps rather than a continuous curve, due to requiring advances in technology or new interfaces. I mean, SSDs themselves are a tremendous improvement in performance, which many PC users still don't take advantage of.

And what about capacity size? If you can now get more, that's improvement too :) (Especially for laptops, where many people still opt to have slower non-SSD hard disks to get more capacity.)

CPU performance improvements have slowed as you've noted. Partly Moore's Law on CPUs itself may have slowed. But also, it's harder to translate those extra transistors into performance. Moore's Law was never a freebie - CPUs got faster in the past because those engineers worked hard. This becomes a harder job as things get more complex, smaller, faster - I'd say it's pretty damn impressive that things continue to go faster and faster. 14% year on year growth, continually, for a mature technology, isn't bad!

And the reduction in size and power consumption is also an important improvement being made. This isn't new - it's a bit like complaining that the speed of room-sized computers has only advanced by 60% in 3.5 years, completely ignoring the improvement that a computer that sits on a desk could now have the performance of a room-sized computer. My 550g Asus tablet has an Intel Atom Z3740 that runs faster than the Intel Pentium E2160 that was in my desktop computer of some years ago ( http://cpubenchmark.net/cpu_lookup.php?cpu=Intel+Atom+Z3740+@+1.33GHz&id=2059 ) - I can carry a full sized PC in my hand!

"Why spend billions in investment when you can just refresh an old chip for a few million?"

Intel follow the tick-tock model ( http://en.wikipedia.org/wiki/Intel_Tick-Tock ) where each year to 18 months they alternately shrink the die, and introduce a new architecture. In some sense I'm sure each new architecture is a "refresh" in that they don't start again from scratch, but then this has been true all the way back to the 8086 - the x86 architecture has continually been improved rather than being an entirely new chip. Same with ARM. It's true there's less point for people to upgrade, with x86 PCs being faster enough for many users. It's been said for years that it's really games that drive the PC industry :)
 

d4005

Bronze Level Poster
Some great info in this thread. Thanks guys. I did mostly agree in my original post that CPUs are still charging along at a good rate, especially when we're not increasing our power demands as quickly as we did in the previous decade. I certainly wouldn't be looking to upgrade my 3.5 year old i7-2600K just for a basic performance upgrade like I've done in the past, this thing is still flying high.

My primary goal this time is to switch from big heavy box that isn't portable to a laptop. I tried a few hours on my old laptop, a Dell 17" thing from 2008 with a T7500 processor @ 2.2GHz but it was not a good situation, even though I cloned the hard drive to SSD a few years ago. I definitely need a completely new laptop.

I'm hoping that this new Optimus/Vortex laptop will have a good 5+ year life.

It looks like I might be buying at a bad time in the development timeline of SSD's, but I won't be talked out of putting 2x 1TB Samsung 840's in it.

So, I'm expecting (even though I'm going from desktop to laptop) to see a 10-15% CPU boost, and a general file system boost by having not only my initial 256GB drive be SSD, but all the drives be SSD. Should be good for performance, cooling, and quietness :)
 

Androcles

Rising Star
You may get more of a boost from the processor than you expect, the number that represents the speed isn't always the main factor in how much better a processor is, as new ranges of processor come out they make improvements to how its processes work, so a 3.4ghz processor from a previous generation is generally actually slower than a 3.4ghz processor from the current generation and future generations with the same clock speed will actually be better. it's not just clock speed wither, you need to look at lots of other aspects including but not limited to L3 Cache etc.

Moors law still applies (although much of it has been disproved and/or reduced over time), but the way it's measured keeps changing with each new generation of chips.
 

grimsbymatt

Enthusiast
I read that the latest i7s have pretty much the same cores as the 2600K in pretty much the same configuration. The main benefits of Haswell are reduced power consumption and better on-chip graphics. Or so I read.
 

mishra

Rising Star
The good thing is that we can keep our PCs for much longer without them becoming outdated and too slow. GPUs seem to be the only thing that require upgrading and even then it is only for a few games that use a lot of power

Totally agree. That is exactly how it seem nowadays. Yes, they introduce new CPU's but upgrading them as often seem like not a necessity any more. I am still using my "old" rig I build 4-5 years ago... only part I kept swapping was graphic cards so I can play modern games. Having said that, I think I have finally reached the point where getting better GPU is pointless as it will get bottlenecked by my ancient CPU! So I may invest in a new PC come next year, or so..

It's crazy.. in the past difference between 286 and 386 was like light years. Same happened to let's say Pentium I and Pentium II. Nowadays, it seem latest CPU and same equivalent from a year ago.. only tells difference in benchmarks.
 

Wozza63

Biblical Poster
Totally agree. That is exactly how it seem nowadays. Yes, they introduce new CPU's but upgrading them as often seem like not a necessity any more. I am still using my "old" rig I build 4-5 years ago... only part I kept swapping was graphic cards so I can play modern games. Having said that, I think I have finally reached the point where getting better GPU is pointless as it will get bottlenecked by my ancient CPU! So I may invest in a new PC come next year, or so..

It's crazy.. in the past difference between 286 and 386 was like light years. Same happened to let's say Pentium I and Pentium II. Nowadays, it seem latest CPU and same equivalent from a year ago.. only tells difference in benchmarks.

Your CPU is still perfectly fine at that clock speed. I'd use Game Booster though for a bit of extra juice and 8GB RAM at some point :D

The reason I bought an 8 core CPU was so that in the future the only thing that needs upgrading for years to come is the graphics card and a single 570 is still pretty powerful and perfectly capable of medium-high gaming, but I am going to upgrade eventually if Nvidia ever release the 800 series for desktops, I need the VRAM more than anything to run multiple monitors smoothly (not necessarily for gaming)
 
Last edited:

mdwh

Enthusiast
It's crazy.. in the past difference between 286 and 386 was like light years. Same happened to let's say Pentium I and Pentium II. Nowadays, it seem latest CPU and same equivalent from a year ago.. only tells difference in benchmarks.
Although note that those CPUs had more than a year between them (286 was released in 1982, the 386 in 1985; Pentium in 1993, Pentium II in 1997), so you should be comparing the latest to 3-4 years ago rather than a year ago. Though true, the gap still feels less.
 

Wozza63

Biblical Poster
Although note that those CPUs had more than a year between them (286 was released in 1982, the 386 in 1985; Pentium in 1993, Pentium II in 1997), so you should be comparing the latest to 3-4 years ago rather than a year ago. Though true, the gap still feels less.

Well the Sandy Bridge CPUs (i5-2500, i7-2600 etc) were released at the beginning of 2011 and the new Devils canyon CPUs are just being released, thats 3.5 years and not much change at all and AMD have barely improved at all at the high end since the bulldozer range, although their APUs are becoming more and more impressive with the amount of GPU and CPU power inside them.

But do we really need the performance upgrades for CPUs? Not really. Have you ever seen your CPU maxed out 100% in anything other than a benchmark? I can't think of a time where I have, so I think the next step definitely is to reduce the power consumption and TDP by a large amount so that within a few years these processors can be in our portable devices such as tablets and phones.
 

d4005

Bronze Level Poster
Well the Sandy Bridge CPUs (i5-2500, i7-2600 etc) were released at the beginning of 2011 and the new Devils canyon CPUs are just being released, thats 3.5 years and not much change at all.
Yeah, I'd say I was lucky to upgrade last time right at the beginning of the era of the awesome i7-2600K, but it wasn't luck. There were reviews of it months before and I held out until I could get one. It seemed like forever.

But do we really need the performance upgrades for CPUs? Not really. Have you ever seen your CPU maxed out 100% in anything other than a benchmark? I can't think of a time where I have, so I think the next step definitely is to reduce the power consumption and TDP by a large amount so that within a few years these processors can be in our portable devices such as tablets and phones.
The only time I max my i7-2600K on my desktop is doing video conversions, where no matter what CPU you're running, it's gonna max out and finish the job earlier. I think once I learn about using hardware-assisted rendering (and CUDA's) then I can convert 4x quicker by making the graphics card do all the work. If I were to upgrade my desktop, I'd just dump the HDD and replace it with a 1TB SSD and maybe get a better graphics card. The CPU is still well up there in performance terms.
 

GeorgeHillier

Prolific Poster
I would personally get a lower clocked CPU and a better GPU, looking at your signature. The tiny extra increase in clock speed won't do a huge amount and will make it run even hotter, that money could be spent on a better GPU. And do you really need 2x1tb SSDs? That seems super overkill, I would never buy about a 250gb SSD until prices are lower, never mind 2x1tb SSDs, that must take up a good few hundred pounds of your budget!

And with regards to Moore's law, it doesn't really apply any more to consumer grade products as they're powerful enough and are more focused on energy efficiency and stuff like that at the moment. I think Moore's law because less relevant a few years back, because once you have a certain amount of transistors you can't fit any more in without it overheating, no matter what you do.
 

d4005

Bronze Level Poster
I would personally get a lower clocked CPU and a better GPU, looking at your signature. The tiny extra increase in clock speed won't do a huge amount and will make it run even hotter, that money could be spent on a better GPU. And do you really need 2x1tb SSDs? That seems super overkill, I would never buy about a 250gb SSD until prices are lower, never mind 2x1tb SSDs, that must take up a good few hundred pounds of your budget!
I don't think I need a better GPU. I really don't run anything too taxing for the display. No 3D rendering, or demanding games. As for the 2x1TB SSD, it's true, I could continue with a 256GB SSD and 2TB HDD, but I've been doing that for the last 3 years and the HDD just drags the performance of the whole thing down. I'm sick of it. Comparing SSD prices now to how they were 3-4 years ago, I already think they're cheap enough now. I'll be using this new laptop at least 12 hours a day 5 days a week and don't want anything in there that's slowing me down, and HDD's most definitely were.

As this will be a business expense (being self employed in the software field) the taxman effectlvely pays for half of it anyway [rollinglaugh]
 

keynes

Multiverse Poster
As this will be a business expense (being self employed in the software field) the taxman effectlvely pays for half of it anyway
You can claim it as expenses to pay less taxes on your profits but how is that halving the price?
 

d4005

Bronze Level Poster
You can claim it as expenses to pay less taxes on your profits but how is that halving the price?
Example 1: Person working as a regular employee earning 100,000 and paying tax at 40%. They've got 60,000 net income and if they want to buy a computer for 2,000 net price, 2,400 with VAT, then their net income remaining after buying that PC is 60,000 - 2,400 = 57,600.

Example 2: Self-employed person earning 100,000 and paying tax at 40% and having no other expenses they can offset against taxes, also has 60,000 net income and 40,000 they have to pay to the Inland Revenue. Now if this person wants to buy a 2,400 computer, first of all they can claim the VAT back. Then, as it's a business expense, it reduces the amount of income to be taxed from 100,000 to 98,000, so there's less tax to be paid. The end result being that 1,200 of the price of the PC is being paid out of net income, the other 800 (40%) is being paid out of the 40k that was destined to be for the IR, and the VAT we already got back. Net result, Self-employed person gets 60,000 - 1,200 = 58,800 net and has paid only half the price of the computer. The other 40% of the VAT-free amount was going to the IR but instead went to the PC provider.

That's pretty much how it works. The VAT claimed back is of course added to the taxable amount for the year, but it almost works out at only paying half.
 
Last edited:
Top