Jump to content
EwinRacing Flash Series Gaming Chairs
sergioosh

Y580 pushing to the limits!

Recommended Posts

Hi guys,

I've seen Y580 with 3610qm and 3630qm processors. Both are 45W processors. Does that mean that y580 supports 45W processors? If yes, then would it be possible to use even faster CPUs like 3720qm, 3740qm, 3820qm and 3840qm (and use their full power)? All of them are also 45W. 3920xm and 3940xm are probably out of question because they're 55W... and bloody expensive. For now let's assume there's no HDD overheating issues :)

I'm willing to take the risk provided you don't know of any obstacles...

Share this post


Link to post
Share on other sites

I know the Y580 uses a one-fan cooling design like the Y500, and if its thermal performance is anything like it, then I'd be worried about putting in a hotter and more powerful CPU. The Y500 already has trouble keeping the temperature of the 3630QM in check when running CPU-heavy games and benchmarks with full Turbo Boost. It can easily go past 90C and even 100C (thermal shutdown is at 105C), which is why many owners run it with Turbo Boost disabled. It seems like a big waste if you spend a few hundred for a faster CPU only to find that it throttles or can't reach the advertised Turbo bins, making it no faster than what you had previously. But if the Y580 has a better cooling system that can handle the heat of a higher-clocked CPU, I guess there's nothing preventing one from doing the upgrade.

Share this post


Link to post
Share on other sites

What if I tell you I currently have i5 3210m? :)

I'm willing to take this risk as long as there won't be any surprises like I put it in, start it up, and it won't work straight away...

Share this post


Link to post
Share on other sites

I bought my Y400 manufacturer refurbished, and they were selling other Y400 with the processors you mention, covered by full warranty. I reckon it's OK, seeing that TDP is more a specification on the system fans than anything else.

Share this post


Link to post
Share on other sites
What if I tell you I currently have i5 3210m? :)

I'm willing to take this risk as long as there won't be any surprises like I put it in, start it up, and it won't work straight away...

In that case, I'd do some research on your Y580 motherboard specifications to see if it can actually run 45 W CPU's at full speed. The Y500 SKU's with i5 and low power i7-3632QM actually have a different motherboard which can only supply up to 35W. When a 45W i7 is used, it will throttle to match the speed of, at most, the 3632QM. So if you installed, for example, a 3840QM, you would be losing 700 MHz and only getting the performance of the much-cheaper 3632QM. I don't know if the Y580 works this way also, but it would definitely serve you well to find out.

I bought my Y400 manufacturer refurbished, and they were selling other Y400 with the processors you mention, covered by full warranty. I reckon it's OK, seeing that TDP is more a specification on the system fans than anything else.

That's not completely true in the case of the Y400/Y500. See above.

Share this post


Link to post
Share on other sites
I used to have a eMachines which had what you're talking about; running an i3-350M on 25W instead of 35W. CPU-z had no problem reporting this though, and TDP was listed as 25W. CPU-z lists 45W for my machine, which lead me to my conclusion. Also, keep in mind that a significant portion of the heat will be from the 650M, which probably draws more than the i7 under heavy GPGPU/GPU applications, meaning the fans will have to channel significantly more. Looking at the MB design, CPU and GPU have same number of heatpipes, so I reckon 45W heat transfer from CPU is more than doable.
  • Thumbs Up 1

Share this post


Link to post
Share on other sites

I got quite a good deal on 3840qm so I went with it. I cleaned everything up and repasted both CPU and GPU with Noctua NT-H1. Under torture test by Prime95 the temerature hit 76 degrees Celsius and stayed there. The CPU was constantly on 3.3GHz using all cores.

One interesting thing is that when I tested the memory throughput with winsat it went from 16GB/s to 24,5GB/S. If this tool is accurate this is a huge difference.

I'm not sure how to test if the computer is using 100% of the processor... I have tried running one worker of Prime95 on one logical core and the CPU clock was jumping between 3.3 to 3.8 in CPU-Z. Any tips?

EDIT:

On Idle it's 41-43 degrees celsius. If I make some space below the laptop it's between 39 and 41. I never checked how much was it with i5, so can't really compare.

I'd love to test it with some CPU-heavy games, but unfortunately my Nvidia GPU seems to be dying (unrelated to CPU swap... happened after I bought CPU but way before I installed it)... I ALMOST had the fastest Y580 possible :/

  • Thumbs Up 1

Share this post


Link to post
Share on other sites

Few results. GPU was set on 1250/3000/1.125V

Passmark 3721

PassMark Software - Display Baseline ID# 115444

3D Mark 11 3327

NVIDIA GeForce GTX 660M video card benchmark result - Intel Core i7-3840QM,LENOVO Product Name

Obviously I see no difference in computers responsiveness or gaming. Seems like this CPU is total overkill with this GPU. On the other hand it's working perfectly fine with no overheating issues. The highest temperature I ever reached was 84 on core1. Generally this core is hotter by few degrees than the other. Does anybody have temperature differences between cores? The highest temperature difference under load I noted was 10 degrees! 6,7 degrees average. Maybe air bubble over this core?

BTW if you ever consider buying cooling pad:

- there's no temperature difference on idle

- under load there is a temperature difference of few degrees when you put the laptop on the cooling pad, but there's little to no difference in temperature when you put the fan on or off. Conclusion: raising your laptop above desk already helps!

- there's a very significant CPU temperature difference if you put laptop on the cooling pad with fan(s) on WITHOUT the bottom cover. Well, see the graph below. The * means there was some throttling so the results aren't very trustworthy.

- It also helps to cool down the HDD, which as we now runs hot in Y580 under load. I can't find the screenshot I took but I remember the HDD temperature dropped from over 50 to little over 40.

post-15132-14494995893864_thumb.jpg

Share this post


Link to post
Share on other sites

What games do you play? If you play minesweeper or solitaire, sure, you'll see no improvement, but if you play BF3 or some other CPU-heavy games you should see notable improvement, especially in minimum FPS.

Share this post


Link to post
Share on other sites

I do not play many games recently (work, work...). I only tried Hitman: Absolution so far and didn't see much difference. There was a noticable difference when I used the overvoltaged BIOS and overclocked GPU to 1250/3000.. Also wanted to try Stacraft 2 but Optimus didn't let me use GeForce O.o' Already sorted it out, but didn't have time to test it.

What other games are CPU-heavy? GTA4 maybe?

Share this post


Link to post
Share on other sites

A lot of modern games are CPU-heavy. BF3, Skyrim, Arma II, Far Cry 3, Crysis 3, Borderlands 2 and most UE3 games, Total War: Shogun 2, Civ 5, PlanetSide 2, Assassin's Creed 3, Guild Wars II, Flight Simulator X, SimCity 2013, I could go on and on. BF3 64-player multiplayer is the most brutal of the bunch, the only game I've played so far that easily uses up 50-60% of my i7 on a regular basis. A dual-core just doesn't cut it nowadays for cutting-edge games, especially something with a relatively low clockspeed like the mobile i5.

Share this post


Link to post
Share on other sites

@octiceps I saw a technical test made by some guy on some techforum with the mobile processors, and the weird thing was that the i5 was on par with i7 on most of the games (like BF etc). Why? His explanation was that most of the games are still set to use full power using only 2 cores instead of 4 so basically when he used the i7 some games weren't optimized to use the full power of quad, and they were running on duals of the i7. Ofc there are games like metro who suck up all the juice from your notebook but some of them still work better with i5, less heat generated, less energy power used.

Share this post


Link to post
Share on other sites
@octiceps I saw a technical test made by some guy on some techforum with the mobile processors, and the weird thing was that the i5 was on par with i7 on most of the games (like BF etc). Why? His explanation was that most of the games are still set to use full power using only 2 cores instead of 4 so basically when he used the i7 some games weren't optimized to use the full power of quad, and they were running on duals of the i7. Ofc there are games like metro who suck up all the juice from your notebook but some of them still work better with i5, less heat generated, less energy power used.

That's a lie, unless he was playing BF3 singleplayer, which does not care whether you have a dual-core or a hexa-core. BF3 multiplayer, on the other hand, is the most CPU-intensive game on the market and can easily utilize up to 8 threads. It's one of the very few games, Skyrim and Borderlands 2 are like this too, that actually see considerable improvement with the increased cores and cache sizes of the Intel hexa-cores if you've got a high-end GPU setup. Take a look at this screenshot and then tell me again that a dual-core is enough for BF3. The game is using almost 70% of my i7 and all 8 threads are loaded considerably. If this were an i5, it would be maxing it out. Furthermore, you can see that even my i7 is not enough for this game as evidenced by my low FPS and low GPU usage, which is almost equal to my CPU usage, clearly indicating a CPU bottleneck.

WBBQCMi.jpg

Metro 2033 is almost entirely GPU-limited and a fast dual-core, something like a desktop i3 or equivalent, wouldn't be a bottleneck in it. Last Light is more demanding CPU-wise and a quad-core would increase FPS a little more over a dual-core, but not to the extent that it would in BF3.

Share this post


Link to post
Share on other sites

I'm limited to the games I own, and that's none of those you listed :) I did play Skyrim on my Y580, which I borrowed from my cousin, but I don't have it anymore. Well anyway Skyrim was very fluent on maxxed settings on my i5 (3210), so I guess it isn't that heavy on CPU. I also played Far Cry 3 on my laptop and from what I remember there was a room for improvement... I'll put it on my wishlist on Steam :)

I see there is a 48-hour trial of BF3. I'll download it and check it out during weekend. We'll see how it fares.

My guess is that GPU will be a bottleneck in my case, but there probably won't be much difference in FPS between single and multiplayer. Then again I'm not sure this trial includes singleplayer.

Just when I fixed the problem with Optimus to test SC2, my mouse died... I think having a game with hundreds of units (like Nexus wars) will also stress the CPU significantly. Now it'll have to wait till I get a new mouse :/

Share this post


Link to post
Share on other sites
I'm limited to the games I own, and that's none of those you listed :) I did play Skyrim on my Y580, which I borrowed from my cousin, but I don't have it anymore. Well anyway Skyrim was very fluent on maxxed settings on my i5 (3210), so I guess it isn't that heavy on CPU. I also played Far Cry 3 on my laptop and from what I remember there was a room for improvement... I'll put it on my wishlist on Steam :)

I see there is a 48-hour trial of BF3. I'll download it and check it out during weekend. We'll see how it fares.

My guess is that GPU will be a bottleneck in my case, but there probably won't be much difference in FPS between single and multiplayer. Then again I'm not sure this trial includes singleplayer.

Just when I fixed the problem with Optimus to test SC2, my mouse died... I think having a game with hundreds of units (like Nexus wars) will also stress the CPU significantly. Now it'll have to wait till I get a new mouse :/

Share this post


Link to post
Share on other sites

I don't get it- then what is this: https://onlinepass.battlefield.com/index.php ?

Does that mean when you purchase BF3 you only get 48hrs of multiplayer and it's subscription after that??

How come your memory is only 2250? Doesn't GT650 also have GDDR5??

I've never heard of any load on CPU because of use of SLI. Well, that depends if there's an actual SLI bridge between those two cards. I highly doubt there isn't one.

Share this post


Link to post
Share on other sites
I don't get it- then what is this: https://onlinepass.battlefield.com/index.php ?

Does that mean when you purchase BF3 you only get 48hrs of multiplayer and it's subscription after that??

How come your memory is only 2250? Doesn't GT650 also have GDDR5??

I've never heard of any load on CPU because of use of SLI. Well, that depends if there's an actual SLI bridge between those two cards. I highly doubt there isn't one.

Share this post


Link to post
Share on other sites
That's a lie, unless he was playing BF3 singleplayer, which does not care whether you have a dual-core or a hexa-core. BF3 multiplayer, on the other hand, is the most CPU-intensive game on the market and can easily utilize up to 8 threads. It's one of the very few games, Skyrim and Borderlands 2 are like this too, that actually see considerable improvement with the increased cores and cache sizes of the Intel hexa-cores if you've got a high-end GPU setup. Take a look at this screenshot and then tell me again that a dual-core is enough for BF3. The game is using almost 70% of my i7 and all 8 threads are loaded considerably. If this were an i5, it would be maxing it out. Furthermore, you can see that even my i7 is not enough for this game as evidenced by my low FPS and low GPU usage, which is almost equal to my CPU usage, clearly indicating a CPU bottleneck.

Metro 2033 is almost entirely GPU-limited and a fast dual-core, something like a desktop i3 or equivalent, wouldn't be a bottleneck in it. Last Light is more demanding CPU-wise and a quad-core would increase FPS a little more over a dual-core, but not to the extent that it would in BF3.

I understand what you're saying, I was just pointing out the test that guy made and there were replies that he was indeed right. Maybe the games were singleplayer, anyway I don't play a lot of games, only dota2 and CS:GO and both are fluent at max. I have the i5 3230M. So far so good. I choose the i5 instead of i7 not because of the price, but because of the less heat produced by the i5 in comparison to the i7. The energy required to power i5 is also lower. On another note they wouldn't create a notebook with SLI and i5 if it wasn't capable of running the SLI in full power. Before I bought this laptop I intensively watched a lot of reviews both pro and amator on tech sites/youtube etc and going for the i5 doesn't really put you that far behind.

Share this post


Link to post
Share on other sites

@octiceps Wow that really is utterly inexcusable this thing about BF3. Let's hope all this PS4 business will turn out as good as it seems to be.

that article clearly says the CPU has to be capable to handle two times more GPUs. How is that different than a single, 2 times more powerful GPU? What I mean is that there's no additional load on the CPU because the system uses SLI. It's just that the system has more powerful GPU in general, no matter if it's single more powerful GPU or two less powerful GPUs.

Well anyway I think it will be impossible for me to every be bottlenecked by CPU now :)

@Florin I partially disagree with you. Max heat output of i7 is greater, but it doesn't mean it will consume more energy in general. There are few points to consider:

- faster processor will complete its task quicker

- i7 has better performance per watt ratio

- of this I'm not entirely sure, but from when I was using my i5 I think it didn't scale as well as my current CPU. I can do everything except gaming at 4x1.2GHz...

On the other hand it's true that i5 is in general more than enough for pretty much everything you throw at it. Especially if you have single 650m or 660m. Also, on peak loads it will generate less heat, so you're less likely to encounter problems like I do with HDD running at excessive temperature.

Still, the name of this topic is "pushing to the limits", so it's more about i7 here than i5 :)

Share this post


Link to post
Share on other sites
I understand what you're saying, I was just pointing out the test that guy made and there were replies that he was indeed right. Maybe the games were singleplayer, anyway I don't play a lot of games, only dota2 and CS:GO and both are fluent at max. I have the i5 3230M. So far so good. I choose the i5 instead of i7 not because of the price, but because of the less heat produced by the i5 in comparison to the i7. The energy required to power i5 is also lower. On another note they wouldn't create a notebook with SLI and i5 if it wasn't capable of running the SLI in full power. Before I bought this laptop I intensively watched a lot of reviews both pro and amator on tech sites/youtube etc and going for the i5 doesn't really put you that far behind.

Share this post


Link to post
Share on other sites

@octiceps I will leave it like that, I don't want to create polemics, because I do not agree with you.Being on par with the OP and the thread, guys when do you know that you need to raise the voltage? What is the method to find the right balance.

Share this post


Link to post
Share on other sites
@octiceps I will leave it like that, I don't want to create polemics, because I do not agree with you.Being on par with the OP and the thread, guys when do you know that you need to raise the voltage? What is the method to find the right balance.

What's wrong with having an educated, civilized discussion? At no point did I resort to personal attacks. As I understand it, you don't agree with me because the games you play are relatively non-demanding so you don't think that there's a point to having an i7, whereas I disagree because some of the games I play could use more CPU power and I've got empirical evidence to back it up.

The basic premise of overvolting is to stabilize an unstable overclock by adding more voltage. Once you're stable, increase the core clock speed more until you crash again, then increase the voltage once more. Rinse and repeat until you hit a limit, which is usually temperatures. Keep in mind that the offset on the voltage slider in Nvidia Inspector is incorrect. The base starts at +112.5 mV, so that is where 0 is. The next increment up, +125 mV, adds 12.5 mV to stock, and so on and so forth.

When I did my overvolting, what I did was start with my highest overclock on stock voltage, 1120/2250, and increase the core by 10 MHz each time while stress testing with Unigine Heaven on max settings. Each time I crashed, I added another 12.5 mV voltage increment in Inspector and retested. If it passed, I added 10 Mhz again and kept going. I stopped when I got to 1260/2250 @ 1.137 V because the GPU started reaching throttling temperatures and I didn't want to add any more voltage for fear of damaging anything. Basic physics says that adding voltage increases heat drastically, and that's what happened to me.

Share this post


Link to post
Share on other sites

There is no point upgrading your CPU. I don't notice any difference even going from i5 to i7 aside in ArmA 2 & 3 and PCSX2.

And PlanetSide 2 also runs fine on a i5 here.

Save your money to buy a new laptop. Really, save yourself the trouble. I won't upgrade my CPU ever again.

  • Thumbs Up 1

Share this post


Link to post
Share on other sites

Ok it seems I will have an opportunity to test BF3 after all. EA just released it as a part of HumbleBundle! I don't want this post to sound like an ad, so look up details yourself :) there's still 9 days left.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

  • Similar Content

    • By rusTORK
      When my first CPU fan on Y500 is broke i replaced it on something compatible (non-original) and planned to replace it later on original. Since that time happened a lot things, but i still using that non-original fan. But i did some research and here is what i founded:
       
      I. ORIGINALS
       
      1. Forcecon (marked as FCN on fan)
      DFS5413005MH0T-FC1C (5V - 0.5A)
      Spotted in: Y400, Y500.

       
      2. Asia Vital Components (marked as AVC on fan)
      BNTA0612R5H-P007 (5V - 0.5A)
      Spotted in: Y410p, Y500, Y510p.

       
      3. Sunon (marked as SUNON MagLev on fan)
      MG60120V1-C230-S99 (5V - 2.25W (0.45A))
      Spotted in: Y410p, Y500.

      So, which to choose? Any recomendation? Maybe there also unexpected way (completely diffirent fan from another company for another laptop)?
       
       
    • By Bivoca
      Hello All. I have a question:
       
      I need to repaste my laptop. To clean the old thermal paste, i went to the pharmacy and bought an acohol solution with 78% alcoohol (highest content they had).
       
      However, the same solution is also 2% salicilic acid. There are no other components in the solution (except 20% water). Can i usr it to clean the CPU anyway?
       
      Thanks!
    • By Aladin68
      Have Clevo P170SM and working music and video production. Processor: Intel core i7 4800mq 2.7GHz with turbo boost up to 3.7GHz x8 (47W), Graphic: Radeon R9 M290X 256BIT 4GB DDR5 (100W) + Intel HD 4600, Memory : 24GB DDR3 1600MHz, SSD 3x Samsung Evo 1tb,  and interested for upgrades, so if somebody have experience with GPU NVidia GTX 1070M 8GB (150W): https://www.digitaltrends.com/computing/gtx-10-series-mobile-reveal/ and CPU AMD Ryzen 7 1700 (65W) : http://www.trustedreviews.com/amd-ryzen-7-1700-review. Of course I will have to change memory with DDR4-2400, but question is, will the 230 watt power adapter be enough and should I expect some more problems (except money) to do that. Thank's in advance.
    • By Greku
      Witam,
       
      właśnie pracuję nad tym aby ulepszyć chłodzenie GPU i CPU w swoim lapku.
      Plan jest prosty, zakupić heatpipe'y i radiatory w pełni miedziane odlewane i na razie plan przetestować złożenie takiego zestawu z pomocą kleju termo i drugi lutowany. Wszystko przystosować frezarką aby całość pasowała pod oryginał. będę próbował jeszcze dołożyć po jednym dodatkowym heatpipe do każdego układu. Generalnie laptop jest po wymianie pasty na standardową i temperatury wynoszą dla CPU 48 i dla GPU 44 stopnie C. Zaraz będę wymieniać pastę na miedzianą AG Thermal Grease. Potem pójdzie w ruch Thermal Grizzly Kryonaut i też podam wyniki i oczywiście po całej procedurze wymiany układu chłodzenia. Podam również wszystkie koszty i umieszczę zdjęcia z tworzenie i wymiany układu dla chętnych zwiększyć możliwości lapka.
       
      Jak macie jakieś pytania i propozycje jak to można jeszcze udoskonalić to pisać
    • By dnkei
      EDIT: SOLUTION on my 4th post.
       
      Hi all,
       
      I just wanted to know if anyone found a fix for the cpu idle bug without hibernating. 
      First of all, I made my setup in 2015 and I just used an external monitor since at the time making it work with optimus having a dgpu was unknown. Recently I read that disabling the dgpu was an option so I quickly got "up to date" reading the new stuff, but I might have missed something. Keeping that in mind:
      I have a MBPr Late 2013 (GT750M dgpu) with an Akitio Thunder Box, with a dell power supply (which I always thought it might not be enough because I had to limit the power with Afterburner to 80% so it wouldn't shut down when playing a few hours) and an Nvidia GTX970 (KFA 4GB).
      Since I was using a Win 7 on Bootcamp/BIOS I decided to start from scratch: got a Win 10 EFI installed, got reFind so I could activate the visibility of the Iris igpu, and although it seems to be much more buggy, got to run the system with the egpu. Then disabled the dgpu on the device manager, used the switch-gpu bat file as admin to get the igpu as main, so far so good. Shut down, Boot.
      I get to the point where I basically have the egpu working and I tried running Tomb Raider's benchmark (what I had at hand) and it was going smoothly (which wouldn't be like that with the iris or dgpu). Also, looking at msi afterburner my egpu was at 100%.

      The ONLY thing that I can't by any means get to work is to not have the CPU idle crazy load. I get around 25-30% of CPU load and if I try to hibernate the system it... well it goes to hell, haha.
      The MBP seems to hibernate (sleep/shutdown black, fans out) but the egpu is still on with the fans (doesn't really hibernate), and from there its impossible to make it run. Pressing any button, alt, power, whatever, it tries to wake up/power on for a moment, but it simply can't finish, just black screen with the fans turning on. The only way to shut it down is to hold the power button. And after that it basically locks there and becomes even difficult to boot again, I actually have to press alt, go back to osx, reinstall refind - since it no longer appears with it trying to wake up from hibernation- run windows efi and quickly press shift+f8 to run in safe mode and then boot correctly (and again with the cpu load).
       
      I have to say that even when working, my macbook pro doesn't consistently boot with the egpu via thunderbolt2, with any combination (wait to plug the cable, have it powered on, off, preboot, when starting windows, etc.) it just seems to randomly work. It used to work a bit better before refind, though, but never consistently (don't know if that has changed in the past years).
       
      So... any ideas of how to get that cpu load off? Or to make hibernation work? 
      Oh, as a note, I read that when you disable the dgpu and use the switchgpu script you can control the screen brightness again, but I just have it locked to the max. The control osd appears and moves but it doesn't actually change. I don't care about it but I thought it might give a hint of something not done properly?

      Many thanks to whoever can give me any ideas!!!
       
      EDIT: Reduced part of the cpu idle from known w10 small stuff but the main bug is still there, about 15%)
×

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.