Jump to content
EwinRacing Flash Series Gaming Chairs

Recommended Posts

15 hours ago, MioIV said:

It based on your needs. If you're planning to use 2570p as a portable laptop, then you have 2 variants: take a most powerful 2/4 (e.g. 3540M) CPU or a most powersaving 4/8 (3612QM). Both of these choices will bring a noticeable performance improvement over base i3/i5 cpu. I suggest to take a 2/4 cpu since it will consume less power in the lightweight scenarios like web browsing or text stuff (but for multithreading apps 4/8 is a better choice). If you will use it at home and ready to make some cooling improvements, then you can give a look for 3740qm.
The graphics part (HD4000) is too slow and lacks support nowadays, and the cooling system of the 2570p not designed for 4/8 and heavy tasks. If you're good with the current performance, then it makes sense to just leave it as is.

thank you in advance for detailed informations. I have red on the first page that a good CPU with a TDP of 35W and "cooler" temperature could be the 3632QM. From your info I understand that this CPU is a compromise between 3612QM and 3740QM. What do you think about?

Share this post


Link to post
Share on other sites
17 minutes ago, fero985 said:

thank you in advance for detailed informations. I have red on the first page that a good CPU with a TDP of 35W and "cooler" temperature could be the 3632QM. From your info I understand that this CPU is a compromise between 3612QM and 3740QM. What do you think about?

3632 is faster than 3612 for 100MHz, so they're in the one grade. 3740QM have a +600MHz that will be a noticeable perf boost, but it also will heat like in hell at full load. For selection between 3612-3632 you can search for power efficiency comparisons, i've not digged in that much.

Share this post


Link to post
Share on other sites

I have upgraded the laptop to quad i7 3610 but noticed my battery is now discharged at a much faster rate, previously where it was equipped with a dual 3520m the battery lasted for 5.5 hours before reaching 15 per cent now it shows only 2 hrs at browsing the interent or reading documents on the screen. Can I improve the battery performance, because I would like to keep a healthy battery and not always use the laptop on ac power!

Share this post


Link to post
Share on other sites
25 minutes ago, SteliyanStakl said:

I have upgraded the laptop to quad i7 3610 but noticed my battery is now discharged at a much faster rate, previously where it was equipped with a dual 3520m the battery lasted for 5.5 hours before reaching 15 per cent now it shows only 2 hrs at browsing the interent or reading documents on the screen. Can I improve the battery performance, because I would like to keep a healthy battery and not always use the laptop on ac power!

Idle consumption of quad i7 should be comparable to dual core, so it seems that certain things create load for the CPU.

It seems you will need to do some research on what uses background CPU, which could be something as ridiculous as the Google animation on Chrome start page.

Common heavy offenders are website ads or background Windows 10 processes, but the latter is only fixed by switching to Linux...

 

And of course you could underclock, I do this often on battery. 1200Mhz is I think minimum for most Ivy Bridge, 4 cores at that frequency is alright for office work.

Nice thing about this, SpeedStep dynamically lowers voltage at lower frequency, so underclocking is extra efficient.

Share this post


Link to post
Share on other sites
On 7/26/2020 at 12:23 AM, invait53 said:

my advice is Wi-Fi 6 MPE-AX3000H. If you want to use Bluetooth of that module you must do the mod. That WiFi card works with Windows 10 only.

do you own that card? Is ist stable so far?

Share this post


Link to post
Share on other sites
On 1/5/2021 at 3:35 PM, batyanko said:

Idle consumption of quad i7 should be comparable to dual core

Not really, two cores consume about 1.5~3 extra Watts when idle.

 

On 1/5/2021 at 2:57 PM, SteliyanStakl said:

Can I improve the battery performance

Limit the max CPU perf % in Power Settings, it will limit the max frequency. AFAIK you can also set PLx power limits using Intel XTU, which will be better if you need good single core perf, but i've not tested it.

Share this post


Link to post
Share on other sites
On 1/7/2021 at 1:13 AM, MioIV said:

Not really, two cores consume about 1.5~3 extra Watts when idle.

That was interesting to test.

 

3720qm and a 3320m, both at 2.60 GHz nominal freqs.

Test bench:

Dell Latitude E6430, 2x8GB DDR3-1600

Linux Mint 20 at min. brightness

radio off

Battery off

 

Idle Watts shown on Kill-a-watt thingie:

3720qm - 7.0 W

3320m - 6.8 - 6.9 W

I would call that a 0.15 Watts difference, rather than 1.5-3.0 Watts. Hardly a factor to energy consumption.

Though its is another question how often you will be at those Watts in Windows 10. Pretty rarely I guess.

 

So here is a test of a modest load, watching .avi in VLC. Core loads averaging around 3%.

3720qm - ~14.5 W

3320qm - ~13.8 W

Here the difference grows to about 0.8 W. Seems that the i7 is getting inefficient due all cores being woken up, while still not much to do.

Another interesting note - underclocking CPU and iGPU doesn't seem to matter, those minimum frequencies seem to be the preferred and sufficient mode for it even without forcing these frequencies.

 

Finally decided to pitch 100% load on 2 cores at 2400 MHz against 4 cores at 1200 MHz - to test the popular opinion that a quad-core at low frequencies is more efficient than a dual-core at higher frequencies.

So I ran a stress test of 4 threads at 2400 MHz on the 3320m, an of 8 threads at 1200MHz on the 3720qm.

That should in a crude way be the same amount of (totally useless) work done per amount of time.

(note radios here are on and WiFi is connected, realized that too late... substract, say, 0.8 W to compare with above readings)

3720qm - ~20.3 W

3320m - ~21.2 W

Here the 7720qm already seems to be more efficient with around 0.9 W.

 

IMO the third example is the most realistic level of load for everyday work, keeping in mind that modern websites are quite energy-hungry and browsers making good use of multi-threading.

So my best bet for energy efficiency would be an underclocked quad-core i7, say, anywhere between 1200 and 2600 MHz.

And yes, running on battery is one use case where Linux has clear advantage over Windows 10. Linux tends to do very little background work, compared with Windows 10.

 

On 1/7/2021 at 1:13 AM, MioIV said:

Limit the max CPU perf % in Power Settings, it will limit the max frequency. AFAIK you can also set PLx power limits using Intel XTU, which will be better if you need good single core perf, but i've not tested it.

So yes, that should do it. Make sure to underclock the iGPU too, that can be quite the power hog.

 

Limitations: these results may be non-representative due to binning.

My impression on other i5s is consistent with what I observed above, but I admit this is the only i7 I have observed in detail.

 

Photos of le scientific measurement equipment:

https://photos.app.goo.gl/G1ChgoxWdchgiEwX9

Share this post


Link to post
Share on other sites
On 1/13/2021 at 10:20 PM, batyanko said:

That was interesting to test.

 

3720qm and a 3320m, both at 2.60 GHz nominal freqs.

Test bench:

Dell Latitude E6430, 2x8GB DDR3-1600

Linux Mint 20 at min. brightness

radio off

Battery off

 

Idle Watts shown on Kill-a-watt thingie:

3720qm - 7.0 W

3320m - 6.8 - 6.9 W

I would call that a 0.15 Watts difference, rather than 1.5-3.0 Watts. Hardly a factor to energy consumption.

Hmmm, that's strange. I've used BatteryInfoView to trace power consumption a year ago. All off, display off, power saver profile. The summarized minimal idle power consumption on mine 2570p with it's stock 3210m cpu has been around 5.5W, and under the same conditions but with 3740qm it didn't fall below 7.8W.

With light loads like code writing (nodejs, autocomplete, type check etc) or web browsing with 4 core cpu my laptop works around 4 hours, when with 2 core it can run 6. Two hours! So i've decided to switch to the 3210m back, since i need portability first.
 

On 1/13/2021 at 10:20 PM, batyanko said:

Here the 7720qm already seems to be more efficient with around 0.9 W.

 

Well, it can be predicted - lower frequencies => lower Vcore.

 

On 1/13/2021 at 10:20 PM, batyanko said:

Photos of le scientific measurement equipment:

https://photos.app.goo.gl/G1ChgoxWdchgiEwX9


Really scientific!

Share this post


Link to post
Share on other sites
On 1/5/2021 at 2:35 PM, batyanko said:

Idle consumption of quad i7 should be comparable to dual core, so it seems that certain things create load for the CPU.

It seems you will need to do some research on what uses background CPU, which could be something as ridiculous as the Google animation on Chrome start page.

Common heavy offenders are website ads or background Windows 10 processes, but the latter is only fixed by switching to Linux...

 

And of course you could underclock, I do this often on battery. 1200Mhz is I think minimum for most Ivy Bridge, 4 cores at that frequency is alright for office work.

Nice thing about this, SpeedStep dynamically lowers voltage at lower frequency, so underclocking is extra efficient.

Your benchmarks where interesting, but I observed the same lowering in battery life when I switched from dual core to quad core. I don't have my dual core anymore, to test again. I think that in your benchmarks the problem comes in full load, indeed, the dual core will be at full load of 2.4 Ghz (let's say), but the quad-core will also be at 2.4 Ghz in full load in Windows. When I say full load, I talk about medium load also (it is scarce to use a laptop in 100% idle, watching a film can be considered as idle but browsing Internet on the greddy Chrome, not very much).

I would be interested to see the benchmark of system consumption with 2.4 Ghz (lets say) for the dual-core and quad-core. For sure it will be 2x more CPU consumption for the quad-core.

Nevertheless, the quad-core should perform 2 times quicker than the dual-core, thus finishing the task 2 times earlier, but I doubt that Windows CPU governor is efficient to the point that it would turn off the corres as quickly as they stopped working.  This would explain why I had this feeling also of the battery draining far more quickly on quad core than dual core, because in medium or full load, all the cores turns on (consuming 2x more power), and they don't turn off as quickly as they turn on.

Edit : I was that you are on Linux Mint, maybe the governor is tweaked. But the reasoning on Windows (and thus, the consumption at 2.4 Ghz lets say, and full cores meaning 2c/4t for the dual core and consumption at 2.4 Ghz at 4c/8t should be still valid)

Edited by juandante

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


  • Similar Content

    • By TheLoser1124
      Hello, A couple of days ago I got a new GPU but when I installed it into my computer I was unable to use it but now I know why. When checking the device manger I went into the events tab of my GPU when I went to view all events, I noticed an error it said " event 411 kernel PnP " and It also said Problem Status: 0xC01E0438. I believe this is why my GPU hasn't been working on my PC. If you know how to fix this problem or have info on how to fix this problem that would be greatly appreciated. I'm also using a EVGA NVIDIA GeForce GTX 1660.
    • By TheLoser1124
      I'm having a problem where my PC is saying my eGPU is not usable, its detected in the Device Manager and it doesn't have the yellow triangle next to it. I cant use it games and the Nvidia Control Panel doesn't recognize it either. I'm using a EVGA NVIDIA Geforce GTX 1660. I'm using windows 10 and I tried DDU and reinstalling them and now I cant access the nvidia control panel. The GPU is not recognize on any other apps and I went on *********** and was unable to find my answer, Any help on how to fix this problem would be greatly appreciated.
    • By Radstark
      Title sums it up.
       
      TL;DR: we have a Clevo that runs a desktop CPU, one with those huge 82 Wh batteries. We remove the GPU and let it use the CPU's integrated graphics. How much time for the battery to go from 100 to 0? Is it comparable to an ultrabook's?
       
      I'm theorizing a mobile set with a static eGPU and an upgradable CPU. Given a hypothetical user that needs fast processing on the go and long battery life while retaining very high degrees of mobility, but at home wants a powerful machine to run most games, I guess that would be their best bet. It would surely be more convenient to keep everything in the same disk. And even though the thing would be quite heavy to carry around, changing CPU would be more cost-efficient than changing an entire laptop. (Not sure if I'm right here, and also I'm not sure whether the motherboard in a Clevo would be replaceable when the new CPU needs a different socket, which is another reason why I'm asking here.)
       
      If my above guesses aren't correct, then an ultrabook with Thunderbolt and without a dedicated GPU would be a better choice. If they are, then we would be carrying more weight in exchange of a more cost-efficient setup, which I think would be a fair tradeoff.
       
      Also I am aware of the heating problems that these laptops suffer from, at least compared to a desktop setup. Would they be solved by moving the GPU out of the chassis, and instead plugging it with an eGPU dock via Thunderbolt port?
       
      What do you think? Is it doable? If not, why?
    • By damianalex
      Hi,
      I want to show you my new project.
      I bought my laptop about 5 years ago.  It was never speed deamon, but for every day use, it was enough to me.
       
      Its specification:
      Intel i7-4700MQ
      Nvidia GeForce GT745M
      16GB RAM
      512 SSD Adata SU800

       
      Yesterday I bought Witcher 3. Of course on 1080p it's impossible to play, because of about 10fps
       
      So I make a decision to buy external grahpic card.
      I choose GTX970 and now I am looking for occasion to buy it.
       
      I've already bought EXP GDC v8.4d mPCI-E. It is used and cost about 30$.

       
      I will use ATX PSU to supply graphic card, because I have old desktop PSU at home.
       
      My plan:
      1) unlock bios
      2) buy GTX 970
      3) enjoy Witcher 3!
       
      BTW I try to cool down my laptop, so I replaced thermal paste to Kryonout Thermal Grizzly and make some tests. Temperatures and fps are on screenshots.
       
      1) laptop on table, playing GTA V 1080p

       
      - first minute (GPU core 1045MHz)

       
      -after some time, because of high temperature and GPU throlttling (GPU core drop down even to 400MHz)

       
      2) with thermalpad Glacier NC400 SilentiumPC

      - 30 minute and longer (GPU core drop to about 900MHz)
       
      3) without down case of laptop (like on picture with thermal paste) on thermalpad.
      Here laptop never drops down MHz and max temp are about 72 Celsius degrees.

       
       
      Bacause of that I bought second down case for my laptop and I will cut there additional holes.
       
      Greetings!
       
      I will describe next step when I get all stuff I need!
       
       
    • By popozitos
      Hello everyone
      I created a case for egpu using 2.5mm acrylic sheets. The first time I created a case to support a GPU + ATX psu
      Now I created another one for the GPU with an external Xbox 360 203w psu.
      the finish was not good but for my use it is already good.
       
      *both cases album
      *Construction of the second case 
      *adapting xbox psu to 8pin power (taken from 24pin)
       
      all made by hand with the tools I had.

      from Brazil.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.