Jump to content

Recommended Posts

15 hours ago, MioIV said:

It based on your needs. If you're planning to use 2570p as a portable laptop, then you have 2 variants: take a most powerful 2/4 (e.g. 3540M) CPU or a most powersaving 4/8 (3612QM). Both of these choices will bring a noticeable performance improvement over base i3/i5 cpu. I suggest to take a 2/4 cpu since it will consume less power in the lightweight scenarios like web browsing or text stuff (but for multithreading apps 4/8 is a better choice). If you will use it at home and ready to make some cooling improvements, then you can give a look for 3740qm.
The graphics part (HD4000) is too slow and lacks support nowadays, and the cooling system of the 2570p not designed for 4/8 and heavy tasks. If you're good with the current performance, then it makes sense to just leave it as is.

thank you in advance for detailed informations. I have red on the first page that a good CPU with a TDP of 35W and "cooler" temperature could be the 3632QM. From your info I understand that this CPU is a compromise between 3612QM and 3740QM. What do you think about?

Link to comment
Share on other sites

17 minutes ago, fero985 said:

thank you in advance for detailed informations. I have red on the first page that a good CPU with a TDP of 35W and "cooler" temperature could be the 3632QM. From your info I understand that this CPU is a compromise between 3612QM and 3740QM. What do you think about?

3632 is faster than 3612 for 100MHz, so they're in the one grade. 3740QM have a +600MHz that will be a noticeable perf boost, but it also will heat like in hell at full load. For selection between 3612-3632 you can search for power efficiency comparisons, i've not digged in that much.

Link to comment
Share on other sites

  • 1 month later...
  • 2 weeks later...

I have upgraded the laptop to quad i7 3610 but noticed my battery is now discharged at a much faster rate, previously where it was equipped with a dual 3520m the battery lasted for 5.5 hours before reaching 15 per cent now it shows only 2 hrs at browsing the interent or reading documents on the screen. Can I improve the battery performance, because I would like to keep a healthy battery and not always use the laptop on ac power!

Link to comment
Share on other sites

25 minutes ago, SteliyanStakl said:

I have upgraded the laptop to quad i7 3610 but noticed my battery is now discharged at a much faster rate, previously where it was equipped with a dual 3520m the battery lasted for 5.5 hours before reaching 15 per cent now it shows only 2 hrs at browsing the interent or reading documents on the screen. Can I improve the battery performance, because I would like to keep a healthy battery and not always use the laptop on ac power!

Idle consumption of quad i7 should be comparable to dual core, so it seems that certain things create load for the CPU.

It seems you will need to do some research on what uses background CPU, which could be something as ridiculous as the Google animation on Chrome start page.

Common heavy offenders are website ads or background Windows 10 processes, but the latter is only fixed by switching to Linux...

 

And of course you could underclock, I do this often on battery. 1200Mhz is I think minimum for most Ivy Bridge, 4 cores at that frequency is alright for office work.

Nice thing about this, SpeedStep dynamically lowers voltage at lower frequency, so underclocking is extra efficient.

Link to comment
Share on other sites

On 1/5/2021 at 3:35 PM, batyanko said:

Idle consumption of quad i7 should be comparable to dual core

Not really, two cores consume about 1.5~3 extra Watts when idle.

 

On 1/5/2021 at 2:57 PM, SteliyanStakl said:

Can I improve the battery performance

Limit the max CPU perf % in Power Settings, it will limit the max frequency. AFAIK you can also set PLx power limits using Intel XTU, which will be better if you need good single core perf, but i've not tested it.

Link to comment
Share on other sites

On 1/7/2021 at 1:13 AM, MioIV said:

Not really, two cores consume about 1.5~3 extra Watts when idle.

That was interesting to test.

 

3720qm and a 3320m, both at 2.60 GHz nominal freqs.

Test bench:

Dell Latitude E6430, 2x8GB DDR3-1600

Linux Mint 20 at min. brightness

radio off

Battery off

 

Idle Watts shown on Kill-a-watt thingie:

3720qm - 7.0 W

3320m - 6.8 - 6.9 W

I would call that a 0.15 Watts difference, rather than 1.5-3.0 Watts. Hardly a factor to energy consumption.

Though its is another question how often you will be at those Watts in Windows 10. Pretty rarely I guess.

 

So here is a test of a modest load, watching .avi in VLC. Core loads averaging around 3%.

3720qm - ~14.5 W

3320qm - ~13.8 W

Here the difference grows to about 0.8 W. Seems that the i7 is getting inefficient due all cores being woken up, while still not much to do.

Another interesting note - underclocking CPU and iGPU doesn't seem to matter, those minimum frequencies seem to be the preferred and sufficient mode for it even without forcing these frequencies.

 

Finally decided to pitch 100% load on 2 cores at 2400 MHz against 4 cores at 1200 MHz - to test the popular opinion that a quad-core at low frequencies is more efficient than a dual-core at higher frequencies.

So I ran a stress test of 4 threads at 2400 MHz on the 3320m, an of 8 threads at 1200MHz on the 3720qm.

That should in a crude way be the same amount of (totally useless) work done per amount of time.

(note radios here are on and WiFi is connected, realized that too late... substract, say, 0.8 W to compare with above readings)

3720qm - ~20.3 W

3320m - ~21.2 W

Here the 7720qm already seems to be more efficient with around 0.9 W.

 

IMO the third example is the most realistic level of load for everyday work, keeping in mind that modern websites are quite energy-hungry and browsers making good use of multi-threading.

So my best bet for energy efficiency would be an underclocked quad-core i7, say, anywhere between 1200 and 2600 MHz.

And yes, running on battery is one use case where Linux has clear advantage over Windows 10. Linux tends to do very little background work, compared with Windows 10.

 

On 1/7/2021 at 1:13 AM, MioIV said:

Limit the max CPU perf % in Power Settings, it will limit the max frequency. AFAIK you can also set PLx power limits using Intel XTU, which will be better if you need good single core perf, but i've not tested it.

So yes, that should do it. Make sure to underclock the iGPU too, that can be quite the power hog.

 

Limitations: these results may be non-representative due to binning.

My impression on other i5s is consistent with what I observed above, but I admit this is the only i7 I have observed in detail.

 

Photos of le scientific measurement equipment:

https://photos.app.goo.gl/G1ChgoxWdchgiEwX9

Link to comment
Share on other sites

On 1/13/2021 at 10:20 PM, batyanko said:

That was interesting to test.

 

3720qm and a 3320m, both at 2.60 GHz nominal freqs.

Test bench:

Dell Latitude E6430, 2x8GB DDR3-1600

Linux Mint 20 at min. brightness

radio off

Battery off

 

Idle Watts shown on Kill-a-watt thingie:

3720qm - 7.0 W

3320m - 6.8 - 6.9 W

I would call that a 0.15 Watts difference, rather than 1.5-3.0 Watts. Hardly a factor to energy consumption.

Hmmm, that's strange. I've used BatteryInfoView to trace power consumption a year ago. All off, display off, power saver profile. The summarized minimal idle power consumption on mine 2570p with it's stock 3210m cpu has been around 5.5W, and under the same conditions but with 3740qm it didn't fall below 7.8W.

With light loads like code writing (nodejs, autocomplete, type check etc) or web browsing with 4 core cpu my laptop works around 4 hours, when with 2 core it can run 6. Two hours! So i've decided to switch to the 3210m back, since i need portability first.
 

On 1/13/2021 at 10:20 PM, batyanko said:

Here the 7720qm already seems to be more efficient with around 0.9 W.

 

Well, it can be predicted - lower frequencies => lower Vcore.

 

On 1/13/2021 at 10:20 PM, batyanko said:

Photos of le scientific measurement equipment:

https://photos.app.goo.gl/G1ChgoxWdchgiEwX9


Really scientific!

Link to comment
Share on other sites

On 1/5/2021 at 2:35 PM, batyanko said:

Idle consumption of quad i7 should be comparable to dual core, so it seems that certain things create load for the CPU.

It seems you will need to do some research on what uses background CPU, which could be something as ridiculous as the Google animation on Chrome start page.

Common heavy offenders are website ads or background Windows 10 processes, but the latter is only fixed by switching to Linux...

 

And of course you could underclock, I do this often on battery. 1200Mhz is I think minimum for most Ivy Bridge, 4 cores at that frequency is alright for office work.

Nice thing about this, SpeedStep dynamically lowers voltage at lower frequency, so underclocking is extra efficient.

Your benchmarks where interesting, but I observed the same lowering in battery life when I switched from dual core to quad core. I don't have my dual core anymore, to test again. I think that in your benchmarks the problem comes in full load, indeed, the dual core will be at full load of 2.4 Ghz (let's say), but the quad-core will also be at 2.4 Ghz in full load in Windows. When I say full load, I talk about medium load also (it is scarce to use a laptop in 100% idle, watching a film can be considered as idle but browsing Internet on the greddy Chrome, not very much).

I would be interested to see the benchmark of system consumption with 2.4 Ghz (lets say) for the dual-core and quad-core. For sure it will be 2x more CPU consumption for the quad-core.

Nevertheless, the quad-core should perform 2 times quicker than the dual-core, thus finishing the task 2 times earlier, but I doubt that Windows CPU governor is efficient to the point that it would turn off the corres as quickly as they stopped working.  This would explain why I had this feeling also of the battery draining far more quickly on quad core than dual core, because in medium or full load, all the cores turns on (consuming 2x more power), and they don't turn off as quickly as they turn on.

Edit : I was that you are on Linux Mint, maybe the governor is tweaked. But the reasoning on Windows (and thus, the consumption at 2.4 Ghz lets say, and full cores meaning 2c/4t for the dual core and consumption at 2.4 Ghz at 4c/8t should be still valid)

Edited by juandante
Link to comment
Share on other sites

Quote

I am planning a build with the 2570p housing. 

The first goal is to find a decent Display to replace the LTN.

 

What would be the absolute best panel which could fit with some modifications? 

 

Thank you! 

 

 

Will fit 2560p Hinges in the 2570p Model too?

Edited by Atheros
Link to comment
Share on other sites

1 hour ago, Atheros said:

 

 

Will fit 2560p Hinges in the 2570p Model too?

Of course they will fit, it is the exact same laptop. The only difference between the two laptops is the light sensor removed on the 2570p on top of the screen.

Edited by juandante
Link to comment
Share on other sites

3 minutes ago, juandante said:

Of course they will fit, it is the exact same laptop. The only difference between the two laptops is the light sensor removed on the 2570p on top of the screen.

 

They got the same hole size?

Link to comment
Share on other sites

On 2/8/2020 at 2:52 PM, Sonney said:

My 2570p successfully modified with X230's LG IPS Panel 
The first thing I did was removing all the magnesium guinding lines for antenna wires and LCD cable compartment screw mounting holes that were on the LCD's way. And because the LCD cable compartment can't be screwed anymore it's held on place with epoxy.
Removing the the panel's screw mounting points was the most risky part, but the metal was soft enough to be cut with long scissors. That required replacing the LCD tape, otherwise screen layers would come off. I used kapton tape covered with electric tape all along the edges. 
LCD cable appeared to be flexible enough to reach LCD's connector. The X230's panel is much thinner so there was enough room to do it.
The last thing to do was adapting the LCD bezel to the new screen. Because X230's panel is higher (the location of the circuit board) I had to cut 4 milimeters from the top of the LCD bezel. The different screen height exposed bottom LCD tape, so I replaced it with black tape. I also had to remove LCD bezel's bottom latches, now it's held on place with double-sided tape. 
If you've got any questions feel free to ask. 
Is this the first IPS 2570p ever? 

 

On 2/8/2020 at 2:52 PM, Sonney said:

photo_2020-02-08_13-59-25.jpg

photo_2020-02-08_14-28-33.jpg

photo_2020-02-08_14-18-25.jpg

photo_2020-02-08_14-18-26 (2).jpg

 

 

Incredible job done! A question about the bottom part of the new screen (the black one, in which there is the connector and the logic board): could it have been folded on the back of the screen, while keeping the connector attached, so that the screen wouldn't be higher? or could it have been fixed in a less definitive way than gluing it on the chassis?
Thanks for sharing your attempt!

Link to comment
Share on other sites

  • 3 weeks later...

https://www.win-raid.com/t834f25-USB-Drivers-original-and-modded.html

 

Intel USB 3.0/3.1 Drivers & Software Set v5.0.4.43_v2 WHQL for Win7

 

this version from series 8 chipsets and up works for 2570p & win7, stable, side effects unknown. Last supported version for Q77 is v1.0.10.255 WHQL

 

 

Edited by istinnstudio
deleted
Link to comment
Share on other sites

  • 2 weeks later...
On 12/5/2012 at 8:04 AM, Tech Inferno Fan said:

#1

Performance upgrade: external graphics (eGPU)

DIY eGPU to attach a desktop videocard via the 2570P's 5Gbps expresscard slot.

A NVidia GTX460/560/660/670 is a straight plug-n-play implementation on a 2570P when using Windows 7, though Win8.x is problematic (src: here). Win8.x users may consider doing a UEFI installation instead which resolved error12 and gave hotplug capabilities as noted here.

An AMD card, GTX650, GTX750, GTX9xx or older NVidia cards require a DSDT override and (maybe) DIY eGPU Setup 1.x interposer software to eradicate error 12 that prevents their functionality. See the 2570P DSDT override details if using one of these cards.

Implemented on a 2570P at x1.2Opt speed (5Gbps + compression) using a NVidia GTX670 (jacobsson), GTX660Ti (Tech Inferno Fan), GTX560Ti (bjorm), GTX660 (dewos), GTX560Ti (hatoblue), GTX650Ti (phillofoc)

Above: simplified HP 2570P eGPU installation process instructions courtesy of T|U user badbadbad


Performance upgrade: storage

Extra hotswappable 9.5mm (2.5") HDD/SSD added through Optical-Drive Bay (SimoxTav) or newmodeus. 
US$18 optical drive faceplate replacement part
2570P RAID-0 SSD guide (jacobsson) - get superfast bootup and 1GB/s sequential reads by configuring two SATA-III SSDs as a RAID-0 vol
US$4 eSATAp cable - connect to your SATA HDD/SSD to the 2570P's eSATAp (combo SATA+ USB 3.0) port for superfast bootable storage.
FAQ about optical drive space saver: can it's faceplate be used on other ODDs/caddies? Answer: no/yes.

#2
 

002520027.jpg

2570p also provides mSATA interface, users can increase their own SSD to improve the overall performance.

 

 

I have few questions:

1. regarding quote #1: so if I use nVidia GTX460/560/660/670 I can just plug them in into eGPU adapter and install drivers and I don't need to do any other mods on the system, like DSDT override? Just plug and play?

Bonus question #1: I see on Ali that there are few versions of eGPU adapters, like v8, v8.5, V9 (are those it?) - is there any difference btw models and which model is the best to use with 2570p?

 

2. that second pcie mini card slot which was ment for WWAN card: I don't need WWAN card. Is there any other use for that pcie mini card slot and which?

Bonus question #2: there is some kind of slot for 56k dial up model - is that slot useless except for dial up modem?

 

3. I've read on Ali that some OPT drive caddy's for HDD have problems in a way that they don't deliver full SATAII speeds. Any experience on that? Is there specific brad that should be bought or are they all the same?

Edited by svabo
spelling errors
Link to comment
Share on other sites

@svabo
1. Well,
 

Quote

is a straight plug-n-play implementation

I'm not sure about adapters, but mostly like they're not differ too much.

2. Patch BIOS, solder signal caps and use it's mSATA capabilities. To use any other usb device you'll also need to remove the whitelist.
Yes, it's only for modem.

3. HDD caddy is a plain passive adapter and a bunch of plactic, there is not much possibilities to worse the SATA connection.

Link to comment
Share on other sites

11 minutes ago, MioIV said:

@svabo
1. Well,
 

I'm not sure about adapters, but mostly like they're not differ too much.

2. Patch BIOS, solder signal caps and use it's mSATA capabilities. To use any other usb device you'll also need to remove the whitelist.
Yes, it's only for modem.

3. HDD caddy is a plain passive adapter and a bunch of plactic, there is not much possibilities to worse the SATA connection.

MioIV, thanks for reply

About #1: do you have or have you ever tested that kind of setup with those "plug & play" cards or you are just confirming what it's said in that first post of topic?

About #2: so if I solder signal caps and turn pcie mini card slot into mSATA slot, I can use mSATA SSD in it? If yes, do you know what's the performance of the slot iself (is it SATA, SATAII or SATAIII speed?)

Link to comment
Share on other sites

@svabo
I've not tested unfortunately, just concluding which was written in this topic before. I'm recommending you to read related posts here to get all details before making any decision.
 

Quote

I can use mSATA SSD in it

You'll NEED to make a bios patch before msata will be usable because corresponding sata port is disabled by default. It will run at the SATAII speed, and it will not be bootable (and not possible to select as bootable on power on).

Edited by MioIV
  • Thumbs Up 1
Link to comment
Share on other sites

48 minutes ago, MioIV said:

@svabo
I've not tested unfortunately, just concluding which was written in this topic before. I'm recommending you to read related posts here to get all details before making any decision.

Thanks but I've read related posts and couldn't find anything on those "plug & play" cards and that I won't need to do DSDT override and use any other modification

Link to comment
Share on other sites

4 hours ago, svabo said:

Thanks but I've read related posts and couldn't find anything on those "plug & play" cards and that I won't need to do DSDT override and use any other modification

Is it not enough for you?

  • Thumbs Up 1
Link to comment
Share on other sites

  • 3 weeks later...

I have two 2x8gb adata memory sticks in my hp2570p just noticed that memory works at 1300mhz in dual mode instead of the advertised 1600mhz. The i7 3610qm works on 1600 mhz according to the manufacturer. What is wrong with my configuration?

Link to comment
Share on other sites

36 minutes ago, stoyancho said:

I have two 2x8gb adata memory sticks in my hp2570p just noticed that memory works at 1300mhz in dual mode instead of the advertised 1600mhz. The i7 3610qm works on 1600 mhz according to the manufacturer. What is wrong with my configuration?

 

Try resetting BIOS defaults, plug one stick at a time, etc.

 

Sometimes BIOS doesn't make good sense of SPD memory info to set correct timings and memory frequency, so you need to give it another chance.

 

A bit surprising for me though, never noticed such problems with IvyBridge. Mostly with an AM3 AsRock mobo (it was so bad it was choosing timings that crashed the system), but there at least I could set timings and frequency manually

  • Thumbs Up 1
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.