Jump to content
EwinRacing Flash Series Gaming Chairs

Recommended Posts

15 hours ago, MioIV said:

It based on your needs. If you're planning to use 2570p as a portable laptop, then you have 2 variants: take a most powerful 2/4 (e.g. 3540M) CPU or a most powersaving 4/8 (3612QM). Both of these choices will bring a noticeable performance improvement over base i3/i5 cpu. I suggest to take a 2/4 cpu since it will consume less power in the lightweight scenarios like web browsing or text stuff (but for multithreading apps 4/8 is a better choice). If you will use it at home and ready to make some cooling improvements, then you can give a look for 3740qm.
The graphics part (HD4000) is too slow and lacks support nowadays, and the cooling system of the 2570p not designed for 4/8 and heavy tasks. If you're good with the current performance, then it makes sense to just leave it as is.

thank you in advance for detailed informations. I have red on the first page that a good CPU with a TDP of 35W and "cooler" temperature could be the 3632QM. From your info I understand that this CPU is a compromise between 3612QM and 3740QM. What do you think about?

Share this post


Link to post
Share on other sites
17 minutes ago, fero985 said:

thank you in advance for detailed informations. I have red on the first page that a good CPU with a TDP of 35W and "cooler" temperature could be the 3632QM. From your info I understand that this CPU is a compromise between 3612QM and 3740QM. What do you think about?

3632 is faster than 3612 for 100MHz, so they're in the one grade. 3740QM have a +600MHz that will be a noticeable perf boost, but it also will heat like in hell at full load. For selection between 3612-3632 you can search for power efficiency comparisons, i've not digged in that much.

Share this post


Link to post
Share on other sites

I have upgraded the laptop to quad i7 3610 but noticed my battery is now discharged at a much faster rate, previously where it was equipped with a dual 3520m the battery lasted for 5.5 hours before reaching 15 per cent now it shows only 2 hrs at browsing the interent or reading documents on the screen. Can I improve the battery performance, because I would like to keep a healthy battery and not always use the laptop on ac power!

Share this post


Link to post
Share on other sites
25 minutes ago, SteliyanStakl said:

I have upgraded the laptop to quad i7 3610 but noticed my battery is now discharged at a much faster rate, previously where it was equipped with a dual 3520m the battery lasted for 5.5 hours before reaching 15 per cent now it shows only 2 hrs at browsing the interent or reading documents on the screen. Can I improve the battery performance, because I would like to keep a healthy battery and not always use the laptop on ac power!

Idle consumption of quad i7 should be comparable to dual core, so it seems that certain things create load for the CPU.

It seems you will need to do some research on what uses background CPU, which could be something as ridiculous as the Google animation on Chrome start page.

Common heavy offenders are website ads or background Windows 10 processes, but the latter is only fixed by switching to Linux...

 

And of course you could underclock, I do this often on battery. 1200Mhz is I think minimum for most Ivy Bridge, 4 cores at that frequency is alright for office work.

Nice thing about this, SpeedStep dynamically lowers voltage at lower frequency, so underclocking is extra efficient.

Share this post


Link to post
Share on other sites
On 7/26/2020 at 12:23 AM, invait53 said:

my advice is Wi-Fi 6 MPE-AX3000H. If you want to use Bluetooth of that module you must do the mod. That WiFi card works with Windows 10 only.

do you own that card? Is ist stable so far?

Share this post


Link to post
Share on other sites
On 1/5/2021 at 3:35 PM, batyanko said:

Idle consumption of quad i7 should be comparable to dual core

Not really, two cores consume about 1.5~3 extra Watts when idle.

 

On 1/5/2021 at 2:57 PM, SteliyanStakl said:

Can I improve the battery performance

Limit the max CPU perf % in Power Settings, it will limit the max frequency. AFAIK you can also set PLx power limits using Intel XTU, which will be better if you need good single core perf, but i've not tested it.

Share this post


Link to post
Share on other sites
On 1/7/2021 at 1:13 AM, MioIV said:

Not really, two cores consume about 1.5~3 extra Watts when idle.

That was interesting to test.

 

3720qm and a 3320m, both at 2.60 GHz nominal freqs.

Test bench:

Dell Latitude E6430, 2x8GB DDR3-1600

Linux Mint 20 at min. brightness

radio off

Battery off

 

Idle Watts shown on Kill-a-watt thingie:

3720qm - 7.0 W

3320m - 6.8 - 6.9 W

I would call that a 0.15 Watts difference, rather than 1.5-3.0 Watts. Hardly a factor to energy consumption.

Though its is another question how often you will be at those Watts in Windows 10. Pretty rarely I guess.

 

So here is a test of a modest load, watching .avi in VLC. Core loads averaging around 3%.

3720qm - ~14.5 W

3320qm - ~13.8 W

Here the difference grows to about 0.8 W. Seems that the i7 is getting inefficient due all cores being woken up, while still not much to do.

Another interesting note - underclocking CPU and iGPU doesn't seem to matter, those minimum frequencies seem to be the preferred and sufficient mode for it even without forcing these frequencies.

 

Finally decided to pitch 100% load on 2 cores at 2400 MHz against 4 cores at 1200 MHz - to test the popular opinion that a quad-core at low frequencies is more efficient than a dual-core at higher frequencies.

So I ran a stress test of 4 threads at 2400 MHz on the 3320m, an of 8 threads at 1200MHz on the 3720qm.

That should in a crude way be the same amount of (totally useless) work done per amount of time.

(note radios here are on and WiFi is connected, realized that too late... substract, say, 0.8 W to compare with above readings)

3720qm - ~20.3 W

3320m - ~21.2 W

Here the 7720qm already seems to be more efficient with around 0.9 W.

 

IMO the third example is the most realistic level of load for everyday work, keeping in mind that modern websites are quite energy-hungry and browsers making good use of multi-threading.

So my best bet for energy efficiency would be an underclocked quad-core i7, say, anywhere between 1200 and 2600 MHz.

And yes, running on battery is one use case where Linux has clear advantage over Windows 10. Linux tends to do very little background work, compared with Windows 10.

 

On 1/7/2021 at 1:13 AM, MioIV said:

Limit the max CPU perf % in Power Settings, it will limit the max frequency. AFAIK you can also set PLx power limits using Intel XTU, which will be better if you need good single core perf, but i've not tested it.

So yes, that should do it. Make sure to underclock the iGPU too, that can be quite the power hog.

 

Limitations: these results may be non-representative due to binning.

My impression on other i5s is consistent with what I observed above, but I admit this is the only i7 I have observed in detail.

 

Photos of le scientific measurement equipment:

https://photos.app.goo.gl/G1ChgoxWdchgiEwX9

Share this post


Link to post
Share on other sites
On 1/13/2021 at 10:20 PM, batyanko said:

That was interesting to test.

 

3720qm and a 3320m, both at 2.60 GHz nominal freqs.

Test bench:

Dell Latitude E6430, 2x8GB DDR3-1600

Linux Mint 20 at min. brightness

radio off

Battery off

 

Idle Watts shown on Kill-a-watt thingie:

3720qm - 7.0 W

3320m - 6.8 - 6.9 W

I would call that a 0.15 Watts difference, rather than 1.5-3.0 Watts. Hardly a factor to energy consumption.

Hmmm, that's strange. I've used BatteryInfoView to trace power consumption a year ago. All off, display off, power saver profile. The summarized minimal idle power consumption on mine 2570p with it's stock 3210m cpu has been around 5.5W, and under the same conditions but with 3740qm it didn't fall below 7.8W.

With light loads like code writing (nodejs, autocomplete, type check etc) or web browsing with 4 core cpu my laptop works around 4 hours, when with 2 core it can run 6. Two hours! So i've decided to switch to the 3210m back, since i need portability first.
 

On 1/13/2021 at 10:20 PM, batyanko said:

Here the 7720qm already seems to be more efficient with around 0.9 W.

 

Well, it can be predicted - lower frequencies => lower Vcore.

 

On 1/13/2021 at 10:20 PM, batyanko said:

Photos of le scientific measurement equipment:

https://photos.app.goo.gl/G1ChgoxWdchgiEwX9


Really scientific!

Share this post


Link to post
Share on other sites
On 1/5/2021 at 2:35 PM, batyanko said:

Idle consumption of quad i7 should be comparable to dual core, so it seems that certain things create load for the CPU.

It seems you will need to do some research on what uses background CPU, which could be something as ridiculous as the Google animation on Chrome start page.

Common heavy offenders are website ads or background Windows 10 processes, but the latter is only fixed by switching to Linux...

 

And of course you could underclock, I do this often on battery. 1200Mhz is I think minimum for most Ivy Bridge, 4 cores at that frequency is alright for office work.

Nice thing about this, SpeedStep dynamically lowers voltage at lower frequency, so underclocking is extra efficient.

Your benchmarks where interesting, but I observed the same lowering in battery life when I switched from dual core to quad core. I don't have my dual core anymore, to test again. I think that in your benchmarks the problem comes in full load, indeed, the dual core will be at full load of 2.4 Ghz (let's say), but the quad-core will also be at 2.4 Ghz in full load in Windows. When I say full load, I talk about medium load also (it is scarce to use a laptop in 100% idle, watching a film can be considered as idle but browsing Internet on the greddy Chrome, not very much).

I would be interested to see the benchmark of system consumption with 2.4 Ghz (lets say) for the dual-core and quad-core. For sure it will be 2x more CPU consumption for the quad-core.

Nevertheless, the quad-core should perform 2 times quicker than the dual-core, thus finishing the task 2 times earlier, but I doubt that Windows CPU governor is efficient to the point that it would turn off the corres as quickly as they stopped working.  This would explain why I had this feeling also of the battery draining far more quickly on quad core than dual core, because in medium or full load, all the cores turns on (consuming 2x more power), and they don't turn off as quickly as they turn on.

Edit : I was that you are on Linux Mint, maybe the governor is tweaked. But the reasoning on Windows (and thus, the consumption at 2.4 Ghz lets say, and full cores meaning 2c/4t for the dual core and consumption at 2.4 Ghz at 4c/8t should be still valid)

Edited by juandante

Share this post


Link to post
Share on other sites
Quote

I am planning a build with the 2570p housing. 

The first goal is to find a decent Display to replace the LTN.

 

What would be the absolute best panel which could fit with some modifications? 

 

Thank you! 

 

 

Will fit 2560p Hinges in the 2570p Model too?

Edited by Atheros

Share this post


Link to post
Share on other sites
1 hour ago, Atheros said:

 

 

Will fit 2560p Hinges in the 2570p Model too?

Of course they will fit, it is the exact same laptop. The only difference between the two laptops is the light sensor removed on the 2570p on top of the screen.

Edited by juandante

Share this post


Link to post
Share on other sites
3 minutes ago, juandante said:

Of course they will fit, it is the exact same laptop. The only difference between the two laptops is the light sensor removed on the 2570p on top of the screen.

 

They got the same hole size?

Share this post


Link to post
Share on other sites
On 2/8/2020 at 2:52 PM, Sonney said:

My 2570p successfully modified with X230's LG IPS Panel 
The first thing I did was removing all the magnesium guinding lines for antenna wires and LCD cable compartment screw mounting holes that were on the LCD's way. And because the LCD cable compartment can't be screwed anymore it's held on place with epoxy.
Removing the the panel's screw mounting points was the most risky part, but the metal was soft enough to be cut with long scissors. That required replacing the LCD tape, otherwise screen layers would come off. I used kapton tape covered with electric tape all along the edges. 
LCD cable appeared to be flexible enough to reach LCD's connector. The X230's panel is much thinner so there was enough room to do it.
The last thing to do was adapting the LCD bezel to the new screen. Because X230's panel is higher (the location of the circuit board) I had to cut 4 milimeters from the top of the LCD bezel. The different screen height exposed bottom LCD tape, so I replaced it with black tape. I also had to remove LCD bezel's bottom latches, now it's held on place with double-sided tape. 
If you've got any questions feel free to ask. 
Is this the first IPS 2570p ever? 

 

On 2/8/2020 at 2:52 PM, Sonney said:

photo_2020-02-08_13-59-25.jpg

photo_2020-02-08_14-28-33.jpg

photo_2020-02-08_14-18-25.jpg

photo_2020-02-08_14-18-26 (2).jpg

 

 

Incredible job done! A question about the bottom part of the new screen (the black one, in which there is the connector and the logic board): could it have been folded on the back of the screen, while keeping the connector attached, so that the screen wouldn't be higher? or could it have been fixed in a less definitive way than gluing it on the chassis?
Thanks for sharing your attempt!

Share this post


Link to post
Share on other sites

https://www.win-raid.com/t834f25-USB-Drivers-original-and-modded.html

 

Intel USB 3.0/3.1 Drivers & Software Set v5.0.4.43_v2 WHQL for Win7

 

this version from series 8 chipsets and up works for 2570p & win7, stable, side effects unknown. Last supported version for Q77 is v1.0.10.255 WHQL

 

 

Edited by istinnstudio
deleted

Share this post


Link to post
Share on other sites
On 12/5/2012 at 8:04 AM, Tech Inferno Fan said:

#1

Performance upgrade: external graphics (eGPU)

DIY eGPU to attach a desktop videocard via the 2570P's 5Gbps expresscard slot.

A NVidia GTX460/560/660/670 is a straight plug-n-play implementation on a 2570P when using Windows 7, though Win8.x is problematic (src: here). Win8.x users may consider doing a UEFI installation instead which resolved error12 and gave hotplug capabilities as noted here.

An AMD card, GTX650, GTX750, GTX9xx or older NVidia cards require a DSDT override and (maybe) DIY eGPU Setup 1.x interposer software to eradicate error 12 that prevents their functionality. See the 2570P DSDT override details if using one of these cards.

Implemented on a 2570P at x1.2Opt speed (5Gbps + compression) using a NVidia GTX670 (jacobsson), GTX660Ti (Tech Inferno Fan), GTX560Ti (bjorm), GTX660 (dewos), GTX560Ti (hatoblue), GTX650Ti (phillofoc)

Above: simplified HP 2570P eGPU installation process instructions courtesy of T|U user badbadbad


Performance upgrade: storage

Extra hotswappable 9.5mm (2.5") HDD/SSD added through Optical-Drive Bay (SimoxTav) or newmodeus. 
US$18 optical drive faceplate replacement part
2570P RAID-0 SSD guide (jacobsson) - get superfast bootup and 1GB/s sequential reads by configuring two SATA-III SSDs as a RAID-0 vol
US$4 eSATAp cable - connect to your SATA HDD/SSD to the 2570P's eSATAp (combo SATA+ USB 3.0) port for superfast bootable storage.
FAQ about optical drive space saver: can it's faceplate be used on other ODDs/caddies? Answer: no/yes.

#2
 

002520027.jpg

2570p also provides mSATA interface, users can increase their own SSD to improve the overall performance.

 

 

I have few questions:

1. regarding quote #1: so if I use nVidia GTX460/560/660/670 I can just plug them in into eGPU adapter and install drivers and I don't need to do any other mods on the system, like DSDT override? Just plug and play?

Bonus question #1: I see on Ali that there are few versions of eGPU adapters, like v8, v8.5, V9 (are those it?) - is there any difference btw models and which model is the best to use with 2570p?

 

2. that second pcie mini card slot which was ment for WWAN card: I don't need WWAN card. Is there any other use for that pcie mini card slot and which?

Bonus question #2: there is some kind of slot for 56k dial up model - is that slot useless except for dial up modem?

 

3. I've read on Ali that some OPT drive caddy's for HDD have problems in a way that they don't deliver full SATAII speeds. Any experience on that? Is there specific brad that should be bought or are they all the same?

Edited by svabo
spelling errors

Share this post


Link to post
Share on other sites

@svabo
1. Well,
 

Quote

is a straight plug-n-play implementation

I'm not sure about adapters, but mostly like they're not differ too much.

2. Patch BIOS, solder signal caps and use it's mSATA capabilities. To use any other usb device you'll also need to remove the whitelist.
Yes, it's only for modem.

3. HDD caddy is a plain passive adapter and a bunch of plactic, there is not much possibilities to worse the SATA connection.

Share this post


Link to post
Share on other sites
11 minutes ago, MioIV said:

@svabo
1. Well,
 

I'm not sure about adapters, but mostly like they're not differ too much.

2. Patch BIOS, solder signal caps and use it's mSATA capabilities. To use any other usb device you'll also need to remove the whitelist.
Yes, it's only for modem.

3. HDD caddy is a plain passive adapter and a bunch of plactic, there is not much possibilities to worse the SATA connection.

MioIV, thanks for reply

About #1: do you have or have you ever tested that kind of setup with those "plug & play" cards or you are just confirming what it's said in that first post of topic?

About #2: so if I solder signal caps and turn pcie mini card slot into mSATA slot, I can use mSATA SSD in it? If yes, do you know what's the performance of the slot iself (is it SATA, SATAII or SATAIII speed?)

Share this post


Link to post
Share on other sites
Posted (edited)

@svabo
I've not tested unfortunately, just concluding which was written in this topic before. I'm recommending you to read related posts here to get all details before making any decision.
 

Quote

I can use mSATA SSD in it

You'll NEED to make a bios patch before msata will be usable because corresponding sata port is disabled by default. It will run at the SATAII speed, and it will not be bootable (and not possible to select as bootable on power on).

Edited by MioIV
  • Thumbs Up 1

Share this post


Link to post
Share on other sites
48 minutes ago, MioIV said:

@svabo
I've not tested unfortunately, just concluding which was written in this topic before. I'm recommending you to read related posts here to get all details before making any decision.

Thanks but I've read related posts and couldn't find anything on those "plug & play" cards and that I won't need to do DSDT override and use any other modification

Share this post


Link to post
Share on other sites
4 hours ago, svabo said:

Thanks but I've read related posts and couldn't find anything on those "plug & play" cards and that I won't need to do DSDT override and use any other modification

Is it not enough for you?

  • Thumbs Up 1

Share this post


Link to post
Share on other sites

Is gtx 980 available for this computer through expresscard ?

 

Will I see any performance increase if I insert a 3740 in place of my 3620qm?

Share this post


Link to post
Share on other sites

I have two 2x8gb adata memory sticks in my hp2570p just noticed that memory works at 1300mhz in dual mode instead of the advertised 1600mhz. The i7 3610qm works on 1600 mhz according to the manufacturer. What is wrong with my configuration?

Share this post


Link to post
Share on other sites
36 minutes ago, stoyancho said:

I have two 2x8gb adata memory sticks in my hp2570p just noticed that memory works at 1300mhz in dual mode instead of the advertised 1600mhz. The i7 3610qm works on 1600 mhz according to the manufacturer. What is wrong with my configuration?

 

Try resetting BIOS defaults, plug one stick at a time, etc.

 

Sometimes BIOS doesn't make good sense of SPD memory info to set correct timings and memory frequency, so you need to give it another chance.

 

A bit surprising for me though, never noticed such problems with IvyBridge. Mostly with an AM3 AsRock mobo (it was so bad it was choosing timings that crashed the system), but there at least I could set timings and frequency manually

  • Thumbs Up 1

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


  • Similar Content

    • By Tech Inferno Fan
      We've had a stack of recurring questions from with problems getting a mPCIe eGPU working. This includes GPU-Z not reporting no clock details, error 10/43 or even not being detected at all. Overall it's more troublesome getting mPCIe working than say expresscard or Thunderbolt.
       
      Here's some common problems and some troubleshooting steps to correct them.
       
      Getting a black bootup screen, resolving error 10/43 or ACPI_BIOS_ERROR win bootup messages
       
      Here the BIOS doesn't know what to do when it sees an eGPU. So the solution is to not let the BIOS see it. Do that by setting the delays on the eGPU adapter (CTD/PTD - EXP GDC or CLKRUN/PERST# on PE4L/PE4C). Boot with eGPU adapter in the wifi slot into Setup 1.30 or Windows. Is the eGPU detected?
       
      I'll add that should error 43 continue AND you have a NVidia dGPU as well as NVidia eGPU then it's likely because of having the mobile NVidia and desktop NVidia drivers loaded simultaneously. Proceed to uninstall ALL your NVidia drivers, use "DDU" to clean NVidia registry entries and do a 'clean' install of the latest NVidia desktop driver.
       
      mPCIe port that hosted the wifi card disappears when connecting an eGPU in it's place
       
      Use the Setup1.30 PCIe Ports->enable to enable the missing port.
       
      eGPU does not get detected
       
      Overcome mPCIe whitelisting by booting with the wifi card and then hotswapping in the eGPU. That way the BIOS will enable the mPCIe port to work.
       
      1. Boot with wifi card into Windows, sleep system, swap wifi card for mPCIe eGPU adapter and ensure eGPU is powered on, resume system. Do a device manager scan in Windows. Is the eGPU detected?
       
      2. Boot with wifi card into Setup 1.30 *carefully* hotplug the eGPU adapter in place of wifi card. Hit F5 to rescan the PCIe bus. Is the eGPU detected?
       
      If this enables detection then avoid this tedious hotswapping by seeking a unwhitelisted modified BIOS for your system OR test the Setup 1.30's PCI ports->undo_whitesting feature.
       
      eGPU still not detected - set the PSU to be permanently on
       
      The latest EXP GDC and BPlus eGPU adapters try to manage the PSU to only power on after they detect a signal. This can cause a race condition where the eGPU isn't ready to go when the CLKRUN signal is asserted.
       
      Avoid this by jumpering the PSU so it's permanently on rather than being managed. Depending on the PSU you are using refer to the following doco on how to do that:
       
      http://forum.techinferno.com/enclosures-adapters/8441-%5Bguide%5D-switching-atx-psu-using-paperclip-trick-swex.html
      http://forum.techinferno.com/enclosures-adapters/9426-220w-dell-da-2-ac-adapter-discussion.html
       
      eGPU still not detected - a non-standard mPCIe implementation by your vendor?
       
      PERST# mPCIe pin 22 may need to be isolated due to a non-standard implementation by your notebook vendor: http://forum.techinferno.com/enclosures-adapters/10812-pe4x-series-understanding-clkreq-perst-delay.html#post142689
       
      eGPU still not detected - faulty hardware?
       
      If you still don't get detection then test the video card and eGPU adapter in another machine to confirm neither is faulty.
       
      NVidia driver stops responding
       
      EXP GDC, PE4H 2.4 and PE4L 1.5 all use a socketted cable and therefore are not true Gen2 compatible device. This error indicates there was transmissions errors.
       
      The solution is either to get a better Gen2-compliant eGPU adapter such as PE4C V3.0 or PE4L 2.1b (both with soldered cable), or downgrade your link from Gen2 to Gen1 using BIOS options or Setup 1.30
       
      Other troubleshooting help resources?
       
      See DIY eGPU Troubleshooting FAQ.
       
    • By ReverseEffect
      3dMark11 Performance Preset Benchmark: http://www.3dmark.com/3dm11/11262792
       
      Required items:
      1.) Lenovo u310 (I have a Core i3 - Ivy Bridge, 8GB RAM)
      2.) 65CN99WW unwhitelisted.
      3.) eGPU (I used a EVGA GTX 750 Ti from another computer I had).
      4.) EXP GDC mPCIe Edition adapter (got from eBay - banggood seller).
      5.) ATX power supply (I used a 600W PSU from another computer I had).
      6.) USB wireless.
      7.) External monitor, keyboard, and mouse.
       
      Steps:
      1.) Obtain and install a unwhitelisted BIOS. If you are unable to obtain a unwhitelist BIOS, I think it might be possible to bypass it with Tech Inferno Fan's Setup 1.x (may need confirmation as I haven't used it myself yet.)
      2.) Shutdown computer and remove all USB devices, ethernet cables, power cables, card reader cards.
      3.) Remove mPCIe wireless card and detach antennas.
       
       
      4.) Attach EXP GDC external mPCIe cable to the former wireless slot and screw down.
       
       
      5.) Attach HDMI end of the mPCIe cable adapter to the EXP GDC device.
       
       
      6.) Attach graphics card to the EXP GDC device (I moved my laptop off the desk and onto the side shelf to make room on the desk for the monitor/keyboard/mouse).
       
       
      7.) Using the power cable adapters that came with the EXP GDC device, I hooked in my ATX power supply's 20 pin and CPU 4 pin cables. Then hooked the other end (8 pin) into the EXP GDC device. My EVGA 750 Ti also required that I use an additional PCIe power cable (6 pin) in the top of the card.
       
       
       
       
       
      8.) Then I attached my misc devices (HDMI monitor, USB keyboard/mouse/wireless adapter), and hooked in my PSU and powered it on (below is image of final product, also moved HDMI cable out of the way).
       

       
      9.) Power on your computer and let it install the standard VGA drivers and then install your drivers (I didn't have to go in the BIOS for any graphics settings, which it doesn't have anyways, nor did I have to disable iGPU in Device Manager before the card was added).
       
      Extra Info:
      I found that most games will play on med settings with about 45 FPS with this particular card.
      BDO: Upscale on - Anti Aliasing on - SSAO off - med settings.
      Skyrim: Med-High settings.
      Fallout 4: Med settings.
       
      (EDIT 5/19/2016) > Images added.
       
    • By TheLoser1124
      Hello, A couple of days ago I got a new GPU but when I installed it into my computer I was unable to use it but now I know why. When checking the device manger I went into the events tab of my GPU when I went to view all events, I noticed an error it said " event 411 kernel PnP " and It also said Problem Status: 0xC01E0438. I believe this is why my GPU hasn't been working on my PC. If you know how to fix this problem or have info on how to fix this problem that would be greatly appreciated. I'm also using a EVGA NVIDIA GeForce GTX 1660.
    • By TheLoser1124
      I'm having a problem where my PC is saying my eGPU is not usable, its detected in the Device Manager and it doesn't have the yellow triangle next to it. I cant use it games and the Nvidia Control Panel doesn't recognize it either. I'm using a EVGA NVIDIA Geforce GTX 1660. I'm using windows 10 and I tried DDU and reinstalling them and now I cant access the nvidia control panel. The GPU is not recognize on any other apps and I went on *********** and was unable to find my answer, Any help on how to fix this problem would be greatly appreciated.
    • By Radstark
      Title sums it up.
       
      TL;DR: we have a Clevo that runs a desktop CPU, one with those huge 82 Wh batteries. We remove the GPU and let it use the CPU's integrated graphics. How much time for the battery to go from 100 to 0? Is it comparable to an ultrabook's?
       
      I'm theorizing a mobile set with a static eGPU and an upgradable CPU. Given a hypothetical user that needs fast processing on the go and long battery life while retaining very high degrees of mobility, but at home wants a powerful machine to run most games, I guess that would be their best bet. It would surely be more convenient to keep everything in the same disk. And even though the thing would be quite heavy to carry around, changing CPU would be more cost-efficient than changing an entire laptop. (Not sure if I'm right here, and also I'm not sure whether the motherboard in a Clevo would be replaceable when the new CPU needs a different socket, which is another reason why I'm asking here.)
       
      If my above guesses aren't correct, then an ultrabook with Thunderbolt and without a dedicated GPU would be a better choice. If they are, then we would be carrying more weight in exchange of a more cost-efficient setup, which I think would be a fair tradeoff.
       
      Also I am aware of the heating problems that these laptops suffer from, at least compared to a desktop setup. Would they be solved by moving the GPU out of the chassis, and instead plugging it with an eGPU dock via Thunderbolt port?
       
      What do you think? Is it doable? If not, why?
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.