Jump to content
Tech Inferno Fan

12" Dell E6230 + HD7870/GTX660@x4Gbps+c-EC2 (PE4L 2.1b) + Win7 [nando4]

Recommended Posts

(11-21-2012) EXCLUSIVE!! A i5-3320M 2.6 12.5" Dell E6230 + GTX660 and HD7870 was performance tested on all the bandwidth levels available to mPCIe/expresscard systems: x1.2Opt, x1 2.0 (x2 1.0), x1.1Opt (aka x1.Opt) and x1 1.0. Comparing to my last year's i5-2540M 2.6 +GTX560Ti@x1.2Opt here we see moderate eGPU performance improvements along with a huge 3dmark11 boost. I was impressed by the HD4000 iGPU performance compared to the last gen HD3000 - it can now provide gaming-on-the-go.

Implementation: i5-3320M 2.6 12.5" Dell E6230 + NVidia GTX660 @x1.2Opt + HD7870 @x1 2.0
 

Notebook
 
  • AU$600 12.5" Dell E6230 i5-3320M 2.6
    HD4000 8GB DDR3 320GB (A.02 bios)
  • Series-7 QM77 expresscard 2.0 slot
  • Windows 7/64 Pro + NVidia 306.9 + 
    AMD 12.11b9
DIY eGPU parts
 
  • PE4L-EC060A 2.1b: US$81(US)/ US$92(ROW) shipped
  • US$240 Asus GTX660 (2GB, 1020/1502) and 
    US$240 Gigabyte HD7870 (2GB, 1100/1200)
  • US$5 salvaged 12V/17A "550W" ATX PSU

TOTAL=US$326 (GTX560Ti=US$216 or GTX660Ti=US$376 below)
Benchmarks (highest OCed GTX660) 3dmark: 06/vant.gpu/11.gpu=19842/21071/6108, RE5.dx9=160.3, dmcv4.dx10_s4=159.4]
LCD
config
RAM
GPU$
DX9
DX10
DX11
Ports / SPD
3dmk6^
RE5
var|fixed
FFXIV
Mafia2^!
1080p
3dmk^
vant.g
dmcv4
scene4
3dmk11
720p
Unigine
Heaven
Dirt2#^
1080p
1080p
720p
ext 8.0
GB
HD7870
x1.2
19653
157.4 | 87.7
4249 4511
59.5
21798 186.2 6427 1736 32.2/70.7 QM77 PCIe-S
GTX660
x1.2Opt
19673 & 159.6 & | 91.1 & 3851 4248 59.1 &59.3 20331 & 159.4 & 5900 & 1759 & 53.5/74.9 &55.7/76.9 QM77 CUDA-Z
GTX660
x1 2.0
10978
137.0 | 77.6
3751 4206
49.5
20185 150.3 5753 1739 51.9/74.8 QM77 CUDA-Z
GTX660
x1.1Opt
19416
138.2 | 75.8
3849 4242
54.3
17464 112.9 5399 1449 30.1/50.3 QM77 CUDA-Z
HD7870
x1.1
18432
106.9 | 53.6
3962 4467
51.2
18935 118.2 5833 1396 20.4/34.2 QM77 PCIe-S
GTX660
x1 1.0
6301
85.2 | 44.6
3183 3738
31.1
17316 111.5 5381 1458 29.2/49.0 QM77 CUDA-Z
int HD7870
x1.2
-
112.7 | 61.9
- 4243
-
- 132.0 - 1241 - QM77 PCIe-S
GTX660
x1.2Opt
16472
79.7 | 74.2
2636 3916
47.3
16051 82.1 5493 1461 37.4/46.7 QM77 CUDA-Z
HD7870
x1.1
-
42.2 | 29.5
- 2825
-
- 49.6 - 789 - QM77 PCIe-S
GTX660
x1.1Opt
10471
39.9 | 37.3
1351 2851
23.5
10231 41.1 4623 947 19.5/23.6 QM77 CUDA-Z
HD4000 5811
42.4 | 34.7
699 1382
14.6
3026 43.3 - 333 18.9/25.6 -
$ = the x1 1.0 and x1 2.0 NVidia modes (not Opt) have the iGPU disabled. Achieved by powering the eGPU before starting the system. HD7870 int mode achieved by running app on external LCD, making in windowed, then dragging it to the internal LCD. Unfortunately LucidLogix Virtu isn't working on the Series-7 Dell E6230 to allow full screen apps to be rendered by the eGPU but displayed on the LCD attached to the iGPU all done transparently.
! = two back-to-back runs using result from the faster second run 
& = 1020/6008@0.962V->1106/6400@0.982V GTX660 eGPU overclock result capture or text. 
# = min/average, London multi-car track with all HIGH except post-process=MED. cmd used "DiRT2.exe benchmark example_benchmark.xml", output saved to Documents/My Games/Dirt 2/Demo/benchmarks
^ = 1080P or 1280x1024 'internal LCD mode' provided via notebook HDMI port to external LCD
GTX660pe4h.th.jpg sgWCS4ht.jpg ZZLDQRdt.jpg Eq49c6at.jpg ZDZYL5xt.jpg
US$325 GTX660+
PE4L-EC060A 2.1b+PSU
x1.2Opt configuration
check
E6230: GTX660@x1.2Opt
using PE4L 2.1b
E6230: GTX660@x1.2Opt
cabling using PE4L 2.1b
E6230: GTX660@x1.2Opt
running RE5 bench

Idea reference: DIY eGPU experiences [version 2.0]

Software Setup 

Some unique features of the Dell E6230

  • It has a dynamic TOLUD. Initially it had a TOLUD of 3.5GB, however, upon the bios detecting the powered eGPU on bootup it changed TOLUD to 3.25GB. If you happened to have performed a DSDT override with TOLUD=3.5GB to get the eGPU functional, as I did, then when the bios had changed to TOLUD=3.25GB you'll see a bootup BSOD. The only fix was to reinstall Win7 from scratch. 
     
  • The eGPU becomes primary video device if detected by bios on bootup. The maximum 6.9s PCI Reset Delay on the PE4L isn't sufficient to prevent the bios from seeing the eGPU. There the bios sets the primary video device to be the eGPU, disables the iGPU and results in only x1 2.0 performance rather than x1 2.0 with pci-e compression (x1.2Opt). This is useful only for my purposes as it allows x1 1.0 and x1 2.0 testing without the pci-e compression engaged. The workaround is a manual start the eGPU after bios boot as described below to get x1.2Opt performance.
     
  • The expresscard slot is set to Gen2 (pci-e 2.0) as the default link speed. No bios option exists to downgrade the link to Gen1 (pci-e 1.x) speed in case you are using older pci-e 1.x specced PE4H 2.4 or PE4L 1.5. You'd need for Setup 1.1x to do that. For best performance it is recommended to use Gen2-capable PE4L-EC060A 2.1b DIY eGPU hardware instead.


Starting the DIY eGPU on a Dell E6230

Once TOLUD has been set to 3.25GB, the DIY eGPU implementation on a Dell E6230 with a PE4L-EC060A 2.1b is plug-and-play. Here's how:
  1. DO THIS ONCE! Attach the EC2C end of the PE4L to the E6230, power on the eGPU, power on the E6230. This will change TOLUD in the bios from the default of 3.5GB to 3.25GB. You'll know when this is done successfully as the PCI BUS entry in Device Manager->View Resources by connection will now show CFA00000 instead of DFA00000 as shown. This change appears permanently set - going into the bios and choosing 'Load Defaults' does not change TOLUD back to 3.5GB. 
     
  2. Manually start the eGPU after the bios has booted to get full x1.2Opt performance. Do this by powering on your E6230, halt Win7 load with the F8 key, attach the EC2C end of the PE4L if it's not already attached, power on the eGPU from the SWEX switch and continue Win7 loading. 
     
  3. If it's the first time Win7 sees the eGPU then will be detected as Standard VGA. There are no error 12 or USB ports disabled issues with the E6230. Proceed to load the latest NVidia desktop video driver. Disable NVidia High Definition sound to maximize video bandwidth. 
     
  4. OPTIONAL: use sleep-resume method. Boot Win7, sleep Win7, attach eGPU via expresscard slot and power it on, resume Win7. If a LCD is attached to the eGPU then make it the Main Display in Display Properties to output accelerated graphics to it. NOTE: this method does not allow Optimus to output accelerated graphics to the internal LCD - the eGPU must be detected on bootup for that. Note: you must set the PCI Reset Delay slider to the 0 second (disabled) position for this to work successfully.
     
  5. OPTIONAL for eGPU overclocking: Install MSI Afterburner.


Observations


 

  • The most interesting result here is reviewing how good the x1 NVidia pci-e compression is. Previously, it has been suggested that x1.Opt is equivalent to x1 2.0 and x1.2Opt equivalent to x2 2.0. The DX9 benchmarks support this idea, even putting Optimus pci-e compressed performance (x1.1Opt) ahead of non-compressed performance with double the bandwidth (x1 2.0) However, when we look at the DX10 and DX11 bench results we see x1.1Opt giving pretty much the same results as x1 1.0; and x1.2Opt giving the same results as x1 2.0. There the physical bandwidth gives more performance than the pci-e compressed bandwidth.

    By induction, these results tell us that the x2 2.0 capable TH05 Thunderbolt-to-pcie adapter will give better synthetic benchmarks and in real-life gaming DX10/DX11 NVidia performance than the x1.2Opt expresscard solutions.

    What is interesting is the HD7870 mostly surpasses x1.2Opt performance. If the pci-e compression is disabled and we compare the NVidia and AMD cards on equal terms (x1 2.0 or x1 1.0), the AMD card is the superior performer. Gone too is the half-duplexing issue with the AMD card that affected older Core2Duo and 1st-gen i-core systems giving lousy performance.
     
  • this GTX660 can be powered by a basic 12V/15A ATX PSU available anywhere. It only requires a single 6-pin pci-e power connector and has a maximum TDP of 140W. It means also any fat XBOX 360 PSU (150W, 175W, 203W) could be modified to be a portable PSU to power it.
     
  • GPU-Z will report a x1 2.0 link speed upon starting GPU-Z but will revert to x1 1.0 a few seconds later *if* the GTX660 eGPU isn't under load. Changing from Adaptive to Performance in the NVidia Control panel having no effect. Running a game on the eGPU in the background does however keep and maintain a x1 2.0 link. The HD7870 requires a background process to put it under load to switch into x1 2.0 link mode.
     
  • x1.2Opt's extra bandwidth makes it realistically possible to now run a eGPU using the internal LCD mode only. x1.2Opt-internal shows twice faster FPS than x1.1Opt-internal in gaming benchmark results. Clearly x1.1Opt-internal was choking on the limited bandwidth. This extra bandwidth means a SB x1.2Opt-internal implementation will outperform a 1st-gen i-core x1.1Opt-external implementation (when you factor in the faster CPU). It will cost less too when you factor in the savings from not buying an external LCD. Though I'd still recommend getting an inexpensive s/h external LCD to run highest FPS.
     
  • The windowed benchmark FFXIV sees FPS double if the Aero interface is disabled prior to the run.
     
  • Gaming/benchmarks are noticably smoother with x1.2Opt over x1.1Opt. Far less microlags. Benchmark histograms show less deviation from the median frame rate.
     
  • A GTX660 is the first affordable mid-range Keplar card sitting at a US$230 pricepoint. A notably more powerful GTX660Ti (960 vs 1344 cores, +40%) is available at ~$290, plus has more overclocking headroom due to large power limits. A GTX560Ti can be had for $180 or for $130 as a s/h unit on ebay. A GTX560Ti is still a great performer but does require slightly more power to run (~150W TDP). So then Cheapskates may want to get a GTX560Ti and performance enthusiasts pay the extra $50 for a GTX660Ti instead of the GTX660. My total cost was US$325. A GTX560Ti or GTX660Ti would be a US$216 or US$376 package respectively.
     
  • In order to do x1 1.0 testing it was necessary to do a x1 2.0 to x1 1.0 live link degrade where the eGPU was booted as the primary video device. The next version of Setup 1.1x adds a prompt to do 'link retraining' when changing a port from Gen2 to Gen1 and vice-versa. Where in the case of a live link the user will select No. In the current Setup-1.105, link retraining was done automatically and would hang when applied to a a live pci-e link.

Conclusion for Sandy/Ivy Bridge systems with latest GTX6xx/HD7xxx card


  • for expresscard/mPCIe systems capable of x1.2Opt: AMD has overall the better performance but the margin is minor.
  • for Thunderbolt systems or systems incapable of x1.2Opt: AMD cards > NVidia cards.


Other factors favoring NVidia cards are CUDA and Optimus' internal LCD mode. Unfortunately AMD's equivalent Enduro is still being developed and doesn't appear to be easy to retrofit to eGPU solutions. I did manage to get LucidLogix Virtu to work. See http://forum.techinferno.com/diy-e-gpu-projects/2967-lucidlogix-virtu-internal-lcd-mode-amd-egpus.html#post41056 for details.

  • Thumbs Up 6

Share this post


Link to post
Share on other sites

Hey Nando, Here is my config : Dell e6230, Core i5 3320M, 4GB RAM BIOS A09 + ASUS GTX 660 Direct CU2 + a PE4L.

I tried to enable the eGPU but my TOLUD doesn't change from DFA00000 to CFA00000 automatically. My eGPU does get detected when I connect it, I know so because it shows up in the device manager when I turn it on when Windows loads, and also when I boot with the eGPU running, my internal display doesn't work, my external display also does not work, but I can "hear" windows boot up, login and shut it down with out the main display.

Can you please help ? Thanks a ton in advance !

Share this post


Link to post
Share on other sites
Hey Nando, Here is my config : Dell e6230, Core i5 3320M, 4GB RAM BIOS A09 + ASUS GTX 660 Direct CU2 + a PE4L.

I tried to enable the eGPU but my TOLUD doesn't change from DFA00000 to CFA00000 automatically. My eGPU does get detected when I connect it, I know so because it shows up in the device manager when I turn it on when Windows loads, and also when I boot with the eGPU running, my internal display doesn't work, my external display also does not work, but I can "hear" windows boot up, login and shut it down with out the main display.

Can you please help ? Thanks a ton in advance !

Latitudes will dynamically adjust the TOLUD if they see an eGPU on startup. The problem is that they also disable the iGPU. Because of this it is necessary to hotplug the eGPU after BIOS POST so that the BIOS does not disable the iGPU, but this means that the TOLUD will not be automatically lowered.

When booting with the eGPU on and connected with a E6520, I found that one of the eGPU DVI ports would output a video signal with a monitor connected or not. If you tried one port and it doesn't work, then try the other. Unfortunately you cannot stop the system from disabling the iGPU so you will not get optimus with this method, but your eGPU should be capable of running this way until you get the DSDT override working.

Share this post


Link to post
Share on other sites

Thank you so much Khenglish, for the prompt reply as well as for making my eGPU work. Hooking up the external display to the 2nd DVI port did the trick. Although, I don't think my eGPU is performing to its maximum potential, because when I ran 3DMark06 I got a score of 2448 :( (link).

I don't know what I am doing wrong, but I am going to dive into the forums to fish out a solution. Also, would you suggest using Setup 1.x or DSDT over-ride. I have very little knowledge about them both, but I can figure it out? The only thing I don't want to do it do a fresh windows install :)

I would appreciate any input.

Thanks again !

Share this post


Link to post
Share on other sites
Thank you so much Khenglish, for the prompt reply as well as for making my eGPU work. Hooking up the external display to the 2nd DVI port did the trick. Although, I don't think my eGPU is performing to its maximum potential, because when I ran 3DMark06 I got a score of 2448 :( (link).

I don't know what I am doing wrong, but I am going to dive into the forums to fish out a solution. Also, would you suggest using Setup 1.x or DSDT over-ride. I have very little knowledge about them both, but I can figure it out? The only thing I don't want to do it do a fresh windows install :)

I would appreciate any input.

Thanks again !

Further to Khenglish's comments, I've documented that the TOLUD changed in my E6230 bios from 3.5GB to 3.25GB upon detecting the eGPU. This was a permanent change that I couldn't, nor wanted, to reverse.

It has a dynamic TOLUD. Initially it had a TOLUD of 3.5GB, however, upon the bios detecting the powered eGPU on bootup it changed TOLUD to 3.25GB. If you happened to have performed a DSDT override with TOLUD=3.5GB to get the eGPU functional, as I did, then when the bios had changed to TOLUD=3.25GB you'll see a bootup BSOD. The only fix was to reinstall Win7 from scratch

I'd suggest installing the latest E6230 bios. Furthermore, I had 8GB RAM installed when this TOLUD change occurred. If yours won't change then you can still do a DSDT override, as I did, to get successful eGPU functionality with 4GB+ of RAM installed.

Share this post


Link to post
Share on other sites

I have updated my bios to A09, I even tried downgrading it to A02, since I saw yours was on A02. I have setup 1.10b5 currently installed and I was trying to get a 36-bit compaction, but that didnt seem to work. So I guess I will get rid of the setup 1x and go along the DSDT over-ride path. I don't really know low level addressing memory stuff but will try to follow the instructions and figure this out. Was your RAM a single module or 2x 4GB ? Just curious. If I fail at the DSDT override then I might just get some extra RAM and hope that it changes the TOLUD automatically.

Thanks !

Share this post


Link to post
Share on other sites
I have updated my bios to A09, I even tried downgrading it to A02, since I saw yours was on A02. I have setup 1.10b5 currently installed and I was trying to get a 36-bit compaction, but that didnt seem to work. So I guess I will get rid of the setup 1x and go along the DSDT over-ride path. I don't really know low level addressing memory stuff but will try to follow the instructions and figure this out. Was your RAM a single module or 2x 4GB ? Just curious. If I fail at the DSDT override then I might just get some extra RAM and hope that it changes the TOLUD automatically.

Thanks !

I was using 2x4GB RAM modules. The TOLUD changed either because of my swapping out RAM or upon detecting the eGPU. I did both at the same time so was not sure what the trigger was. The DSDT override worked just as well, that is, until TOLUD changed causing Win7 to BSOD on every boot.

You will need to hotplug the eGPU after boot (or use the PCI Reset Delay) so as to get the iGPU active. Without the active iGPU you won't get x1.2Opt performance. The Dell bios will otherwise make the eGPU the primary video device if it's detected on boot.

Share this post


Link to post
Share on other sites

I am happy to report that I have enabled Optimus using the DSDT override technique after having at for almost 4 hours. :) My 3DMark06 is 16111. Its weird that my processor clock shows up as 1197 MHz. Should I be worried about my CPU running so slow ?

Share this post


Link to post
Share on other sites
I am happy to report that I have enabled Optimus using the DSDT override technique after having at for almost 4 hours. :) My 3DMark06 is 16111. Its weird that my processor clock shows up as 1197 MHz. Should I be worried about my CPU running so slow ?

Great news.. that was how I was running it until TOLUD changed on me and the DSDT override then caused BSODs. Worth keeping an eye out for that.

Is that 3dmark06 score with internal or external LCD? You can see I got a score of 19673, quite a bit more than your 16111.

Try changing your Power Profile to be Maximum performance to see if that improves your 3dmark06 score. Ensure the external LCD is attached via the HDMI port on the video card rather than the HDMI port on the notebook so it routes out directly rather than via Optimus/internal LCD mode.

Share this post


Link to post
Share on other sites

Hi, it's about 2 years ago since last post by @Tech Inferno Fan

I've got error 12(tolud issues) on my spec :

Dell E6230

i5 3320M 2,6 ; 8GB ram ; Win 8.1 x64 with

EXP GDC v.6 (using Dell DA-2 220W)

MSi GTX 750Ti 2GD5.

Before that, I've been successfully for attaching GTX 560SE on exp gdc via plug n play. And My tolud matched about 3,25GB, according Nando4's Posting. Moreover, after attaching GTX 750Ti and before that, my tolud still did not changed.

This is mine :

post-34411-1449499996185_thumb.jpg

First trying, I use bios A02 and that's not get me a favor. I try bios A014. Still get error 12.

My Question , Can I use this gtx 750ti w/o DSDT override ? Honestly, I'm still confusing to get DSDT override.

I'm sorry for my english. I'm from Indonesian.

Thank you all .

Share this post


Link to post
Share on other sites

Thank you so much to @Tech Inferno Fan

[solving Error 12 on Dell Latitude E6230]

[Here my spec:]

i5 3320M 2,6

2x4 GB 1600Mhz

Win 8.1 x64

Bios A14 (newest)

EXP GDC v.6

MSi GTX 750TI OC version

AOC Monitor 1366x768@60Hz

[prologue]

I've been use old GTX 560SE via plug n play. That card can run without error 12. But the newer card, GTX 750TI, need PCIE compaction via setup 1.3 or you get error 12.

[installation]

- Install setup 1.3 on windows

- first, extract the setup on your outer directory system, example : C:\egpu

- then, inside folder egpu, click setup-disk-image.bat( run as administrator)

- After setup installed, reboot your computer

- on boot manager, choose boot setup 1.3

- go to menu setup

- go to chainloader --> change MBR2 into MBR

- Test run chainloader

- system will restart, and choose boot setup 1.3 again and choose menu setup

- go to pcie compaction, choose igpu and egpu only

- go to chainloader ---> test run

- on boot manager, choose windows 8.1

- Done :)

note : nvidia driver just installed before installation setup 1.3 and all steps above. So don't get worry if your gpu won't recognized the system

[Testing]

- 3DMark06 : 19639, (unfortunately, my 3DMark13 setup corrupted, so I cannot give the score)

- Dying Light : medium setting , 31-60 fps

- Far Cry 4 : Ultra setting : 22-60 fps ; Very high Setting : 30-60 FPS

- GTA V : All setting set High, set max on distance scalling and density population, prevent shuttering FPS locked to 30 FPS via nvidia inspector. And in game, I get 27-30 FPS.

[New Problem]

I cannot booting with egpu. I cannot restart mode with egpu. So I must power on my laptop. And after entering windows, I just plug the egpu. if I want restart windows, I must unplug egpu, then restarting windows.

Can anyone solve this problem ?

  • Thumbs Up 2

Share this post


Link to post
Share on other sites
dear all, I have follow step by step instruction and get same problem with @Sunay here is my spec Dell E6230 - bios a14 i5 3320M 2,6 ; 8GB ram ; win 7-64 with

PE4C v3.0 (PSU - CX600M) Nvidia GT210 (trial GPU, borrow from my friend). Samsung Monitor (1440 x 900, 60 hz)

I think, I need do dsdt, so I folllow dsdt step by step using DSDT editor Linux_Windows when i reach the end of dword entry and going to add qword entry, i found that dsdt.aml file already have qword statement.

Spoiler

QWordMemory (ResourceProducer, PosDecode, MinFixed, MaxFixed, Cacheable, ReadWrite, 0x0000000000000000, // Granularity 0x0000000000010000, // Range Minimum 0x000000000001FFFF, // Range Maximum 0x0000000000000000, // Translation Offset 0x0000000000010000, // Length ,, _Y0F, AddressRangeMemory, TypeStatic)



i have change its value to

Spoiler

QWordMemory (ResourceProducer, PosDecode, MinFixed, MaxFixed, Cacheable, ReadWrite, 0x0000000000000000, // Granularity 0x0000000C20000000, // Range Minimum, set it to 48.5GB 0x0000000E0FFFFFFF, // Range Maximum, set it to 56.25GB 0x0000000000000000, // Translation Offset 0x00000001F0000000, // Length calculated by Range Max - Range Min. ,, , AddressRangeMemory, TypeStatic)



after compling I got

error : 1

7464 Error Invalid leading asterisk (*pnp0c14)


warning : 8 Remarks : 14

I have looking at how to fixed DSDT errors by @kizwan, but it did not mention invalid leading asterix problem

Since i don't have any programming background, I will be grateful if someone can give explanation and what i have to do regarding this condition.

Many thanks

Share this post


Link to post
Share on other sites

Hi, I just solved tolud value without changing DSDT

initial before tolud value permanent change

<spoiler>

post-36666-14495000379135_thumb.png

</spoiler>

permanent tolud change

post-36666-14495000379412_thumb.png

I modified step from solustion#3 error 12 - DIY eGPU trouble shooting FAQ

Step to change tolud value

1. Set delay switch1 value to 3 and delay switch 2 value to 2 at PE4C G3

2. Turn on laptop with EC adapter (GPU also turn on using booth together function)

3. Device manager will detect GPU as standard video card (with error 12) and several HD audio hardware (in audio section)

4. Disable HD audio from GPU and disable Standard video card

5. Restart laptop, as per win7 request

6. After restart, open device manager, vga(egpu) disable then turn off eGPU, rescan hardware change (eGPU not detected)

7. Put laptop on sleep

8. Turn on eGPU then wake up laptop

9. Enable standard video card on Device Manager and restart (as per win7 request)

10. Check device manager (device manager > view > Resources by connections), tolud has change as per @Tech Inferno Fan documented)

Tolud has changed but i still have error 12 problem.

I going to follow @gharimanto step by step solutions

Share this post


Link to post
Share on other sites

@gharimanto solutions work perfectly, without restart problem and unable boot with eGPU.

I'm using delay switch to prevent bios detecting eGPU

3DMark11 result - 4246

Spec

Dell E6230 - bios a14

i5 3320M 2,6 ; 8GB ram ; win 7-64 with

PE4C v3.0 (PSU - CX600M)

Nvidia GTX 750.

Samsung Monitor (1440 x 900, 60 hz)

429dccf89c50ed90ce23c5139e8114e7.jpg

Share this post


Link to post
Share on other sites

I just post about performance gtx 1060 with this laptop

 

 

Firestrike. Look the graphics score, performance decrease about 20% than the desktop

Spoiler

Capture.PNG

 

This is same with Time Spy

Spoiler

123.PNG

 

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.