Jump to content

eGPU experiences [version 2.0]


Tech Inferno Fan

Recommended Posts

Possibly these are crazy questions, anyway...

1. Is it possible to connect eGPU and laptop's LCD with DVI cable and make it...you know, loop data transferring? Use internal lcd as external.

2. How about upgrade an expresscard slot (for example, find a newer laptop model for parts with 2.0 slot)? Is this possible or do I need to replace whole motherboard to upgrade it?

Link to comment
Share on other sites

Possibly these are crazy questions but anyway...

1. Is it possible to connect eGPU and laptop's LCD with DVI cable and make it...you know, loop data transferring? Use internal lcd as external.

2. How about upgrade an expresscard slot (for example, find a newer laptop model for parts with 2.0 slot)? Is this possible or do I need to replace whole motherboard to upgrade it?

A pci-e 2.0 capable system could drive the internal LCD and see better FPS than your current pci-e 1.x system. That's the easier and least cumbersome way of getting what you want. See pci-e 1.x (external LCD) versus 2.x (internal LCD) performance comparison at http://forum.techinferno.com/diy-e-gpu-projects/2747-%5Bguide%5D-12-dell-e6230-gtx660%40x1-2opt-hd7870%40x1-2-pe4l-ec060a-2-1b.html#post37197 .

Bitminers?

Any bitmining folks out there? Surprised those users aren't using eGPUs to do their mining. Way better returns than using a energy thirsty desktop to drive their Radeons. Seems the financial comptrollers are a bit worried about Bitcoin since they can't control it.

Link to comment
Share on other sites

A pci-e 2.0 capable system could drive the internal LCD and see better FPS than your current pci-e 1.x system. That's the easier and least cumbersome way of getting what you want. See pci-e 1.x (external LCD) versus 2.x (internal LCD) performance comparison at http://forum.techinferno.com/diy-e-gpu-projects/2747-%5Bguide%5D-12-dell-e6230-gtx660%40x1-2opt-hd7870%40x1-2-pe4l-ec060a-2-1b.html#post37197 .

Bitminers?

Any bitmining folks out there? Surprised those users aren't using eGPUs to do their mining. Way better returns than using a energy thirsty desktop to drive their Radeons. Seems the financial comptrollers are a bit worried about Bitcoin since they can't control it.

When I had the 7970 I left it at night to litecoin mine.. a desktop hardy takes more power as the GPU takes the most.

Link to comment
Share on other sites

First GTX660 vs HD6850 comparison (I don't have access to 660 atm...on the other hand, such comparison is a bit pointless, because 6850 is way weaker than 660, and it seems like it's more limited by pcie bandwidth):

Crysis 3, mission 6 (from the beginning to the second Ceph AA defence, getting there took me 7 minutes on 660 and 12 minutes on hd6850 due to lower FPS, which makes it hard to play well).

Settings:

Resolution: 1920x1080

All low

[email protected]:

18897026_crysis-3-gtx660-misja-6.png

Look at the red line, green represents gameplay at high setting (textures and the rest) and is really short.

AVG FPS: 45

[email protected]:

19863389_crysis-3_mission-6_hd6850_fullhd-low.png

AVG FPS: 27

Like before, the CPU Usage is way lower then it was with GTX660. It's really worth consideration, because as we know, Nvidia drivers cause a greater CPU load then AMD drivers. It's much more visible on older Core 2 Duo based desktop PCs, causing microstuttering, but I think that it might affect Core i5 as well. The problem should be not present when using quad core CPU.

On low settings, GTX660 should easily get average of 60 FPS. Not sure about HD6850 performance, but according to benchmarks, 28 FPS is a valid result... for high settings. On low it should be more like 40 FPS I suppose.

So, we can see that both cards are limited. By... and now it's starting to get really difficult. I'm not sure which of the factors is more important here. Maybe the performance is CPU limited, like I think it is on Welcome to the Jungle level. Of course, when it comes to the "grass moment", the PCIe bandwitdh is a real drawback there, giving us drops to 20 FPS instead of a way more playable 30-35. On GTX660, mission 6 is really playable, even at high setting. On HD6850 it's not playable on low settings, which keeps mi thinking about how a GCN-based card would perform. It would have to be less bandwidth-limited to maintain good FPS.

p.s. I'm going to buy a GCN-based card like HD7870/7870xt/r9-270(X), but given that it's Christmas time, I think I'll have to wait till January, because prices are a bit higher now and shipping might take very long. Which is really important, because in Poland when buying online, I could give the GPU back after 10 days of testing without providing any cause of it. So, I'd like to use that privelege for an almost-free benchmarks, if I won't stay with AMD GPU.

  • Thumbs Up 1
Link to comment
Share on other sites

p.s. I'm going to buy a GCN-based card like HD7870/7870xt/r9-270(X), but given that it's Christmas time, I think I'll have to wait till January, because prices are a bit higher now and shipping might take very long. Which is really important, because in Poland when buying online, I could give the GPU back after 10 days of testing without providing any cause of it. So, I'd like to use that privelege for an almost-free benchmarks, if I won't stay with AMD GPU.

If you can and willing to return a card after use, why not test a 290X and be done with it :o

Has anyone tried SLI/Crossfire yet? Like two PE4L one expresscard and one mPCIe and using a bridge? As I understand

the bridge helps the gpu's talk trough it rather than using PCi bandwidth. This is different than using two lanes from different ports to one.

Now with a DSDT override, I would really want to see the max a 1.x 2.0 or two in SLI (if possible) can handle,

say a 7990, or 690. Wish I had the money :P

Link to comment
Share on other sites

And here, catch "The root of all Evil", part one from the beginning to destroying the badass Hydro-Electric Generator.

HWInfo log:

19866258_crysis-3_map-canyon.png

Screenshots:

19866208_crysis3-2013-12-21-19-45-36-02.jpg 19866209_crysis3-2013-12-21-19-46-06-46.jpg 19866210_crysis3-2013-12-21-19-46-46-51.jpg 19866211_crysis3-2013-12-21-19-47-06-48.jpg 19866213_crysis3-2013-12-21-19-47-16-47.jpg 19866214_crysis3-2013-12-21-19-47-26-49.jpg 19866215_crysis3-2013-12-21-19-47-36-46.jpg 19866216_crysis3-2013-12-21-19-49-16-51.jpg 19866217_crysis3-2013-12-21-19-50-06-51.jpg 19866219_crysis3-2013-12-21-19-50-56-47.jpg 19866220_crysis3-2013-12-21-19-51-06-46.jpg 19866221_crysis3-2013-12-21-19-53-26-49.jpg 19866222_crysis3-2013-12-21-19-54-26-49.jpg 19866223_crysis3-2013-12-21-19-55-46-46.jpg 19866224_crysis3-2013-12-21-19-56-36-50.jpg 19866225_crysis3-2013-12-21-19-57-16-46.jpg 19866226_crysis3-2013-12-21-19-57-26-45.jpg 19866227_crysis3-2013-12-21-19-57-56-46.jpg 19866228_crysis3-2013-12-21-19-59-36-52.jpg 19866229_crysis3-2013-12-21-19-59-46-53.jpg

Placed in a spoiler, because there are 20 screenshots, and someone might have not played the game yet... it's my favourite mission in the game, spectacular and mind blowing, even on low settings like in the test.

I don't have the GTX660 log at the moment, but I remember the mission was fully playable at >30 fps min and about 35 average, played with high textures and med-high settings, so again, way better

Maybe the FPS was higher because of better CPU Utilisation? The CPU Usage stays at 60% avg on hd6850, and on 660 was more like 80-85%.

If you can and willing to return a card after use, why not test a 290X and be done with it :o

Because I need a GPU comparable with GTX660, so I'll rather stay at the level of HD7870/r9 270. But the new Radeon series is... strange. R9 270 is sometimes better, sometimes way worse than 660, and it costs about 10-20% more than I payed for 660 four months ago. 270x is a bit more powerful, but 270 should be easy to overclock to match 270x performance. Then there is a gap, because the next GPU is 280x, which can be compared to 680.

There should be a r9 280, priced at 200$ and being a good competitor for 660ti or 670.

I'll propably take the 270 one or find an older 7870 GPU, which has better GPU clock, but worse memory clock... and that's the only difference.

I can take 7870xt too. It's priced at the level of R9 270x here in Poland, and it's performance is better. But it's only available in Club3d and XFX versions, and I'd rather stay with Asus, MSI or Gigabyte ones.

Link to comment
Share on other sites

And here, catch "The root of all Evil", part one from the beginning to destroying the badass Hydro-Electric Generator.

HWInfo log:

19866258_crysis-3_map-canyon.png

Screenshots:

19866208_crysis3-2013-12-21-19-45-36-02.jpg 19866209_crysis3-2013-12-21-19-46-06-46.jpg 19866210_crysis3-2013-12-21-19-46-46-51.jpg 19866211_crysis3-2013-12-21-19-47-06-48.jpg 19866213_crysis3-2013-12-21-19-47-16-47.jpg 19866214_crysis3-2013-12-21-19-47-26-49.jpg 19866215_crysis3-2013-12-21-19-47-36-46.jpg 19866216_crysis3-2013-12-21-19-49-16-51.jpg 19866217_crysis3-2013-12-21-19-50-06-51.jpg 19866219_crysis3-2013-12-21-19-50-56-47.jpg 19866220_crysis3-2013-12-21-19-51-06-46.jpg 19866221_crysis3-2013-12-21-19-53-26-49.jpg 19866222_crysis3-2013-12-21-19-54-26-49.jpg 19866223_crysis3-2013-12-21-19-55-46-46.jpg 19866224_crysis3-2013-12-21-19-56-36-50.jpg 19866225_crysis3-2013-12-21-19-57-16-46.jpg 19866226_crysis3-2013-12-21-19-57-26-45.jpg 19866227_crysis3-2013-12-21-19-57-56-46.jpg 19866228_crysis3-2013-12-21-19-59-36-52.jpg 19866229_crysis3-2013-12-21-19-59-46-53.jpg

Placed in a spoiler, because there are 20 screenshots, and someone might have not played the game yet... it's my favourite mission in the game, spectacular and mind blowing, even on low settings like in the test.

I don't have the GTX660 log at the moment, but I remember the mission was fully playable at >30 fps min and about 35 average, played with high textures and med-high settings, so again, way better

Maybe the FPS was higher because of better CPU Utilisation? The CPU Usage stays at 60% avg on hd6850, and on 660 was more like 80-85%.

My guess its the opposite, its higher CPU Utilization because of the higher FPS, meaning the CPU can handle those frames easily but the GPU isn't keeping up so the CPU has to wait...

Link to comment
Share on other sites

first i just want to say thanks to all this awsome forum members mods... for the amazing job that you are doing it

so i have ASUS K53SD(i5 2450M sandy bridge @2.5 GHZ "2 cores 4 threads" 4gb of ram will try to upgrade it to 8 gb and both geforce 610M and intel HD 3000) i spent the last month reading this forum about the project so i know the steps but as there is not the kind of my laptop in the projects that have been done here i want some assistance from you

My questions :

first i do not know if i have a express card slot or not so i think i will just use my wifi mini e-pci slot with PE4H PM3N can it be done or no?

second i want to plug a Gtx 760 or a Gtx 660 ti and of course buy them a separate PSU could both(one of them) of them work good with the PE4H-PM3N ?

the most important question is i do not own an external screen so i wanna use my laptop lcd screen this is why i want to do these build can it be done ? and how with steps please?

finally i just want this build at 1366 * 768 and most importantly is only for the witche 3 so please help me

the screen shots from cpu z and gpu z

post-21234-14494996772201_thumb.jpg

post-21234-14494996772389_thumb.png

with all due respect

Link to comment
Share on other sites

first i just want to say thanks to all this awsome forum members mods... for the amazing job that you are doing it

so i have ASUS K53SD(i5 2450M sandy bridge @2.5 GHZ "2 cores 4 threads" 4gb of ram will try to upgrade it to 8 gb and both geforce 610M and intel HD 3000) i spent the last month reading this forum about the project so i know the steps but as there is not the kind of my laptop in the projects that have been done here i want some assistance from you

My questions :

first i do not know if i have a express card slot or not so i think i will just use my wifi mini e-pci slot with PE4H PM3N can it be done or no?

second i want to plug a Gtx 760 or a Gtx 660 ti and of course buy them a separate PSU could both(one of them) of them work good with the PE4H-PM3N ?

the most important question is i do not own an external screen so i wanna use my laptop lcd screen this is why i want to do these build can it be done ? and how with steps please?

finally i just want this build at 1366 * 768 and most importantly is only for the witche 3 so please help me

the screen shots from cpu z and gpu z

[ATTACH]10050[/ATTACH]

[ATTACH=CONFIG]10051[/ATTACH]

with all due respect

No you don't have a expresscard and PE4H 2.4 is Gen1 (2.5Gbps), You'll need PE4L v2.1b or PE4H v3.2, both are Gen2 and sadly don't have detachable cables! So PE4L v2.1b is good enough.

This guy was using a 560TI with a laptop almost the same as your asus:

http://forum.techinferno.com/diy-e-gpu-projects/2109-diy-egpu-experiences-%5Bversion-2-0%5D-19.html#post30125

Strange that his next post he says he has a 660TI: http://forum.techinferno.com/diy-e-gpu-projects/2109-diy-egpu-experiences-%5Bversion-2-0%5D-32.html#post31995

Check to see if you have access to a free mPCI-e port and or at least that you have access to the wfi's one. I'm guess if it worked for him that

the port isn't white listed only, but I can't be sure.

Any by quoting nando: If there is no bios option to disable your 610M, then yes, you will need DIY eGPU Setup 1.x to disable the 610M. Only then will the NVidia driver engage pci-e compression netting you x1.2Opt performance.

I think that's also needed for routing to the internal screen.

Link to comment
Share on other sites

No you don't have a expresscard and PE4H 2.4 is Gen1 (2.5Gbps), You'll need PE4L v2.1b or PE4H v3.2, both are Gen2 and sadly don't have detachable cables! So PE4L v2.1b is good enough.

This guy was using a 560TI with a laptop almost the same as your asus:

http://forum.techinferno.com/diy-e-gpu-projects/2109-diy-egpu-experiences-%5Bversion-2-0%5D-19.html#post30125

Strange that his next post he says he has a 660TI: http://forum.techinferno.com/diy-e-gpu-projects/2109-diy-egpu-experiences-%5Bversion-2-0%5D-32.html#post31995

Check to see if you have access to a free mPCI-e port and or at least that you have access to the wfi's one. I'm guess if it worked for him that

the port isn't white listed only, but I can't be sure.

Any by quoting nando: If there is no bios option to disable your 610M, then yes, you will need DIY eGPU Setup 1.x to disable the 610M. Only then will the NVidia driver engage pci-e compression netting you x1.2Opt performance.

I think that's also needed for routing to the internal screen.

thank my dear friend for your replay so i need to check my bios if i can disable the 610m but what is DIY eGPU Setup 1.x and x1.2Opt and sorry for the trouble i giving you

- - -

- - - Updated - - -

Notebook:

- ASUS A53SV

- i5-2410M 2.3GHz

- 4 GB of RAM

- Intel HD Graphics 3000 + Nvidia GeForce GT540M

- Windows 7 x64 Home Premium

eGPU:

- DIY eGPU Setup 1.1x

- PE4L-PM3N 2.1b

- MSI N560GTX-Ti-M2D1GD5/OC

- Nvidia desktop drivers 306.97

Benchmarks:

- 3DMark06 NVIDIA GeForce GTX 560 Ti video card benchmark result - Intel Core i5-2410M Processor,ASUSTeK Computer Inc. K53SV score: 15616 3DMarks

- 3DMark Vantage coming soon

- 3DMark 11 coming soon

hello friend i have the i have ASUS K53SD wich is like your setup if i can ask to give me your steps on how you did it and how to connect the setup to my laptop screen

Link to comment
Share on other sites

thank my dear friend for your replay so i need to check my bios if i can disable the 610m but what is DIY eGPU Setup 1.x and x1.2Opt and sorry for the trouble i giving you

Setup 1.x is a Donationware (you pay a donation to nando to receive the program) that does the following:

Setup 1.x is a FreeDOS environment used to configure your DIY eGPU before chainloading to your OS. Required if you:

  1. encounter WinXP/7 error 12:cannot allocate resources requiring automated PCI Reallocation or IGP relocation. Note: Win7 users may want to try a simpler 36-bit root bridge DSDT override instead of using Setup 1.x.
  2. want to use x1E/x2E/x2/x4 higher performance pci-e links (x1E/x2E on Series-4 or older, x2/x4 on Series-5 or older chipsets)
  3. want to set pci-e 1.0 (2.5GT/s) or pci-e 2.0 (5GT/s) link speed (Series-6 or newer chipset).
  4. require mPCIe anti-whitelisting (HP/Lenovo) to allow a wifi slot to work with a PM3N
  5. want to disable a dGPU in a hybrid graphics system to free up resources to host the eGPU
  6. want to use a NVidia eGPU with Optimus instead of a NVidia dGPU (Series-6 chipsets)
  7. want to initialize a NVidia video card prior to Windows boot to prevent hang/BSOD on startup or error43 in device Manager. Eliminates the need for a standby, attach, resume to overcome this problem.

You might meet 1, 4, 5, 6, and 7 of what setup 1.x does, more info can be found at the relevant thread: <link>http://forum.techinferno.com/diy-e-gpu-projects/2123-diy-egpu-setup-1-x.html

x1.2Opt means 1x (lanes) of PCI-e Gen 2 with Nvidia Optimus PCI Compression engaged to save bandwidth and provide better performance (Compared to x1.2, especially on internal screen and DX9 games.).

Basically its what you'll probably want, since Engaging Optimus is what also enables you to reroute the eGPU's output to your laptop's internal screen.

Link to comment
Share on other sites

Setup 1.x is a Donationware (you pay a donation to nando to receive the program) that does the following:

You might meet 1, 4, 5, 6, and 7 of what setup 1.x does, more info can be found at the relevant thread: http://forum.techinferno.com/diy-e-gpu-projects/2123-diy-egpu-setup-1-x.html

x1.2Opt means 1x (lanes) of PCI-e Gen 2 with Nvidia Optimus PCI Compression engaged to save bandwidth and provide better performance (Compared to x1.2, especially on internal screen and DX9 games.).

Basically its what you'll probably want, since Engaging Optimus is what also enables you to reroute the eGPU's output to your laptop's internal screen.

yooooo man you are a life saver man thanks where can i donate

- - - Updated - - -

Link to comment
Share on other sites

Setup 1.x is a Donationware (you pay a donation to nando to receive the program) that does the following:

You might meet 1, 4, 5, 6, and 7 of what setup 1.x does, more info can be found at the relevant thread: http://forum.techinferno.com/diy-e-gpu-projects/2123-diy-egpu-setup-1-x.html

x1.2Opt means 1x (lanes) of PCI-e Gen 2 with Nvidia Optimus PCI Compression engaged to save bandwidth and provide better performance (Compared to x1.2, especially on internal screen and DX9 games.).

Basically its what you'll probably want, since Engaging Optimus is what also enables you to reroute the eGPU's output to your laptop's internal screen.

and another question i know i am being heavy on you man sorry i searched i ebay and i found PE4L-PM060A v2.1b https://www.ebay.com/itm/PE4L-PM060A-v2-1b-PCIe-to-Mini-Card-adapter-/321250397773?pt=US_Motherboard_Components&hash=item4acc040e4d but i see it have a very small bus slot can the 660ti or the 760 fit properly on it

post-21234-14494996774257_thumb.jpg

Link to comment
Share on other sites

Yes, the slot is cut at the end so any card will fit it. But since its small you usually have to put something to balance the card properly, or build something for it. I myself used paper that I folded several times to lift the back of the card. With the front bazel I stuck its lower edge

in a cardboard box. You can see how to donate for Setup 1.x to nando as shown in Setup 1.x thread that I pointed:

I suggest you get all the required parts before getting setup 1.x, nando sends you a link pretty fast by email. Also consider buying PE4L from bplus directly:

http://www.bplustech.com/Adapter/PE4L-PM060A%20V2.1.html

I've got it by DHL from them in less than a weak (international).

  • Thumbs Up 1
Link to comment
Share on other sites

Hey I have an m15x with 720qm can egpu work ? I don't care if I have to use external. monitor

Sent from my Galaxy Nexus using Tapatalk

Hello, here's what Nando4 said:

i7-720QM/i7-740QM has no iGPU. For Optimus to engage on a M15x R2 you'd need a dual-core i5/i7 (with iGPU) and the BIOS would need to enable and route the iGPU to the internal LCD. A quick google search suggests the M15x R2 only has the dGPU routed to the internal LCD.

Sure your system might be x2 capable (check if your ports meet the criteria). However, an x2 eGPU implementation is more costly (PE4H 2.4 + mPCIe/EC or mPCIe/mPCIe + Setup 1.x) due to requiring more parts, more configuration tinkering and is more cumbersome to start as need to access the underside of the system to attach cables. Even then you'd only be running x2 1.0, which is slower than x1.2Opt. x1.2Opt is what you'd get with a Sandy/Ivy bridge system with an iGPU using only a single cable.

So then I'd recommend once again offloading your machine for a faster, more battery efficient and better Sandy/Ivy Bridge eGPU candidate. With the lower eGPU cost and factoring in offloading your current system, you might find the final solution might be cheaper than extending the existing M15x R2 too. It will outperform it by a considerable margin.

Link to comment
Share on other sites

So I'm trying to get a 6950 to work with a x230 tablet, and all I get is "No drivers are installed for this device." in device manager (which, according to some, is Windows 8-speak for Code 12, insufficient resources - though, Windows 8 has given me Code 12 in specific cases). Additionally, on boot (or hotplug), Windows tends to be really laggy for a few minutes.

What's really interesting about this is that the setup works fine with a GTX660 (since the BIOS does dynamic TOLUD, where PCI gets allocated the range starting at CFA00000 instead of DFA00000 with no eGPU present). However, using the 6950, the PCI address range also starts at DFA00000.

Setup information:

- OS: Windows 8

- 8GB RAM

- Hardware: possibly the cheapest, hackiest, literally-DIY eGPU connector: DIY eGPU experiences - Page 763 (this may be a cause, especially if the AMD card interprets some signals differently than nVidia cards. But PCIe should be pretty standard...)

Anyways, here are some things I've tried:

- DSDT override (possible in Windows 8 by enabling testsigning in bcdedit). Gives a BSoD, even with no modifications (other than one to resolve a syntax error; I used the latest iASL from ACPICA for extraction and compilation). This would probably the ideal solution, but I've also heard others try DSDT mods on the x230 with similar (negative) results. Are there ways to debug this?

- AMD DNA modded drivers. Difficult to find (scrounged up v11.10), and doesn't help. Also had to work around the amazingly annoying Windows 8 driver signature enforcement.

- Both Gen1 and Automatic settings for Expresscard link speed. Automatic seems to get the BIOS to detect *something* (in my experience with the GTX660, when the machine "restarts" during BIOS POST, it's indicative of changing TOLUD), however, the PCIe memory range stays at DFA00000.

- Connecting the GTX660 at POST, then hotplugging the 6950 from sleep (in the hopes that the hotplug would work with the GTX660 TOLUD settings). Same "No drivers are installed for this device" error.

And some things I know that I should try, but haven't:

- <4GB DRAM, but I don't have access to any small sticks at the moment.

- Setup 1.x. The free version from way back when didn't work, and I'm reluctant to pay 4 times the cost of the PCIe connector for software which may or may not fix this problem.

Additional questions:

- Does anyone know if there are significant differences in PCIe memory / wiring requirements between the GTX660 and 6950? (also note: I've heard somewhere that the GTX660 has lower-than-usual memory requirements)

- Is there a good way to debug a DSDT override?

- Anyone else had similar experiences with similar hardware?

- Anything else that might be worth trying?

Thanks!

Link to comment
Share on other sites

So I'm trying to get a 6950 to work with a x230 tablet, and all I get is "No drivers are installed for this device." in device manager (which, according to some, is Windows 8-speak for Code 12, insufficient resources - though, Windows 8 has given me Code 12 in specific cases). Additionally, on boot (or hotplug), Windows tends to be really laggy for a few minutes.

What's really interesting about this is that the setup works fine with a GTX660 (since the BIOS does dynamic TOLUD, where PCI gets allocated the range starting at CFA00000 instead of DFA00000 with no eGPU present). However, using the 6950, the PCI address range also starts at DFA00000.

Setup information:

- OS: Windows 8

- 8GB RAM

- Hardware: possibly the cheapest, hackiest, literally-DIY eGPU connector: DIY eGPU experiences - Page 763 (this may be a cause, especially if the AMD card interprets some signals differently than nVidia cards. But PCIe should be pretty standard...)

Anyways, here are some things I've tried:

- DSDT override (possible in Windows 8 by enabling testsigning in bcdedit). Gives a BSoD, even with no modifications (other than one to resolve a syntax error; I used the latest iASL from ACPICA for extraction and compilation). This would probably the ideal solution, but I've also heard others try DSDT mods on the x230 with similar (negative) results. Are there ways to debug this?

- AMD DNA modded drivers. Difficult to find (scrounged up v11.10), and doesn't help. Also had to work around the amazingly annoying Windows 8 driver signature enforcement.

- Both Gen1 and Automatic settings for Expresscard link speed. Automatic seems to get the BIOS to detect *something* (in my experience with the GTX660, when the machine "restarts" during BIOS POST, it's indicative of changing TOLUD), however, the PCIe memory range stays at DFA00000.

- Connecting the GTX660 at POST, then hotplugging the 6950 from sleep (in the hopes that the hotplug would work with the GTX660 TOLUD settings). Same "No drivers are installed for this device" error.

And some things I know that I should try, but haven't:

- <4GB DRAM, but I don't have access to any small sticks at the moment.

- Setup 1.x. The free version from way back when didn't work, and I'm reluctant to pay 4 times the cost of the PCIe connector for software which may or may not fix this problem.

Additional questions:

- Does anyone know if there are significant differences in PCIe memory / wiring requirements between the GTX660 and 6950? (also note: I've heard somewhere that the GTX660 has lower-than-usual memory requirements)

- Is there a good way to debug a DSDT override?

- Anyone else had similar experiences with similar hardware?

- Anything else that might be worth trying?

Thanks!

It sounds like the BIOS only detects nvidia device IDs to determine if it should lower the tolud or not. A tolud of DFA00000 is too high to get the card to work even with the reallocation script in the setup program. You must either lower the tolud or get DSDT working, or the card will never work.

As for the lower than usual memory requirements, Nvidia GPUs do not require as much PCI resources as AMD GPUs. AMD needs a solid 256MB block while Nvidia needs multiple smaller blocks, which are more flexible and take up less space overall. You should be able to get the AMD working anyway though.

Link to comment
Share on other sites

  • Moderator
Bitminers?

Any bitmining folks out there? Surprised those users aren't using eGPUs to do their mining. Way better returns than using a energy thirsty desktop to drive their Radeons. Seems the financial comptrollers are a bit worried about Bitcoin since they can't control it.

Me! Although it isn't that great for GPUs anymore. I am running dual 5 GH/s ASICs now which runs circles around any GPU. Now with 28nm ASICs coming out my current ones are kinda useless. GPUs aren't overly useful for BTC mining. Still great for LTC and other altcoin mining though (like LTC). Pretty much anything scrypt based is fine for GPUs. No hashrate reduction when mining eGPU or non-eGPU. Which is a nice plus.

  • Thumbs Up 1
Link to comment
Share on other sites

hey guys,

my new working setup:

MacBook '13 Retina Late 2013

Windows 8.1

Intel i7 2,8Ghz

16GB Ram

NVIDIA GTX 570

Vidock 4+ & Sonnet Echo Pro

I will post some benchmark in the next days.

First 3dMark06 with external display: 21599

I will give you the link to this score and others (internal + external)

I also got a Sonnet Express III and will do some tests, too.

There is the proof (internal and external)

post-8492-14494996784187_thumb.jpg

post-8492-14494996784672_thumb.jpg

  • Thumbs Up 1
Link to comment
Share on other sites

~US$250 Silverstone T004 450W Thunderbolt eGPU enclosure ETA: Q1-2014 with Apple support

Found this nugget on SilverStone's facebook page:

I'm sure this will turn out better than the MSI GUS II, which wasn't released due to the absence of Mac support, as a result of Apple not wanting to test and make it work on Macs. In an email from a Silverstone representative, I found out that the T004 project is receiving support from Apple but Intel doesn't want to support it. I also found out that they expect to release it in the first quarter of 2014. It's looking pretty good. Although, if it doesn't get released, I'll be pretty suspicious of Magma since the only two lower-cost GPU enclosures would have become vaporware.
  • Thumbs Up 2
Link to comment
Share on other sites

@Tech Inferno Fan Has anyone revisted SLI/CF yet? Like trying hyperSLI or modifying the reported PCI speed for the driver to allow it?

I've ordered 3 1x-16x pci-e risers, a cf bridge and a sli bridge to try on my PC first, but I don't have two graphic cards yet.

Unfortunately the NVidia driver requires a x4 link for the SLI option to appear. Not sure what, if any, requirements exist for CF. Guess you may be the first to report what happens when you attach 2 AMD cards to a system with a CF bridge.

Link to comment
Share on other sites

Unfortunately the NVidia driver requires a x4 link for the SLI option to appear. Not sure what, if any, requirements exist for CF. Guess you may be the first to report what happens when you attach 2 AMD cards to a system with a CF bridge.

So how does one go about faking a x4 link? I'm looking for any deals till my bridges get here (AMD are on short supply)

I've also tried asking litecoin folks to do a test, since they all running multiple amd gpus on 1x risers to mine, which is perfect to test on!

but they didn't respond, I guess they'll want a LTC to even "waste" their time on it.

Here's my thread asking:

Can someone test something for me? : litecoinmining

Here's another guy offering a reward:

LTC REWARD FOR HELPING ME GET MY 2x R9 290 MINER WORKING! I HAVE SPENT 30+ HOURS ON IT AND IT WILL NOT MINE! I KNOW WHAT IM DOING BUT IT WONT MINE! HELP PLEASE! : litecoinmining

10 people joined skype to help him :P (I was one of them to help him so he could help me..)

Link to comment
Share on other sites

  • Moderator
So how does one go about faking a x4 link? I'm looking for any deals till my bridges get here (AMD are on short supply)

I've also tried asking litecoin folks to do a test, since they all running multiple amd gpus on 1x risers to mine, which is perfect to test on!

but they didn't respond, I guess they'll want a LTC to even "waste" their time on it.

Here's my thread asking:

Can someone test something for me? : litecoinmining

Here's another guy offering a reward:

LTC REWARD FOR HELPING ME GET MY 2x R9 290 MINER WORKING! I HAVE SPENT 30+ HOURS ON IT AND IT WILL NOT MINE! I KNOW WHAT IM DOING BUT IT WONT MINE! HELP PLEASE! : litecoinmining

10 people joined skype to help him :P (I was one of them to help him so he could help me..)

I asked my friend here, he just has to find his crossfire cables. Will do a test crossfire with both at 16x, then tape off the lanes for 1x on each and see if it still works. Not sure when he'll get around to it. I'll keep pestering him till he does so lol.

I'll meet up with him Sunday. We'll do it then.

--

Minor update:

Going to try Cfire with a 7970 and a 280X (both Tahiti). Shown to work natively in desktops. Will try Cfire at 1x link as well. If it does work, would be neat to try as eGPU as well. I don't have a mPCIe PE4L adapter to attempt that with. Perhaps someone can lend me one if 1x link does work (in the desktop)? If you live in eastern Texas PM, perhaps one of you is near me!

Link to comment
Share on other sites

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.