squinks

2013 15" Macbook Pro GT750M + GTX780Ti@16Gbps-TB2 (Sonnet III-D) + Win8.1 [squinks]

28 posts in this topic

Hi everyone. New to Tech Inferno. You can call me Stu. Nando asked me to post my setup and results which I'm happy to do.

 

I'm going to keep this as simple as possible since this eGPU project has been just that, simple and awesome. In my eyes, this is where the eGPU game ends. Thunderbolt 2 (and TB3 coming soon), full-size chassis, fastest single GPU on the market, running at desktop speeds. I suppose the only caveat is cost but considering I only need to upgrade my GPU from this point on, it’s worth it. I live an hour away from Sonnet's headquarters in California and decided I had waited long enough to complete my eGPU experiment. The results surpassed expectations.

 

Feel free to scroll down for specific benchmark and game results.

 

Key Points

Completely plug and play. Standard Bootcamp 5.1 install (*see below), connect everything up, boot Windows, install drivers, you’re done.

eGPU vs Desktop performance: 85-90%

Gaming: Max out virtually any game. 60FPS+ (with a few exceptions. e.g. Black Flag has weird PhysX that kills performance so I turn it off)

Discrete 750M graphics = No Optimus/Internal display support. MBP w/Iris only = Optimus support but not fully plug and play (**see below)

Simulated Optimus FPS loss: 5-20% (window drag method)

eGPU Setup Cost (Not including MacBook): $1500 to $2000 depending on GPU purchase

 

*It appears only 2013 MBPs w/newer PCIe flash storage defaults to UEFI/GPT-based windows installation with Bootcamp 5.1 .. 2012 rMBP (and possibly other macs/notebooks) will not format to GPT and therefore no plug and play

**Conclusive Results for 2013 13" MacBook Pro - Optimus - Thunderbolt 2 - Plug and Play

 

Configuration

2013 Macbook Pro 15” w/GT 750M i7-4850HQ@2.3 16GB-DDR3L 512GB SSD US$2599

Sonnet Thunderbolt 2 III-D Chassis (US$979)

EVGA Nvidia Geforce GTX 780 Ti Superclocked 3GB (US$720)

Corsair RM450 (Silent) Power Supply (US$100) for 8-pin cable only (Jumped the 12V rail with small piece of metal. Google it)

Standard Bootcamp 5.1 (UEFI) Installation

Windows 8.1 w/latest Nvidia drivers

MSI Afterburner with custom fan curve, gpu temp/fan speed match (e.g. 68 degrees/68% fan speed)

 

uc?id=0Bwe8SFneDiWWRmd6cWxVc0xIaWc

 

uc?id=0Bwe8SFneDiWWZ2Y1eEhhZlQxUjg

 

uc?id=0Bwe8SFneDiWWc09oQUtZVEt1VVU

 

uc?id=0Bwe8SFneDiWWQlR6cGVLVE01Ylk

 

uc?id=0Bwe8SFneDiWWN2hSQjAwanJRSEU

 

uc?id=0Bwe8SFneDiWWSU9yZ3F6cjVrck0

 

 

 

— Performance and Benchmark Results —

 

Maximum Overclock Scores:

FireStrike Graphics: 11227 link

*Surpasses Titan and 780 Ti graphics score (without overclock) by 1000+ points comparison [Note: This is only one sample]

3DMark11 Score: 12781 link

3DMark11 Graphics: 13996 link

3DMark-Vantage Graphics: 46629 link

3DMark06 Score: 29254 link

Unigine Valley Extreme HD: 70FPS / 2924

Unigine Heaven 4.0 Extreme: 67FPS / 1683

BioShock Infinite Benchmark (UltraDX11): 126FPS Overall

 

 

eGPU vs Desktop performance

 

Fire Strike comparison (Desktop 780 Ti vs eGPU 780 Ti SC)

Reference: link

Graphics Score

Desktop: 11096

eGPU: 10410

Ratio: 93% (take into account reference doesn't mention overclock)

 

Unigine Valley comparison (Desktop vs eGPU)

Reference: link

Desktop FPS/Score: 73.1/3057

eGPU FPS/Score: 60.2/2520

Ratio: 82% (take into account desktop CPU which offsets results somewhat)

 

Bioshock Infinite

Reference: link

Ratio: 80-93% (calculated at multiple frame stops)

Another Unigine Valley Comparison

Reference: link

Ratio: 91% (no overclock mentioned)

 

Overall eGPU Perfomance vs Desktop Performance: 80-95% (Games and Benchmarks consistently show this)

 

Internal Display FPS Loss (window drag method)

Overall internal display FPS loss: 5-20%

Unigine Heaven: 16% 53FPS vs 63FPS

Borderlands 2: 5-10%

 

CUDA-Z Bandwidth

Host to Device: 1258 MiB/s

Device to Host: 1366 MiB/s

Device to Device: 136 GiB/s

Reference Host to Device

TB1 10Gbps: 781MiB/s link

TB1 8Gbps (x2 2.0): 697MiB/s link

 

Unigine Heaven (Basic 720p)

107 FPS

Score: 2716

 

Unigine Heaven (Extreme 1080p 4XAA)

62.7 FPS

Score: 1580

 

Unigine Heaven (Extreme 1080p 8XAA)

54.2 FPS

Score: 1364

 

Unigine Valley (Basic 720p)

80FPS

Score: 3343

 

Unigine Valley (Extreme 1080p 2XAA)

78.6 FPS

Score: 3290

 

Unigine Valley (Extreme HD 8XAA)

60.2 FPS

Score: 2520

 

3DMark11

Score: 11269 link

Graphics: 12576

Physics: 8395

 

3DMark (2013)

Fire Strike

Score: 8807

Graphics: 10410

Physics 8102

Cloud Gate

Score: 18795

Graphics Score: 57882

Physics Score: 5588

 

Call of Duty: Ghosts

Max settings 1080p 2x AA: 60FPS+

 

Tomb Raider

Ultimate (Tess. hair off) 1080p: 60-100FPS

 

Crysis 1

Very High (Maxed) 2xAA 1080p: 60-90FPS (Fly-through Benchmark)

 

Crysis 3

Very High (Maxed) SMAA 1080p: 40-60FPS

 

Nvidia Demo - A New Dawn: 31FPS

BioShock Infinite Official Benchmark - 1080p UltraDX11: All scenes average: 108FPS

 

What about SLI?

SLI Success! 2x 780Ti + 2x Sonnet SEL on MacBook Pro @32Gbps TB2 (2x 16Gbps)

 

uc?id=0Bwe8SFneDiWWVVdJMEZZakoxU0U

 

External discussion about this post:

AnandTech: Running An Nvidia GTX 780Ti over Thunderbolt 2

TechReport: Thunderbolt box mates MacBook Pro with GeForce GTX 780 Ti

MacRumors: 2013 15" Macbook Pro + GTX780Ti@16Gbps Thunderbolt2 eGPU implementation

YouTube: MacBook Pro running an NVIDIA GTX 780 Ti over Thunderbolt 2

PC Perspective: NVIDIA GTX 780 Ti on Thunderbolt 2 by DIYers

Linustechtips: Running an NVIDIA GTX 780 Ti Over Thunderbolt 2

Edited by squinks
White space formatting
15 people like this

Share this post


Link to post
Share on other sites

@squinks , thank you for posting this exclusive info. Yes, this is a big deal for the eGPU community. Game-changer? Sure. Here we see a desktop-performance level pluggable eGPU opening creative doors in gaming, finance, video production, engineering and just sheer number crunching. Multi-GPU CUDA processing can give supercomputer power in your home or office :)

Though the US$979-20Gbps Sonnet Echo Express III-D is too costly to be accessible by most users. The good news is there are cost friendly US$270-16Gbps and US$199-10Gbps enclosures available.

The $199-10Gbps Firmtek solution sees you get less bandwidth for a lower price and makes 4Gbps Expresscard-to-Thunderbolt (eg: PE4L-EC060A or Villagetronic ViDock) eGPUs redundant - they are overpriced and underperforming. Details about these more affordable Thunderbolt enclosures and how to use them are at http://forum.techinferno.com/diy-e-gpu-projects/6578-implementations-hub-tb-ec-mpcie.html#Thunderbolt .

Iris Pro 15" Macbook is preferred to the GT750M model for NVidia eGPU purposes

That's because the G750M model doesn't have an active iGPU in Windows due to a flawed Apple firmware. MacOSX does allow switching on the iGPU. The iGPU is required for

- Optimus internal LCD mode where the image is rendered by the eGPU and displayed by the iGPU-attached internal LCD. The best the G750M model can do is drag a windows app across from the external LCD to the internal one.

- If using a cost effective x1 eGPU implementation, consisting of a PE4L-EC060A (Expresscard-to-pcie adapter) and a Sonnet Echo Expresscard Pro (Thunderbolt to expresscard), then the iGPU becomes even more important. There the NVidia driver engages a x1 pcie compression greatly accelerating DX9 and to a lesser degree, DX10 apps.

- and of course, the iGPU-only model will give better battery life under Windows.

5 people like this

Share this post


Link to post
Share on other sites

Good one squinks.

A quick question: did you have to modify the Sonnet's PCIe slots to allow the card to be inserted?

I have all the parts for the same thing with the Sonnet Echo Express SE II and a GTX760 but when I tried to cut off the end of the PCIe slot on the Sonnet's motherboard I broke it - a new one is on its way now from Sonnet. For anyone reading, the Sonnet Echo Express SE II has a 4x slot with solid ends so the 16x card won't fit in. I found out too late the the correct way to cut the PCIe slot is to heat up a scalpel type knife on your stove to very hot then it will cut through the plastic with very little force.

Thanks

Share this post


Link to post
Share on other sites

I can't wait to try out Thunderbolt 1 with a 780ti and a little bit more powerful CPU then you have in your Macbook (3840QM Overclocked to 4.1Ghz). I will be posting results similar to yours.

Share this post


Link to post
Share on other sites

Wow @Relentless, sorry about the mishap. The Sonnet III-D has a 16x slot for full length, full width cards. No modifications. Pure plug and play

- - - Updated - - -

I'll be interested to hear your results @ha1o2surfer. You may experience a performance limit with Thunderbolt 1 but you never know until you try. Best of luck

- - - Updated - - -

Hi @sgluhov, the smaller chassis from Sonnet does have TB2, however it doesn't provide support for full-length, full-width card support. Only has 8x slots so cutting is required (read the comment Relentless just made). The PSU for the PCIe board also has less power so that may impact performance or be insufficient power altogether. There may be other limitations I've heard about with that chassis regarding it's full plug and play capabilities. My opinion is, why bother with a smaller solution when the results are questionable. Pay a little more and know it works. Just my 2 cents.

- - - Updated - - -

Thanks @nak1017 and @sskillz. Appreciate the comments. The display is an ultra-slim Samsung S23C570 I picked up at Costco. Really nice for what I paid.

Share this post


Link to post
Share on other sites
Wow @Relentless, sorry about the mishap. The Sonnet III-D has a 16x slot for full length, full width cards. No modifications. Pure plug and play

Thanks squinks.

For those playing at home this may be the biggest reason to get the big Sonnet box. I know it is actually not that hard to do the modification of the Sonnet's PCIe box but it was pretty stressful at the time - and it turned out that I broke it anyway ;) The next one should run more smoothly.

P.s congratulations on this: AnandTech | Running an NVIDIA GTX 780 Ti Over Thunderbolt 2

2 people like this

Share this post


Link to post
Share on other sites

Featured on Anandtech? Didn't see that coming.

5 people like this

Share this post


Link to post
Share on other sites

Impressive work squinks! The whole eGPU setup cost more than $2000 :numbness:, that costs more than the computer itself!

Multi-GPU CUDA processing giving supercomputer power in your home or office :)

Assuming you can attach another eGPU to the other Thunderbolt port and successfully utilize it, the costs would total to $4000+! That isn't a cost anyone can come across and acknowledge as necessary at home. Higher tier CUDA products (NVIDIA Quadro line) costs more per GPU, pushing the costs even further!

Share this post


Link to post
Share on other sites
Impressive work squinks! The whole eGPU setup cost more than $2000 :numbness:, that costs more than the computer itself!

Assuming you can attach another eGPU to the other Thunderbolt port and successfully utilize it, the costs would total to $4000+! That isn't a cost anyone can come across and acknowledge as necessary at home. Higher tier CUDA products (NVIDIA Quadro line) costs more per GPU, pushing the costs even further!

squinks has demonstrated that it is doable. We know too that SLI requires a x4 link which a second Thunderbolt port would provide, or even using a pcie bridge as is found in the multi-slot Sonnet TB enclosure products.

Yes, cost of this is the detracting factor. Anandtech's article does show a whole desktop system will cost less than this. Why does the Sonnet III-D enclosure cost US$979? I've seen figures like US$20 being bantered around for the cost of the Thunderbolt chip. So is the > 4000% upmark for the enclosure? Why are Intel forcing vendors to provide an enclosure solution as part of the certification? Or is this a case of monopolizing the market?

This requirement for an enclosure and huge cost upmark (profiteering) for it is absurb. The engineering to add an enclosure or pcie slot around a Thunderbolt chip is not at all complex. An enclosure can even detract from the usage of the product. I'd prefer to NOT have an enclosure. Rather, just a board like a PE4H 2.4 with a locked pci-e slot. If I need extra protection I'll just purchase a video card with a backplate option. Then the whole thing is much smaller AND has no ventilation issues.

The best thing is the financial constraints of this solution are coming to the fore and users will start investigating what part of the supply chain for these Thunderbolt enclosures is making the cash grab.

2 people like this

Share this post


Link to post
Share on other sites
squinks has demonstrated that it is doable. We know too that SLI requires a x4 link which a second Thunderbolt port would provide, or even using a pcie bridge as is found in the multi-slot Sonnet TB enclosure products.

Yes, cost of this is the detracting factor. Anandtech's article does show a whole desktop system will cost less than this. Why does the Sonnet III-D enclosure cost US$979? I've seen figures like US$20 being bantered around for the cost of the Thunderbolt chip. So is the > 4000% upmark for the enclosure? Why are Intel forcing vendors to provide an enclosure solution as part of the certification? Or is this a case of monopolizing the market?

This requirement for an enclosure and huge cost upmark (profiteering) for it is absurb. The engineering to add an enclosure or pcie slot around a Thunderbolt chip is not at all complex. An enclosure can even detract from the usage of the product. I'd prefer to NOT have an enclosure. Rather, just a board like a PE4H 2.4 with a locked pci-e slot. If I need extra protection I'll just purchase a video card with a backplate option. Then the whole thing is much smaller AND has no ventilation issues.

The best thing is the financial constraints of this solution are coming to the fore and users will start investigating what part of the supply chain for these Thunderbolt enclosures is making the cash grab.

I agree that enclosures aren't always preferred. Aside from the actual costs that go into the making of the enclosure, there is no actual competition for cheap solutions on the market. MSI's GUS II was announced 2 years ago (in January of 2012 The Verge briefly showed it), but we still don't see it on the market (people speculate it to be ~$150 or more). I recently came across one of your old posts (here), and I would go with a hwtool's recalled TH05 if I could (the product page doesn't exist "anymore" but here).

Also, Thunderbolt hasn't been widely adopted by most laptop manufacturers. Dell has yet to announce a mobile Precision with Thunderbolt and HP has pretty much left the laptop PC business (Elitebooks with/without Premiercolor). I would wait on a cheaper eGPU Thunderbolt solution or just go with an expresscard/mPCIe solution.

Share this post


Link to post
Share on other sites

Here is my conspiracy theory:

Intel don't want eGPUS to happen through TB (or at all) by making them un-financially viable.

The reason I have figured out for this, is to get the maximum performance, a fairly powerful Intel quad-core desktop can be procuredfor the same price as the III-D.

They don't want a docked laptop to be as fast as a desktop, so when someone needs mobility and desk-bound performance, they are forced into buying a laptop and desktop both with Intel silicon.

The same shows with Ultrabooks, they are not pushing displayport at all, and when you do get one with displayport, MST isn't supported so you can't really run multiple monitors easily. I think Intel now see most laptops as companion devices, or at least more dumbed down than stuff like the proper EliteBooks.

This is further evidenced by the dropping of 16x 3.0 PCIe lanes from Haswell ULV.

I know many businesses would be served fine by eGPU laptops, but as it's not a viable solution they would rather just have to buy two separate machines, thus more money into Intel's pockets.

Same goes for NVidia, they would rather you have to buy a gaming laptop, and a desktop, both with GeForce, than just one GPU.

2 people like this

Share this post


Link to post
Share on other sites
Wow @Relentless, sorry about the mishap. The Sonnet III-D has a 16x slot for full length, full width cards. No modifications. Pure plug and play

- - - Updated - - -

I'll be interested to hear your results @ha1o2surfer. You may experience a performance limit with Thunderbolt 1 but you never know until you try. Best of luck

- - - Updated - - -

Hi @sgluhov, the smaller chassis from Sonnet does have TB2, however it doesn't provide support for full-length, full-width card support. Only has 8x slots so cutting is required (read the comment Relentless just made). The PSU for the PCIe board also has less power so that may impact performance or be insufficient power altogether. There may be other limitations I've heard about with that chassis regarding it's full plug and play capabilities. My opinion is, why bother with a smaller solution when the results are questionable. Pay a little more and know it works. Just my 2 cents.

- - - Updated - - -

Thanks @nak1017 and @sskillz. Appreciate the comments. The display is an ultra-slim Samsung S23C570 I picked up at Costco. Really nice for what I paid.

I dont care about their case - i can use it without case easily with my own atx power brick. And the price is 400$ so i am saving 600$ Why not?

Share this post


Link to post
Share on other sites
I dont care about their case - i can use it without case easily with my own atx power brick. And the price is 400$ so i am saving 600$ Why not?

The $399 Sonnet Echo SEL says uses a x8 slot that's electrically x4 2.0 (that would be 16Gbps on a Thunderbolt2 port)

You'd would require a US$6 8x-to-16x PCIe riser to extend the slot outside the chassis as well as an additional ATX PSU to supply pcie power if using a video card requiring > 80W.

Far more financially viable than the $979 Echo Express III-D demoed in this thread and providing identical performance.

1 person likes this

Share this post


Link to post
Share on other sites

squinks has updated his opening post to this thread with more details. Important findings being:

- TB2 is a x4 2.0 16Gbps pci-e link, giving a 60% performance improvement over TB1's 10Gbps. TB2 is 20Gbps (max) for pcie + DP traffic.

- His i7-4850HQ + GTX780Ti-OC@16Gbps performance sees 3dmark results placing him on top of the leaderboard for all except the 3dmk11 result. Another highly OCed GTX780 just slightly outbenchmarks him there. It should be noted that squinks' Win8.1 implementation 3dmark results will be lower than the Win7 comparative ones.

- Upon detecting the pci-e SSD found in mid-2013 MBA/MBP systems, bootcamp 5.1 will create GPT Windows partitions for UEFI PnP use.

- A MBR installation was attempted via bootcamp 4.0 but found the Apple firmware would poweroff the system when attempting to boot Windows with eGPU attached in the same way as described at http://forum.techinferno.com/diy-e-gpu-projects/3062-%5Bguide%5D-2012-13-mbp-gtx660ti-hd7870%40x2-2-th05.html#post42483 . squinks didn't have a eGPU PCI Reset Delay circuit as a workaround.

Share this post


Link to post
Share on other sites
Here is my conspiracy theory:

Intel don't want eGPUS to happen through TB (or at all) by making them un-financially viable.

The reason I have figured out for this, is to get the maximum performance, a fairly powerful Intel quad-core desktop can be procuredfor the same price as the III-D.

They don't want a docked laptop to be as fast as a desktop, so when someone needs mobility and desk-bound performance, they are forced into buying a laptop and desktop both with Intel silicon.

The same shows with Ultrabooks, they are not pushing displayport at all, and when you do get one with displayport, MST isn't supported so you can't really run multiple monitors easily. I think Intel now see most laptops as companion devices, or at least more dumbed down than stuff like the proper EliteBooks.

This is further evidenced by the dropping of 16x 3.0 PCIe lanes from Haswell ULV.

I know many businesses would be served fine by eGPU laptops, but as it's not a viable solution they would rather just have to buy two separate machines, thus more money into Intel's pockets.

Same goes for NVidia, they would rather you have to buy a gaming laptop, and a desktop, both with GeForce, than just one GPU.

I agree. The least expensive eGPU chassis from Sonnet is $400. For that price, you can buy an IBM clone desktop. For that price, you might as well buy one. Most people aren't going to spend $400 on the chassis. I certainly wouldn't be willing to spend that much just on that part of an eGPU setup. $400 is the most I'm willing to spend on an eGPU as a whole, not just for that one item. This is why I've created a petition at Change.org regarding this issue: https://www.change.org/petitions/intel-allow-silverstone-and-asus-to-sell-the-asus-sg-station-2-and-allow-villagetronics-to-sell-the-4th-generation-vidock-in-essence-allow-the-sale-of-affordable-egpu-enclosures

Share this post


Link to post
Share on other sites

Hey squinks, I've a question to your nice hardware setup:

Is the RM450 really enough for the EVGA GTX 780ti? I've an EVGA GTX 760 and the manual says it needs a 500W PSU. Currently I'm running into some problems (freezing) that may be due to the insufficient power supply. I'm using the Sonnet SEL.

Share this post


Link to post
Share on other sites

They recommend 500W because 1. it needs to power CPU/Mobo too, and 2. because people buy shitty PSUs that are rated higher than what they can actually perform.

That card will maximally draw 300W alone, but probably even less than that.

Share this post


Link to post
Share on other sites

@squinks I have a mid 2014 Macbook Pro 15" 2.8GHz (4.0Ghz Turbo Boost) 16GB Ram GT 750M 1TB PCI SSD + (External GPU) GTX 780 Ti SC + Sonnet Thunderbolt 2 III-D Chassis. Im getting the same +/- scores as you with and without external PSU (Corsair RM 550). So my question is do I really need lug around my 550 Watt PSU when clearly its not a factor in stability or benchmarks? Will or can I damage the GTX 780 Ti by not using the external PSU?

Share this post


Link to post
Share on other sites

@squinks, @ReaverM1

I see you guys have GT750M. I also have the rMBP with GT750 and was really excited about eGPU-ing it. However the inability to use rMBPs LCS might halt my efforts (I actually do not own an LCD). There is absolutely no chance to use internal LCD? No workarounds, no hot-swaps, no special Window boot schemes / builds?

Share this post


Link to post
Share on other sites
Hey squinks, I've a question to your nice hardware setup:

Is the RM450 really enough for the EVGA GTX 780ti? I've an EVGA GTX 760 and the manual says it needs a 500W PSU. Currently I'm running into some problems (freezing) that may be due to the insufficient power supply. I'm using the Sonnet SEL.

I am having freezing as well, WITH a psu that meets my requirements.

I am using the Echo SE II.

My details are here:

What have you tried for debugging?

Share this post


Link to post
Share on other sites
I am having freezing as well, WITH a psu that meets my requirements.

I am using the Echo SE II.

My details are here:

What have you tried for debugging?

I was running a 550RM in

<--Video no hiccups!

Share this post


Link to post
Share on other sites

I am powering it with the corsair RM550 PSU

You could use the internal power of the sonnet echo iii-d but it wont be stable (lack of power watts) as I've tried that since no one responded to my earlier question.

Share this post


Link to post
Share on other sites

Providing enough power? Using a riser with molex?

Share this post


Link to post
Share on other sites
I am powering it with the corsair RM550 PSU

You could use the internal power of the sonnet echo iii-d but it wont be stable (lack of power watts) as I've tried that since no one responded to my earlier question.

Yeah, I can imagine a drop in stability with only the III-D's PSU. Does it freeze? What happens exactly. And it is only during max load?

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now

  • Similar Content

    • By jimmyco2008
      Intro
      The Alienware Graphics Amplifier- Probably the best implementation of an eGPU enclosure that has ever existed, and at about $130 "like new" on eBay, it's a bargain compared to other PCIe enclosures like the Akitio Thunder 2 and Sonnet Echo Express, especially when you consider it comes with a 460-watt PSU (and two 6-pin PCIe connectors that can separate to two 4-pin connectors). Fantastic.
      Obviously, though, it uses a proprietary Alienware PCIe port, making is just about useless to people like us who, if we had Alienware computers, probably wouldn't need an eGPU in the first place. I was curious how the Alienware GA did its magic, I figured it couldn't be all that different from how the Akitio Thunder 2 and others like it do their Thunderbolt-to-PCIe thing. So I scoured the Internet for some high-res pics of the inside of the GA, and finally found some. Similar, indeed, it has inside it a simple PCB with two PCIe slots: 1 PCIe x8 (which is backwards!) for the proprietary port and 4 USB 3.0 ports, and 1 PCIe x16 slot for a full-length, full-height, double-width GPU.
      I finally got the GA in the mail today, so I could look at the circuits up-close and see just how proprietary that x8 card with the proprietary port is. From what I can tell, not very. The circuits (sorry if this isn't the official term, but the lines you can see on the board running from component to component) from the proprietary port run mostly to the PCIe x8 slot, where they then go over to the PCIe x16 slot. Both slots, by the way, receive power directly from the ATX connector from the PSU, so in the case of the x16 slot, you have some circuits running to the ATX connector and the rest going straight to the x8 slot.
      My thinking is that the purposes of the x8 card are to, aside from doing a pass-through of the PCIe connection to an Alienware laptop, control the power state of the GA (turn it on and off) and of course control the USB 3.0 hub (that's what most of the ICs on the board appear to be for). I'm assuming that if the detection method for the proprietary connector is anything, anything of substance, it is something we will have to work around when we take that proprietary x8 card out. Fortunately, it may be possible to simply tape down the power supply's reset button which, if one holds it down, causes the GA to essentially power on- the fan in the PSU spins, as does the fan in the front of the GA. The Alienware logo on the front, however, does not turn on, so I'm thinking that's controlled by the x8 card as well.
      The Plan
      So I mentioned earlier that I will be replacing that proprietary x8 card, and you may guess with a Thunderbolt to PCIe x4 card. There's another thread here on "https://jatsby.com/echo/eGPU/Thunderbolt%202%20AIC.pdf"]ASRock Thunder II Manual
    • By Shelltoe
      NOTE: The US$180 BPlus TH05 (inc Thunderbolt cable) native Thunderbolt adapter used in this implementation was recalled in Jan 2013 due to (presumably) threats by Intel/Apple per TH05 recall notice. As a result refer to this solution that can be implemented today: [URL]http://forum.techinferno.com/diy-e-gpu-projects/4570-%5Bguide%5D-2012-13-rmbp-gtx660-sonnet-echo-express-se-%40-10gbps.html#post63754[/URL] (recommended for 15" rMBP/MBP due to iGPU issues) or 2013 11" Macbook Air + Win7 + Sonnet Echo ExpressCard + PE4L + Internal LCD [US$250].





      While reading some threads about EFI-Boot on Mac I finally made my eGPU work.

      I fixed my "error 12" by enabling VGA Output on the PCI bridge connected to my Thunderbolt ports using the EFI-Shell.



      For this task I installed rEFIt and used "pci -i -b" / "pci xx xx xx -i -b" to find VGA devices and their bridges.

      I noted all Bus/Dev/Func as well as the required registars. After that I had to set those registars using "mm".





      On a Mac Book Pro 15" 2012 you'll have to do the following:



      1. Install rEFIt



      2. In Mac OS mount the EFI partition using terminal:

      mkdir /Volumes/EFI

      sudo mount -t msdos /dev/disk0s1 /Volumes/EFI



      3. Create a textfile called "startup.nsh" in it's root:

      echo -off

      echo "Setting Registars"

      # IGPU Intel HD 4000

      # I noticed some improvements in boottimings while deactivating the Intel HD

      # (don't use this if your running a 13" single gpu model)

      mm 00020004 1 ;PCI :0



      # eGPU PCI Bridge

      # this line does the magic by enabling VGA Output

      mm 0001013E 1 ;PCI :8



      echo "Booting Windows"

      fs0:EFIBootootx64.efi



      4. Reboot while eGPU is connected (SW1=1) and select "start EFI-Shell" in rEFIt.

      "startup.nsh" launches and Windows 8 starts up with eGPU enabled.





      Update:

      Replaced rEFIt with rEFInd which is is a fork of rEFIt.

      I'm now able to create an menuentry which boots Windows using my startup script. I'm also able to hide the non functional Windows entries.



      Here's my current refind.conf:



      timeout 20

      hideui banner

      showtools shell, reboot, shutdown

      dont_scan_dirs EFI/Boot, EFI/Microsoft



      menuentry "Windows 8 with eGPU" {

      icon EFI efindiconsos_win.icns

      loader EFI oolsshell.efi

      options "fs0:StartupseGPU.nsh"

      }







      Update (experimental):

      Thanks to Linux's "apple-gmux" developer Andreas Heider I was able to switch graphics before booting Windows and enable Intel HD as primary VGA device. Though Intel HD is still bugged this probably allow us to enable Optimus functionality in the future.



      You can also boot with Intel HD enable by installing gfxCardStatus 2.1.1(!) and setting it to integrated only, too.



      For now you'll receive a black screen and reboot due to some "igdkmd64.sys" error.



      Add this to your startup.nsh before "mm 0001013E 1 ;PCI :8":



      echo Switch select

      mm 7C2 1 ;IO :1

      stall 100000

      mm 7D4 1 ;IO :28



      echo Switch display

      mm 7C2 1 ;IO :2

      stall 100000

      mm 7D4 1 ;IO :10



      echo Switch DDC

      mm 7C2 1 ;IO :2

      stall 100000

      mm 7D4 1 ;IO :40



      echo Power down discrete graphics

      mm 7C2 1 ;IO :1

      stall 100000

      mm 7D4 1 ;IO :50



      mm 7C2 1 ;IO :0

      stall 100000

      mm 7D4 1 ;IO :50



      echo enable eGPU

      mm 0001013E 1 ;PCI :8



      echo Boot Windows

      fs0:EFIBootootx64.efi











      Some numbers:







      Model:






      rMBP 15"








      OS:






      Windows 8 EFI-Boot








      CPU:




      Intel-i7 3820QM @ 2.7gHz








      RAM:




      16GB








      iGPU:




      Intel HD Graphics 4000 (broken for Win8)








      dGPU:




      Nvidia GT 650m








      Adapter:




      TH05










      I tried some overclocking and ended up with some strange results for 3D Mark 06. No matter what you change you'll have lower points than default settings.














      dGPU GT650m




      eGPU GTX
      560Ti






      eGPU GTX660Ti




      eGPU GTX660Ti OC








      3DMark 2011:






      2431p




      4415p




      7110
      p




      7463
      p








      3DMark Vantage:




      10633p




      16755
      p




      23810p




      24702p








      3DMark 2006:




      15289p




      17479p




      17979
      p




      like 15900?














      This might work for BIOS boot aswell but I don't know how to launch an MBR partition. (Update: BIOS method by nando)



      I'll try to tweak this a bit and hope for an Intel fix.

      Is there anyone with some more knowledge in EFI Shell?