Jump to content

ssamydla

Registered User (Promoted)
  • Content Count

    47
  • Joined

  • Last visited

Everything posted by ssamydla

  1. PC has it's glory with Razer Core, Alienware Amplifire, ASUS XG2, etc etc. and all of it's DIY eGPU projects was amazing At this moment, the only eGPU that will work perfectly compatible with MacbookPro Retina 15 inch is the BizonBox. On MBPr 15 inch iGPU only model you can even use eGPU to game with internal display I've been following this product quite a while, and finally come this reasonable product review that runs on an iMac 5K + GTX 980TI 1440 monitor BizonBox Official Benchmarks Mac Pro vs MacBook Pro Retina with BizonBox Here's for the Config Pricing & Complete Technical Spec: https://bizon-tech.com/us/bizonbox2-egpu.html/#config Make sure to read FAQ: https://bizon-tech.com/us/faq-egpu I might going to go with BizonBox instead for my MBPr 15", since Akitio Thunder 2 was not build for GPU usage. I hope this product will be the holy grail for Mac users, before the new MacBook Pro released with desktop GTX 10xx and T3 (finger crossed) (they'll go with ATI)
  2. ssamydla

    Should egpu fan stay on during setup?

    got my W520 successfully run with EXP GDC Beast expresscard + DELL power, without Setup 1.X. I believe it should be the same architecture with your setup. Anyways, I got Error 43 when using mPCIe, eGPU fan slow. I found out my GPU Vbios is corrupted. You can check my workarounds on my post
  3. ssamydla

    EXP GDC Beast/Ares-V7/V6 discussion

    my LED on v8 is BLUE. I've never seen any EXP GDC with RED LED before (perhaps new version?) I couldnt find v8.3 either. Bangood doesnt reply me. They still using same old product photos. But i can confirm you that there are new version on the mPCI signal cable. This could be easily identify by its physical form: - Smaller/Thiner cable - It has 'BEAST Pro' logo on mPCIe end - There's no PTD switch - 8 pin power connector slightly different Beast Pro Mine: Blue LED On
  4. what do you mean with turn off power hungry electronics? which electronics? you think is it possible to get Gen 2 with Expresscard or mPcie?
  5. ssamydla

    Hi guys need some assistance.

    Hi@rradij, from what i've found The only option to add eGPU for you laptop is thru it's mPCIE and removed your wifi card. You should check with your chosen eGPU products that support your laptop. Bandwidth and GPU manufacturer is all depends on the whole config/setup. But AMD Drivers makes it harder to run eGPU along side with iGPU. Depends on your OS. You can find@Tech Inferno Fan 's complete solution regarding which OS and Drivers. Or using his Setup 1.x My Lenovo W520 laptop is the same manufacture date with yours, 2010-2011, at that time most PCI Express build with Gen 1 and limited to x1 link speed. I have a MacBook Pro Retina 2013 15inch, but i prefer to make a good use of my old 2010 laptop with eGPU. And it worked fantastic. ps: i believe your laptop is originally AMD based, and you swap with Intel i7, so your iGPU & dGPU conflicting?
  6. @Boykodev oops, sorry man i'm on Lenovo Still waiting for my Mac signal cable from China then i can try with my Macbook Pro Retina 15", but with EXP GDC only. Thunderbolt on a Windows bootcamp is so troublesome at this moment.
  7. I have managed to make iGPU runs for my internal and eGPU Quadro 4000 for Mac 2GB for my External monitor But this doesn't work when i use Quadro 6000 6GB. Output display only goes to eGPU Both setup has no issue for lid closed, and restart or sleep while it closed
  8. @Boykodev Optimus is GPU switching 'technology' created by Nvidia. Only available on Mobile Workstation (mostly) or Laptops that incorporates Intel HD Graphic from Intel Processors as an Integrated Graphic (Power Saving), and Nvidia GTX / Quadro Graphic Card as Discrete Graphic for more graphic/compute demanding applications (Max Performance) Windows 10 has it's own Hybrid Graphic capabilites. Nvidia also have 'Maximus' GPU switching technology but only works on Workstation that incorporates High-end Quadro GPU + Tesla card. By system default / BIOS, Laptop Internal display will use Intel Graphic, and External Monitor (thru it's Digital output port) will be driven by Discrete Graphic / from 3D Application options from Nvidia Control Panel. some has managed to make Optimus works with eGPU on Win 10, but many has failed
  9. I'm not sure if this related to your issue, But there's performance issue with 970 back in 2014. http://www.pcworld.com/article/2875740/nvidia-explains-geforce-gtx-970s-memory-performance-issues-admits-error-in-specs.html http://www.gamespot.com/articles/did-you-buy-a-gtx-970-nvidia-owes-you-30-if-you-li/1100-6442239/
  10. Not sure if anyone in this forum know about this. But there's a class-action lawsuit over false advertising. The GTX 970 is not fully runs on 4GB of GDDR5 as advertised by Nvidia. This lawsuit has been consolidated since March 2015 when owners having performance issue since the product has been launch in Sept 2014. As a result, the 970's memory was split into 3.5 GB and 0.5 GB sections You can read the full article here: http://www.gamespot.com/articles/did-you-buy-a-gtx-970-nvidia-owes-you-30-if-you-li/1100-6442239/ http://techreport.com/news/27721/nvidia-admits-explains-geforce-gtx-970-memory-allocation-issue GeForce Discussion Forum: https://forums.geforce.com/default/topic/803518/geforce-900-series/gtx-970-3-5gb-vram-issue/
  11. From what i've found, there is no iGPU on a 2012 iMac. Can you be more specific on the model? The latest iMac has iGPU since it's build on mobile processor You should refer to a link for your iMac spec, not from Intel. Or simply go to About Mac> Sys Info. If there's Dual GPU, you'll see it. http://www.everymac.com/systems/apple/imac/specs/imac-core-i5-2.7-21-inch-aluminum-late-2012-specs.html [EDIT] if you're on an iMac which is a desktop computer, why would you need Optimus? You can achieve more juice with AC power all time Optimus only for Mobile Laptops: https://forums.geforce.com/default/topic/521518/geforce-mobile-gpus/is-nvidia-optimus-in-all-in-one-computers-/ If you disable dGPU from the system, that is why it goes black screen and errors. Nvidia has Maximus for Multi GPU Workstation, but only works for Quadro + Tesla cards
  12. lol. i feel you. took me 2 months to make it work on windows, bricked 2 my quadro cards and re-order cables from china if i may suggest, if possible test with PCIe card other than GPU (sound card, LAN Card, video capture card) it is good to test it run on your Akitio T2 on stupid windows. Since we're facing so many possibilities that could make troubleshooting harder
  13. this guy managed to make it work. But he's on a mac mini and use it on it's OSX 2014 Mac Mini + GTX960@16Gbps-TB2 (AKiTiO Thunder2) + OSX 10.11 [lilins] https://r.tapatalk.com/shareLink?url=https://www.techinferno.com/index.php?/topic/10267-2014-Mac-Mini-%2B-GTX960%4016Gbps-TB2-%28AKiTiO-Thunder2%29-%2B-OSX-10%2E11-%5Blilins%5D&share_tid=10267&share_fid=33421&share_type=t
  14. @jowos you mean to get internal display to use eGPU? Or use it without external display thru eGPU? Well, with my Q4000, i'm able to use iGPU on laptop monitor and eGPU on external display. But with my Q6000 if i plug an external display, the laptop monitor will go blank. So just to make it sure i always close my laptop lid so that it will go thru eGPU only, and set PhysX to use eGPU from nvidia control panel. (Windows power settings at Maximum Power Plan) anyways, i believe for recent GTX cards, PhysX should work as it is from the ODE driver. And only few games take advantage of PhysX. It's more useful for demanding compute+graphic works, ie: Benchmarks & 3D Applications So in my Quadro case, i need PhysX ) Here's some great systematic test on PhysX http://www.volnapc.com/all-posts/how-much-difference-does-a-dedicated-physx-card-make
  15. @jowos Yeah it's an old Fermi card please don't compare it with recent GTX cards My setup was pretty much off an old hardwares, i'm trying to make a good use of a 4 yo GPU, Laptop and Monitor as it was back in it's glorious days. lol At first i use mPCIE but it was very unstable, i believe it cause both of my Quadro cards bricked. Now i'm running on ExpressCard. And yes, it's on x1 v1.1. I still can play Fallout 4 with Ultra settings on 1920x1200@60Hz, but i tend to use it more for 3D & VFX stuff
  16. @jowos Just found out that we need to install PhysX System Software in order to make PhysX work properly and detected by GPU-z. I did some test, by installing latest PhysX System Software it greatly improves PhysX benchmark on FluidMark. But computer freezed twice during Fallout 4 loading and FluidMark fullscreen benchmark. I need to install Legacy PhysX software as well to make it run smooth (for old Games or Games that use old PhysX SDK). But i didn't use it much for gaming though. Only for 3D & VisualFX Applications. If you think it's working fine. no need to install it though. It's just work as it is. PhysX System Software: http://www.nvidia.com/object/physx-9.15.0428-driver.html Nice post btw!
  17. Can anyone confirm that on GPUZ, normally GTX cards' CUDA, PhysX was not checked? Since i saw others GPU-Z screenshots that enabled CUDA & PhysX, but it was from Desktop PC.
  18. How did you managed to install GTX 960 driver? and which driver did you use?
  19. ssamydla

    help : eGPU usage 0%

    @Chevrotine669 You cannot checked. It shows what's your GPUs capable of and read / detected by the system or Driver. Here's the example of other's MSI GTX960 i've found on the net and here's my eGPU with Quadro 6000: Is it possible to check your 960 on a desktop computer? or use other PCIe cards on your eGPU just to make sure both are OK. Since you have mentioned in first post that it worked without Error 43. [EDIT] My mistake, PhysX System Software must be Installed in order to run properly and detected by GPU-Z. The rest feature is by default Installed by the Driver. Download PhysX System Software: http://www.nvidia.com/object/physx-9.16.0318-driver.html Download PhysX FluidMark to test: http://www.geeks3d.com/20130308/fluidmark-1-5-1-physx-benchmark-fluid-sph-simulation-opengl-download/ I just installed it and it's greatly improves PhysX benchmark.
  20. ssamydla

    help : eGPU usage 0%

    @Chevrotine669 No. SLI only enabled if you have multiple desktop GPU and connected thru SLI bridge. I'm more concerned on the Computing check list. Normally it should be Checked/Detected by the system.
  21. ssamydla

    help : eGPU usage 0%

    @Chevrotine669 Can you screenshots DevManager with that Error 43? Double click your eGPU with Error43, General Tab & Details Tab > Property dropdown menu > Hardware ID. Or screenshot GPU-Z
  22. ssamydla

    NVIDIA Quadro K5100M CLEVO VBIOS

    @yayatwiste i believe this would be a risky swap. you sure the Clevo K5100M kit supported on your GT70? There's going to be a hardware dependencies i believe, not just by Flashing it. for a $1500 MXM mobile GPU, you could've just bought a full set eGPU with recent GTX cards
  23. @yama84 Yeah, In recent OSX/MacOS version All Graphic card drivers is embed on MacOS / OSX Installer. Unless if you're using Quadro for Mac certified model (Q4000 for Mac & K5000 for Mac). GTX 980 is Maxwell. and K4200 is Kepler, However, here's the K5000 for Mac drivers but I cannot guaranteed it will worked for your K4200: http://www.nvidia.com/download/driverResults.aspx/105589/en-us But if you would like to give it a shot, share with us the results And you might need to to Install separate CUDA for Mac driver to utilize it's Graphic API & Compute API on a Mac: http://www.nvidia.com/object/macosx-cuda-7.5.30-driver.html Only install CUDA driver for Mac if you planned to use your Quadro card for Creative application computing. Good luck [EDIT] If you think the driver successfully installed and Quadro card detected on El Capitan but it's not working properly, you might need to flash the card's VBIOS with customized for OSX. But this is very risky, if you're unsure about Flashing VBIOS. Don't do it. Consult with eGPU & VBIOS Mod experts in this forum
  24. Thank you Moderators for putting this into Implementation Guide - PC Cheers!
  25. Hi, I would like to share my eGPU experience with EXP GDC V8 ExpressCard on my W520, Quad core i7, 12GB RAM, Quadro 2000M 2GB dedicated VRAM, Windows 10 Pro 64bit. last month after weeks of fails and errors, (specially the famous Error 43), i finally got my EXP GDC running using mPCIE version with NVIDIA Quadro 4000 for Mac (yes, for Mac!) I make it works by setting the PTD switch to 7s. But by the time Windows successfully detect my Q4000 i straught away hook up my external monitor (EIZO CG245W 24-inch) to my eGPU, then the system blinks, i assume it detect a display signal. but it wasnt. And my W520 suddenly shutdown. And i can not make it work again. Even after hundreds of time reinstall-uninstall NVIDIA Drivers with DDU it wont make it work again. So, at this point i switch the GPU to my Quadro 6000 6GB GDDR5, and repeat the last success process. Nothing happened. Our good friend Error 43 back again and stay there forever. In my desperate time, i finally ordered the ExpressCard signal Cable. While waiting it's enroute somewhere in China, I test the EXP GDC with my AJA Kona 3G 4K I/O Video Capture Card, and hey it's working! i can do Edit & Realtime Preview 4K RAW video files thru the card flawlessly, It utilized the card's hardware encoding, (4K RAW files from ARRI & RED Camera) as well as output 16-Channels 192KHz Audio signal from my DAW. (no additional 6-pin power required for this card) Then i test my both Quadro 4000 & Quadro 6000 GPU back to my HP Z800 workstation, guess what? They're both bricked! It received power, but system cannot boot up. Beeping. means GPU is faulty. The next day, my AJA Kona 3G card also bricked! Today the ExpressCard signal cable finally arrived along with my 6-pin + 6+2 pin, but again, when i use the ExpressCard setup it doesnt do anything. No power comes in to the EXP GDC. My 500W ATX PSU is running though, but the LED indicator on the EXP GDC not lit. i set the slot to Generation 1 power mode in BIOS. So, I've tried both mPCIe and ExpressCard on W520 with nothing but frustrations. Did i miss anything here? I appreciate any workaround on this. Since i've never found any eGPU setup with W520 anywhere in the net. specially using ExpressCard & a High end Quadro GPU. Here's my last config that i've managed to make Q4000 work: mPCIe method: BIOS: Whitelist latest BIOS 1.42 Power Management: Both for Processor and PCI set to Disabled Display Mode: Integrated Graphic OS Detection Mode for Optimus Graphic: Disabled PXE Boot and all Network Boot options including PCI LAN Boot order : Disabled Win 10: Device Manager: System Detect eGPU, Error 43 Uninstall NVIDIA Driver with DDU (safe mode) Reboot Check BIOS first, usually it will set back the Display to Optimus Mode, i Let it Boot in OPTIMUS MODE Make sure Graphic Card(s) is detected in Device Manager as a 'Basic Microsoft Display Adapter' Install Quadro driver for both Discreet Graphic & eGPU Reboot to BIOS Display Mode: Integrated Graphic OS Detection Mode for Optimus Graphic: Disabled Quadro 4000 for Mac successfully detected in Device Manager. Plug my external monitor to eGPU GPU Bricked ExpressCard Method: Same as mPCIe method ExpressCard Power mode: Generation 1 Nothing happened eGPU setup: EXP GDC Beast V8 500W ATX PSU DELL 220W Power Adapter (backup for small cards) PCIe 6pin+6pin+2 auxiliary power cable (backup spare) ExpressCard Signal Cable mPCIe Signal Cable [IMPORTANT NOTE]: You will know your graphic card is working fine when you power on your laptop, normally all Quadro cards fan spin at normal speed, then it will spin faster/accelerate for 3-4 sec at Windows logo, means the system & OS detects your card then it back to normal fan speed. During the setup, i often get my Quadro 4000 fan spin so freakin fast. Faster then the POST boot init, but then back to normal speed Bricked Quadro Graphic Card: Quadro 4000 Fan spin fast forever, while Quadro 6000 stays low speed. Thanks a lot!
×

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.