Jump to content

davide445

Registered User
  • Posts

    138
  • Joined

  • Last visited

  • Days Won

    2

Everything posted by davide445

  1. Hi biker63, reading here and there I can't find a specific way to do so, just find MIGHT be possible but not a specific instruction to verify or do it. Hope others more expert than me can give you a specific advice.
  2. Can comment since I heavily researched for exactly this reason: offload calculation (not only DX11 graphics) to GPU. GPU calculation it's a reality since many years in HPC, just look at top500 list and you can find a lot of GPU accelerated supercomputers. On graphics workstation market since the presentation of the new MacPro was clear Apple decided focusing more on GPU than CPU, otherwise why shipping an high end graphics machine with only one CPU but 2xGPU by default, as many "traditional" 3D pro are complaining for. Reason is Intel is slowing down CPU performance increase, vs GPU performance going higher and higher. Apple it's following the trend, using also his strategic move to retain customers with a tight hw+sw integration as usual, as example with his Final Cut Pro X already optimized to work with 2 GPU using OpenCL. Also Apple move being based on AMD GPU is pushing many vendors to adopt OpenCL (recent example the new Nuke 9), meaning even broader and cheaper GPU available power with AMD normally really price competitive, and the possibility for anyone to develop on open standard and not proprietary CUDA. There are also a huge movement among 3D renderer to offload computation to GPU, with some already finished solutions (Furryball) and some hybrid CPU+GPU approach (Indigorenderer and some few others, with Indigorenderer in the future will develop also pure GPU rendering option). This mean at least a x4-x6 speed increase vs CPU rendering, with some limitations due to limited available VRAM. Notice the recent availability of 8GB VRAM "gaming" cards, useful not only for 4k gaming! Just read some pro asking for Alienware Graphics Amplifier to be used on hybrid rendering solutions. Following this trend 3D professional are also considering to use 3D game engine as rendering engine vs traditional renders. Game engine such as Unreal Engine and CryEngine are able to present stunning visual in real time, leveraging the GPU power without almost any usage of the CPU. As example there are already many example of UE4 usage in interiors architecture rendering. Established renderer with the adoption of GPU rendering are just going on the same side of game engines, as game engines are pushing to adopt renderer technology such as global dynamic illumination now and realtime ray tracing in the near future. The separation among them will blur more and more. With available huge VRAM GPU and maybe the adoption of AMD hUMA architecture the transition from CPU centric to GPU centric (or at least hybrid) architecture will be complete. In my understanding this is the idea why Intel its preventing Thunderbolt eGPU: they win CPU race with AMD and refocus his R&D effort on the mobile market, only to find that high margin high end 3D graphics market it's going through GPU way where they didn't have any brand nor high end solution, and also in mobile market they need to compete with many ARM based architectures, again starting without any market recognition nor support. Thunderbolt eGPU will allow anyone to just maintain current hw for years, just upgrading the GPU as hUMA or equivalent solutions are gaining traction. To finish worth in my idea to underline AMD situation: they lose CPU war, but in fact adopt the right strategic move. They buy in non suspect times ATI, so they have now technology and brand to compete in GPU driven market. They also develop hUMA that can be the cornerstone for new general purpose GPU driven architectures. They push out Mantle API that force DirectX and OpenGL to pair, enabling even more the usage of low cost CPU with a strong GPU a side. They also partner with ARM, pushing the adoption of this energy efficient architecture in traditional server market. I think they have demonstrated really good strategic thinking and I personally really appreciate his GPU value with many workstation class features available at small price. The same discussion we made for Intel we can made for Nvidia, that is pushing to stop the adoption of OpenCL for evident reasons, as is introducing more and more proprietary features to retain his loyal sw vendors. Im not an AMD affiliate nor think AMD it's a no profit angel organization. I only think they better incarnate what we are doing here: intelligent solutions that enable new possibilities for many in the average and not only high end market, pushing for grow to more ambitious goals. I hope AMD move will start to pay, since this will mean IMHO also a push for broader eGPU technology availability.
  3. I don't think so since the circuitry it's different. PE4C V2 and also V2.1 it's coming with EC, mPCIe and PCI-E adapters, if your new notebook is coming with one of these you can maybe change just the connector if BPlus is shipping it separately. There are also EC to Thunderbolt adapters such as this one http://www.sonnettech.com/product/echoexpresscard34thunderbolt.html but you will be limited to EC speed and also don't know if will work with another layer in the middle. They used to have a specific Thunderbolt adapter named TH05, that was pulled from the market due to probably legal issues with Intel. How is going your 16Gb RAM test?
  4. mSATA will be for new HDD/SSD. Problem is if can be used as mPCIe.
  5. Here http://forum.techinferno.com/diy-e-gpu-projects/7969-pre-purchase-faq-draft-input-requested.html point 4 you can find answer about using internal monitor.
  6. hi Jakoob, didn't remember the version (1.36 or 1.38) but its the latest one I did check just last week. Im also interested to upgrade RAM even if cost a bit, interested to know if your setup is working. Where are you planning to buy the new RAM? On an international distributor or national one?
  7. Your model seems to have no external PCIe compatible interface such as Thunderbolt or ExpressCard. Did have an internal mSATA slot that might be used as mPCIe slot, where you can connect an eGPU. I think (others can confirm) you need to check your BIOS if you can switch mSATA port to mPCIe, since mSATA and mPCIe are electrically compatible but not guaranteed to be interchangeable (depends on motherboard). If not, there is no way I know you can use a eGPU since you haven't any way to connect on. - - - Updated - - - In your model I can't find any reference to PCIe compatible external or internal interface such as Thunderbolt, mSATA/mPCIe or ExpressCard. Without such an interface you can't connect a eGPU to your laptop. Better you ask to Asus to be sure if there is something.
  8. DIY eGPU guide for graphics applications Objective: work on FullHD video editing, color grading, 3D modelling, VFX, realtime 3D engine sw on a 2011 sub-notebook Lenovo X220 with lowest possible costs Precheck and software planning - Check notebook can work without problems on eGPU http://forum.techinferno.com/diy-e-gpu-projects/2109-diy-egpu-experiences-%5Bversion-2-0%5D.html#prepurchasefaq In my case with 8GB RAM a TOLUD check is needed: Device Manager>View>Resouce by type>Memory and the first PCIe entry returned A0000 meaning 2.5GB < 3.25GB so no problems. Also from hwinfo my X220 does have ExpressCard 2 5GBps interface, so can use x1.2 (x1 PECIe lane 2.0) connection - Careful sw planning, searching for sw that work mostly on GPU and not on the weaker CPU. Meaning DaVinci Resolve instead of i.e. Adobe Premiere for video editing, and GPU driven rendering sw instead of CPU driven i.e. Furryball vs VRay. Also searching for the specific GPU features requested for the planned sw: games request for high frequency and many shaders, graphics sw request for huge VRAM and big data rate (big data bus) - Find the right GPU. Using these sources Best Graphics Cards For The Money: September 2014 - Best Graphics Cards For The Money, September Updates (best card for the money) Graphics Card Performance Hierarchy Chart - Best Graphics Cards For The Money: September 2014 (to find equivalents AMD vs Nvidia vs Intel cards) September 2014: Graphics Card Performance Per Dollar (performance per $) PassMark Software - Video Card Benchmarks - High End Video Cards also benchmarks for specific applications (i.e. Luxmark, Cinebench) Normally after roughly $250 you receive less money for performance. In my case considering the request for biggest possible VRAM and big data bus the best choice under $250 is 7950 Boost, since correspondent Nvidia GTX 670 does have normally 2GB vram and 256 bit bus vs 3GB and 384 bit and much lower double precision floating point performances. With 225w TDP can be powered from power adapter provided with most eGPU set and didn't need an external ATX power supply. - Recheck sw selection for right GPU acceleration Having choosen an AMD GPU I can't use CUDA accelleration and need to look at OpenCL accellerated sw. So ok Davinci Resolve for video editing, for VFX better Nuke 9 over Fusion or After Effects, Cinema 4D ok for 3D modelling and animation but need an external render since C4D did use only CPU, will be Indigo Render (still not 100% sure). Ok for Unreal Engine as realtime 3d graphics. - Find right eGPU connector This was the hardest part, since was really difficut for me to understand the differences among various options. PE4C V2.0 does have x2 connection option and a updated design, with GPU holding, it's provided with a 230w power adapter with changable power cable (provided one it's for USA market, I'm in Europe). Most important for me provide a 8pin-->2x(6+2)pin male male Y splitter, difficult to find otherwise and also give the flexibility to power on the GPU (7950 come in various options 6+6, 6+8, 8+8). Only drawback you didn't have a enclosure. EXP GDC V6 it's quite similar but: the Y splitter come with 6pin-->6+6pin only and you have an enclosure as option, but acrylic one for v6, the metal it's only for V7. - Bill of materials a) GPU (Sapphire 7950 Boost) cost: 150 Euro eGPU adapter (BPlus PE4C V2.0) cost 113 USD + 31 Euro customs duties c) European power cable (to substitute USA style provided with BPlus eGPU) cost 13 Euro d) External monitor (I already have one) e) DVI --> VGA adapter (my monitor does have only VGA in) cost 6 Euro f) added mSATA 240GB SSD replacing my 64GB previous one for 40 Euro RMA + 60 USD new costs TOT: 338 Euro vs about 1000 Euro of a new PC I can also maintain SSD and GPU for a new PC when I will build one. - - - Updated - - - Setup 1) build the eGPU setup connecting GPU and adapter + power cable 2) power up laptop 3) power up eGPU 4) connect to X220 trough ExpressCard adapter 5) laptop recognize the eGPU and install drivers 6) after that use amddriverdownloader to verify amd drivers, download and install it 7) reboot, leaving eGPU connected 8) extend display to external monitor 9) set external monitor as Primary using amd Control Center 10) to power off for safety I normally a) stop extending display remove eGPU using remove hardware 11) for powering up next time connect eGPU to laptop when both laptop and eGPU are off, power up eGPU and after power up laptop Troubleshooting - the first time I tried to install eGPU was not recognized from laptop. The secret it's the sequence: you need to connect it powered up with also laptop up and running. - if you installed drivers before have the GPU properly running better uninstall it and have a clean installation using previous steps, otherwise will not work even following right process - discovered a lot of artifacts (horizontal strings) on external display when eGPU is loaded. In my case (true problem can vary) the cause is DVI-VGA analog convesion. Using only digital connection on a test monitor (both HDMI or DVI-D) there are no artifacts. - - - Updated - - - Graphics related score Cinebench R15 OpenGL: 63,46 fps Indigo Renderer (3.6.28, 3min Erotica test scene, PT, samples): CPU: 242.72/pixel, 891k/s, +GPU: 311.99/pixel, 1145k/s Unreal Engine: TBD Interestingly in Resolve you can work also on internal display only (difficult on a 12" but can be for just setup and small things) since GPU is used as computational unit and iGPU for GUI. On normal setup both eGPU and iGPU are active, rendering graphics on respective display (external and laptop).
  9. Just an update atout my flickering problem, if can be useful for others. In my case I did see horizontal strings all over the screen when the eGPU come under load, from any kind of app (browser, video editor, 3D engines). Strings disappear immediately after the GPU work finished. Reading on the net appear to be a common problem for HD 7xxx series, without a single solution. Can come from Catalyst driver version, problem from voltage management, bad card, OC, connection cable quality, vsync, bad VRAM, bad power supply. After testing: GPU underclocking, RAM underclocking, voltage increase, voltage decrease, forcing vsync, forcing display refresh, cable connection strengthen, disabling any GPU image enhancement, I did maybe discovered the problem. I attached as a monitor my led TV using HDMI cable instead of the DVD-I-->VGA adapter I used on the old monitor. No flickering at all, so I will buy a HDMI or DP or DVI-D to VGA adapter and will see if will work on my display in the same way.
  10. Maybe this one can be useful http://forum.techinferno.com/diy-e-gpu-projects/2129-diy-egpu-troubleshooting-faq.html#error43_faq1 I'm absolutely not a geek so my best is to report first hand process or others solutions!
  11. Unfortunately no idea about, sorry. Maybe this can be useful http://support.microsoft.com/kb/133240
  12. my experience was exactly the same (using PE4C) and the solution was in the sequence: need to wait the laptop was up and running and only after connect the (already powered) eGPU. The laptop then recognized a new device, search and install drivers. After that you can install vendor specific drivers for your GPU, and in my case was asked for reboot. If you have already installed vendor drivers uninstall it and wait the system has installed his drivers. If all is ok next time you need to boot always with eGPU already connected and powered. Also pay attention to set your external display as primary if you want benchmark and app can experience the new hw acceleration. Hope this help and your case is similar to mine.
  13. I think it's time to publish my official scores since the system it's pretty stable. System: 12" Lenovo X220 CPU: i5 2520M 2.5 RAM: 8GB eGPU: [email protected] Adapter: PE4C V2.0 Ports: QM67 EC2 OS: Win7/64 3dmk11.gpu: 7656 (AMD Radeon HD 7950 video card benchmark result - Intel Core i5-2520M Processor,LENOVO 429137G) vant.gpu: 24255 (AMD Radeon HD 7950 video card benchmark result - Intel Core i5-2520M Processor,LENOVO 429137G) 3dm6: 15585 (AMD Radeon HD 7950 video card benchmark result - Intel Core i5-2520M Processor,LENOVO 429137G) Cinebench R15 OpenGL: 63,46 fps Indigo Renderer (3.6.28, 3min Erotica test scene, PT, samples): CPU only: 242.72/pixel, 891k/s, +eGPU: 311.99/pixel, 1145k/s Indigo Renderer (4.0.30 3min McLaren test scene, PT, samples): CPU 0.265 Mil pix/sec, pure GPU: 2.259 Mil pix/sec HitFilm 3 Pro (3.0.3521, Sun test scene rendering min): iGPU 47:38, eGPU 3:09 Unreal Engine: <acronym title="To be discussed">TBD</acronym>
  14. Crossing my fingers maybe I did find the problem. First I did apply this fix "Display driver stopped responding and has recovered" error in Windows 7 or Windows Vista and second looking at Catalyst Control Center Performance settings appear that OC was still enabled for some reason. Returned settings to default one and I'm working without any problem since one hour. Last remaining problem to be addressed the screen flickering, but since now I'm able to work on DaVinci Resolve 11 and Unreal Engine 4 Editor without any problem, something impossible using iGPU. Magic eGPU!
  15. ok tested also forcing EC to Gen1 by BIOS setup, I got the same error (amd display driver stopped responding and was successfully recovered) almost immediately, even without any opened app. So I put it again on Auto. Hwinfo report following info, did you see any pbm? PCI Express x1 Bus #5 ----------------------------------------------------- Sapphire Radeon HD 7950 --------------------------------------------------- [General Information] Device Name: Sapphire Radeon HD 7950 Original Device Name: ATI/AMD Radeon HD 7950/R9 280 (TAHITI PRO) Device Class: VGA Compatible Adapter Revision ID: 0 Bus Number: 5 Device Number: 0 Function Number: 0 PCI Latency Timer: 0 Hardware ID: PCI\VEN_1002&DEV_679A&SUBSYS_E249174B&REV_00 [PCI Express] Version: 3.0 Maximum Link Width: 16x Current Link Width: 1x Maximum Link Speed: 8.0 Gb/s Current Link Speed: 2.5 Gb/s Device/Port Type: Legacy PCI Express Endpoint Slot Implemented: No Active State Power Management (ASPM) Support: L0s and L1 Active State Power Management (ASPM) Status: L0s and L1 Entry [system Resources] Interrupt Line: N/A Interrupt Pin: INTA# Memory Base Address 0 D0000000 Memory Base Address 2 F1BC0000 I/O Base Address 4 4F00 [Features] Bus Mastering: Enabled Running At 66 MHz: Not Capable Fast Back-to-Back Transactions: Not Capable [Driver Information] Driver Manufacturer: Advanced Micro Devices, Inc. Driver Description: AMD Radeon HD 7900 Series Driver Provider: Advanced Micro Devices, Inc. Driver Version: 14.301.1001.0 Driver Date: 15-Sep-2014 DeviceInstanceId PCI\VEN_1002&DEV_679A&SUBSYS_E249174B&REV_00\4&2211D9D4&0&00E3 ATI/AMD Tahiti - High Definition Audio Controller ------------------------- [General Information] Device Name: ATI/AMD Tahiti - High Definition Audio Controller Original Device Name: ATI/AMD Tahiti - High Definition Audio Controller Device Class: Mixed mode device Revision ID: 0 Bus Number: 5 Device Number: 0 Function Number: 1 PCI Latency Timer: 0 Hardware ID: PCI\VEN_1002&DEV_AAA0&SUBSYS_AAA0174B&REV_00 [PCI Express] Version: 3.0 Maximum Link Width: 16x Current Link Width: 1x Maximum Link Speed: 8.0 Gb/s Current Link Speed: 2.5 Gb/s Device/Port Type: Legacy PCI Express Endpoint Slot Implemented: No Active State Power Management (ASPM) Support: L0s and L1 Active State Power Management (ASPM) Status: L0s and L1 Entry [system Resources] Interrupt Line: IRQ16 Interrupt Pin: INTB# Memory Base Address 0 F1440000 [Features] Bus Mastering: Enabled Running At 66 MHz: Not Capable Fast Back-to-Back Transactions: Not Capable [Driver Information] Driver Manufacturer: Microsoft Driver Description: High Definition Audio Controller Driver Provider: Microsoft Driver Version: 6.1.7601.17514 Driver Date: 19-Nov-2010 DeviceInstanceId PCI\VEN_1002&DEV_AAA0&SUBSYS_AAA0174B&REV_00\4&2211D9D4&0&01E3
  16. Maybe I'm wrong about my EC adapter and can't be able to reach Gen2 speed, here data from HwInfo: Slot Designation: ExpressCard Slot Slot Type: PCI Express Slot Usage: Empty Slot Data Bus Width: 1x / x1 Slot Length: Unknown I also did have 5 PCIe Bus, how can I know which is connected with EC?
  17. Thanks Tech Inferno Fan it's an honor receive your first direct response. I tested in fact yesterday OC but create crashes so now all is running at fabric default. My Lenovo X220 does have EC2 interface theoretically capable of Gen2 speed. Also PE4C V2.0 need to support Gen2 (right?). Will check if setting to Gen1 will solve the issue but I will be really disappointed since will mean much worse performance.
  18. I experienced also a stability issue: after few minutes of work (testing on video editing app and game editors) the screen go black and next return and the system state "AMD display driver stopped working and has recovered". I need to disable external display and reboot the system, otherwise I'm not able to do anything more. This is also your problem?
  19. Here your performance per $ chart September 2014: Graphics Card Performance Per Dollar didn't know why the UK version cut this chart out. With such EC spec I think you can achieve more than 90% of GPU theoretical performance, more expert people here can confirm please.
  20. Your choice it's easier in that way. About GPU bottlenecked from connection depends from the interface you can use on your laptop, what laptop model did you have? Does your laptop have ExpressCard, miniPCIe or Thunderbolt connection? About choosing the GPU it's a bit off topic here, I suggest looking here Best Graphics Cards For The Money: September 2014 - Best Graphics Cards For The Money, September Updates . My understanding after much reading is if you didn't game at very height resolutions spending more than $200 it's less and less convenient in term of buck for the money. I bought my used 7950 Boost for that price.
  21. Will finish my tests in a while, need I to post here all details to be in leaderboard so that anyone can use as reference?
  22. Discovered the arcane: leaderboard look at 3DMark.gpu component, so my marks are 3DMark Vantage.gpu = 24255 3DMark11.gpu = 7656 quite good and comparable to similar setup.
  23. I didn't use EXP GDC so prefer to leave others with first hand experience to give you confirmation.... - - - Updated - - - Thanks! Anyway looking at leaderboard did find others with same GPU 7950 with much higher score. There are any guide for performance optimization?
  24. Yes was the external screen, I discovered the pbm: need to use Catalyst Control Center and set external screen as Primary one. Using this setup here my scores : 3DMark Vantage: 1672 (iGPU) 16007 (eGPU) 3DMark11: 5338 (only eGPU since iGPU dind't support DX11) I did have a lot of flickering during the test, probably due to high fps not accepted from LCD. Need to look at setup but I'm really pleased from actual performances.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.