Jump to content

prosetheus

Registered User
  • Posts

    34
  • Joined

  • Last visited

Posts posted by prosetheus

  1. Looking for some advice on an enclosure and laptop purchase, centred around an EGPU setup for some gaming.

    I already have a Gigabyte GTX 970 G1 Gaming edition card that is quite long. I'm experiencing some graphic errors on boot that I think might be caused by the card so I *might* be able to RMA it and get it replaced with a smaller card. But let's assume for the moment that I can't.

    Laptop purchase

    I'm a fan of Lenovo's T/X/W Thinkpad range, and some reading here suggests Thunderbolt2 is the way to go for maximum PCIe bandwidth (?). The Thinkpad W541 comes with a TB 2.0 port, fuller spec can be found here: Lenovo ThinkPad W541 | Mobile WorkstationÂ*| Lenovo UK

    I'm working on the assumption that this is a good choice for an eGPU (I believe it's in the 'good' list on one of the recommendation threads here). If there is something better, I'll need the laptop to be sturdy/premium build laptop with a global warranty (where I can get it serviced in a country outside the purchase country)

    Enclosure

    From what I've read, the AKiTiO Thunder2 is the preferred choice for a lot of users. Here are the things I would like from an enclosure, hoping someone can point me in the right direction:

    - ability to fit a full (or over-)sized GTX 970 - L=312mm, W=129mm, H=43mm

    - not needing a PC PSU to power it (desk space is a premium)

    Implementation hopes/desires

    - I want to run the eGPU through the laptop's display, hopefully with minimal performance loss

    - I'd like it to be fairly seamless process - plug and play as much as possible, and not invalidating my warranty with (possibly) Lenovo

    - will consider a slight loss in performance if I can get an enlosure that fits the card I have and can be powered with just the enclosure, or perhaps using some form of power brick

    Let me know if anything I've said is nonsense/impossible. Haven't seen any people with a full length card in a closed enclosure in my browsing, hoping it's possible!

    Thanks!

    Hey.

    Thunderbolt 3 egpu's have officially been allowed by intel and partners and the race is on to provide enclosures and laptops by major manufacturers such as MSI etc.

    MSI is including a 'superport' in some new models that will allow you to add external gpus much more smoothly than possible now.

    Also, Lenovo is coming out with amazing new models with full thunderbolt 3 support.

    [h=1]https://forums.lenovo.com/t5/Lenovo-IFA-2015/ThinkPad-P50-amp-P70-Hands-On/td-p/2164261[/h]

    • Thumbs Up 1
  2. So, as an update. In win 7 x64, with 12gb of 1333 ram via 2+2+4+4, my w520 with GTX 670 on PE4C 2.1 EC booted flawlessly into x1 2.0 without using Setup. Using Setup, I got full optimus support and framerates that were quite nice.

    With 8+8+8+8 1600 ram, win 7 x64 was unusable, and didn't fiddle around with setup enough to get anything above x1 1.1 (Setup saw the card as a pcie 1 card, but windows gpu-z noted the card as pcie 2 x16 @x1 1.1.

    Installed win 8 WITH EGPU CONNECTED VIA EXPRESSCARD. Same 8+8+8+8 1600 and without using setup the egpu runs flawlessly at x1 1.1 (that means no memory problems )

    I'll fiddle around with setup to see if I can get it to 2.0

    THANKS NANDO FOR THE WIN8 SUGGESTION!

    Just wanted to make sure, ideally, for the W520, the egpu should run at x1.2 opt right?

    With over 12gb of ram and win 8.1, that does not seem to be possible?

    Asking because I am about to upgrade from 8 (4 x 4) gb of ram to 24 gb (4 x 4 x 8 x 8).

  3. Good choice, I hope you got a non coil-whiny one =) (seems like the GTX900 series has a lot of that going on)

    I don't see why it shouldn't work with limelight and gamestream, I mean your system is a legit system only on a crippled bus (compared to a GEN1/2/3 x16 bus).

    I'm thinking testing my rapsberry pi as a limelight receiver since the concept is really neat!

    My friend will be bringing the gpu from the US in the first week of december so there is still some time till I can get it up and running.

    In case you or anyone else gets the time to pull this off before I do, here are the basic requirements for Limelight (Game streaming to other Android devices or P… | Nvidia Shield | XDA Forums)

    General requirements for current APK:

    SoC capable of decoding H.264 High Profile in hardware (Snapdragon, Exynos, Tegra 3 or higher, Rockchip, and more) OR an SoC powerful enough to decode in software (4 x Cortex-A9 at 1.5 GHz or similar)

    Android 4.1 or higher

    GeForce Experience with a GTX 600/700/800 GPU (GTX 600M/700M/800M supported with GeForce Experience 2.0)

    Steam

    Xbox, PS3 (with SixAxis app), Moga (HID mode), Shield, or Ouya controller (other controllers may work too in HID mode)

    Mid to high-end wireless router (preferably dual-band 802.11n or better)

    Good wireless connection to your Android device

    ---------------

    Please do bear in mind that limelight also has PC client side version as well, i.e. it can stream games to a low powered windows netbook/laptop/tablet as well.

    One of the best demonstrations of the capabilities of nvidia Gamestream (on its own proprietary hardware, NOT Limelight) I found so far was:

    I really hope this works well with egpu solutions or I'm screwed because I chose the 970 or the r9 290x Lightning which is for $321 now on newegg (after rebate).

    Let me know if you or anyone else gets the chance to try it out before I do.

  4. @prosetheus

    I'd go for the MSI GTX 970

    - - - Updated - - -

    I'd get the Gigabyte GTX 970 Gaming G1 over the GTX780Ti GTX 970 Comparison: STRIX vs MSI Gaming vs Gigabyte G1 . That card can be overclocked to > 1500Mhz boost clock overperforming the GTX780Ti.

    The AMD versus NVidia question is more complex. Some games can be optimized more for one than the other. Consider reviewing some comparative benchmark results http://forum.techinferno.com/diy-e-gpu-projects/2109-diy-egpu-experiences-%5Bversion-2-0%5D.html#ati+nvperf . AMD doing some serious cost cutting since the release of the GTX9xx. Means might pickup a serious performance bargain. Though do note that there is no accelerated internal LCD akin to NVidia Optimus with AMD cards unless you pay for LucidLogix Virtu, which even then may not offer mobile support: http://forum.techinferno.com/diy-e-gpu-projects/2967-lucidlogix-virtu-internal-lcd-mode-amd-egpus.html

    Best would be if you could buy a GTX970 and R9 290x, run your important games on both and compare framerates and return the slower one.

    Thank you for the suggestions guys. I ended up getting a MSI 970 to pair up with my W520. Hopefully I should be able to have a decent experience.

    One more thing, and this is somewhat important to me. Can anyone confirm whether nvidia Gamestreaming works over pe4l expresscard connections?

    That is basically a feature which allows nvidia cards to stream games to their Shield controller/tablet. I will be using Limelight, which is basically a hacked version of nvidias software that allows it to stream to any android device or even PC. Limelight is free and open source.

    It would be very helpful for me if someone could let me know what I can expect with that by testing it out on their end.

    relevant links:

    What is nvidia gamestream? Stream PC Games from your GeForce GTX Rig | NVIDIA SHIELD

    What is Limelight? Limelight Game Streaming

    Where to get it from? https://play.google.com/store/apps/details?id=com.limelight&hl=en

  5. Hey guys, need some help here.

    A friend is coming over from the US by end of november so I can order parts at good prices. At the moment I am faced with building a desktop or setting up an eGPU setup over EC with a W520 (2820qm + 8gb ram) I have.

    I will be gaming over external display at 1080P.

    For a diy egpu, which would give the best performance?

    GTX 780 Ti ($390)

    r9 290x (~$340 to 370)

    GTX 970 (~$340 to 360)

    Also, I wanted to get the opinions of the forum users about reliable figures for the power consumption of these cards, both stock and with reasonable, stable OC's, so I know which PSU i can get away with for cheapest.

    I play all kinds of games, but important to me is performance in MP shooters (BF4 etc) and open world games (GTA 4 and upcoming 5, Assassins Creed Black Flag) and later Arkham Knight.

    Much thanks in advance.

  6. btw getting the new WS60 when it comes out cant wait for wrecking graphics and the ability to connect an egpu easily

    I would think twice before getting the WS60. I checked the specs on the link you provided and for some reason the WS60 only has 2 ram slots, whereas quad i7's support 4 ram slots. Thus the maximum RAM supported is actually only 16gb. And if you actually intend to use it as a workstation, 16 gb can be a limiting factor.

  7. Finally I've got my hands on MSI R9 270 GPU. My 3dmark11 score:

    i5-3210m + R9 270. GPU score is significantly higher than the GPU score of GTX660: 6785p vs 6019p. Does it perform that well in games?

    No :( I've tested Crysis 3, ACIV, Batman Arkham Origins.

    Some HWINFO logs:

    a) Crysis 3.

    21083711_crysis-3_mission-1-low-settings.png 21083712_crysis-3_mission-3-low-settings.png

    Played on low settings in FullHD resolution. The performance is way worse than on GTX660. The FPS peaks in vents etc. are higher on R9 270, but overall, 660 beats R9 270. The third mission is barely playable on R9 270 with low settings. With GTX660, I could play the mission on mixed medium to very high settings, getting higher FPS (about 35 average). I don't have a log for the mission 2 "Welcome to the jungle" with the "grass problem", but the FPS was about 10-15% lower on the grass, reaching 15 FPS minimum. Both cards can't achieve playable framerate on this grass level.

    B) Batman Arkham Origins

    21083776_batman.jpg

    Played on almost full settings, without motion blur. The game is fully playable on both cards. The benchmark contains both mission in a closed space and some city free running.

    For now, I am disappointed with the R9 270 eGPU performance. I hoped, that the GCN architecture could give me some magic like Nvidia Optimus does. I'm going to test BF4 with and without Mantle, to see if that helps. In theory, CPU Usage should be lower with Mantle and the communication between the CPU and GPU, which is a serious drawback of eGPU due to PCIe bandwidth, should be faster.

    Thank you for your analysis. I will be following your experience closely, as I was interested in an AMD egpu as well. Shame to hear that you had problems with Crysis 3. Looking forward to comparisons in other games.

  8. Very interesting:

    I was surprised to see this in their benchmarks. We really need to see what Thunderbolt 2.0 can do to these numbers.

    Waiting anxiously for the promised Thunderbolt 2 products to start showing up. I've been holding out on setting up a expresscard solution on my W520 as I've found the Quadro 2000m to be sufficient for medium-low settings at 1080p and flys if I lower the resolution. I was hoping that TB solutions would be available by now so I could set it up. Hopefully the next few months will bring good news.

    It's been a long time since I made a post in this thread.

    Anyway, I'm requesting from @Tech Inferno Fan or @kizwan to help me on something.

    Remember a year back when I asked how to make AMD CPU and AMD eGPU work together? Well I'm bringing this back now. The story is that I'm giving my Lenovo Thinkpad to my brother after I buy Fujitsu Lifebook AH532, and my other brother has Dell Inspiron, the one that's on my signature. Would you help me on this on? I'll be very appreciated.

    Looking forward to seeing you succeed with this. AMD already have the ability to crossfire between APU and dGPU, and if there could be a way to make this work with external gpu's, it would be hopefully lead to better performance at lower prices. Do you have the AMD dockport in mind when thinking of AMD egpu?

    AnandTech Portal | AMD

  9. Anyway i have an acer v5-551-8401 i have a radeon 7600g (Mobile) And it has 6gb of ram. THe gpu has 512mb of memory. It has a quad core processor (AMD 4555M) 1.6ghz-2.5(Turbo boost) and well i want to have an e gpu i need to know what gpu will be suitable for me without bottlenecking my gpu/cpu ( I need some low cost ones too) Cause i got this pc in november 2012 and i dont want to upgrade just for gpu sake so yea.

    As far as I know, the diy egpu solution is for Intel processor equipped notebooks only.

    But on the topic of AMD, I also wanted to bring to the attention of forum experts this news:

    AnandTech Portal | AMD 2014 Mobile APU Update: Beema and Mullins

    Of importance in the above article is this:

    While DockPort sounds interesting (a non-Intel alternative to Thunderbolt that basically combines DisplayPort 1.2 with USB 3 into a single cable), AMD said precious little about DockPort in their presentation. Someone asked about it, and AMD said it was “up to laptop manufacturers” and that was about it. There’s the above slide as well, showing how a single cable could drive three external displays along with a variety of peripheral devices, but we’ll have to wait and see how many companies are willing to jump on the DockPort bandwagon.
  10. You could use a can of compressed air to clean your heatsink without having to repaste. Just remember not to let the fan spin in the opposite direction whilst cleaning. You could also raise the back of the laptop whilst gaming, as in prop it up with something to increase airflow.

    A powerful laptop cooler might also help.

    On the software side, use throttlestop to manage processor speed. Most games are very poorly optimized and try to get the most out of turboboost, raising temps a lot for no good reason as they only utilize 2 or 3 cores at most. I even disable turboboost for older games.

  11. Hope that TB2 egpu solutions become feasible/affordable soon, as next gen gaming is just around the corner. My dream machine would be a Surface Pro 2-like tablet/hybrid laptop with a TB2 port. However, that seems unlikely till atleast the next generation. Is there no way the egpu community can approach AMD for an alternative solution? They are both cornered and wishing to fight back tooth and nail against intel/nvidia at this point. I know they had an alternative to TB, I think they called it Lightpeak or something.

    Anyways, I think AMD would be far more open to considering eGPU as a viable product as they do not really have any hold in the mobile gaming marking (AMD's laptop CPU's being mostly useless for gaming, and nvidia also massively outsell them in mobile gpu market).

    Hey Nando, if you could get in touch with someone at AMD, I am sure the community here would definitely rally and show their support for such a cause.

  12. After I had reapplied the paste last night it was running at 99c in SC2 but once I gave up on playing I changed my undervolting from -80mW to -100mW. I left it running the XTU stress test overnight and while I was at work today. The temps have been averaging between 65-70c at 100% CPU load. The entire time I can feel warm air blowing out the side and even as I type this now I can feel it blowing as a constant warm/hot temp like it should. However, if I load up SC2 at random times the fans go into overdrive, the air comes out cool and the cpu throttles until I get about 1-5fps. It starts randomly but once it does it continues until I close the game out. I'm thinking its an issue with SC2 and Windows 8.1.

    While running Battlefield 3 the CPU never goes above 72c and its running like it should.

    How about running Throttlestop and disabling turboboost and adjusting multiplier to make the CPU run at speeds between stock and Max turbo. That is what i have to do in most games. This alone changes temps from 97 degrees to 88-90 degrees max on my 2820QM in the laptop in my sig. The W520 has very poor cooling whilst the GPU is running.

    What I would like to know is whether I should stick to what I am doing (its a one click solution really) or try undervolting like you did.

    Sorry if I am hijacking the thread but I felt this was better than starting a newer one as there are already people posting here.

  13. Current games are not optimized for just AMD. Who knows what will happen in the future. But PC exclusives will not be optimized for just AMD. And any game released for the PC in the near future will have to run on Nvidia as well. Besides, things are more complicated, for example the game will be optimized for the exact hardware of the consoles, so other hardware will need reoptimizing again. Not to mention, if it is optimized for a AMD cpu, but Intel runs faster than AMD, does it matter. Better to have too much than too little, if the price is OK. And battlefield 4 on PC bottlenecks even Intel cpus.

    Funny you should mention Battlefield 4, an upcoming major title HEAVILY optimized for AMD (mantle and multi threading). I think its necessary building a new PC to always look upto a year in advance than to consider the past or present.

    Unless one is running SLI'd or Crossfired multi gpu setups, the GPU will become a bottleneck much earlier than the CPU. What you are saying is also correct by the way. I only wanted to help the thread starter with very specific, simplified information to get him started without much trouble.

  14. Thanks for the response b1te. Atleast now I can be sure that its the GPU dying thats the problem. I got the laptop from the US (newegg) back in dec 08, and not in the US at the moment, so will just try getting the GPU reballed locally if someone can do it for cheap otherwise just scrap it keep for parts.

  15. Since you are new to the gaming world, I am going to assume that you will be playing a wide variety of games, and will likely prefer newer releases coming out before digging back into the classics in search of better games. AMD is the base for both PS4 and the Xbox One, and most major titles will be on both systems. Thus, just about all upcoming games will be optimized for AMD hardware.

    This is good for you, as AMD hardware is cheaper and offers more tweaking margin than Intel's mainstream CPU's.

    I would strongly recommend getting an AMD 8150 or 8350 processor and combining it with a ~$300 7970 GPU for future proof desktop gaming.

    For laptops, you should definitely go for Intel, as they are more expensive but better than AMDs lackluster CPU's. But again, with newer drivers, AMD GPU's will be helpful as they are much more cheaper than Nvidias severely over priced mobile (laptop) GPUs.

    So, in conclusion, AMD CPU + GPU for desktop

    Intel CPU + AMD GPU for laptop,

  16. Thanks for the replies again guys.

    In case it is the cable or the GPU dying, I would be very grateful if you guys could suggest the where I can purchase both from?

    Since the GPU is a part of the motherboard (soldered instead of MXM), or if the cable is messed up, I would like to know a good online retailer to buy from. Is it even possible to buy just the motherboard of the MSI GX600? I will need the whole motherboard if the gpu is done.

  17. Possibly shorting on the VRAM, or dying VRAM.

    Thank you for the response. I was hoping for a way to identify the problem.

    This occurredd before a couple of years ago and was fixed by a repair shop. It worked well for about a year again and I even played Mass Effect 3 on it for days. Then I didn't turn it on for a few months but when I did, the problem cropped up within an hour. I'd also like to add that this is a rebadged MSI GX600.

    could this be a problem with a cable that connects to the display?

  18. I was unsure of which section to post this on, but I think this is where it might get the most visibility so here it goes. I really need help with this.


    http://www.dailymotion.com/video/x15dhnr_laptop-problems_tech" target="_blank">Laptop Problems by http://www.dailymotion.com/jfp555" target="_blank">jfp555

    If it does not embed properly, here is the link: Laptop Problems - Video Dailymotion

    Is there a solution to this problem? The laptop seems to be working just fine and boots into windows and I can even make out the display, use the mouse, shutdown, but obviously it is unusable.

    Specs are:

    C2D T5850

    4 gb ram

    8600m GT

    Win 7

    The fault seems to be hardware based. not software.

    Much thanks in advance.

  19. Hello everyone,

    I was wondering if anyone had any suggestions for an ideal eGPU setup for a Lenovo W520 (see signature for details of the laptop). The main issue is: I have very large 3D data sets/meshes that are very taxing (stuttering) on my current GPU. Since school/work is paying for the setup, I figure I might as well get the best eGPU setup as possible. I know the GTX 690 and GTX Titan have gotten rave reviews/incredible benchmarks. But will they be overkill for the PCI-E slot bottleneck effects? Will the GTX 780 even be overkill?

    Any and all help/instructions for a high-end eGPU setup for my laptop would be immensely appreciated.

    Thank you.

    The most powerful card you can plug in right now into that is the 780. Due to memory bandwidth limitations dual gpu cards are not and probably will not be supported through expresscard solutions. For that people are waiting for 2014 Falcon Ridge Thunderbolt update. A user named carage (on nbr forums I think) got a Titan running with his W520, but the difference was very small compared to his 670 4gb. So it is possible, but the card will be held back by memory bandwidth, and I think even more so in 3d applications.

  20. This doesn't make sense, Adobe lists the 7000 series as compatible (Adobe Community: FAQ: What features use the GPU and how do I troubleshoot GPU issues?)

    Also if the MGE uses OpenGL and OpenCL frameworks then it has to work on any hardware which supports those features.

    I see where the misunderstanding is. This is for Adobe Photoshop. Unfortunately, what I am working with is Adobe Premiere.

    This link is what the Mercury Playback Engine is: Mercury Playback Engine | Adobe Premiere Pro CS6 - Adobe.com

    And in the list of cards under 'Accelerated Rendering', you'll see a huge amount of NVidia cards and only 2 AMD mobile cards, and that too only those that are in the macbook pros and work ONLY with Mac OS X, and NOTHING for even desktop AMD cards. As much as I hate Crapple, I have to admit that they were powerful enough to twist Adobe's arm after a huge amount of mac fans were horrified at the beatings their macbooks with equal specs and many times the cost got at the hands of much cheaper windows machines. And since Apple put in AMD gpu's into all macbooks, they had to work this out with Adobe. The fiasco was quite entertaining, but the result remains the same. AMD is suffering due to nVidia and Adobe's deal. But thats the way it is.

    On a side note, NVidia is now more scared of Intel and its iGPU's getting closer to mainstream user acceptability limits, and they allowed AMD to power ALL next gen consoles, hopefully giving AMD a fighting stance against Intel, also buying Nvidia some time to put out project shield.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.