Jump to content

angerthosenear

Moderator
  • Posts

    551
  • Joined

  • Last visited

  • Days Won

    10

Posts posted by angerthosenear

  1. TB1 and TB2 both use a x4 2.0 electrical pci-e link. Only difference is the former is 10Gbps (equiv to x2 2.0 + 12.5%) across the TB channel whereas the latter is 20Gbps. x4 2.0 giving equiv to 88%/94% of desktop x16 2.0 performance. TB2 is the closest we've had to of desktop-level graphics performance (bandwidth) on a mobile platform.

    Apple's Haswell 13/15" MBP being the first mobile system TB2 port(s) such that we can do that.

    Ahhhh okay, thanks for the clarification. So the cables are interchangable then? aka there will be no TB2 cable?

  2. very overclocked 580.

    Knowing you, that 580 probably developed tear ducts and started crying...

    --

    semi off-topic:

    If a stock card is rated for 195W, why would it pull more than that when under load? aka why dos it pull more than rated? (even though it is stressed). Is there a % of over-pull to estimated like for computer builds (looking to build a desktop with two GPUs).

  3. Hopefully the Apple TB2 -> PCIe chassis will work on any TB2 system. But I have a hunch the ones from other companies will be much cheaper though.

    SilverStone needs to hurry up and release the T004 (with TB2)!

    ---

    Do you think there will be something similar to the PE4H but with TB connectors? Or does the way the ports addressed prevent that? How long do you expect until we can get a full x16 link for an eGPU?

  4. @Tech Inferno Fan perhaps Apple will help with a Thunderbolt solution?? (inb4 really expensive)

    With the new Mac Pro it seems like there will be Apple Branded eGPUs

    QUOTE (Apple - Mac Pro - Performance):

    "

    Unprecedented expansion. At 20Gb/s.

    Six Thunderbolt 2 ports redefine the possibilities for expansion by giving you up to 20Gb/s of throughput — without being limited by a set number of PCI Express slots. Connect to your SAN or fast local RAID storage. Then add video I/O, broadcast monitors, your existing PCI Express cards, and just about anything else your workflow requires. With HDMI output, you can even use a 4K TV as a preview monitor. Manufacturers like Promise, AJA, and Blackmagic are creating a host of advanced high-performance storage, video I/O, and expansion solutions for Thunderbolt 2. Another benefit of Thunderbolt: You can easily move your high-performance peripherals from one Mac to another based on the task at hand. And both generations of Thunderbolt technology are compatible with the new Mac Pro.

    "

    'your existing PCI Express cards'

    Might be interesting.

    More under the Audio section:

    "

    Connect to next-generation I/O technologies.

    Six Thunderbolt 2 ports and four USB 3 ports redefine the meaning of expansion. Connect up to 36 best-in-class audio I/O devices from Avid, Apogee, M-Audio, MOTU, Universal Audio, and more. And with a PCI expansion chassis connected via Thunderbolt, you can work with the DSP and audio I/O PCI Express cards you already have.

    "

    • Thumbs Up 1
  5. Well, my desktop is pretty simple:

    https://www.dropbox.com/s/2319nk4fuirsqs1/Captura%20de%20tela%202013-10-22%2000.31.33.png

    The wallpaper with filename "初音ミク.jpg" is a workaround to a strange bug on Windows (if Chrome is the first application to show non-ASCII characters, all the other applications will not show any of them; so if you put a file with non-ASCII on your desktop the Explorer will be the first application to show them always and the bug will not be triggered). I don't know if they fixed this on Windows 8.1 (this happened on both Windows 7 and 8), maybe I should try to delete it.

    Well, and the "nice" text about SecureBoot is thanks to the new way Windows 8.1 manages SecureBoot. F#!"$ Microsoft!

    I used to have that Hatsune Miku wallpaper as one of mine too!

    That's a rather annoying sounding bug.

    I have my system locale set to Japan, but I'm running Win7 Enterprise. I don't know about Win8 / 8.1. Perhaps that will help? I have a lot of applications that use non-ASCII characters and I luckily haven't had any boxes instead.

  6. Hey Nando,

    I am implementing the eGPU with gtx 680 (AMP edition) on my Macbook Air. The only thing left to purchase now is the PSU.

    Its max power requirement is 195W. Now I am confused as to how the PSUs work? Is the single cable from the PSU have a specific current output OR it can have the max possible output from the PSU unit if no other wires are used/connected?

    Also the V/A also is not specified anywhere for this.

    Also, kindly suggest the best possible PSU

    ========================

    PS. Waiting for new eGPU setups on the new MBP with thunderbolt 2 :congratulatory:

    The PSU Tech Inferno Fan linked in his previous post would be suitable for your power needs:

    CORSAIR CX430M 430W ATX12V v2.3 80 PLUS BRONZE Certified Modular Active PFC Power Supply - Newegg.com

    If you go down to the spec sheet to the 12V rail, you see there is 12V at 32A which is 384W. So, your 195W power requirement is met.

    Hopefully that math is close to reality. But regardless, that PSU would be suitable.

    -atn

  7. angerthosenear

    Thank you so much for answering my questions and for your help.

    It really helps me a lot, and you were fast too!

    What do you mean by hardware whitelist?

    On youtube I heard youtube username "drydreamer" say that if your laptop has bluetooth or your laptop model has an option for bluetooth than it should have mPCIe.

    But anyway I can be sure my WLAN adapter is using mPCIe.

    I will start installations this week and post my findings/results here.

    I havent decided yet which card I should buy I am thinking of GTX 5xx or 6xx ones

    I still dont get the opt stuff and speeds and I have no idea what hardware I should get for my laptop to use optimal speed.

    But I will read the information again and hopefully I will find out!

    Again thank you very much!

    A hardware whitelist means that a predetermined set of hardware (connected via mPCIe or other internal connection) will work with the system. Things not in that list will either give you an error saying unsupported hardware, or just plain not show up. If you happen to have some spare laptop WiFi cards you can swap them out to test, see if they work - that'll give you a rough idea if you have a hardware whitelist or not.

    As for the connection, maybe something like what MikjoA has done here:

    http://forum.techinferno.com/diy-e-gpu-projects/2158-diy-egpu-guide-sony-vaio-vpc-z2-svz13.html

    for ease of connecting your eGPU.

    He uses the PE4L-PM060A. We have determined that the length of the cable doesn't adversely affect the performance (at least the 100cm cable doesn't).\

    PE4L V2.1 (PCIe Adapter )

    I use a GTX 660Ti. MikjoA uses a Titan.

    I'm not sure if you will have any particular issues with your laptop. I know some don't like certain cards or whatnot. I don't have enough knowledge in the matter to clarify this.

    @Tech Inferno Fan , input on if a 660+ series card will work with his laptop?

    ---

    That adapter I linked will allow you to get the x1.2 opt link. Which gives you pretty good performance (as is the best you can get over a single mPCIe / EC connection atm). You will want to contact Tech Inferno Fan to get the Setup 1.x from him (I believe it is 25$) so you can perform compaction to get your x1.2 opt link if necessary. I guess some computers can run without doing this. But be prepared to get it (seems necessary with current gen cards).

    --

    Ask if you need anything more,

    atn

  8. Laptop Model: X53s

    Intel Core i5 2410M @ 2.30GHz

    Chipset: Sandy bridge

    Southbridge: HM 65 (intel)

    4GB of RAM DDR3

    Motherboard model/type: K53sv

    Bios updated to latest version

    iGPU: Nvidia GT520MX 1 GB

    Also I have an: Intel® HD graphics Family, I guess this is my onboard graphics card?

    iGPU: Intel® HD graphics family

    dGPU: NVidia GT520MX 1GB

    I dont have an Expresscard slot.

    But when I looking inside my laptop I found a wireless adaptor (intel centrino wireless-N 100)

    So, can someone tell me if this wireless adapter is using a PCIe slot or just normal PCI slot or something totally different? I tried to look on the internet what kind of port it is using for hours.. but I am still not 100% sure if its a mPCIe or not.. very confusing.

    That would make it a mPCIe port. Assuming you have no hardware whitelist (you would have to ask the manufacturer directly to find out), you can connect your eGPU through this port.

    Since I have an onboard Intel® HD Graphics Family card, does it mean I can use Optimus driver to use my Internal display?

    Yes.

    I couldnt find my laptop in the list, thats why I asking questions here and I really hope someone can help me out :)

    Mine isn't in the list either last I checked, no worries about asking questions!

    Also TechPowerUp says: Bus Interface PCI-E 1.1 x 16 @ x 16 1.1

    I guess this means my current gpu is running on PCI-e?

    But it has nothing to do with mPCI-e right?

    Correct, this just tells you the link between your dGPU and the Southbridge (iirc). This is not mPCIe.

    Can someone give me any information or confirmation about my questions?

    1.) If my WLAN adaptor is indeed using my mPCIe slot (which I dont mind removing)

    2.) Optimus driver can be used if I have an Intel HD Graphics Family card onboard.

    1. Yes it is. You will have to remove this to use an eGPU. (Unless you so happen to have a thunderbolt port (I didn't look up your specific laptop).

    2. Yes.

    Those are the most critical things I need to know before I can proceed with the installation.

    Thank you very much!!

    \

    Feel free to ask more questions if ya need to!

    -atn

  9. Try to boot in Windows without eGPU connected and disable the dGPU. Then reboot and start your eGPU, make a compaction eventually on iGPU/eGPU/dGPU if this is the case then chainload again to Windows.

    If eGPU an iGPU are ok, try to reenable your dGPU and see if that still works.

    Naw, that doesn't work either. I tried it in the past. That just leaves me with only the iGPU on boot. The dGPU is still active during boot (pre-Windows environment) so it still throws the compaction error, boot hangs, random other non-booting states.

    tbh I think I'm going to give up on having iGPU + eGPU only. All the stuff I run is DX11 and has PhysX so I don't have to worry about DX9 performance at all other than benchmarks.

  10. -snip-

    Anyways, the temperature dropped by about 20°C.

    New 3dmark06 score is 15212. NVIDIA GeForce GTX 560 video card benchmark result - Intel Core i5-2410M Processor,Dell Inc. 0YW3P2

    Slightly lower than what I used to have 2 years ago, mainly from the SM2 score. Despite the PCI speed improvement. Hmm...

    Glad to see that helped. Not sure how to improve that final amount though. Background processes is all I can think of.

  11. Given that case, you can try a iGPU+dGPU+eGPU compaction, force iGPU+eGPU into 32-bit, then do a dGPU[off]. That will then assign the iGPU+eGPU into 32-bit PCI space and make the DSDT override unnecessary. THough since you do have the DSDT override you can also try without the forcing 32-bit of the iGPU+eGPU.

    I tried this again after updating my video drivers and I still cannot get my computer to boot with the dGPU off. Tried every combination of compaction and dGPU [ignore] and dGPU [off]. Hoping to figure this out to see why my scores are so low for the benchmarks.

    What else would affect this? I think I got the thermal throttling issue mostly resolved so I'm guessing it is something software wise.

    Anything else in Setup 1.2 or in the BIOS I should look for/change?

    --

    Also in Setup 1.2, it asks for devices to involve in compaction, which to have in 36bit, and which to have in 32bit. Suggestions for this? Getting to the point of too many options for me to try them all.

  12. Thanks again for the conversation. I decided to try out the EVE test server and the problem is completely resolved there, so I'll probably have to wait until the expansion in mid November to get this fix. I found out that they've been doing a lot of work on their engine to make it DirectX 11 capable, which I know is not a trivial matter. Eventually they plan to add all the pretty tessellation, which they demoed not that long ago. An already pretty game is going to look pretty glorious when they're done.

    At least I know it's not my eGPU setup now! I tried out another game, Warframe, which I can run on absolute highest settings and, despite the fact that quite a few scenes have a lot of crap moving around and shooting at you, it is performing well. Looks like my eGPU conversion is an unqualified success. :)

    Oh cool, glad to hear it.

    I play Warframe too, you should add me (ign: angerthosenear). If you do 25 minute infested survival you can even bring your eGPU to it's knees with all the ammo/loot on screen. You get lots of Toxics onscreen that have their own little cloud so it's a pile of effects. Orokin Derelict stuff is also good to text, the environment is very detailed and the mob density is usually really high.

    - - - Updated - - -

    hi guys my new Pe4L is finally here. Unfortunately my Benchmark score is quite low. My Pe4H 3dmark06 score was about 15k (you can still see it in the leaderboard). My new score now is 12232. any ideas what might be causing the problem? In the egpu setup my 560 is also only listed as x1.1 instead of pci2.0. :/ :( In fact, I just redit the 3dmark06 test and got only 10800. Weird.

    http://www.3dmark.com/3dm06/17413290

    edit: to make matters worse I seem to be getting error 12 now all the time and cant even chainload the card anymore. I already tried 32bit 32Abit compaction with various options but does not seem to help at all... :/

    edit 2: ok doing a 36bit compaction worked. I also set the g2 speed to 5gbps in the egpu setup... However, score is now 11343 http://www.3dmark.com/3dm06/17413345. So still quite far away from being 15k+ :( This is my original score with the Pe4H 2 years ago: http://www.3dmark.com/3dm06/15975981

    edit3: GPUZ confirmed the card is running in 2.0 x 1 under load. So at least that's working...

    Seeing it has been two years. Is your CPU thermally throttling? Might have dust buildup or thermal paste that has gone bad. Everything software wise seems fine. I notice your CPU score is far lower too (thus I though of this).

    Perhaps try cleaning out your lappy / repasting your CPU.

    • Thumbs Up 1
  13. I'm running one (external) screen through the eGPU. I shut the laptop screen itself to turn it off (I suspected that it was possible that the through-and-back across the link and driving two screens would make the eGPU unhappy).

    I think that you and I and MikjoA are in agreement that it's likely this game which is doing it. Since the internal Intel HD3000 chipset is doing what most integrated graphics systems do (direct use of the on-board memory), there probably isn't any delay at all as textures load and such. Now it actually has to get paged to memory and sent across the link to get stuffed into the eGPU's memory and I suspect they may be storing up a bunch and just pushing it across in one fell swoop, causing the bus to saturate for a second and the game to stutter. I have been doing some research into it and apparently other people are having problems like the ones I'm experiencing on desktop hardware through a normal PCI connection. I'm starting to be convinced that it's not my setup causing this but the code of the game managing its interactions with the video card poorly, which isn't a problem for everyone. The developer who was in interaction with some other customers with this problem seemed pretty confused about why it's happening, and it may be a recurring issue with this game. I'll be testing out other games (I usually only play EVE) to see how well they perform.

    Lemme know what you find out, I've been curious about this.

  14. Thanks for the reply! I'm using an mSATA SSD which gets pretty nice performance. The game doesn't hitch at all when played on the internal GPU (same monitor, driven through the x220t's DisplayPort), though obviously at much lower settings, so I can't see how it would be a disk related issue. It only stutters on the eGPU and only occasionally, but it's enough to be annoying. Incidentally, I actually have tried the game out before with a RAM disk and really didn't see any performance gain, so I don't think this particular game is performance bound by read/write disk IO.

    I will try without V-sync on and see if that helps.

    Edit: V-sync did not help, nor did any of the other settings in the control panel.

    Whelp, that shot down a lot of things lol.

    You are running external off of eGPU and internal screen on correct? Can you have GPU-Z running on your non-EVE screen (so your internal). Set it to view your eGPU and click on the Sensors tab.

    Are you maxing out your memory usage? If so it just might be a combination of bandwidth bottleneck and memclock speed.

    I was talking with @MikjoA about this. My guess is that graphical stuffs are being held in memory until all of the data has been sent to the card, but with the bandwidth bottleneck, the eGPU has to store a lot in memory (more so than a desktop card) before it can do anything (ie it takes longer to have all the data sent). But I have no idea how any of that is handled, I'm just guessing here.

    How many screens do you have hooked up to your eGPU? I have 4 hooked up to mine and I have ~500MB of VRAM usage just from having my usual applications (non-games) open.

  15. -snip-

    My question is regarding a specific game (Eve Online). The game is running great on highest settings except for the occasional hitching / stutter / graphics lag spikes. Reducing the settings to bare minimum (which runs well on the integrated video), the hitching doesn't go away during identical in-game activities. In your opinion, is this hitching related to bus throughput problems? It appears that I've got Optimus working just fine, but is there any way to determine if the link compression aspect of the technology is working? I've tried using the 16-bit color trick, which affected nothing (I'm not actually sure that the game accepted it). I also tried forcing frame rates using MSI afterburner, which also had no effect on the hitching. Is there a problem with my setup or any tricks I can employ to reduce hitching that I haven't seen here?

    Also, I built my own enclosure for this and can provide people with some basic knowledge on how to make their own if anyone is handy with basic tools they're likely to already have in their household. It's my first case mod and some minor cosmetic mistakes were made but it came out quite nicely. If I can pull it off never having done it and with only an idea in my head of what I wanted, anyone handy with simple shop tools probably can too.

    The main issue with things like this is disk speed. For instance, my Skyrim has some smooth moments, but also some deathly slow times with lagspikes as pointy as the traps. Thus is use a RAM-disk to bypass this issue. It is able to load to textures quickly and thus rid of the issue.

    So if you have spare RAM you could try setting up a RAM-disk with IMDisk and put the texture files there, setup a junction to point there. (This is a daunting task if you are new to doing this - try keeping this to a last resort if you are.)

    --

    You are getting the expected bandwidth to your eGPU so there is no issue there.

    --

    Turn V-sync off. I've had this fix many issues with suttering and whatnot.

  16. Hi all, I have a question to ask, my notebook was Acer AS4736z Intel GMA 4500M.

    If I purchase PE4L-PM060A v2.1b + Gigabyte Radeon HD 6670 is it good on gaming? or Nvidia for best?

    can I use internal notebook for display?

    You will want to use a nvidia card so you can get Optimus compression. This helps immensely with the bandwidth constraint you have. Yes, you can use the internal display but you will have a performance drop.

  17. It would be a problem on your end. I'm running driver 327.23 at the moment. I used to use 314.22. If I'm not mistaken you do not have a dGPU (much less a nvidia dGPU), so you cannot get Optimus compression. @Tech Inferno Fan would be able to shed more light on this aspect.

    I know @MikjoA does not have a dGPU but still is able to run his internal LCD with his Titan (albeit the bandwidth causes this to not be overly helpful).

  18. I also use IMDisk for certain games where rapid loading of textures helps immensely. I have 16GB of RAM (I only have two slots). So I allocate half of it to a RAM disk when needed. Mainly when I play Skyrim. I have all of the (high-res) textures and a few miscellaneous other things loaded to my ramdisk. I have a junction link setup for the related files (the 'textures' folder references the R:\textures essentially).

    And yes, it helps immensely. Only really helps with games that need to load a bunch of stuff at once.

  19. @MikjoA

    For your FPS cap testing, I cannot seem to find a game that has decently stable menu activity.

    For Skyrim, depending on the number of little clouds in the bottom, that determined what was the stable fps. It was either ~65, ~75, or upwards of ~110 fps.

    Can you think of anything that would be rather uniform?

    ---

    Saints Row:The Third :: Has a stable main menu, however I cannot determine if it is a video in the background or something actually rendered. I get ~110 fps.

    ---

    Borderlands 2 :: Pretty intensive game. I get a measly 12 fps at the main menu (since it is all rendered). Changing the video options doesn't change this. Essentially - low video settings fps = high video settings fps. I just think this game has a lot going on GPU wise no matter. Not sure. Maybe it is trying to do everything with the iGPU instead? At least with Skyrim I was able to select the GPU (launcher video options).

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.