Jump to content
EwinRacing Flash Series Gaming Chairs
Tech Inferno Fan

eGPU experiences [version 2.0]

Recommended Posts

Hi just thought I would post up my results:

3DMark06

[h=1]15594[/h] 3DMarkVantage

[h=1]P12259[/h] 3DMark11

[h=1]P3650[/h]

System:

14" HP 8440p

CPU: i5 580m 2.67GHz

RAM: 8GB

Gfx: GTX 560ti 1.Opt

Chipset: QM57 EC

OS: Win7/64

Setup 1.x - Yes (required)

  • Thumbs Up 1

Share this post


Link to post
Share on other sites

Hey guys, I'm very new to this, but really want to try eGPU for obvious reasons.

What setup would you recommend for the following (i would really really appreciate detailed instructions :) )

I have:

Lenovo Thinkpad x201i

Intel Core i5 i5-540M / 2.53 GHz ( 3.06 GHz ) ( Dual-Core )

RAM: 4GB

Windows 7 (currently 32bit, but in process of getting 64bit) i dont know if it matters for this...

Thx in advance!

- - - Updated - - -

Hey guys, I'm very new to this, but really want to try eGPU for obvious reasons.

What setup would you recommend for the following (i would really really appreciate detailed instructions :) )

I have:

Lenovo Thinkpad x201i

Intel Core i5 i5-540M / 2.53 GHz ( 3.06 GHz ) ( Dual-Core )

RAM: 4GB

Windows 7 (currently 32bit, but in process of getting 64bit) i dont know if it matters for this...

Thx in advance!

Share this post


Link to post
Share on other sites
Thanks for the benchmarks! :) Could you please tell me if your GTX 650 Ti Boost needs a 256MB memory block (like ordinary GTX 650 Ti) or if it uses 128MB+64MB+32MB slots (like all other nVidias)? Thank you. You can see it in device manager.

Edit: also, could you tell me anything about your 175W XBOX psu implementation? Is it any different than the 203W psu?

This is what I see in the device manager:

post-8761-14494996871023_thumb.jpg

Three separate memory blocks, so probably the normal Nvidia allocation?

For the PSU, hardly anything different as compared to a 203W. The PSU is much lighter and uses a 2-prong vs 3-prong plug. The 3-prong fits too, but the ground connector is gone. I had a 203W for my eGPU but it was either bad or I fried it when I accidentally hooked up a 12V-->5V reducer backwards. I had 40V coming out of the 5V side hooked up to the PE4L...good thing it was smart enough to turn itself off! After I fried it, I could run a GT610 through full benchmarks but the 650TiB would freeze after a few seconds (power starvation maybe?). The 175W is missing the two small (extra 12V and gnd) wires that the 203W had, so there's only 8 total wires (3x12V switched, 3xGnd, 1x5V always on, 1xsense for switching on 12V).

My long-term goal is to get a project box for my eGPU, so I used one of these:

StarTech EPS8EXT 8" EPS 8 Pin Power Extension Cable - Newegg.com

as a connector between the PSU and eGPU. I ran 12V and 5V using a floppy Molex connector to the PE4L, and also soldered a gpu power plug. I put a switch between the 5V and power-on sense wire, and on the sense wire side, ran the 5V to the PE4L. This way I have switched 12V and 5V.

Some discoveries from tinkering with this setup:

The middle ground on the GPU power connector is used as some sort of sense line to see if the connector is hooked up. Having this not connected to ground threw an error 43, just like if the connector was not hooked up. Resoldering this to one of the ground lines fixed the error 43.

@Tech Inferno Fan mentioned something a few months ago about how a PE4L was supposed to automatically reset with the #PERST after a restart. I had this behavior when I had the 5V line on the Molex disconnected and the J4 on the PE4L removed. You can simulate this on an ATX power supply setup by starting up your laptop with the expresscard connected and egpu power off, so the PE4L took the 3.3V from the expresscard. Has anyone else noticed something along these lines? This could elimiate the soft-restart issues that most of us have been having.

I didn't have any issues with setting up the 650Ti Boost with my 2570P. Essentially plug and play. I have Setup 1.x for another reason, but I don't think that I needed it for this. I still have yet to run benchmarks on an external monitor, but right now I'm running Skyrim (no mods) on the internal monitor at 720P on a mix of high and ultra settings.

  • Thumbs Up 1

Share this post


Link to post
Share on other sites

Thanks for the info. :) About the memory allocation, I think you're right. The second memory block in your screenshots is a 128MB block and the other two are even smaller, so I'd guess this vga is like ordinary nVidia ones. Infact the chip is different from the stock GTX 650 Ti, if I'm not mistaken.

Would you guys think that the XBOX 203W psu would have enough juice for a GTX 680 (rated 195W by nVidia site)? Other option would be a GTX 660 Ti (rated 150W), for which I would opt for the XBOX 175W adapter. What do you think about it? If the XBOX 203W adapter can't power the GTX 680, then I'll go for the smaller 175W psu.

Thanks.

Share this post


Link to post
Share on other sites
Thanks for the info. :) About the memory allocation, I think you're right. The second memory block in your screenshots is a 128MB block and the other two are even smaller, so I'd guess this vga is like ordinary nVidia ones. Infact the chip is different from the stock GTX 650 Ti, if I'm not mistaken.

Would you guys think that the XBOX 203W psu would have enough juice for a GTX 680 (rated 195W by nVidia site)? Other option would be a GTX 660 Ti (rated 150W), for which I would opt for the XBOX 175W adapter. What do you think about it? If the XBOX 203W adapter can't power the GTX 680, then I'll go for the smaller 175W psu.

Thanks.

phillofoc's GTX650Ti Boost has a 128MB+32MB+16MB allocation same as a GTX660 or better card. *some* GTX650 cards require 256MB+16MB which would cause issues with his 2570P since there is no bios-allocated 256MB block free. Meaning he'd either need Setup 1.x or a DSDT override to be able to host one of those problematic cards. Same with AMD cards which too require 256MB+16MB.

GTX460-GTX580 cards require 128MB+64MB+16MB. Hence I suggest tread carefully with those GTX650 cards or just skip them and get a GTX560Ti or GTX660.

203W XBOX360 PSU has a 12V/16.5A rail (198W) and a 5V/1A rail (5W). It would be touch-and-go powering a GTX680. Though if you read some comments by bjorm here he suggests a GTX680 has very little performance benefit over a GTX670 on a x1.2Opt link. Something I agree with. a GTX670 has a 170W TDP, comfortably accomodated by that 203W adapter.

If you used a XBOX360 175W AC adapter instead then you'd be limited to 12V/14.2A (170.4W) + 5V/1A (5W). That would be pushing it for a GTX670 with a 170W TDP but would be fine for a GTX660Ti with 150W TDP.

Share this post


Link to post
Share on other sites
phillofoc GTX650Ti Boost has a 128MB+32MB+16MB allocation same as a GTX660 or better card. *some* GTX650 cards require 256MB+16MB which would cause issues with his 2570P since there is no bios-allocated 256MB block free. Meaning he'd either need Setup 1.x or a DSDT override to be able to host one of those problematic cards. Same with AMD cards which too require 256MB+16MB.

GTX460-GTX580 cards require 128MB+64MB+16MB. Hence I suggest tread carefully with those GTX650 cards or just skip them and get a GTX560Ti or GTX660.

203W XBOX360 PSU has a 12V/16.5A rail (198W) and a 5V/1A rail (5W). It would be touch-and-go powering a GTX680. Though if you read some comments by bjorm here he suggests a GTX680 has very little performance benefit over a GTX670 on a x1.2Opt link. Something I agree with. a GTX670 has a 170W TDP, comfortably accomodated by that 203W adapter.

If you used a XBOX360 175W AC adapter instead then you'd be limited to 12V/14.2A (170.4W) + 5V/1A (5W). That would be pushing it for a GTX670 with a 170W TDP but would be fine for a GTX660Ti with 150W TDP.

Thanks Tech Inferno Fan. I was thinking that the GTX 670 weren't so appealing, because it's the same as the 660 Ti other than a higher memory bus width. Isn't such a high memory bandwidth a bit overkill for playing in 1080p? Just asking.

Edit: to accomodate the 670 with the 175W psu, would I be able to do it by undervolting the card? Or it wouldn't startup?

Share this post


Link to post
Share on other sites
Thanks Tech Inferno Fan. I was thinking that the GTX 670 weren't so appealing, because it's the same as the 660 Ti other than a higher memory bus width. Isn't such a high memory bandwidth a bit overkill for playing in 1080p? Just asking.

Edit: to accomodate the 670 with the 175W psu, would I be able to do it by undervolting the card? Or it wouldn't startup?

GTX670 would certainly startup. It's full load conditions where problems with power starvation appear, eg: card stops working or starts throttling. I'm not sure how much additional tolerance those XBOX360 adapters have, eg: +5%? Safe bet would be a 175W adapter with a GTX660Ti or a 203W adapter with a GTX670.

  • Thumbs Up 1

Share this post


Link to post
Share on other sites
The 175W is missing the two small (extra 12V and gnd) wires that the 203W had, so there's only 8 total wires (6x12V switched, 6xGnd, 1x5V always on, 1xsense for switching on 12V).

My long-term goal is to get a project box for my eGPU, so I used one of these:

StarTech EPS8EXT 8" EPS 8 Pin Power Extension Cable - Newegg.com

as a connector between the PSU and eGPU. I ran 12V and 5V using a floppy Molex connector to the PE4L, and also soldered a gpu power plug. I put a switch between the 5V and power-on sense wire, and on the sense wire side, ran the 5V to the PE4L. This way I have switched 12V and 5V.

Did you mean 3x12V and 3xGnd?

Also, what did you do with that 8 pin power extension cable?

Thanks.

Share this post


Link to post
Share on other sites
Did you mean 3x12V and 3xGnd?

Also, what did you do with that 8 pin power extension cable?

Thanks.

Good catch on the number of wires, it's 3x12V and 3xGnd. I edited my original post to prevent any confusion in the future.

For the 8 pin power cable, I cut the wires in the middle, separating the male and female 8 pin connectors to make the system a little more portable. I hooked up the male side to the power supply and the female side is on the egpu side, this way I could separate the psu from the egpu. If I didn't do this, then I would have to solder the power supply directly to all the connections, which doesn't work with my future plans to make an enclosure.

Share this post


Link to post
Share on other sites

For the 8 pin power cable, I cut the wires in the middle, separating the male and female 8 pin connectors to make the system a little more portable. I hooked up the male side to the power supply and the female side is on the egpu side, this way I could separate the psu from the egpu. If I didn't do this, then I would have to solder the power supply directly to all the connections, which doesn't work with my future plans to make an enclosure.

Great idea! :) But how did you get the red cable (5V) to be detachable from the PE4L? It can't go through that 8 pin cpu cable, can it?

Share this post


Link to post
Share on other sites

Hi all. I bought a PE4L 2.1b for use with a GeForce 560 Ti. The PE4L arrived today and I hooked it up to my Lenovo X201 (8GB RAM, Windows 8.1).

I hotplugged the GPU (it would BSOD on boot) and installed the latest Nvidia drivers. From then, it worked, and I could switch between my internal screen powered by Intel and my external display powered by my 560 Ti.

Anyway I wanted to check in and make sure I'm not missing anything else here. Do the latest desktop drivers automatically use Optimus compression or is that something I have to do via driver modding? I downloaded the latest mobile drivers and modified the nvamn.inf file to include thingies for my 560 Ti. This meant the driver installation software had a go at installing the drivers, but it fails during installation.

So my questions are:

1. How do I get Optimus? Do I have Optimus? I don't really care about getting my internal display using the 560 Ti, I just want PCI-e compression.

2. What is best practice for removing/reconnecting it? Sleep mode? At boot now that I have the drivers installed?

Will be making a custom enclosure and posting pics no doubt once I'm done. :)

Share this post


Link to post
Share on other sites

@qapn

How do I get Optimus? Do I have Optimus? I don't really care about getting my internal display using the 560 Ti, I just want PCI-e compression

The desktop drivers have Optimus now. So yes you will most probably have PCI-e compression, feel free to run 3dmark06 to check.

Here are some pictures of my enclosure (for anyone interested):

post-16814-14494996884651_thumb.jpg

post-16814-14494996886787_thumb.jpg

post-16814-14494996887916_thumb.jpg

I am getting a new fan because the current one is quite loud, but it will still be in the same position.

Feel free to ask if you want more photos/info.

post-16814-14494996885163_thumb.jpg

post-16814-14494996886335_thumb.jpg

  • Thumbs Up 2

Share this post


Link to post
Share on other sites

Is BF3,Crysis 3 the most recent demanding game that people have noticed bandwidth limitation beyond ~85% of desktop?

I think there is a list someone on some polish board.

(I'm not really interested in games that aren't demanding and are restricting (World of WarCraft)).

Share this post


Link to post
Share on other sites
I am getting a new fan because the current one is quite loud, but it will still be in the same position.

Feel free to ask if you want more photos/info.

What did you do to mount the PSU and GPU? I see the little slot cut for the GPU IO shield, but that's about it. For mine (dunno if you've seen it), I used steel pipe hangar and a custom IO shield with a little screw mount hole.

Album:

eGPU setup - Imgur

Share this post


Link to post
Share on other sites

To mount the GPU I cut a little slot for the 90 degree curved edge in the IO shield (see picture), slotted that in and put a stray rivet through the hole (works quite well).

The PSU is mounted with a little bracket riveted along the floor (see pic two of first post) and the jug cord plug fits very snugly which keeps it pretty solid.

My PE4L is just on the bottom of the 560ti with some cardboard to help support it underneath.

If I turned my box upside-down, things would go pear-shaped. Apart from that I could carry it around fine though.

post-16814-14494996893227_thumb.jpg

  • Thumbs Up 1

Share this post


Link to post
Share on other sites

@bjorm: do you think there would be any advantage going from a GTX660Ti to a GTX670, with a 1080p configuration? I ask because the only difference between those cards is the memory bandwidth, which (I think) could be more than enough for a 1080p configuration even on the GTX660Ti.

Share this post


Link to post
Share on other sites

In my opinio there is no point of buying something newer if You have already GTX660Ti.

Share this post


Link to post
Share on other sites

No, I don't have it yet. But I was wondering if it would make any sense going for a GTX670 to play at 1080p.

Share this post


Link to post
Share on other sites
First GTX660 vs HD6850 comparison (I don't have access to 660 atm...on the other hand, such comparison is a bit pointless, because 6850 is way weaker than 660, and it seems like it's more limited by pcie bandwidth):

Crysis 3, mission 6 (from the beginning to the second Ceph AA defence, getting there took me 7 minutes on 660 and 12 minutes on hd6850 due to lower FPS, which makes it hard to play well).

Settings:

Resolution: 1920x1080

All low

GTX660@1.2Opt:

18897026_crysis-3-gtx660-misja-6.png

Look at the red line, green represents gameplay at high setting (textures and the rest) and is really short.

AVG FPS: 45

HD6850@1.2:

19863389_crysis-3_mission-6_hd6850_fullhd-low.png

AVG FPS: 27

Like before, the CPU Usage is way lower then it was with GTX660. It's really worth consideration, because as we know, Nvidia drivers cause a greater CPU load then AMD drivers. It's much more visible on older Core 2 Duo based desktop PCs, causing microstuttering, but I think that it might affect Core i5 as well. The problem should be not present when using quad core CPU.

On low settings, GTX660 should easily get average of 60 FPS. Not sure about HD6850 performance, but according to benchmarks, 28 FPS is a valid result... for high settings. On low it should be more like 40 FPS I suppose.

So, we can see that both cards are limited. By... and now it's starting to get really difficult. I'm not sure which of the factors is more important here. Maybe the performance is CPU limited, like I think it is on Welcome to the Jungle level. Of course, when it comes to the "grass moment", the PCIe bandwitdh is a real drawback there, giving us drops to 20 FPS instead of a way more playable 30-35. On GTX660, mission 6 is really playable, even at high setting. On HD6850 it's not playable on low settings, which keeps mi thinking about how a GCN-based card would perform. It would have to be less bandwidth-limited to maintain good FPS.

p.s. I'm going to buy a GCN-based card like HD7870/7870xt/r9-270(X), but given that it's Christmas time, I think I'll have to wait till January, because prices are a bit higher now and shipping might take very long. Which is really important, because in Poland when buying online, I could give the GPU back after 10 days of testing without providing any cause of it. So, I'd like to use that privelege for an almost-free benchmarks, if I won't stay with AMD GPU.

Hey I've test crysis 3, but with 6950 on my desktop and on my laptop.

I got really strange result and I may have found the solution.

I have a crappy cpu on my desktop (but I think its good enough) and I get 30fps on the start of the jungle scene (on the stairs) and

while I look at the floor almost 60fps. This is on low and very high textures, original version crysis.

Now on the laptop I was getting just 20fps looking to the jungle and 30fps looking at the floor behind, so I checked GPU and CPU usages

and they were both lower than 100%, 75% gpu and I think 20% cpu.

Then I tried disabling vsync in game options and I got 25fps looking down the jungle and 45fps looking at the floor behind,

GPU's utilization jumped to 100% while the CPU's was at ~20%.

I can't really explain why it wasn't using the full gpu power with vsync at <= 30fps if it was synced to 30Hz and not 60Hz as it seemed to be.

So still it is withing the 83% performance of desktop, at least for this card, and doesn't seem to be affected more than other games, unless you'd like

another scene to test (I can't get the map load command to work, so if its near the start it would be great).

Related thread:

https://forums.geforce.com/default/topic/537360/crysis-3-locked-30-fps-when-v-sync-is-enabled/

Share this post


Link to post
Share on other sites

Yeah, the 670 seems to be the best card to get with 1.2 Optimus. Just bought a MSI 670 PE card and while I haven't done any concrete tests yet, medium settings 1080p resulted in a very consistent 60fps with FEW drops below (hit 55 for about a second then went back up). And got 7566 on 3Dmark11 P. Not too shabby.

I'm finishing up a comparison between a 660 and 670 at 1.2Opt so people can see the difference and make a call from there.

Share this post


Link to post
Share on other sites

Tech Inferno Fan>> Full hardware spec of eGPU implementation posted earlier : http://forum.techinferno.com/diy-e-gpu-projects/2109-diy-egpu-experiences-%5Bversion-2-0%5D-222.html#post75247

HP ProBook 6460b (with Intel HD Graphics 3000)

Intel Core i5-2520M

EVGA GeForce GTX 670 2GB

PE4L ver.2.1b

PCIe 2.0 enabled in the BIOS

@lapytopy - Thanks for sharing your case pix. That looks really portable!

@bjorm - If the next gen PE4L lets us power through the bus (is this really theoretically possible?) then you can make a micro eGPU :)

@noric - GTX 670 for sure... Windforce if you can get your hands on one. If I had the chance to start over I would have held out for the Asus GTX 670 DirectCU II (or maybe the Asus GTX 670 DC Mini!)

I was finally able to upgrade my humble i5-2520M CPU to an i7-2860QM. I still can't touch any of the leaderboard numbers, but I'm happy with the boost. Here are some before and after snapshots:

3DMARK06

17268 - NVIDIA GeForce GTX 670 video card benchmark result - Intel Core i5-2520M Processor,Hewlett-Packard 161C

23128 - NVIDIA GeForce GTX 670 video card benchmark result - Intel Core i7-2860QM Processor,Hewlett-Packard 161C

3DMARK11

P5942 - Graphics Score 7382, Physics Score 3764

P7155 - Graphics Score 7427, Physics Score 7083 - NVIDIA GeForce GTX 670 video card benchmark result - Intel Core i7-2860QM Processor,Hewlett-Packard 161C

I didn't need to do anything to the heatsink or fan, but I went ahead and upgraded to a 90W laptop power supply. Honestly I don't think I really needed to though.

  • Thumbs Up 3

Share this post


Link to post
Share on other sites

@kaladeth

Nice to see such a nice improvement with a change in CPU. My i7-2620M is only a smidgen better than the i5-2520M you had. You are tempting me to upgrade to one of the nicer i7 CPUs lol.

I see you have 3 displays. Have you done testing with 3 hooked up vs 1 hooked up? I noticed a pretty solid performance difference with a change in the number of displays active:

http://forum.techinferno.com/diy-e-gpu-projects/2109-diy-egpu-experiences-%5Bversion-2-0%5D-43.html#post74494

(1 to 6 displays in that test)

Share this post


Link to post
Share on other sites
@lapytopy - Thanks for sharing your case pix. That looks really portable!

@bjorm - If the next gen PE4L lets us power through the bus (is this really theoretically possible?) then you can make a micro eGPU :)

@noric - GTX 670 for sure... Windforce if you can get your hands on one. If I had the chance to start over I would have held out for the Asus GTX 670 DirectCU II (or maybe the Asus GTX 670 DC Mini!)

I was finally able to upgrade my humble i5-2520M CPU to an i7-2860QM. I still can't touch any of the leaderboard numbers, but I'm happy with the boost. Here are some before and after snapshots:

3DMARK06

17268 - NVIDIA GeForce GTX 670 video card benchmark result - Intel Core i5-2520M Processor,Hewlett-Packard 161C

23128 - NVIDIA GeForce GTX 670 video card benchmark result - Intel Core i7-2860QM Processor,Hewlett-Packard 161C

3DMARK11

P5942 - Graphics Score 7382, Physics Score 3764

P7155 - Graphics Score 7427, Physics Score 7083 - NVIDIA GeForce GTX 670 video card benchmark result - Intel Core i7-2860QM Processor,Hewlett-Packard 161C

I didn't need to do anything to the heatsink or fan, but I went ahead and upgraded to a 90W laptop power supply. Honestly I don't think I really needed to though.

Nice upgrade with good performance boost. I'll mention that in the 2560P/2570P thread we discovered some useful info about the SB/IVB CPUs:

* SB i7-quad CPUs run about 10-14W hotter at the same x27 multiplier than a IVB ones where often that is then the 4-core multiplier limit. Significantly less than a IVB CPU: REF: http://forum.techinferno.com/hp-business-class-notebooks/2537-12-5-hp-elitebook-2570p-owners-lounge-42.html#post79514

* i7-x6xxQM and i7-x7xxQM and i7-x8xxQM CPUs are rated at 45W. Intel spec the 4-core turbo typically to be 2 multis less than the single core turbo BUT if you put the CPU under high load it will be TDP throttled significantly less than that level. Eg: i7-3740QM has a x35 4-core multiplier. It will however be TDP throttled down to x32 under extreme 4-core load such as linx. Only way around this is to have power limits removed where it will use somewhere upwards of 55W at that load. Unlocked power limits however are not available on the typical business grade notebooks by HP, Lenovo and Dell that are used for eGPU purposes. One workaround would be to get a i7-x9x0XM with 55W TDP limits, giving an extra 10W to work with.

A 2560P owner saw his i7-2760QM with a x32 4-core multiplier limited to x27. REF: http://forum.techinferno.com/hp-business-class-notebooks/2537-12-5-hp-elitebook-2570p-owners-lounge-42.html#post79502

Summary from this: given IVB systems are only a small cost premium over SB ones on ebay I recommend getting IVB system over a SB one for best i7-quad eGPU performance.

Share this post


Link to post
Share on other sites

@bjorm - If the next gen PE4L lets us power through the bus (is this really theoretically possible?) then you can make a micro eGPU :)

@noric - GTX 670 for sure... Windforce if you can get your hands on one. If I had the chance to start over I would have held out for the Asus GTX 670 DirectCU II (or maybe the Asus GTX 670 DC Mini!)

I was finally able to upgrade my humble i5-2520M CPU to an i7-2860QM. I still can't touch any of the leaderboard numbers, but I'm happy with the boost. Here are some before and after snapshots:

3DMARK11

P5942 - Graphics Score 7382, Physics Score 3764

P7155 - Graphics Score 7427, Physics Score 7083 - NVIDIA GeForce GTX 670 video card benchmark result - Intel Core i7-2860QM Processor,Hewlett-Packard 161C.

What do You mean about microeGPU? What is connection between PCI-E generation and smaller eGPU? :)

I prefer MSI TwinFrozr or DirectCUII than Windfroce, I used to have maaaany GPU's but windforce is very good system but so fake plastic ;) DirectCUII and TwinFrozr are the coolest and the quitest as possible :)

Take a look that You high end CPU from SB is the same level (GS in 3dm11) like my TDP limited IB CPU (according to @Tech Inferno Fan results) :)

Share this post


Link to post
Share on other sites

Hey nando, what about idle power consumption in SB vs. IB? That's a very important factor in a mobile environment. Did you say anything about IB being more consuming at idle? This could only be tested on IB system, by pulling the IB cpu and plugging a similar specifications SB cpu...

@kaladeth: could you please try to see in throttlestop the voltage and TDP at maximum multithread performance and at x27 multiplier (such as nando is testing)? Also, what about the temps of your 2860qm? Thanks.

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.