Jump to content
Review: E-Win Champion Series Gaming Chair Read more... ×
EwinRacing Flash Series Gaming Chairs
gerald

Y510p Ultrabay Graphics card

Recommended Posts

@rusTORK Cpu pci-e lanes and chipset pci-e lanes ar different and separate. Intel cpus of that era had 16 pci lanes v3.0 , while the chipset has its own pci lanes(usuallu v2.0) that ar used for stuff like m.2/lan/sata/usb/audio and such, so dont confuse the two. Y500 cpu seposedly supports pci-e v3.0 so im not sure why it is limited on the slot,maybe they had some technical difficulties and had to limit it in that one or a design flaw they overlooked  (wouldent be the first time). The major difference betwean y500 and y510p is cpu generation, y500 has 3xxxmq series and y510p had 4xxxmq series, but again bouth support pci-e 3.0 so im not really sure.

Share this post


Link to post
Share on other sites
20 hours ago, Celestus said:

@rusTORK Y500 cpu seposedly supports pci-e v3.0 so im not sure why it is limited on the slot,maybe they had some technical difficulties and had to limit it in that one or a design flaw they overlooked  (wouldent be the first time).

Who may know more (Intel or Compal)?

Share this post


Link to post
Share on other sites

Hey,

I'm new here.

And Gerald, I'd really like to buy one of your adapters. PM sent

LG

Share this post


Link to post
Share on other sites
On 2018/3/22 at 3:24 AM, gerald said:

Drawing and dimensions...

 

Ultrabay_V2.pdf

Hi gerald,I'm new here,and I major in ee,could you send me this file to my email:kangyifeizx at gmail,I would like to make contributions to this project.many thanks.

Share this post


Link to post
Share on other sites

so I just bought the adapter. Hm, so I guess its the latest version. Do I still have to remove the 103 resistor? I just assumed I will get v3 already, but it seems I got v2, or is that resistor ok and should be like this to run geforce gtx with bios v3 ?

IMG_20180918_182552[1].jpg

Share this post


Link to post
Share on other sites

Hi,

the adapter is the latest one. The PCB version is #2 but the mentioned resistor has the new value -> 10k instead of 1k.

  • Thumbs Up 1

Share this post


Link to post
Share on other sites
1 hour ago, gerald said:

Hi,

the adapter is the latest one. The PCB version is #2 but the mentioned resistor has the new value -> 10k instead of 1k.

Super!! 

Share this post


Link to post
Share on other sites

I can now confirm that Lenovo Y500 support NVIDIA card up to GeForce GTX 1080 Ti.

 

IMG_3291_2.thumb.jpg.3b0e125a198d422cf46bf8d1a1a08e2c.jpg

 

I understand that this is a bit too much, but it's have got good price (same as new GTX 1070 Ti or 1080).

 

P.S. My adapter is also Ultrabay/2

Edited by rusTORK

Share this post


Link to post
Share on other sites

Not sure how much of bottleneck will it be with pci-e 2.0 x8 but you can just run Firestrike benchmark or something and compare graphic score with any review or something like that.

Edited by Celestus

Share this post


Link to post
Share on other sites
30 minutes ago, Celestus said:

ou can safely put rtx 2080 and pci-e 3.0 x8 will be still enough to handle it.

 

This is actually a very good question! I'm really looking forward to seeing somebody try the new RTX cards with this adapter. I don't have much concern about the bandwidth, but the troubles with error 43 (if they persist) might become a real issue since the trick with older drivers likely won't work for obvious reasons. 

Share this post


Link to post
Share on other sites

@High_Voltage sorry i edited my post, i forgot that y500 has pci-e 2.0 unlike 3.0 in y510p. So for rusTORK it wont be working 100% speed, only y510p users get full benefit of new cards. Also i dont think there will be any other problems with the error 43, thats already resolved  rtx cards should be detected like any other with the adapter fix.

Edited by Celestus

Share this post


Link to post
Share on other sites

@CelestusYeah, i understand that this card is really overpowered for such system, but as i said in my post real reason of choosing this card wasn't performance, but it's similar cost (new GTX 1070 Ti or GTX 1080 and old GTX 1080 Ti). Also i may later transfer it in usual desktop PC case and build new PC around card. Currently it's just upgrade of laptop grapthic part.

 

About benchmarks... i will try them later. Currently i am looking for decent HDMI monitor emulator in local stores. The only thing i may connect to VGA is old Samsung TV with poor maximum resolution 1360x768. It's just bad from all sides.

 

Also, my AIDA64 just don't see card, it's probably outdated too. Need some additional time to find new version and keys to it.

Edited by rusTORK

Share this post


Link to post
Share on other sites

Today i founded interesting information about internal display from iO forum:

 

Quote

April-2018: Win10 Spring Creator's Edition (1803) can now provide an eGPU accelerated internal LCD mode without requiring an Intel iGPU or needing a ghost adapter. It allows assigning an app to run with a specified eGPU. 

 

Link: https://io/forums/pc-setup/guide-accelerated-internal-lcd-on-non-optimus-systems-enjoy/ (forum broke the link)

 

In this topic:

https://io/forums/pc-setup/windows-10-spring-update-1803-and-bootcamp-egpu-for-macs/ (forum broke the link)

 

One screenshot with "Graphics Specifications" and selected card for "Power saving" and "High performance". I got same window, but i don't allow to select cards. Like there is no such menu.

 

Or everything this is related only to Windows 10\Mac OS via Thunderbold?

Edited by rusTORK

Share this post


Link to post
Share on other sites
13 hours ago, rusTORK said:

Today i founded interesting information about internal display from iO forum:

 

 

Link: https://io/forums/pc-setup/guide-accelerated-internal-lcd-on-non-optimus-systems-enjoy/ (forum broke the link)

 

In this topic:

https://io/forums/pc-setup/windows-10-spring-update-1803-and-bootcamp-egpu-for-macs/ (forum broke the link)

 

One screenshot with "Graphics Specifications" and selected card for "Power saving" and "High performance". I got same window, but i don't allow to select cards. Like there is no such menu.

 

Or everything this is related only to Windows 10\Mac OS via Thunderbold?

 

I didnt get what you mean exactly. If you want to run egpu with internal display then all you have to do is disable dedicated gpu (GT755 in my case).

 

I've also made a script to disable the dGPU at startup. Now my dGPU is always disabled and i can use eGPU (1070) to drive the internal screen.

 

If you want to access the option to use specific gpu for a specific application then go to "Settings" -> "System" -> "Display" -> "Graphics Settings" -> Now you have to select the app and gpu you want to use for that app. You'll need 1803 version for this.

Share this post


Link to post
Share on other sites
7 hours ago, intruder said:

If you want to run egpu with internal display then all you have to do is disable dedicated gpu (GT755 in my case).

I did so, when windows load to desktop, i go to device manager, disabled GT 650M. But i can't select in Graphics Settings my eGPU. I may select app, may select performance type, but there is no any VGA or it's menu.

 

I have got Windows 10 build 1803 (17134.286).

 

AT iO forum i saw that some people useing HDMU monitor emulator, but some don't. Also, in their instruction one of the steps is Hot Plug eGPU, what isn't possible for Ultrabay. As far as i know - it's not support of hot plug.

Share this post


Link to post
Share on other sites
31 minutes ago, rusTORK said:

I did so, when windows load to desktop, i go to device manager, disabled GT 650M. But i can't select in Graphics Settings my eGPU. I may select app, may select performance type, but there is no any VGA or it's menu.

 

I have got Windows 10 build 1803 (17134.286).

 

AT iO forum i saw that some people useing HDMU monitor emulator, but some don't. Also, in their instruction one of the steps is Hot Plug eGPU, what isn't possible for Ultrabay. As far as i know - it's not support of hot plug.

 

You have to restart after disabling gpu. I still have a hard time understanding what you want to do here. Posting a screenshot will help.

 

Also, if you want to define which app should use which gpu you can do that in Nvidia Control Panel as well. 

 

I'm waiting for my new psu to arrive then I'll test the "Display Settings" option.

Share this post


Link to post
Share on other sites
32 minutes ago, intruder said:

 

You have to restart after disabling gpu. I still have a hard time understanding what you want to do here. Posting a screenshot will help.

 

Also, if you want to define which app should use which gpu you can do that in Nvidia Control Panel as well. 

 

I'm waiting for my new psu to arrive then I'll test the "Display Settings" option.

Yx10p are different from Yx00.Yx10p have intergrated intel graphics so that they can easily utilize Optimus to accelerate internal screen(also the internal screen is wired to intergrated intel gpu. However,Yx00 does not have intergrated graphics (disabled physically) so they should do some tricks to make the inner screen accelerated.(If any stable solution is appreciated.) the internal screen is directly wired to dGPU(GT650/750M),which makes Optimus impossible to use. (Seems that win10 grahical settings won't do the trick, but I'm not sure)

Share this post


Link to post
Share on other sites
27 minutes ago, Swung Huang said:

Yx10p are different from Yx00.Yx10p have intergrated intel graphics so that they can easily utilize Optimus to accelerate internal screen(also the internal screen is wired to intergrated intel gpu. However,Yx00 does not have intergrated graphics (disabled physically) so they should do some tricks to make the inner screen accelerated.(If any stable solution is appreciated.) the internal screen is directly wired to dGPU(GT650/750M),which makes Optimus impossible to use. (Seems that win10 grahical settings won't do the trick, but I'm not sure)

 

Ah. Now i get it. I forgot he's using y500. I'll keep an eye out for solutions. 

Share this post


Link to post
Share on other sites

Solved problem (kind of) by buying HDMI monitor emulator. After that i may at least run games on internal display (tested Deus Ex: Mankind Divided, DOOM (2016), Far Cry 5 and strange results with Fallout 4, but i will download last version and test it again).

 

Also, i did few benchmarks in AIDA64 and founded limitation:

 

My result on laptop:

AIDA64_GPGPU_Benchmark_EVGA_GTX_1080_Ti_SC2.png.7b14da2ea3e3e50c611701653ae15556.png

 

Result i founded with same GPU (Desktop):

AIDA64_GPGPU_Benchmark_EVGA_GTX_1080_Ti_SC2_Desktop.jpg.8d6c036399227441d927d8884675bdf9.jpg

 

You may clearly see low results with "Memory Read" and "Memory Write". I may guess that it's limitation of PCI-E v2.0 x8 which is exactly 3.2 GBps.

5ba78b8467920_PCI-EExpressBandwidth.gif.cc404cbd1dd7a8c63e6468e405df6992.gif

Edited by rusTORK

Share this post


Link to post
Share on other sites

3.2GBps not Gbps which is equal to 40Gbps.. So Basically you have 3.0 x4 or Thunderbolt 3 Bandwidth. It ain't a big deal in my opinion since your processor and old DDR3L Memory speeds are going to be the primary bottlenecks.

Just use a 1440p external monitor to increase your performance.

Share this post


Link to post
Share on other sites
5 hours ago, Tesla said:

3.2GBps not Gbps which is equal to 40Gbps..

Fixed. :)

 

5 hours ago, Tesla said:

It ain't a big deal in my opinion since your processor and old DDR3L Memory speeds are going to be the primary bottlenecks.

If i got PCI-E 2.0 X16 or even PCI-E 3.0 X16 - results should be better, right (with same CPU and RAM)?

 

5 hours ago, Tesla said:

Just use a 1440p external monitor to increase your performance.

It's ain't boost Memory Read and Write, but yeah i saw tests and reviews with 1440p resolution and GTX 1080 Ti (in desktop) was pretty good.

 

Maybe will be good part for transition from Laptop to Desktop later.

Share this post


Link to post
Share on other sites

Ye if you got 3.0 x16 it for sure will be better but Here is the caveat, you will run into a cpu bottleneck before you can saturate the performance of 2.0 x8, it really depends on the task.

Gaming at higher resolution would put more strain on the GPU itself and using an external monitor will increase the available PCI bandwidth.

Share this post


Link to post
Share on other sites
9 hours ago, Tesla said:

Gaming at higher resolution would put more strain on the GPU itself and using an external monitor will increase the available PCI bandwidth.

You think 3.2 GBps isn't full load? Well... it worth testing (at least), but i don't have spare 1440p monitor near me (home or work). Need to think about it.

 

9 hours ago, Tesla said:

you will run into a cpu bottleneck before you can saturate the performance of 2.0 x8

Hmmm... Look's like there is really small "window". Need to run few test with CPU and GPU loads on screen to be sure.

 

I tested Valley Benchmark yesterday and can't run it in fullscreen mode on internal monitor, only windowed. I am not sure results will be proper.

 

Also i looking forward to get 3DMark and run it too.

  • Thumbs Up 1

Share this post


Link to post
Share on other sites

Hey guys. I'm having a weird issue with my power supply.

I was using a cheap artis 500w psu (borrowed from a friend) with my 1070 mini which needs additional 8-pin pcie connector and everything was working great. The card was using upto 160w power.

My friend needed that psu back so I gave it to him and now i bought corsair VS450. Now the card is detected and i can play some less demanding games but when the power load of 1070 reaches 110W, the PSU shuts off. I don't know what to do now as it's the only PSU i have.

I tested both the graphics card as well as the psu in the desktop together and everything's working fine.

The difference between those 2 psu is that the 500w one had multiple rails whereas the vs450 has single rail. 

Any ideas what i can do now? 

Share this post


Link to post
Share on other sites
19 hours ago, intruder said:

My friend needed that psu back so I gave it to him and now i bought corsair VS450. Now the card is detected and i can play some less demanding games but when the power load of 1070 reaches 110W, the PSU shuts off.

It's probably PSU secure mode. Since you useing it for GPU only - it's detecting high power on +12V rail and shut down.

 

On desktop you probably connectiong all components (like fans, cpu, gpu, memory, hdd, ssd, e.t.c.) and PSU working normally since all rails are loaded (+3.3V and +5V).

 

Search you PSU in Google (Corsair VS450 shut down). There a lot same situations.

Edited by rusTORK

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

  • Similar Content

    • By popozitos
      Hello everyone
      I created a case for egpu using 2.5mm acrylic sheets. The first time I created a case to support a GPU + ATX psu
      Now I created another one for the GPU with an external Xbox 360 203w psu.
      the finish was not good but for my use it is already good.
       
      *both cases album
      *Construction of the second case 
      *adapting xbox psu to 8pin power (taken from 24pin)
       
      all made by hand with the tools I had.

      from Brazil.

    • By jcrcarmo
      Hi folks,
        Greetings from Brazil!  Not sure I'm posting in the right place, but I have a question:  how do I find out the brand and model of my BIOS chip without having to open my laptop?  I can't find this information anywhere.  Been searching for days... I have a Lenovo Ideapad 300-15ISK 80RS (motherboard Paris 5A8).  Thanks a lot for your time and help! I really appreciate it!   Best regards,   JC.
    • By Consumerofmulch
      Hello!
       
      I have recently acquired my Dell Inspiron 15 7000, and I am interested in building an egpu setup for it. It has a USB-C port. Any recommendations/guides out there?
       
      thanks.
    • By Donz7733
      I am planning to get this Charger (170w 20v 8.5a) - https://www.amazon.in/Lenovo-ThinkPad-4X20E50574-170W-AC/dp/B00MLTQ99Y
      &
      the Adapter - https://www.aliexpress.com/item/32975526334.html?spm=a2g0o.productlist.0.0.27cb747e6TEc28&algo_pvid=a7db42d0-97cb-4a37-be7a-d19c44bddf73&algo_expid=a7db42d0-97cb-4a37-be7a-d19c44bddf73-0&btsid=8d6ee0f9-08e9-4474-b63b-cc751403b4c7&ws_ab_test=searchweb0_0,searchweb201602_,searchweb201603_60
       
      Will this combination work to run Both GT755m of Y510p?
       
       
      Thanks
       
    • By Sparkhunter
      Hi there guys,
       
      I've been looking into getting a new GPU for my laptop (4-lane PCI-e) and enclosure (HP Omen Enclosure) but I cannot decided between the RX 5700 XT and the RTX 2070 (non SUPER).
       
      I did some price to performance calculations (higher is better) with some of the best prices I can currently get in the UK.
       

       
      The RX 5700 XT does perform better and is cheaper than the RTX 2070 (Although the 2070 does come with 2 games) but my main concern is drivers and from what I've heard NVIDIA cards work a lot better with eGPUs than AMD.
       
      If anyone has experience using both red & green or if anyone else has any other input i'd love to hear your experiences and input.
       
      Thanks.
       
×

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.