Jump to content

juquinha

Registered User
  • Posts

    4
  • Joined

  • Last visited

Posts posted by juquinha

  1. Hi! :D

    So, i got a strange situation here. I have an old Asus ROG G751JM laptop connected to an old Samsung monitor (Model S20C301L) through a VGA connection. :victorious:
    This monitor supports both DVI-D and VGA inputs, while my laptop can output through VGA, HDMI or Thunderbolt/Mini DIsplayport. I wanted to upgrade to digital signal, so i bought an Thunderbolt/Mini DIsplayport to DVI cable.

    At first, it was a nightmare. There was absolutely nothing i could do to get it to work. I spent hours troubleshooting, but the monitor just could not detect any signal, only VGA one. But then this happens.
    I was testing with only one connection at once and i got tired of plugging and unplugging the cables every time. So i had the idea to keep both cables plugged in and change the source detection in monitor's menu when i wanted to. So, with the TB/MiniDP cable already connected, i've then connected the VGA cable (while DVI was selected as input source). The display started flashing. After 2 seconds, voilà, it was working. And i was like, "what the f***???":S I've then unplugged the VGA cable, signal lost. Plugged it again, got digital signal.
    Well, i'm happy that it's working now, but i simply don't understand what's happening here. Why is that i need both connections simultaneously to make it work? WIll that still happen if i use a HDMI to DVI cable?
    Fun fact: the image displays only after windows startup. So, if i want to enter BIOS setup for example, i need to switch back to VGA as input source.

    Sorry for my english.

     

    Update: Can't run games with It, i get bsod. :Banane13:

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.