Need help to modify BIOS of asus G75VW to use GTX 670MX...
hello everybody. I want to put my gtx670mx in my asus g75vw ... but at the boot i get black screen... But the graphic card work properly on g75vx...
Please is there somebody who can help me to do the process ? or to do it for me ? mp me please.... Dido
I have tried to update nvidia drivers of my notebook ge72vr6rf apache pro. The only drivers that works are the ones provided by the msi. The problem is that they haven't updated those since 2016 and I can't play newer games.
Some say you need to mod your 1060 drivers, which I have no idea how to some says you need to update the flags in ini files. So please help me
Thanks for your time.
I have a notebook Asus X550CC.
Some weeks ago my screen went black with the brightness working though (black screen but one can see brightness changes) - I couldn't figure out the root of the problem, since it was sudden and it works with an external monitor.
Now, I was trying to install an SSD with a caddy. I went to BIOS to change the boot order, but couldn't find my SSD - so, I accidently disabled launch CSM option hoping it would recognize it or something. After that I couldn't access BIOS anymore and I have a very slow startup with dxgkrnl.sys, wdfilter.sys and CAD.sys showing not loaded in the boot log.
I tried to:
Remove CMOS battery for 8h Short JRST2001 and JRST2002 Access BIOS through windows 10 troubleshooting None of the above worked for me
The laptop does not restart or shuts down correctly as it gets stuck somewhere and I have to press ON/OFF button again in order to shut it completly. The only display I can see in the external monitor, during start-up is the windows 10 loading screen (can't see the asus loading screen) and it takes more than 5 minutes to get me there (seems like a bootloop).
I hope someone may have the answer cause I'm running out of solutions
Thanks in advance, everyone!
So, i got a strange situation here. I have an old Asus ROG G751JM laptop connected to an old Samsung monitor (Model S20C301L) through a VGA connection.
This monitor supports both DVI-D and VGA inputs, while my laptop can output through VGA, HDMI or Thunderbolt/Mini DIsplayport. I wanted to upgrade to digital signal, so i bought an Thunderbolt/Mini DIsplayport to DVI cable.
At first, it was a nightmare. There was absolutely nothing i could do to get it to work. I spent hours troubleshooting, but the monitor just could not detect any signal, only VGA one. But then this happens.
I was testing with only one connection at once and i got tired of plugging and unplugging the cables every time. So i had the idea to keep both cables plugged in and change the source detection in monitor's menu when i wanted to. So, with the TB/MiniDP cable already connected, i've then connected the VGA cable (while DVI was selected as input source). The display started flashing. After 2 seconds, voilà, it was working. And i was like, "what the f***???" I've then unplugged the VGA cable, signal lost. Plugged it again, got digital signal.
Well, i'm happy that it's working now, but i simply don't understand what's happening here. Why is that i need both connections simultaneously to make it work? WIll that still happen if i use a HDMI to DVI cable?
Fun fact: the image displays only after windows startup. So, if i want to enter BIOS setup for example, i need to switch back to VGA as input source.
Sorry for my english.
Update: Can't run games with It, i get bsod.