Jump to content

Search the Community

Showing results for tags 'nvidia'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • Forum Introduction & Feedback
    • Site Announcements, Rules & New Member Intros
    • TechInferno Forum Feedback
  • Tech News & Reviews
    • News & Reviews
  • Notebook Discussion
    • General Notebook Discussions
    • Notebook OEM Subforums
    • What Notebook Best Fits My Needs?
    • Legacy Articles
  • Desktop & General Hardware
    • General Desktops Discussion
    • Desktop Hardware
    • Overclocking, Cooling & Build Logs
  • Software, Networking & Gaming
    • PC & Console Gaming
    • Video Driver Releases & Discussion
    • Networking
    • General Software Discussion
  • Everything Else
    • Off Topic
  • Legacy Section (Not Actively Supported)
    • DIY e-GPU Projects

Categories

  • SVL7 & Klem VBIOS Mods
    • AMD
    • Alienware M11x R3
    • Alienware M14x R2
    • Alienware M17x R4
    • Alienware M18x R1
    • Alienware M18x R2
    • Kepler VBIOS
    • Lenovo Y400-Y500
    • Lenovo Y410p-Y510p
    • Lenovo Y580-Y480
    • Legacy BIOS/VBIOS
    • Maxwell VBIOS
    • Sony Vaio SVS13 / SVS15 series
  • BAKED BIOS Mods
    • Clevo
  • Utilities

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


Steam


AIM


MSN


Website URL


Yahoo


Jabber


Skype


Location


Interests


Occupation

  1. I had the privilege of retrieving a Uefi compatible Vbios for the GTX 670 TI P/N: 03G-P4-3663-KR . I have uploaded the file. Unzip and run the update.exe and it will start the process. enjoy the contents came directly from EVGA.gtx670ti.zip
  2. Version 1.0.0

    79 downloads

    Developed by Orbmu2k, NVIDIA Inspector is an invaluable tool for NVIDIA GPUs which has a variety of features including but not limited to the following listed below. Feature overview: Overclocking - Ability to set GPU base clock, memory clock, power and temperature target, voltage offset and fan speed. Profiles - Create per GPU overclocking and fan speed profiles. NVIDIA profile manipulation - Ability to modify NVIDIA game profiles. Version 1.9.7.3 updated setting constants until driver r343 updated custom settings override reported default value for aa gamma correction and shader cache fixed donation link show memory vendor shortname after memory type you can now add "-silent" param after the nip file for silent profile import creating startup tasks will add 30sec task delay creating seperated startup tasks for each perf level fixed default values for non p0 state oc controls fixed mdps apps drag & drop from non elevated desktop fixed rop calculation for maxwell 2 cards Screenshots:
  3. If you're subscribed to receive NVIDIA's newsletter and are a fan of Ubisoft's upcoming game The Division, then you'll be pleased to find an e-mail in your inbox from NVIDIA handing out closed beta access keys to the game. The closed beta starts on January 28th for Xbox One players and January 29th for PC and PS4 owners and the only way to get guaranteed access to this closed beta is by pre-ordering the game or receiving a code from NVIDIA. According to the game's FAQ, the purpose of this beta is to test out the servers and to make sure the game is balanced. The Division is an upcoming RPG game with a military theme that takes place in New York City after a pandemic has swept across the US creating chaos throughout society. In the game, you play as a Division agent that roams through an open world persistent New York where you will explore, take part in combat against other players and AI and have character progression. Amazon has a 30 minute video called Tom Clancy's The Division: Agent Origins which gives a nice background on the story. You can find more information on the closed beta here: http://tomclancy-thedivision.ubi.com/game/en-us/beta/ View full article
  4. Intel's Gregory Bryant, vice president and general manager of Intel’s desktop clients platform, has gone on record during a speech at the J.P. Morgan forum saying that the company's IGP (integrated graphics processor) called Iris and Iris Pro are fast enough for casual and mainstream gamers and that they would no longer need a discrete graphics solution. That statement in itself does not sound unreasonable or outlandish as Intel IGP performance has steadily increased over the years and eaten into AMD and NVIDIA's low end share. However, Mr. Bryant also stated that Iris and Iris Pro can outperform 80% of discrete graphics chips , “We have improved graphics 30 times what they were five years ago,” but admits that Intel has done a poor job communicating the benefits of integrated graphics. According to Steam's hardware survey, as of December 2015, Intel currently holds 18.66% of the overall share with 54.61% going to NVIDIA and 26.23% to AMD. This market share is virtually unchanged from December 2014 where Intel had a share of 18.88% so it seems they do have some work to do if they want to increase their appeal to gamers. Unlike NVIDIA, AMD manufactures APUs that compete with Intel's IPG solutions but with the release of Iris 6200 pro, Intel has taken a significant lead over AMD and has even approached NVIDIA's discrete GeForce GTX 750 performance at the entry level. With AMD Zen APUs possibly being released in 2017, it may give the firm the opportunity to finally take back the low end APU performance from Intel. Source: PC World View full article
  5. NVIDIA released the GeForce 361.60 hotfix driver which addresses crashes in photoshop and illustrator as well as installing and clocking related issues. You can grab the driver from below. Windows 7 and 8 64 bit Version 32 bit Version Windows 10 64 bit - Win10 32 bit - Win 10 View full article
  6. The essence of the case. I have a laptop Lenovo x230t tablet 12.5 "Touch HD i7-3520M 2.9Ghz 16Gb 250Gb SSD Win7 Pro. Item model number X230t. Processor 2.90 GHz Intel Core i7 I bought a video card Gigabyte NVidia GTX 970 and for him EXP graphic displaycard Beast Install the drivers supplied with the video card, and then download the new drivers and reinstall them. Device Manager windows sees both cards and built and installed the discrete. It defines them as functional. After installing the driver there were two panel settings Gigabyte_OC_GURU, GeForceExperience And the problems begin .... ((((( Launch settings panel nvidia, an error "crash nvidia control panel application 8.0.760.0" The signature of a problem: Event Name the problem: APPCRASH Application Name: nvcplui.exe Application Version: 8.0.760.0 The time stamp applications: 5414b78f The name of the module with an error: nvcplui.exe Module Version error: 8.0.760.0 The time stamp module error: 5414b78f Exception code: 40000015 Offset exceptions: 00000000001c86b5 OS Version: 6.1.7601.2.1.0.256.48 Language code: 1049 Additional Information 1: 2c72 2 For more information: 2c723d80b9da5141cfaf23219eceb812 For more information, 3: 3c38 For more information, 4: 3c38e557a5c0fe5d129254087b723f77 Click "search for a solution to the Internet", but the window is closed and the silence. Two other utilities Gigabyte_OC_GURU, GeForceExperience not react to the launch What to do? Please advice and assistance
  7. A little more than a year ago, NVIDIA, one of the largest graphics processing unit (GPU) companies in the world, claimed Samsung infringed on three of it's core patents and asked the ITC to ban Samsung smartphones and tablets that used Samsung's Exynos SoC (system on chip) and Qualcomm's Snapdragon SoC. However, an ITC administrative law judge ruled that Samsung and Qualcomm did not infringe on two of NVIDIA's patents and declared the third that they did infringe to be invalid. After the case went to the full ITC commission, it upheld the administrative law judge's ruling in favor of Samsung. In turn, Samsung counter-sued NVIDIA claiming that it had violated three of Samsung's patents, specifically 6,147,385, 6,173,349 and 7,804,734 which date back to the 1990s covering implementation of SRAM. And now an ITC administrative law judge (ALJ) has found NVIDIA did violate those patents and the case is set to go before the full ITC commission. NVIDIA argues that the patents Samsung used in its countersuit are outdated and no longer used in modern designs: "We look forward to seeking review by the full ITC which will decide this case several months from now." One of the three patents is set to expire in 2016. NVIDIA, despite being the world leader in visual computing on the desktop, has not had much success in replicating that dominance in mobile designs with it's Tegra SoC and has since moved on to using its technology in other products and applications such as the Drive PX self-driving platform and it's consumer SHIELD android based gaming box. Sources: Seeking Alpha, Anandtech View full article
  8. - Extract drivers 7-Zip - 3xx.xx-notebook-win8-win7-64bit-international-whql\Display.Driver - Display.Driver folder, OEM INF list. nvaci.inf Acer, Gateway nvami.inf Asus nvaoi.inf Apple nvbli.inf HP nvcti.inf Compal nvcvi.inf Clevo nvdmi.inf Dell nvfmi.inf Fujitsu nvfui.inf Siemens nvhmi.inf HP nvloi.inf LG nvlti.inf lenovo NVMIi.inf MSI nvqni.inf NEC nvszci.inf Sony nvtdi.inf Toshiba Qosmio nvtsi.inf Toshiba - INF file structure. ; NVIDIA Windows (64 bit) Display INF file ; Copyright © NVIDIA Corporation. All rights reserved. [Version] Signature = "$Windows NT$" Provider = %NVIDIA% ClassGUID = {4D36E968-E325-11CE-BFC1-08002BE10318} Class = Display DriverVer = 10/23/2013, 9.18.13.3165 CatalogFile = NV_DISP.CAT ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ [Manufacturer]%NVIDIA_A% = NVIDIA_SetA_Devices,NTamd64.6.0,NTamd64.6.1,NTamd64.6.2,NTamd64.6.3 (Windows versions 64-bit) NTamd64.6.0 Vista NTamd64.6.1 Win7 NTamd64.6.2 Win8 NTamd64.6.3 Win8.1 1 [NVIDIA_SetA_Devices.NTamd64.6.0] Vista %NVIDIA_DEV.0407.01F1.1028% = Section004, PCI\VEN_10DE&DEV_0407&SUBSYS_01F11028 %NVIDIA_DEV.0407.01F2.1028% = Section004, PCI\VEN_10DE&DEV_0407&SUBSYS_01F21028 %NVIDIA_DEV.0407.0228.1028% = Section004, PCI\VEN_10DE&DEV_0407&SUBSYS_02281028 ------------------------------------------------------------------------------------------------------------------------------------- 2 [NVIDIA_SetA_Devices.NTamd64.6.1] Win7 %NVIDIA_DEV.0407.019C.1028% = Section001, PCI\VEN_10DE&DEV_0407&SUBSYS_019C1028 %NVIDIA_DEV.0407.01F1.1028% = Section001, PCI\VEN_10DE&DEV_0407&SUBSYS_01F11028 %NVIDIA_DEV.0407.01F2.1028% = Section001, PCI\VEN_10DE&DEV_0407&SUBSYS_01F21028 ------------------------------------------------------------------------------------------------------------------------------------- 3 [NVIDIA_SetA_Devices.NTamd64.6.2] Win8 %NVIDIA_DEV.0407.019C.1028% = Section002, PCI\VEN_10DE&DEV_0407&SUBSYS_019C1028 %NVIDIA_DEV.0407.01F1.1028% = Section002, PCI\VEN_10DE&DEV_0407&SUBSYS_01F11028 %NVIDIA_DEV.0407.01F2.1028% = Section002, PCI\VEN_10DE&DEV_0407&SUBSYS_01F21028 ------------------------------------------------------------------------------------------------------------------------------------- 4 [NVIDIA_SetA_Devices.NTamd64.6.3] Win8.1 %NVIDIA_DEV.0407.019C.1028% = Section003, PCI\VEN_10DE&DEV_0407&SUBSYS_019C1028 %NVIDIA_DEV.0407.01F1.1028% = Section003, PCI\VEN_10DE&DEV_0407&SUBSYS_01F11028 %NVIDIA_DEV.0407.01F2.1028% = Section003, PCI\VEN_10DE&DEV_0407&SUBSYS_01F21028 ------------------------------------------------------------------------------------------------------------------------------------- 5 [strings] DiskID1 = "NVIDIA Windows (64 bit) Driver Library Installation Disk 1" NVIDIA = "NVIDIA" NVIDIA_A = "NVIDIA" NVIDIA_DEV.0407.019C.1028 = "NVIDIA GeForce 8600M GT " NVIDIA_DEV.0407.01F1.1028 = "NVIDIA GeForce 8600M GT" NVIDIA_DEV.0407.01F2.1028 = "NVIDIA GeForce 8600M GT " ------------------------------------------------------------------------------------------------------------------------------------- nvdmi.inf (Dell) e.g. Alienware M17xR2 GTX 680M - Win7 - Device manager -> Display Adapters -> Details -> Device Description -> Hardware Ids. 10DE&DEV = Nvidia, 11A0 = GTX 680M, 043A = M17XR2, 1028 = Dell (PCI\VEN_10DE&DEV_11A0&SUBSYS_043A1028) PCI\VEN_10DE&DEV_11A0&SUBSYS_05511028 = M17xR4/60Hz GTX 680M (Reference) - Edit only the "NTamd64 Section" that corresponds to your OS version + 5 "NVIDIA Windows (64 bit) Driver Library Installation Disk 1" Under [NVIDIA_SetA_Devices.NTamd64.6.1] 2 (Win7) search for "%NVIDIA_DEV.11A00551.1028% = Section210, PCI\VEN_10DE&DEV_11A0&SUBSYS_05511028" replace 0551 with 043A. - 5 "NVIDIA Windows (64 bit) Driver Library Installation Disk 1" [strings] NVIDIA_DEV.11A0.0551.1028 = "NVIDIA GeForce GTX 680M " replace 0551 with 043A Notepad CTRL+H Method - Display.Driver folder -> open nvdmi.inf (notepad) CTRL+H -> Find What: 0551 Replace With: 043A -> Hit 'Replace All' -> Save. - C:\Nvidia...International\ -> run setup.exe. M17xR1 GTX 260M/280M/285M. - GTX 260M = 0618 %NVIDIA_DEV.0618.02A2.1028% = Section033, PCI\VEN_10DE&DEV_0618&SUBSYS_02A21028, replace 02A2 with 02A1 - GTX 280M = 060A %NVIDIA_DEV.0618.02A2.1028% = Section033, PCI\VEN_10DE&DEV_0618&SUBSYS_02A21028, replace both 0618 / 02A2 with 060A / 02A1 respectively. - 5 "NVIDIA Windows (64 bit) Driver Library Installation Disk 1" [strings] NVIDIA_DEV.060A.02A1.1028 = "NVIDIA GeForce GTX 280M " NVIDIA_DEV.0618.02A1.1028 = "NVIDIA GeForce GTX 260M " Notepad CTRL+H Method GTX 260M - Display.Driver folder -> open nvdmi.inf (notepad) CTRL+H -> Find What: 02A2 Replace With: 02A1 -> Hit 'Replace All' -> Save. - C:\Nvidia...International\ -> run setup.exe. GTX 280M 1- Display.Driver -> open nvdmi.inf CTRL+H -> Find What: 02A2 Replace With: 02A1 -> Hit 'Replace All' 2- Find What: 0618 Replace With: 060A -> ''Replace All' 3- Find What: 260M Replace With: 280M -> 'Replace All' -> Save. 4- C:\Nvidia...International\ -> run setup.exe GTX 285M 1- Display.Driver -> open nvdmi.inf CTRL+H -> Find What: 043A Replace With: 02A1 -> Hit 'Replace All' -> Save 2- C:\Nvidia...International\ -> run setup.exe - Windows 8 - disable driver signing. Command Prompt (Admin) Win key + X Type: bcdedit /set {current} testsigning yes -> "The operation completed successfully" -> reboot -> Install drivers. Exit test mode. bcdedit /set {current} testsigning no -> reboot. View full article
  9. When Fallout 4 was first released, we saw a huge deficit between NVIDIA and AMD in performance and many attempted to blame GameWorks for AMD's performance issues with the game. Many (including myself) maintained that it was an issue with AMD drivers needing a revision to bring performance parity rather than NVIDIA or Bethesda sabotaging AMD hardware. Now ten days later we have the following results, courtesy of OC3D: I hope this serves as a lesson that when properly motivated, AMD can develop drivers that address performance deficits in DX 11 games and that it isn't always GameWorks fault.
  10. Does anyone have? I'm currently experiencing problems with this card. Pulled from AW system, couldn't display screen on MSI system. Flashed older VBIOS using Clevo system, forgot to save the original VBIOS. Installed in MSI system, installed Windows right away, worked for like 10 minutes and fans started spinning hard. Didn't want to kill the card so I forced shut down. Shows Code 43 error on MSI and AW system. Just wanna rule out every possible option to determine what the problem is. :/
  11. I own a Gigabyte G1 Gaming Edition of nVidia's massive new 980ti GPU. It's a great card as-is but I do love to tinker and I'm aware that it uses the same core (GM200 I believe) as the new Titan X GPU, the only difference aside from the 6gb GDDR5 RAM versus the Titan's 12GB is that some of the CUDA cores are disabled on the 980ti, giving it 2,816 active cores instead of the full 3,072 on the Titan X. I'm curious if it would be possible to flash a 980ti so it thinks it's a Titan X and would therefore use that additional block of previously disabled cores. I remember it was once very easy to do this with, say, the old ATI Radeon cards - one could buy a certain relatively cheap card, flash it to something better that used the same core, and even bolt on a better cooling fan, overclock, and that's all it would take to build a way stronger rig for a bargain price. Those were the days, right! I'm sure there are safeguards in place now to make this more difficult because companies like nVidia don't like to lose money, and that difference in amount of video RAM might require a "hybrid" modded VBIOS, but do you think it could be done? What obstacles would have to be overcome? Where might I start in researching this further?
  12. This game has gotten pretty high reviews and deservedly so because I just finished it (high chaos, evil ending) and I loved every minute of playing it. I'm not the type to finish SP games either, I usually get bored and quit (e.g. Skyrim, Witcher 2 etc) but this game kept me intrigued the entire time which says a lot about it. The rune system worked really well and the relationship you build with the characters in the game determines how everything will play out. I hope they make a sequel to this because I think this game is up there with the Half Life series in how fun it is. Some spoilers: My biggest complaint with the Crossfire 7970M (12.8 WHQL driver) is that the UE3 streaming textures causes a lot of hitches. I heard this problem was mainly with AMD cards but I'll investigate further when my 680M SLI cards get here on Tuesday. I played the game with max settings in game with SSAO + enhanced AA enabled in the AMD control panel. Framerate was generally 60+ the entire time except for parts where the texture streaming took place (mainly during the beginning of a level) and it would cause hitching. My overall rating for the game: Fun factor: 10/10 Gameplay/Controls: 9/10 Replayability: 9/10 (multiple endings) Graphics: 8.5/10 Sound/Music: 9/10 Reviews: Metacritic overall: 91 IGN: 92 PC Gamer: 92 Gamespot: 90 Gamespy: 90 Edge Magazine: 90 Game Informer: 88
  13. Reading about this news NVIDIA I was wondering if in the future can become a solution to overcome the lack of external interfaces for eGPU. Right now Nvidia cloud GPU appear to be limited for gaming (only Nvidia hardware as client) but curios if anyone tested a GPU cloud service provider using not just a game but a 3D application.
  14. When I run 3D vantage and play Battlefield Hardline, the applications close if I have my PhysX set to GPU. If I move it to CPU it works properly.. Why is this so? Thanks in advance!
  15. So officially the Asus Maximus V Formula doesn't support Tri-SLI, just Tri-CrossfireX and SLI (2-way). The reason I guess is due to nvidia not being able to officially utilize 4x speeds (e.g. it does 8 x 8). The extreme version of the Maximus V supports tri-sli but that's about $400+ for that board. One way around it is to use a program called HyperSLI it's a simple install and click program. Just make sure you have a processor that has VT-X (although software emulation is also possible) and 2 mini bridges + 1 long bridge to connect the 3 cards. In my case, I connected 3 GTX 680 cards (2 680 SC signature 2 + 1 680 SC, all EVGA). It doesn't seem like the lack of bandwidth is hurting the performance too much in games or benchmarks on a single monitor at 2560 x 1440. With a triple display, things may change: 3DMARK 11 Performance: NVIDIA GeForce GTX 680 video card benchmark result - Intel Core i7-3770K Processor,ASUSTeK COMPUTER INC. MAXIMUS V FORMULA score: P21377 3DMarks Extreme: NVIDIA GeForce GTX 680 video card benchmark result - Intel Core i7-3770K Processor,ASUSTeK COMPUTER INC. MAXIMUS V FORMULA score: X10444 3DMarks GAMES Borderlands 2: Everything maxed out @ 1440p the fps never dips below 130 fps. BF3: 4XAA/Game maxed out @ 1440p: https://dl.dropbox.com/u/51380793/bf3_2013_04_05_00_27_53_326.jpg BF3 8X CSAA enabled in CP with everything maxed in-game: https://dl.dropbox.com/u/51380793/bf3_2013_04_05_00_48_38_505.jpg PICS
  16. ...Hi all. I felt i must post this info. Today i installed windows 10 preview again here on my P370SM. 680M sli.I wanted to try out a driver that "Cyris" posted on a guru 3d forum. (349.72) here is the forum : windows 10 directx driver 349.65 windows update - Page 3 - Guru3D.com Forums . here is the driver i downloaded : 349.65-desktop-win10-64bit-international-beta . It worked ok. When i came to set the nvidia control panel to my preferred settings there was the option for "DSR" I tried it and it seemed to work.I have not used DSR before but i defiantly felt that it was working. BTW i had to add my gpu hardware id the "nv_dispiwu.inf" file. Cheers! : )
  17. Picked up a couple EVGA GTX 980 SC's with ACX 2.0 coolers and compared to the Titans that they are replacing, the temperatures are WAY better (Maxwell ftw) as is the operating frequency. It's pretty awesome how far NVIDIA has taken 28nm and they aren't done yet as the big Maxwell (GM200) will probably arrive in 2 months in the form of Titan 2 and then probably GTX 1080i in June or so (I'll probably grab that). Anyhow, I added a side fan to the computer door to help exhaust heat since these are open air coolers and therefore dump heat into the case. Temps and OC are fantastic as I mentioned, with my OC so far hitting 1520 MHz on stock voltage and max temps being 75C and 63C respectively. These cards idle at 40C and 34C each so I'm really happy about that. I'll probably do a run of 3DMark in a few minutes just to see how they do but I'm not a hardcore benchmarker, I prefer to test the results of my overclock in games I actually play and then crank their settings up and see how much an OC benefits me. Here's my setup: Benchmarks: 3DMark GTX Titan SC SLI vs 980 SC SLI (980s are on STOCK voltage):Result Stock vbios. +130 core (1571 MHz core/1853 MHz memory), +12mv voltage, 120% Temps never got above 69C on GPU 1, second card barley hit 60C. My 3DMark run: 19155
  18. hi guys, my lenovo y410p recently having " Event ID 13 from source nvlddmkm cannot be found" when playing new games (i tried tomb raider, sniper elite 3, shadow of mordor, watch dogs which receive same errors). i already ask to nvidia tech support they ask me to reinstall driver but it aint work. my friend has y400 which didnt have optimus activated he playing games all fine never received this error. And if i unlock the bios can i disable the optimus? so i wondering if you guys having same problem like me.. please help i really appreciate your supports:)
  19. Hi all, Been playing Watch Dogs crazily lately, though it is more frequently coming up with stutters and sometimes even crashes the game, saying that NVidia display drivers crashed. Running the game on MSI GT70 on Win8 with 8gb RAM and stock GTX680m using NVidia recommended settings. Game running of SSD. Kinda new to PC gaming, so looking for advice on PC / GPU performance mods or even game config settings. Thanks
  20. A new OS X 10.9.5 update has just been released. If you are going to update to the latest version, you will loose compatibility with the old OS X NVIDIA web driver. Do the following to re-enable your driver with the new update and new NVDIA web driver.1. Update to 10.9.5 2. Download and install NVIDIA Cuda 6.5.18 here http://us.download.nvidia.com/Mac/Qu...4.01.03f01.pkg You will not be able to install the web driver because it will not recognize your hardware; [EDIT] this step can be skipped if you downloaded a modded package.[ATTACH=CONFIG]12589[/ATTACH] Follow the steps here on how to skip the hardware verification
  21. Hi, I am new here but I have been reading the forums for a while. I have just got up the nerve to do my own eGPU. I am very confused about some things. I will ask them to clarify them even though I have read through the forums. I just cant seem to pin down with confidence that I know what I need to know. So here goes. My system is: Dell XPS L502x 1080p monitor 8gb Ram, 1TB HDD, Bluray Drive HDMI, mPCIe slot (which I think houses my intel wireless a/n/g/a/b dual band card), and mini-display port and 2- USB 3.0 ports, Card reader Nvidia Geforce 525m discrete gpu with intel hd3000 integrated gpu Intel i7 2670 qm 2.2 up to 3.1 ghz quad core processor --So here is my question: I have a gtx760 gpu to use for the egpu I will be ordering this to adapt the mpcie PE4L V2.1b + PE4L-PM060A and I have a 520 watt insignia pc power supply to power it. Now will these components work with my set up? Is this everything I need? I really do not know what I am doing here but I do everything myself so Im determined to do this as well. I think with the help of the genius minds I have been following on this forum I can do this. You people are very smart and helpful. Please let me know if I am totally off base with this set up. Thanks!
  22. Hi guys, I've got an R1 M18x, and I've just swapped out two 6990m for a pair of 780m. I flashed the BIOS to unlocked A05 before I swapped and the change seemed to go well...until I tried turning it on. Screen black, and all I'm getting is two short beeps. I've taken the ram out, swapped the order etc but to no avail. Any idea what the issue could be? Thanks for any help, Mark.
  23. GEFORCE GTX 780 MSRP: $650 GTX 780 GPU Engine Specs: 2304CUDA Cores 863Base Clock (MHz) 900Boost Clock (MHz) 160.5Texture Fill Rate (billion/sec) GTX 780 Memory Specs: 6.0 GbpsMemory Clock 3072 MBStandard Memory Config GDDR5Memory Interface 384-bitMemory Interface Width 288.4Memory Bandwidth (GB/sec) Confirmed reviews: http://www.anandtech.com/show/6973/n...gtx-780-review http://www.tomshardware.com/reviews/...view,3516.html http://www.guru3d.com/articles_pages..._review,1.html http://www.techpowerup.com/reviews/N...Force_GTX_780/ http://www.computerbase.de/artikel/g...x-780-im-test/ http://techreport.com/review/24832/n...-card-reviewed http://hothardware.com/Reviews/NVIDI...TX-780-Review/ http://videocardz.com/41960/first-ge...ichill-exposed http://www.hardocp.com/article/2013/...w#.UZ4UmlaN3Yc http://www.hardwarecanucks.com/forum...80-review.html http://www.hardware.fr/articles/894-...u-presque.html Useful Applications & Bios Mods Bios mods Applications
  24. Hey guys, recently came upon something that I thought I would share for others out there with the same issue.... Basically, I've had my eGPU setup for months and most of the games I play have been working perfectly (DX11, 1080p, 60+ fps). Some of these games include WoW, Bad Company 2, and other fairly recent titles. However, some older games have not worked properly this whole time. Those included CoD2/4/5 in particular. Even on 1600x900, no AA, etc I was struggling to maintain a stable 20+ FPS which confused me because they are less demanding than the other games that ran perfectly fine at max settings. Anyway, I seem to have finally tracked down the problem and it seems to be limited to the Call of Duty games I listed running off an eGPU setup. To investigate this problem, I went into the Nvidia control panel and noticed the Physx configuration was set to auto select, with my Core i5 and 650Ti both visible. I manually set it to the 650Ti and viola! All my CoD games running perfectly maxed out @ 1080p! I figured autoselect was the better option thinking the software would choose which option is better in terms of performance but it seems you will get better performance overall if you manually specify to only use the eGPU for Physx. My $0.02.
  25. I was wondering if the Y410p supports optimus technology ? I heard it just switches to the iGPU when on battery and the nvidia gpu when connected ?
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.