Jump to content

14" Dell Latitude E6430 - Performance Upgrades and System Mods


Recommended Posts

Hello! I have an interesting news. Starting from 368.22 Nvidia Driver (Note: but I'm not sure if it is the first driver doing so), my GTX 660 is automaticly detected on EC plug-in without the HDMI cable (win10).


- iGpu is still enabled

- External LCD link by EC

- eGpu is in Optimus mode

- All work is redirected to eGpu (if you say so in the panel)

- eGpu KEEPS 2.0 (before I was not able to keep this, and forced the 1.1x)

 

Edited by Dewos
Link to comment
Share on other sites

Power supplies don't cause throttling. Your laptop won't sense that it is at the psu power limit and throttle itself. Instead power supplies just turn off if their current rating is exceeded. In the case where the power supply is too weak its not so much as your laptop is throttling but instead that it is running on battery.

 

Some power supplies automatically turn back on after a few seconds, but most stay off until you unplug them from the wall and plug them back in.

 

Your problem is the cpu's low default vid table. You need to set the "additional turbo voltage" in ifr to raise the voltage. I can't remember if latitudes have access to this or not though.

Is possible to modify fsb or blck clock with an IFR value instead using xtu? Thanks

Link to comment
Share on other sites

 

 

Is possible to modify fsb or blck clock with an IFR value instead using xtu? Thanks

 

 

Yes, look page 3 near the end... but first you need to unlock the flash descriptor.

Already unlocked, so i'll do, thanks

About your request, the igpu only cooler is too weak for cool down an over clocked quad core, the 3720 at 3.9ghz is already too much.. I've modded the cooler of dgpu model but isn't a easy work. So, I think is better to buy a cheaper 3720 instead, bye

Link to comment
Share on other sites

after very much work i've modded the original cooler of my e6430 with igpu only combining it with the dgu model

i've desoldered the cooper heatpipes from the dgpu model, and soldered to the cpu plate of igpu only model, the plate must be machined to insert two cooler instead of one in place.

during soldering the opposite part of the heatpipes must be submerged in water or it will be desoldered and everything will be destroyed.

The necessary temperature has been obtained using an hot air gun setting to the max( around 800°C)

the alluminum tape is not strictly necessary is only to avoid small air leak  , everythink is soldered copper to copper

here are the pictures of the final result.

IMG_20160612_171531_zpsjjkgacc9.jpg~origIMG_20160612_171057_zpsykz0ozob.jpg~origIMG_20160612_171049_zpsvzgwqezj.jpg~origIMG_20160612_171038_zpszbuz9rpz.jpg~origIMG_20160612_171032_zpse4llud2p.jpg~orig

Edited by aldimeola81
  • Thumbs Up 3
Link to comment
Share on other sites

  • 2 weeks later...

Hi, I just bought Dell Latitude E6430 with NVS 5200M and its performance is really poor, only 2300pts in 3D Mark06 and on integrated HD4000 I'm getting 4900pts. Updating video drivers, system  and switching off Optimus in BIOS isn't changing anything. According to many tests found on internet it should be minimum 7300pts on NVS 5200, but I can't find out what's wrong. Same problem on Windows 7 and after updating to Windows 10, the integrated video card is much faster than Nvidia... Any suggestions?

Edited by matx86
Link to comment
Share on other sites

1 hour ago, matx86 said:

Hi, I just bought Dell Latitude E6430 with NVS 5200M and its performance is really poor, only 2300pts in 3D Mark06 and on integrated HD4000 I'm getting 4900pts. Updating video drivers, system  and switching off Optimus in BIOS isn't changing anything. According to many tests found on internet it should be minimum 7300pts on NVS 5200, but I can't find out what's wrong. Same problem on Windows 7 and after updating to Windows 10, the integrated video card is much faster than Nvidia... Any suggestions?

 

This can happen if you're using an unrecognized power supply. Do you get power adapter warnings on startup, or did but disabled them? Disabling the warnings does nothing to prevent the resulting throttling.

Link to comment
Share on other sites

On ‎22‎.‎06‎.‎2016 at 6:13 AM, Khenglish said:

 

This can happen if you're using an unrecognized power supply. Do you get power adapter warnings on startup, or did but disabled them? Disabling the warnings does nothing to prevent the resulting throttling.

 

I bought laptop with pre-installed Windows 7 and after tests updated it to Windows 10, didn't seen any warnings about power adapter but now i will make a clean install of Windows 10 and check it, thanks for advice.

 

Edit: I've checked the power supply and it's DELL oryginal. After performing clean installation of Windows 10 the same problem, around 2300 pts. in 3DMark06 using NVS 5200 with 353.62 driver version (on any older the same result) and no warnings on startup or during work about power supply.

Edited by matx86
Link to comment
Share on other sites

i'm planning to upgrade my Msi gtx 960 4g gaming with something better, i'm using dell da-2 supply, what's the best card i can buy around 400€? thanks.

 

due is an egpu system i prefer if the video card got a backplate,so i've seen zotac gtx980 .. is too big for dell power supply? thanks.

Edited by aldimeola81
Link to comment
Share on other sites

Wow this laptop is full of wonders :)

I was able to boot and even make some tests with single channel RAM set to 2400 MHz

 

So far i have never had problems with Dual channel set at 2133 MHz @1.35v directly as JEDEC (2x Crucial 8GB @1600MHz - - CT102464BF160B.C16) the cheapest RAM i could find ... 

 

I  found some XMP profiles set to 2400 and decided to play a little bit further.
Single channel is able to run some tests but gives blue screens and fails after few seconds of Prime95.

Dual channel does not boot
 

Tried 2400MHz (1.65v/1.5v):

@11 13 13 31 50 361 7 16 9 9 

@12 13 13 31 50 361 7 16 9 9 

@13 13 13 31 50 361 7 16 9 9 
@11 11 11 11 50 361 7 16 9 9 

 

All fail Prime 95... played with other timings but no luck.

 

BTW the CPU runs stable at 4x3.8GHz (3720QM QBC1) for higher frequencies it needs more voltage :( (And as timohour & Khenglish pointed out cannot be altered )

Too bad I bet this CPU can do 4.5GHz.

Cannot wait to buy video card and play with the eGPU once again ( i had for a while but I sold it shortly after i bought this laptop)

 


 

 

 

2400fail.png

read_2400mhz.png

ups.png

  • Thumbs Up 1
Link to comment
Share on other sites

@timohour   

 

i'm using the pe4c v3.0 with single 2.0 lane  but i would use x2 2.0 link as pe4c 2.1 do, is possible to solder the additional lane to the pe4c v3 pcb? someone got the schematics? or has made some diy adapter? i got no problem with soldering or doing hardware mods but i need to know how add second lane, thanks.

Link to comment
Share on other sites

@timohour   

 

i'm using the pe4c v3.0 with single 2.0 lane  but i would use x2 2.0 link as pe4c 2.1 do, is possible to solder the additional lane to the pe4c v3 pcb? someone got the schematics? or has made some diy adapter? i got no problem with soldering or doing hardware mods but i need to know how add second lane, thanks.

Yes you can... you just have to use your internal mPCIe ports. You can't use EC for x 2.2 link.

If you do have a mPCIe adapter for the PE4C v3.0 I can guide you how to solder the 2nd lanes signals.

Sent from my Redmi Note 3 using Tapatalk

Link to comment
Share on other sites

@timohour   

 

i'm using the pe4c v3.0 with single 2.0 lane  but i would use x2 2.0 link as pe4c 2.1 do, is possible to solder the additional lane to the pe4c v3 pcb? someone got the schematics? or has made some diy adapter? i got no problem with soldering or doing hardware mods but i need to know how add second lane, thanks.

Yes you can... you just have to use your internal mPCIe ports. You can't use EC for x 2.2 link.

If you do have a mPCIe adapter for the PE4C v3.0 I can guide you how to solder the 2nd lanes signals.

BTW, your cooling mod rocks.

Link to comment
Share on other sites

@timohour

 

thanks for compliments..

 

i want to add second lane, using an internal mpci ..  But i've understood well? I will use only one mpci to obtain 2.2 connection? Not expresscard too?

 

please send me a pinout, thanks.

 

p.s. is possible to do 4x mods?  :devil:

 

 

 

Link to comment
Share on other sites

So I finally have an update on my machine - got rid of the 3740QM, running too hot, replaced it with a 3612QM and everything is much smoother overall.

I ordered another EXP GDC Beast (v8.0 this time). The mPCIe version was 33$ and it convinced me to pull the plug on it - unfortunately it confirmed my hunch, and it's very disappointing.

This new adapter can't hold Gen2 signal either.... on expressCard.

But on mPCIe it can. It ocasionally throttles down to 1.1 for a microsecond and then ramps back up. Gonna game a bit on it but... very disappointing :(

Considering the ExpressCard slot seems unstable... forking over the 100$ for a PE4C v3.0 seems like a much harder swallow.

Link to comment
Share on other sites

Tried switching out the newer EXP GDC for the older EXP GDC using the mPCIe cable, and it seems the old Beast doesn't even throttle down to 1.1, while the newer one does. So the older one actually has better signal integrity - just not over expresscard :( And i don't know if it's the cable or the expressCard slot.

@Dewos, didn't you have this same exact problem?

Edited by sangemaru
Link to comment
Share on other sites

On 6/11/2016 at 11:43 PM, carbotaniuman said:

I'm planning to upgrade my E6430 to an i7-3740qm and I was wondering if there would be any throttling concerns. iGPU only.

I disagree. Used iGPU only version with the 3740QM decently cooled with liquid metal and the best pastes. It quickly ramps up to 105C and then hard-throttles down to about 3.2GHz, and slows down massively in games. Switching to a 3612qm made games much smoother.

Link to comment
Share on other sites

I have a humble question about OCing the IGPU (I am very new to this). I booted into grub and used setup_var to alter these variables:

Setting: GT OverClocking Support, Variable: 0x16F
Option: Disabled, Value: 0x0 
Option: Enabled, Value: 0x1 
End of Options 

Numeric: GT OverClocking Frequency, Variable: 0x170
Default: 8 Bit, Value: 0x16
End 

Numeric: GT OverClocking Voltage , Variable: 0x171 
Default: 8 Bit, Value: 0x0 
End

I noticed, however, when I did "setup_var 0x16F" it returned a message about unexpected size and GUID. The value returned was "0x00" I set this value to 0x01 and later 0x10. I also altered 0x170 and 0x171 to 0x1f and 0x25 respectively. 

 

After I rebooted and used GPU-Z I noticed no improvement in my max graphic's frequency. It still remained at a hard limit of 1250Mhz. Is there something that I am doing blatantly wrong? Thank you for your time. 

Edited by Hwrgrabe
Link to comment
Share on other sites

0x16F set to 0x1

Make sure to keep in mind the range posted in badbadbad's guide:

GT OverClocking Frequency 0x170 0x00-0xFF (8-bit value from 0-255) [iGPU] [the decimal value] x 50MHz (Example: 34 x 50mhz = 1700mhz)
GT Overclocking Voltage 0x170 0x00-0xFF (8-bit value from 0-255) [iGPU] 0.01 increment for every value from 0x00 to 0xFF (Ex: 0x05 = +0.05V)

 

 

 

 

 

So if  you'd like to try a relatively safe clock, try setup_var 0x170 0x1c which should set to 1400Mhz. Setting it to 0x17f most likely registers it as 0x7f which is 127 in decimal, which is way out of bounds.

As for voltage, you gave it +0.37v, which should be enough to get you well over 1500MHz. Check the table below:

GT Overclocking Frequency
Value
Resulting Frequency
(from GPU-z log)
GT Overclocking Voltage
Value
Voltage Increment
(speculation)
Memory Frequency
(Dual Channel)
Highest GPU Temperature
(from GPU-z log)
Highest GPU Power
(from GPU-z log)
Furmark 720p
Bencmark
unchanged 1100 MHz unchanged +0.00 V 1600 MHz   18.1 W 371
unchanged 1250 MHz unchanged +0.00 V 2133 MHz   17.9 W 515
0x1a 1300 MHz unchanged +0.00 V 2133 MHz   19.5 W 517
0x1b 1350 MHz unchanged +0.00 V 2133 MHz   21.1 W 532
0x1c 1400 MHz unchanged +0.00 V 2133 MHz 81 C 22.2 W 549
0x1d 1450 MHz 0x05 +0.05 V 2133 MHz 83 C 23.5 W 553
0x1e 1500 MHz 0x15 +0.21 V 2133 MHz 84 C 27.5 W 563
0x1f 1550 MHz 0x25 +0.37 V 2133 MHz 87 C 31.6 W 589
0x20 1600 MHz 0x40 +0.65 V 2133 MHz 93 C 37.4 W 639
0x21 1650 MHz 0x50 +0.80 V 2133 MHz 102 C   717

 

Anyway, use throttlestop's TPL option to enable Intel Power Balance and give 0 to the CPU and 31 to GPU in order to test overclocks. It's going to get pretty hot pretty fast.

 

  • Thumbs Up 3
Link to comment
Share on other sites

2 hours ago, sangemaru said:

0x16F set to 0x1

Make sure to keep in mind the range posted in badbadbad's guide:

GT OverClocking Frequency 0x170 0x00-0xFF (8-bit value from 0-255) [iGPU] [the decimal value] x 50MHz (Example: 34 x 50mhz = 1700mhz)
GT Overclocking Voltage 0x170 0x00-0xFF (8-bit value from 0-255) [iGPU] 0.01 increment for every value from 0x00 to 0xFF (Ex: 0x05 = +0.05V)

 

 

 

 

 

So if  you'd like to try a relatively safe clock, try setup_var 0x170 0x1c which should set to 1400Mhz. Setting it to 0x17f most likely registers it as 0x7f which is 127 in decimal, which is way out of bounds.

 

Thank you so much for responding! My apologies I was not clear and I also mis-typed (I've have attached an image). I tried setting 0x16F to 0x1, but it ends up becoming 0x01. Originally when I did setup_var 0x16f, it displayed 0x00. Setup_var 0x170 showed 0x16 and 0x171 showed 0x00. Also I set 0x170 to 0x1f. 

 

I believe the main issue is the enabler. Somehow I'm getting an 8bit value for 0x16f when it's only suppose to be 0x0 or 0x1. 

 

Here is the image: http://imgur.com/JEHuKAX (Sorry, I couldn't directly upload it to here. There was an error)

I also get this error for 1 second before the command line shows up: http://imgur.com/a/fnRAh

 

Edited by Hwrgrabe
Link to comment
Share on other sites

Hello.

I want to get my Dell E6430 cooler. CPU Temps are at Idle about 60°C and fan is at 0 RPM.
In Speedfan I can turn up the fan for maybe for a second until the motherboard turns it back to slow.
Is there a solution to deactivate bios fan control ?

 

-BIOS is up to date

-Advanced dell energy options for cooling dont have any effects

-Fan has been cleaned.

-New thermal paste is coming soon

 

 

 

Maybe with this tutorial: Use UEFI variables to change many hidden BIOS options - How to make a permanent boot option to EFI Shell guide here  ?
 

Or is there another way to control the fan?

Thanks in advance!
 

Edited by lonkodonko
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.