Jump to content

timohour

Registered User
  • Posts

    295
  • Joined

  • Last visited

  • Days Won

    12

Posts posted by timohour

  1. Thank you for the feedback @timohour ! I've corrected the original post to allow the mods to do their cleanup magic. I haven't done the major tweaks yet, so most of it came from my understanding and imagination.

    Added this on the first post.. Thanks for your efforts

    port 1 variable is 0xb2 port 4 0xb5

    Also some variables need cold boot to take effect so you better shutdown and start up

  2. OK this is almost perfect... I love it! Just needs some corrections

    UPDATE: Check the corrected post above.

    Spoiler



    Quote

    Actually, I had to hold off the task at hand because someone borrowed my laptop tools.

    I started writing the Step-by-Step guide and would like to clarify a few things. Hope I could get your input on this.



    NEWBIE GUIDE TO TWEAKING THE DELL E6430



    Permanently Unlocking Flash Descriptor Requirements: Paper clip, USB Flash drive with FreeDOS (c/o Rufus) and fpt.exe, FITC.exe or any Hex Editor (ex. HxD) Purpose: Allow Flashing of Modified ME Firmware; Prerequisite for UEFI variable modifications

    You don't need to unlock your Flash Descriptor to make UEFI variable modifications. Add here the ability to set port 1 or port3 to x2 mode.
    Quote

    Steps:

    • Locate the IDT chip

    • Using a paperclip, bridge Pin 5 to either Pin 9 or the resistor and boot the laptop

    • Open image.bin in a Hex Editor like HxD -where do I find image.bin?



    Ok, After booting with the paperclip "Boot to FreeDOS on startup using the USB (Press F12)". Use the command
    fpt -d image.bin
    to store a full image of your BIOS.
    Quote





    • Locate HEX string: 00 00 0B 0A 00 00 0D 0C 18 01 0

    • Replace string with: 00 00 FF FF 00 00 FF FF 18 01 08

    • Save the file and reboot laptop

    • Boot to FreeDOS on startup using the USB (Press F12) -should I bridge pins 5 and 9 again?



    If you reboot the laptop you don't have to apply the pinmod again. If you shutdown and then startup you need to redo the pinmod again to enablel writing to the Descriptor region.
    Quote



    • Flash your new descriptor with fpt.exe using the command: fpt -desc -f editedimage.bin

    Note/s: If the bridging of pins was not done correctly, you will get Error 26

    Backup Original BIOS

    This precedure is part of the Permanently Unlocking Descriptor... the image.bin you obtain after your first boot is the original BIOS.
    Quote

    Requirements: USB Flash drive with FreeDOS (c/o Rufus) and fpt.exe; Unlocked Flash Descriptor Purpose: Creates a dump of your Original Bios Steps:

    • Boot to FreeDOS on startup using the USB (continued from Permanently Unlocking Flash Descriptor)

    • Backup BIOS using the command: fpt -d filename.bin



    Note/s: Keep the backup .bin file in a safe place in case you need to revert back to original settings (for warranty/selling)



    Flash Modified ME Firmware and BLCK Overclocking Requirements: USB Flash drive with FreeDOS (c/o Rufus), fpt.exe, and Khenglish's E6430_OC.bin; Unlocked Flash Descriptor Purpose: Overclocking using BCLK, via XTU Steps:

    • Boot to FreeDOS on startup using the USB (continued from Permanently Unlocking Flash Descriptor)

    • Flash Modified ME Firmware using the command: fpt - f -me E6430_OC.bin



    This is maybebe my mistake, the command is
     fpt -me -f yourimage.bin
    Quote





    • Reboot

    • Install Intel Extreme Tuning Utility (XTU)

    • Open XTU

    • Select Manual Tuning and adjust slider to desired overclock



    Quote

    Note/s: When balancing between UEFI TDP/Multiplier Unlocking and BCLK Overclocking, it is better to set the highest BCLK possible first, then adjust multipliers accordingly.



    Setting Port1 @ x2.2 Requirements: USB Flash drive with FreeDOS (c/o Rufus), and fpt.exe; FITC.exe; Unlocked Flash Descriptor Purpose: Combining two PCIe 2.0 ports to allow eGPU at x2.2 configuration (8GT/s) Steps:

    • Open FITC.exe



    At this moment you load your BIOS image (or your Descriptor region only)
    Quote





    • Navigate to: Flash ImageDescriptor RegionPCH StrapsPCH Strap 9

    • Double click on PCIe Port Configuration 1

    • Replace 00: 4x1 Ports 1-4 (x1) with 01: 1x2, 2x1 Port 1 (x2), Port 2 (disabled), Ports 3,4 (x1)

    • On the menu bar, select Build -> Build Image (or alternatively, press F5)

    • After the build completes, go to the FITC folder where an outimage.bin has been created

    • Transfer the file to your FreeDOS USB

    • Boot to FreeDOS on startup using the USB (Press F12)

    • Flash with fpt.exe using the command: fpt -desc -f editedimage.bin



    Quote

    Note/s: Port 2 will no longer function as long as Port 1 is set at x2.2 (transfer your Wireless card to Port 5)

    If you only want the descriptor region of the outimage.bin file (due to limited space on your DOS disc), you can reopen outimage.bin with FITC and inside the FITC folder you will find a new folder with the name of your bios image. Inside the Decomp folder you will find a 4K Flash Descriptor.bin which you can flash using the command: fpt -desc -f filename.bin )





    UEFI Variable Modifications Requirements: FAT32 formatted USB with GRUB bootloader; Unlocked Flash Descriptor

    You can change the UEFI vars without unlocking your Descriptor
    Quote

    Purpose: PCIe Generation 1/2 toggle, TOLUD/Power/CPU/RAM/Battery tweaks, and many more yet to be discovered! Steps:

    • Boot to GRUB on startup using the USB (Press F12)

    • Test out GRUB by looking up RAID0 value (use the command: setup_var 0x12D )

    • Depending on the desired modification, change the variables using the command: setup_var variable value





    Hope you don't mind I will make the corrections in the table.
    Quote

    Modifiable Variables

    [/TABLE]













    Reboot [/QUOTE]

    Quote

    Note/s: Before booting to GRUB, make sure that the GRUB bootloader is located in: EFIBootootx64.efi

    No need for GRUB!!! Select the UEFI USB under the F12 menu. You just need the bootx64.efi file under the specified location. [/spoiler]
    Option Variable Values Notes
    PCIe Speed (Gen1/2)

    This is for port 3 Expresscard every port has its own Variable. Check the IFR file.
    0xB4 0x0 (Auto), 0x1 (Gen1), 0x2 (Gen 2) [PCIe] Gen1 works if plugged prior to boot, Reverts back to Gen2 if hot plugged
    Set Max TOLUD 0x275 0x0 (Dynamic), 0x3 (1GB), 0x4 (1.25Gb), 0x5 ( 1.5GB), 0x6 ( 1.75GB), 0x7 ( 2GB), 0x8 ( 2.25GB), 0x9 ( 2.5GB), 0xA ( 2.75GB), 0xB ( 3GB), 0xC ( 3.25GB) [PCIe] Adjust TOLUD to enable more space for PCIe compaction
    Long Duration Power Limit 0xB40 Hexadecimal value. eg 80W =

    0x50
    This is to set the TDP for your CPU. The higher the frequency for your CPU the highest the power it uses. Giving a higher TDP value will save you from unnecessary TDP throttling issues. Default is 45W for a quad core CPU. Optimal setting is the one that you don't have TDP Throttling (check with Throttlestop)
    Short Duration Power Limit 0xB41 Hexadecimal value. This is to set the short duration TDP limit for your CPU. Default value is +12,5% of the original...
    BDPROCHOT ?? ?? BiDirectional PROCHOT is a setting in the ThrottlelStop program to avoid CPU Throttling. This is not a UEFI Variable setting. Read more in the ThrottleStop Documentation.
    1-Core Ratio Limit 0x25 0x00-0xFF (8-bit value from 0-255) [CPU] Refer to this post for figuring out the Multiplier limit. Depends on the CPU. But even setting a higher value, if your CPU is not fully unlocked it will revert to the highest value.
    2-Core Ratio Limit 0x26 0x00-0xFF (8-bit value from 0-255) [CPU] One value less than 1-Core
    3-Core Ratio Limit 0x27 0x00-0xFF (8-bit value from 0-255) [CPU] One value less than 2-Core
    4-Core Ratio Limit 0x28 0x00-0xFF (8-bit value from 0-255) [CPU] Same value as 3-Core
    DIMM Profile 0x1EE 0x0 (Default),

    0x1 (Custom), 0x2 (XMP Profile 1), 0x3 (XMP Profile 2)
    [RAM] Not needed if RAM has existing JEDEC OC'd profile, Locked RAM requires pre-flashing of XMP Profile (then select 0x2 or 0x3)
    ASPM Support 0xC 0x0 (Disabled), 0x37 (Auto), 0x1 (Force L0s) [Battery] Set to Auto
    Native ASPM 0xB04 0x0 (Disable), 0x1 (Enable) [Battery] Set to Enable
    GT OverClocking Support 0x16F 0x0 (Disable), 0x1 (Enable) [iGPU] Enable to allow iGPU Overclocking
    GT OverClocking Frequency 0x170 0x00-0xFF (8-bit value from 0-255) [iGPU] x 50MHz (Example: 34 x 50mhz = 1700mhz)[/COLOR][/TD] [/TR]
    GT Overclocking Voltage 0x170 0x00-0xFF (8-bit value from 0-255) [iGPU] 0.01 increment for every value from 0x00 to 0xFF (Ex: 0x05 = +0.05V)
    RAID0 0x12D 0x0 (Disable), 0x1 (Enable) [H] Toggle RAID0 array
    RAID1 0x12E 0x0 (Disable), 0x1 (Enable) [H] Toggle RAID1 array
    RAID10 0x12F 0x0 (Disable), 0x1 (Enable) [H] Toggle RAID10 array
    RAID5 0x130 0x0 (Disable), 0x1 (Enable) [H] Toggle RAID5 array
    Intel Rapid Recovery Technology 0x131 0x0 (Disable), 0x1 (Enable) [H] Toggle Intel Rapid Recovery Technology

    • Thumbs Up 2
  3. I'm currently in the process of unlocking the Flash Descriptor. Documenting the process so we can make the guide even easier to follow.

    I've downloaded FITC already. But I don't know what to do from there. I see the Flash Image\Bios Region

    You are right... You are right, this is not at all helpful. I assume you already unlocked your Descriptor.

    If you not, open your image.bin in a Hex Editor (I use HxD) and apply the change you describe above...

    Save the file and flash your new descriptor using fpt.exe in a bootable DOS disk using the command

     fpt -desc -f [I]editedimage.bin[/I]

    Using -desc is to verify that you are only flashing your descriptor region...

    Now that your descriptor is unlocked keep your original BIOS in a safe place (in case you need to return it for warranty issues or sell it) and open your unlocked edited image in FITC.

    In the album below you can see that the PCIe Port Configuration is in PCH Strap 9. Originally 00 is selected (4x1 ports 1-4 (x1)). To set your port 1 @ x2 you should select 01 (1x2, 2x1 Port 1 (x2), Port 2 (wifi disabled), Ports 3,4 (x1)).

    After selecting press OK go to menu Build and select Build Image (or press F5). After the build completes go into the FITC folder and there is going to be an outimage.bin created. This will be the full bios with your new Descriptor settings. Flash it and you are good to go...

    (Optional: If you want only the descriptor region because there is limited space on your DOS disc, you can reopen the new image using FITC and inside the FITC folder you will find a new folder with the name of your bios image. Inside the Decomp folder you will find a 4K Flash Descriptor.bin you can use to flash your descriptor using the command I gave you above)

    Your WiFi port will be disabled but you can connect your WiFi under the WWAN port 5 which will be enabled (tested and working)

    Oh you have a 130W. I only have 90W PSUs. That should never shut down with the relatively low power gpu.

    beat_brick.gif

    My stupidity is unique. I thought that I was using the 130W but I was using the 90W.... It goes up to 882MHz stable now... Thanks for the heads up...

    • Thumbs Up 1
  4. I get PSU shutdowns when overclocking both GPU and CPU, so overvolting the GPU would make that worse unless you got a stronger PSU. It'd definitely help though. I extracted the vBIOS from the sBIOS, but overvolting tools don't properly recognize the vBIOS to modify it. Volt mods might be hardmod only.

    FYI my 5200m does 844/2092 stable.

    844 stable??? Mine won't complete an 3DMark06 @ 800/2050. :( In the middle of the test the driver restarts and the test fails. Any ideas? Maybe just a bad chip? Or power related?

    Your 3920XM is far more power hungry than my 3720QM... @ 3,8GHz my top TDP would be 55-60W. So I think that the 130W PSU would be enough, wouldn't it?

    Although, I don't think I will do any hardmods until the warranty expires.

  5. OCing NVS5200M on a Dell Latitude E6430 

    Maybe I should consider a head to head comparison with the NVS5200M on the dGPU model too...


    Overclocking the NVS5200M is fairly easy compared to iGPU OC. OCing the NVS5200M's core and memory by ~30% can achieve up to ~35% better performance (Furmark Benchmark) and provide great graphics performance for gaming on the go. I can say that the performance gain is almost linear.

    To accomplish the OC you can use any application which can manipulate core/memory clocks.The lighest/easiest to use IMHO is the Nvidia Inspector. You can download the software from the guru3d.com site. No installation is needed, just download the rar extract it and fire it up.

    The GUI is pretty straight forward.



    Offcourse for a more permanent solution or if you also use eGPU it is better to use MSI Afterburner and create profiles.

    KirVSlg.jpg


    Tested with the Furmark 720p benchmark and 3DMark06 to measure the performance gain from one frequency to the other and the difference btw iGPU-dGPU.

    All tests where done using my i7-3720QM (4 top bins unlcocked) with 2x4GB Kingston RAM @ 2133MHz.

    Core clock runs stock @672 MHz and went all the way up to 891 MHz with a total performance gain up to ~31%.

    Be sure to use a 130W PSU though because with 90W and a 4C i7 you are limited to 780MHz stable.


    (A high end GF108 implementation (e.g. GT440) running @ 810 MHz Core 1800 MHz memory has TDP up to 65W. Add to that the 45W an 4C i7 draws or the ~60W an unlocked i7 draws, you may be at the limits of your 130W PSU, let alone 90W)

    GPU clock would go up to 891 MHz and the maximum memory would be 2100 MHz. I needed to lower it a bit more to get some stable 3DMark06 results

    The following tables shows the results:

    Furmark Benchmark
     
    GPU Clock
    dGPU
    Memory Clock
    dGPU
    Shader Clock
    dGPU
    Voltage 
    dGPU
    Furmark 720p
    Bencmark dGPU
    Furmark 720p
    Bencmark iGPU
    Core Clock
    iGPU
    Memory Clock
    system/iGPU
    672 MHz 1567 MHz 1344 MHz 0.98 V 542 515 1250 MHz 2133 MHz
    750 MHz 2050 MHz 1500 MHz 0.98 V 632 639 1600 MHz 2133 MHz
    891 MHz 2092 MHz 1882 MHz 0.98 V 728 717 1650 MHz 2133 MHz


    Raw power and stock performance seems to be on par and better than the Intel HD4000 and a more syntetic benchmark seems to tell the same story

    3DMark06 Benchmark
     

    GPU Core Clock GPU Memory Clock 3DMark06 Score SM 2.0 Score HDR/SM 3.0 Score CPU Score Performance Gain
    672 MHz 1567 MHz 9312 3615 3248 7213  
    780 MHz 2050 MHz 11028 4331 3912 7334 up to 25%
    882 MHz 2092 MHz 11962 4736 4234 7700 up to 31%

    NVS5200M, i7-3720QM @ 3.8GHz 4C, 2x4GB @2133MHz

    In comparison the Intel HD4000 performance was:
     

    GPU Core Clock GPU Memory Clock 3DMark06 Score SM 2.0 Score HDR/SM 3.0 Score CPU Score Performance Gain
    1250 MHz 2133 MHz 7850 2436 3250 7698  
    1600 MHz 2133 MHz 9250 2942 3850 7516 up to 21%

    Intel HD4000, i7-3720QM @ 3.8GHz 4C, 2x4GB @2133MHz

    Perforance Gain from stock clocks goes up to 25% on 3DMark06 and possibly on most games. 

    When running at stock frequencies it seems to be on par with Intel HD4000 (or even a bit faster) but when OCed it is up to 30% faster. It also runs much cooler than the iGPU OCed. Maximum Furmark temperature was @ 70C.

    @Khenglish has promised that he would probably manage to run the 5200M memory to run at quad rate.

    The E6530's NVS5200 overclocked only slightly beats out an overclocked HD4000 on average, and in some games even loses. Hopefully I can get the 5200's memory to run at quad rate soon. It physically has GDDR5, but it's only running at dual rate.

    If this is possible it would have a great impact on performance.

    UPDATE:
    The memory is already running at quad rate. I forgot the memory bus was only 64-bit so I interpreted it wrong.



    Also @ 800MHz it seems unstable like it needs more voltage. Temperatures were ok @ 55C (3DMark06) and up to 70C on Furmark. If we could somehow increase a little the core voltage, we could maybe reach 1GHz core clock 
    (same core implementations GF117 GT720M @938MHz, GT 820M @954MHz), and if the performance gain is as linear as it is from 672-800MHz, it would hit up to 14K in 3DMark06.

    UPDATE:
    I was stupid enough to use my 90W brick during testing... (I thought I was using the 130W). After using the 130W psu I could hit and benchmark @ 882MHz.

    Also 5200M seems to be using the GF108 core, not the GF117..GT440 OCing tests show stable performance up to 910MHz but there are users claiming that they reached 970MHz. Maybe I need a befier PSU to achieve these numbers but I have reasons to believe that there are surely going to be cooling issues...
    • Thumbs Up 1
  6. Ah, now I understand. It's a bit confounding to see these design decisions resulting in such wasted potential.

    Nobody had in mind when they built the E6430 that it would be an eGPU candidate. The eGPU community promoted this idea and this is how Intel was convinced to support it with TB3.

    Did you try keep the iGPU enabled while connecting the eGPU before boot? Is it working?

  7. @timohour do you have an opinion on the settings for setting ports PEG0/1/2 do Gen3 Speed with UEFI vars? The PE4C v3.0 is advertised as offering Gen3, right? The expresscard may be gen2 only, but what about the WWAN/WLAN or that mPCIe half/mini port? Wouldn't that raise the transfer rate to 8 Gbps? (probably around 6.5 after 8b/10b encode/decode). Or am I misinterpreting something? What does the Gen1/Gen2/Gen3 setting on the PEG ports refer to?

    As Tech Inferno Fan noted

    Intel Series 6-8 chipsets support max Gen2 on the Southbridge PCIe ports used to host mPCIe and expresscard slots. It's only the northbridge PCIe port used to host dGPUs that are Gen3 capable

    in Series7+ chipsets. If Gen3 was possible on mPCIe/EC slots I can assure you it would have been set many times over already.

    EC - mPCIe --> Southbridge Gen2

    dGPU-->          Northbridge Gen 3

    dGPU on the Gen3 northbridge is wired directly to the CPU as you can see here pg.115.

    By the way Gen 3 uses a more efficient 128b/130b encoding resulting in a real bandwith per pair of 984 MB/s out of the 8 GT/s, not 8b/10b. pg.30

    On pg.30 you can also see that the configuration with the more Gen3 capable PCIe ports is 1x8 and 2x4, a total of 3 ports. That's what PEG0/1/2 is. In our E6430 it is set as 1x16 and it is dedicated for the dGPU which is the only port you can set to Gen3 speed. (NVS5200M is only Gen2 capable so it is mostly a waste). There are no systems AFAIK that sport an EC or mPCIe port rated Gen3.

    EDIT : Just for proof I tried some fiddling with the PEG0 port speed Gen.

    Setting variable 0x1f5 to Gen1 will result in the following result

    a1seb2Il.jpgU0JOHg1l.png

    While on default it is set to Gen2 will result to this

    x5swoUUl.jpgPKuZjsul.png

    Setting this to Gen3 (although it is useless since NVS5200M is just Gen2 capable would have some strange effect on the card clocks. It would clock @ 202MHz and it would have terrible performance... I had to clear CMOS to revert to default clocks and Optimus to work as normal.

    • Thumbs Up 2
  8. How are you setting Gen1 speed? I tried using UEFI variables to set Gen2 speed a Dell E5540 and my EC port remained at the bios-set Gen1 speed.

    I'd advise checking the port speed with GPU-Z (with the GPU under load) or Setup 1.30.

    OK I think I found the problem. The value you set with UEFI vars you need to have the device connected @boot in order to work. If you disconnect it or connect it after boot it will revert to Gen2 speeds.

    Tested using Setup 1.30 (thanks)

    x5swoUU.jpgKP95I4P.jpg

    I set both ports 2 and 3 to Gen1. You can see that port 2 is always set to Gen1 cause it is there when the laptop boots.

    Port3 on the other hand is set to Gen2 on the first screenshot because there is no device connected before boot. The same happens if I disconnect/hotplug the device after boot.

    On the 2nd screenshot it is set to Gen1 cause the device (EC Asmedia USB 3.0 module) was connected before boot.

    @sangemaru That's why you have problems when you connect your eGPU after boot. It is because your card is connected with Gen2 speeds and your gear cannot handle it without issues. You can use Setup 1.30 in order to change the port to Gen1 speeds after boot, Or try if one of the above settings to keep your iGPU enabled after boot with eGPU connected works.

    Thanks @Tech Inferno Fan for pointing that out.

    • Thumbs Up 1
  9. My main problem, is that the behavior exhibited when enabling Gen2 (driver crash and reset, system freeze) is also exhibited when the iGPU is active at the same time as the eGPU, whether or not the adapter is set to Gen1 or Gen2. It is possible to boot with both iGPU and the eGPU, but putting any activity on the eGPU in that situation (such as plugging in an external display) will crash my drivers (so long as I have any drivers installed) and make the system almost unusable.

    I can only reliably use the eGPU by booting it as primary. An acceptable compromise might be booting the eGPU as main, but still somehow enabling the iGPU as secondary, so that I can get internal display. I wonder if setting the Internal Graphics variable to Enabled instead of Auto, and the Primary Display variable to PEG (this is a pretty poor selection, does the ExpressCard count as PEG?) might allow me to retain internal LCD capabilities.

    Sleep / connect / wake = guaranteed BSOD. Hotplugging to the EC slot after post before Windows, I've tried a few times, but tends to stall my windows loading more often than not - either way so long as I get to windows (or Ubuntu) with both the iGPU and the eGPU, the system isn't stable and will BSOD or freeze sooner or later.

    Your hangs don't make sense (to me). If it works without the iGPU it should work with the iGPU/eGPU too. The sleep-resume method should also work. I had this random freezes with iGPU/eGPU when I tested Virtu the first time. The problem then, was the HD3000 driver that came with Windows (WDDM).

    Installing the latest driver from Intel should solve this problem. In order to install the Intel Driver you may need to manually uninstall any previous version from the Device Manager.

    Also if possible consider rolling back to windows 7 or 8.1.

    PEG among other things :)stands for PCI Express Graphics. So any graphics connected to a pci express slot is considered as PEG. Setting the Primary Display to PEG may cause problems though when you start your laptop without the eGPU connected.

    By the way, if I consider using a x2.2 setup, do you have any suggestions on making it less intrusive? Or do I have to give up either WLAN or WWAN, and make a very unsightly hole in my laptop or wear it with no bottom plate?

    Thanks for those variables, will try out combinations soon. Hope I don't render the system unbootable and have to clear CMOS again :D

    If you are going to purchase new gear, badbadbad has already ordered PE4C for x2.2, so we better wait and see how it works for him.

    • Thumbs Up 1
  10. Quote

    I got my hands on an external screen to experiment with, and the following conclusions have been drawn: My EXP GDC Beast works ONLY if: - I boot from the eGPU directly (booting from iGPU will either stall my boot, bsod, or make it to windows only for the driver to start crashing constantly once I connect the external monitor). - I use Gen1 Expresscard (Gen2 will do all of the above, regardless from which device I boot).

    So my EXP GDC only works at gen 1 speed. This sucks. I can't even use both internal and external LCD, external only.

    P6568 3DMark11 Score 4466 3DMark Firestrike



    If you recently purchased your EXP GDC you should consider talking to your seller for compensation cause it is supposed to be gen2 compatible.

    Many users and myself have connected Gen2 compatible hardware on the EC slot without issues both Nvidia and AMD and we had no problem thus it is probably the adapter's issue.

    If you are going to boot with the eGPU plugged in, latitudes disable the iGPU and you won't be able to enable it no matter what. That's why you don't have internal screen along with external.

    You may try this EFI variable and set Primary Display to iGPU (if it is not already selected).
    Setting: Primary Display, Variable: 0x1D4 
    Option: Auto, Value: 0x3
    Option: IGFX, Value: 0x0 
    Option: PEG, Value: 0x1
    Option: PCI Bus, Value: 0x2

    or maybe this one

    Setting: Internal Graphics, Variable: 0x1D8

    Option: Auto, Value: 0x2

    Option: Disabled, Value: 0x0

    Option: Enabled, Value: 0x1
    This way you may be able to cold boot your laptop with the eGPU connected and it won't disable the iGPU. (haven't tried it, it may not work)

    Have you tried hotplugging to the EC slot after POST before windows? That's how it worked for me. You can also try sleep, connect and then wake. It worked for W7 and W8.1 it most probably works with W10.
    • Thumbs Up 1
  11. I'm using an r9-270x. No nvidia gpu unfortunately, and as far as i can tell, optimus internal lcd mode doesn't work for people in windows 10 anyway.

    Virtu also refuses to work in windows 10.

    Sent from my Neken N6 using Tapatalk

    If that's the case I think that the only solution (until you buy an external screen) would be to connect a dummy plug on your card, ran the application/game in windowed mode and then drag it on your internal screen.

    You can also try using Ultramon. I remember using it back in the day to help me drag the window from the dummy screen (used with an old HD4870 some years back)

    EDIT: Found the original post by Tech Inferno Fan here. Hope Ultramon works for 10 (it works on 8.1 without issues)

    • Thumbs Up 1
  12. Yea, made that happen. Actually setting max tolud is all that mattered. The crashing appears to have been due to virtu mvp (though i still can't hotplug).

    The last hangup is trying to find a workable way to render on the internal screen without virtu mvp.

    Sent from my Neken N6 using Tapatalk

    You are using the HD4850? If yes you won't be able to use it with internal screen even with Virtu Mvp.

    If you are using an Nvidia GPU follow Tech Inferno Fan guide for the http://forum.techinferno.com/implementation-guides-pc/2747-12-dell-e6230-hd7870-gtx660%40x4gbps-c-ec2-pe4l-2-1b-win7-%5BTech Inferno Fan%5D.html.

    This way you will have optimus enabled and will be able to use the internal screen without issues.

  13. Quote

    Set var 0x1f8 (PEG3, the only PEG device that doesn't offer a Gen3 speed setting, I'm assuming this one is the ExpressCard) to 0x1, no change.

    This makes me wonder. If the other 3 PEG devices (devices 0,1 and 2, which I'm assuming are the mPCIe slots) support Gen3, then wouldn't using one of these in conjunction with the PE4C v3.0 with Gen3 support offer another doubling of bandwidth? Since they can be set manually to gen3 through EFI vars, PCIE x1 3.0 should be equal to x2 2.0 and x4 1.0 right? (with the added benefit that being x1 would engage nvidia compression).

    Also Ubuntu won't boot. Didn't try hotplugging ubuntu yet. EDIT: Booting an old HD4850 works, though the driver support is pretty much not there, all it can do is browse. EDIT2: Managed to boot it up properly using Leshcat drivers, after uninstalling VIRTU MVP. Virtu MVP software makes the gpu unusable. The question now is, how do I use the eGPU on internal display?



    Quote

    Intel Series 6-8 chipsets support max Gen2 on the Southbridge PCIe ports used to host mPCIe and expresscard slots. It's only the northbridge PCIe port used to host dGPUs that are Gen3 capable in Series7+ chipsets.

    If Gen3 was possible on mPCIe/EC slots I can assure you it would have been set many times over already.



    You are trying to change the gen on the wrong variable. This one is for PEG device 3 which is connected directly to the CPU thus gen3 (in case the laptop came with 3 dGPUs).

    The correct value for southbridge PCI Express Root Port 3 (ExpressCard) as noted here would be

    Setting: PCIe Speed, Variable: 0xB4

    Option: Auto, Value: 0x0

    Option: Gen1, Value: 0x1

    Option: Gen2, Value: 0x2

    End of Options


    UPDATE: This will work only for devices that are connected on the machine prior to boot. If you disconnect or hotplug the machine it will revert to Gen2 speeds as pointed here. thanks @Tech Inferno Fan for pointing that out.

    Haven't made myself any tests on W10, only 8.1 and 7 and that's why I probably can't help you more.

    Also Khenglish has noted many times that Fastboot in UEFI BIOS should be put to Thorough mode in order for the eGPU to work without issues. Could you try setting that?
    Spoiler

  14. Surely will. I was planning to document my implementation anyway to help other beginners like me. Guide seems easy enough to understand though.

    Reverted back to Windows 7 Pro x64 so I can't really answer the DX12 question yet.

    Feel free to correct me or add more staff whenever you think something is not clear with a post or PM. English is not my native language and you may not understand what I really want to say.

    • Thumbs Up 1
  15. Making progress :D I noticed that unless I enable DIMM profile variable 0x1EE to value 0x1 (custom profile), changes I'd make to things such as tCL timing wouldn't stick.

    It makes sense. You have to enable custom profile in order for it to work...

    So I edited the variable above, checked that changes would stick, and now I successfully booted dual-channel to CL12. Command rate set to 2T. Now my biggest challenge is getting the system to boot. I seem to have trouble booting at 2133, where I didn't have that trouble before. I also reset my cmos beforehand to make sure I have everything as close to default as possible. I've noticed the default values on some of the variables are NOT the same as in the A07 EFI dump file. A16 might have switched some around, I'll need to extract the A16 efi variables and do a compare. Things like tRAS and maybe others have different default values (0x0a instead of 0x04 for tCL for example, 0x0b instead of 0x3 for tRAS, and so on. Though they seem to be defaulting to whatever the SPD table on the ram detects at the particular speed at that moment. Still, I'm not sure they're still the same variables, so I'll try to dump A16).

    I'm hoping I manage to boot 2133 at CL12 and command rate 2T at least.

    E6430-A16_IFR.txt I don't see any differences btw the A07 and the A16. As you noticed the defaults are the timings your RAM is currently running.

    • Thumbs Up 1
  16. Quote

    Hey guys! I've got all my parts setup for overclocking now and was hoping to ask for some advice as well.

    My setup is the following: LAPTOP: E6430 NVS5200, 900p CPU: i7 3840 QM (QCF1) RAM: Crucial 16Gb PC12800 1600MT/s EGPU: PE4C v2.1 + GTX970 (Hoping to get at least x2.2)

    For those who already did the paperclip mod, do you power on the laptop with the keyboard removed or can you just lift it from its position? Not sure if there is enough space for my hands to bridge the paperclip underneath an attached keyboard.

    Also, once the system has properly booted after bridging, do I have to permanently Unlock the Flash descriptor before resetting the Laptop? Because that might mean I would have to use an external keyboard, or hotplug the keyboard I disconnected.

    Thanks in advance!



    That's great news!!! Waiting for your implementation.

    Regarding the descryptor flash, I have done it both ways. The first time I removed the palmrest but let the keyboard connected on the side (used a plastic bag under the keyboard to avoid any kind of shortcircuit). The second time I completely removed the palmrest and keyboard, and used a USB keyboard to boot/flash. If you have a second computer you can edit there your bios to unlock your descryptor. After you successfully apply the paperclip mod your descryptor will be unlocked as long as you don't shutdown your laptop (rebooting won't matter). Flashing the descryptor region with an modified dump will make the unlock permanent

    Keep us posted.

    Quote

    This kit. Currently running at 1866 CL10. [ATTACH=CONFIG]15902[/ATTACH]

    EDIT: Tried editing the variable to set custom profile. Good enough idea, unfortunately the latency won't stick. Unless I write an XMP profile to the module (I really want to write a 1.5V, 2400 or 2666MHZ CL14-CL15 profile and try it out through XMP. Unfortunately no ThaiPhoon.



    XMP with 2400MHz or 2666MHz was a no go for me. Best case scenario it would downclock to 2133MHz and every other case it won't even boot. I haven't tried to use 2400 JEDEC profiles though. Those may stick.

    Maybe try setting your CR to 2T it is under
    Setting: NMode Support, Variable: 0x1EF
    Option: Auto, Value: 0x10
    Option: 1N Mode, Value: 0x0
    Option: 2N Mode, Value: 0x2 


    I kind of not getting it. Mine have similar JEDEC timings
    Spoiler

    Maybe setting it to 1.5V? There is this variable
    Setting: DDR Selection, Variable: 0x1E9
    Option: DDR3, Value: 0x0

    Option: DDR3L, Value: 0x1 Option: Auto, Value: 0x2


    Or find a way to flash some XMP Profiles.

    Quote

    In addition, NVIDIA will match Microsoft OS support for DX12. Over 70% of gaming PCs are now DX11 based. NVIDIA will support the DX12 API on all the DX11-class GPUs it has shipped; these belong to the Fermi, Kepler and Maxwell architectural families. With more than 50% market share (65% for discrete graphics) among DX11-based gaming systems, NVIDIA alone will provide game developers the majority of the potential installed base.



    source: [URL]http://blogs.nvidia.com/blog/2014/03/20/directx-12/[/URL]
  17. Quote

    Thanks for the memory speed variable, managed to use it to get at least stable 1600 cl9 speeds for now, now i don't have to throw away the kit. The replacement kit performed the same, no idea why, I assume similarities in the design. Any idea of a way for me to set latency? So I can try 2133 CL12/Cl13 for example?



    If you know your way around tweaking memory there are a bunch of variables that you could try (never tried them myself):

    Numeric: tCL , Variable: 0x1FE 
    Default: 8 Bit, Value: 0x4

    End

    Numeric: tRCD , Variable: 0x1FF Default: 8 Bit, Value: 0x3

    End

    Numeric: tRP , Variable: 0x200 Default: 8 Bit, Value: 0x3 End Numeric: tRAS , Variable: 0x201 Default: 16 Bit, Value: 0x9 End

    Numeric: tWR , Variable: 0x203 Default: 8 Bit, Value: 0x5

    End Numeric: tRFC , Variable: 0x204 Default: 16 Bit, Value: 0xF

    End

    Numeric: tRRD , Variable: 0x206

    Default: 8 Bit, Value: 0x4

    End Numeric: tWTR , Variable: 0x207 Default: 8 Bit, Value: 0x3 End Numeric: tRTP , Variable: 0x208

    Default: 8 Bit, Value: 0x4 End

    Numeric: tRC , Variable: 0x209 Default: 16 Bit, Value: 0xF End Numeric: tFAW , Variable: 0x20B Default: 16 Bit, Value: 0xA

    End Setting: NMode Support, Variable: 0x1EF

    Option: Auto, Value: 0x10

    Option: 1N Mode, Value: 0x0

    Option: 2N Mode, Value: 0x2 End


    For 8bit values you probably set from 0x0 to 0xFF (256) and for 16 bit values from 0x0 to 0xFFFF (65536). For example if you want to set tCL 13 you should set variable 0x1FE to 0xD etc. NMode seems to be the Command Rate use 1N for 1T or 2N for 2T.

    But you should probably also set

    Setting: DIMM profile, Variable: 0x1EE
    Option: Default DIMM profile, Value: 0x0

    Option: Custom profile, Value: 0x1

    Option: XMP profile 1, Value: 0x2 Option: XMP profile 2, Value: 0x3


    to Custom Profile.

    What kit?
    • Thumbs Up 2
  18. The irony of this system is that with overclocking and high-speed ram, the dGPU becomes redundant and you can get better battery life and comfortably use eGPU. But you need cooling to get that performance level, and to get cooling you need the redundant and rather unnecessary NVS-5200. Would have been nice if both were DX12 and then you'd have the possibility for a doubling of performance under Windows 10, but the HD4000 is DX11 only (I'd heard Fermi supports DX12 but I wouldn't know).

    Gotta get myself some of those copper tiny heatsinks so stick them on the heatpipe.

    Haven't tried to OC the dGPU yet. I think that i will outperform even the OCed Intel HD4000, so I don't think it is redundant. As of battery life I don't think that using the iGPU only system is going it be much more efficient.. With battery life tweaks my dGPU E6430 would have an idle discharge rate as low as 7,2W. Considering that Tech Inferno Fan 's iGPU only E6230 would do as low as 5.4W. I don't think that dGPU would be a battery sucker especially when Optimus is enabled.

    I have been looking for an iGPU initially myself and if this wasn't a sweet deal, I would have an iGPU now. But I am pretty happy I went dGPU way now, cause at work I can have up to 3 screens connected (using Docking Station).

    Maybe there is a dual core i5/i7 with unlocked turbo bins? It would be nice to see how the iGPU would perform with your single pipe stock cooler.

    A fan pushing fresh air to the bottom of the laptop (open lid) is crucial for me to keep my temperatures under control. Did you try it?

  19. When trying to unlock the upper bins of a 3740QM, and editing into variables 0x25 to 0x28, what value should I be setting it to? Trying to understand the guides but coming up short.

    Can I also control maximum Jedec speeds and latencies? Say I want to use different jedec profiles on my ram than what is automatically selected.

    If I am right the 3740QM has maximum unlocked bins x41 for 1 core x40 for 2 core x39 for 3/4 core. You should set the hexadecimal of these multipliers in every var. For example 0x25 (which is 1 core ration limit) you should set the value 0x29 which is the HxD of 41. At 0x26 you set 0x28 etc. Generally you can set the highest in every variable and your chip would go as far as it can.

    Regarding JEDEC I suspect that you can set the JEDEC profile spd with variable 0x1e6. Never tried it since there was no use for it.

    For the OC ME FW, should i boot a rufus usb stick using DOS and use fpt.exe to flash the e6430oc.bin file? (Use variable 0x228 set to 0x1 to unlock flash descriptor before flashing, right?

    What's the risk on this?

    DO NOT USE THE VARIABLE 0x228 to flash an OC ME FW:

    It is possible that this mod causes bios corruption. I had problems after a flash since I couldn't reenable the ME FW flash through UEFI variables and I had to use the hardware mod to unlock the Descryptor region again.

    Unlock your descriptor like described here and flash a modified ME FW as described here.

    Unless you are interested in OC your RAM a little more than 2133, I wouldn't recomment to BCLK OC an [email protected] since you would definetely face Temperature Throttle.

    I tried going to 1600MHz on iGPU but the chip gets HOT fast, and I run into TDP throttling quickly. I've no idea what to do about cooling the chip properly, without access to the beefier heatsink. I might be better off using the i5 CPU for gaming on the go until my EXP GDC arrives.

    I faced some TDP throttling my self during the tests. I think that you should try the highest core frequency without additional voltage (it should be stable at 1350-1400MHz) plus dual channel high-speed RAM (2133MHz) if you haven't already. Keep in mind that dual rank chips have generally better performance than single rank chips and the faster the RAM the better the iGPU performance. Check http://forum.techinferno.com/general-notebook-discussions/6063-all-memory-not-created-equal-hyperx-2133-cl12-versus-ripjaws-2133-cl11.html thread for more info on RAM.

    • Thumbs Up 1
  20. Another question @timohour, do I need to convert my entire Windows install to GPT and run UEFI mode in order to use UEFI variables, or is it enough to boot a thumbdrive in UEFI mode, use the setup_var command, reboot in legacy mode and they stick?

    (Do they stick? Do efi variables need to be applied at every boot?)

    No need to reinstall Windows in UEFI mode to use UEFI variables. You simply boot with the USB drive in UEFI mode.

    As Tempest said changing efi variables is like changing a setting in UEFI bios. You need a CMOS reset to revert this to its default state.

    You better halt than reboot your system cause some settings need a cold boot to change.

    My cpu is a 3630QM and I can confirm that once I add JEDEC timmings @ 1866, all I have to do is shutdown and start again and the memory automatically runs at 1866. A simple Windows 8.1 restart doesn't change the Ram settings. I was unable to get the laptop to boot with any of the Ram modules at 2133 (tested 4 different modules). Or its my 3630QMs memory controller? This was tested using bios A16.

    Here is a picture of the weak Ram module.

    With A16 I run my Kingston @ 2133CL12 and that's with both 3630QM and the 3720QM. Maybe you should loosen your timings? If you have similar modules you could try my settings. PM

  21. Hey guys. Sorry to intervene.

    I've recently upgraded my bios to A16 and I can still use UEFI variables to enable Raid0, change TOLUD and select XMP Ram profiles.

    eGPU still works without issues. Tested in Windows 8.1 x64 using EC.

    Thanks for your input.

    Already have my Dell E6430 (iGPU only/900p/Bios A16) :)

    I had to use uefi variables to unlock raid0 . As expected the variable that needs to be changed is different from the one of the E6440 = "0x19D". So in my case E6430 = "0x12D". And had to use them again to set up TOLUD in order to use eGPU.

    Using Thaiphoon Burner

    I've purchased Thaiphoon Burner but I was unlucky. It turns out that one of my "cheap" Ram modules from 2012 (8GB Kingston 1600 CAS11) is a bit weak and only lets me lower the CAS to 10. At 1866 the E6430 wont boot, even with another module installed and with different CAS settings. The other module has no problems with 1866.

    I can set XMP profiles for both of them or even JEDEC. Some other modules I've tested only allow me to create XMP profiles.

    With other Ram modules (2x4GB) I managed to get 1866 working in dual channel.

    Rereading your edited post, I say nicely done. Could you elaborate on your ram module (s/n and/or a photo of the memory chips) so we can avoid them in the feature?

    Also could you add your CPU? I have been thinking to build a table on the first post with user configurations and mods.

    Can you confirm that memories with JEDEC timings @ 1866MHz or 2133MHz will automatically speed @ 1866MHz or 2133MHz respectively?

  22. @timohour What BIOS are you using? I'm currently on A07, was wondering if there's reasons to upgrade to A16? Would I lose any features?

    I am currently running A16 but I see no reason to upgrade your bios. No features gains/losses that I can think of.

    I am still running my memories and they time PnP and I can still set the multiplier using UEFI variables. What features do you have in mind?

  23. Hey guys? Can you tell me if the dual-heatpipe heatsink is compatible with the intel-only e6430? I have the opportunity to get one for about 20$ tonight.

    It seems that those two have different mounting holes but I can send you some images from mine so you can see for yourself

    WCtcaqo.jpg

    Seems like screw #1 is way out of position compared to the iGPU model. I wouldn't recommend it... For more photos check here.

    • Thumbs Up 1
  24. @Tech Inferno Fan I was wondering if it's possible to upgrade the e6430 with an e6440 mobo? Any idea?

    As an E6430 user I think this is almost impossible without serious modding. Even ports are the other way around

    Hi Guys,

    One of my friends bricked his e6440 during a BIOS update (not exactly sure how it happened).

    Is there a way to do an emergency bios update (i.e. from a thumbdrive) on the motherboard so that it can made bootable again?

    Regards,

    Cheng Mun Wai

    If this happened during a BIOS update and it is still under warranty (you can check with the Service Tag) contact Dell support and they will replace the motherboard.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.