Jump to content

sangemaru

Registered User
  • Posts

    157
  • Joined

  • Last visited

  • Days Won

    1

Everything posted by sangemaru

  1. Thanks for chiming in. Mostly I want to make sure that any tweaks/optimizations are there (still holding out a tiny bit of hope that my failing ram might be due to the system MRC being under MRC 1.5.0.0 and that I might fix the glitching by just BIOS update - fat chance though, but I'd still try it). And yea, my main worry is that a newer BIOS might refuse to set JEDEC speeds over 1600, as I'd seen on the forum in one user's post who had to use XMP to get his RAM back to 2133.
  2. @timohour What BIOS are you using? I'm currently on A07, was wondering if there's reasons to upgrade to A16? Would I lose any features?
  3. Hey guys? Can you tell me if the dual-heatpipe heatsink is compatible with the intel-only e6430? I have the opportunity to get one for about 20$ tonight.
  4. That looks so delicious I'd eat it. Especially with DX12 making better use of the iGPU and that shitty undervolted dual-core cpu.
  5. Sold the machine, kept cpu and ram, will test in a e6430 to see if it was the components or machine at fault. Sent from my Neken N6 using Tapatalk
  6. @Tech Inferno Fan I was wondering if it's possible to upgrade the e6430 with an e6440 mobo? Any idea?
  7. The kingston data sheet advertises it as compatible with intel panther point, and even the previous sandy bridge chipsets. That shouldn't be the issue. Sent from my Neken N6 using Tapatalk
  8. Well, that didn't last long. I had a semi-stable config and wanted to try and see if Windows 10 would work. It didn't, and upon reverting back to win 8.1 the flickering resumed, and this time no driver could fix it
  9. Huh, interesting thread. I just picked up a kit of Kingston hx321ls11ib2k2/16 16GB HyperX Impact kit and ran into the the behavior described here (basically the system is pretty unstable, i blame the iGPU). I'm trying to use these in a HP 2570p, the bios of which has crap-all in terms of options. I was considering increasing the latencies, but i'm not sure it's the ram that's the problem (memtests complete ok). Is it possible the i7-3740QM simply can't handle the clockrate? What would I have to do to have these chips made usable, try and delete every SPD profile above 1866MHz somehow?
  10. My Kingston HyperX Impact 2133 cl11 16gb kit arrived today. Needless to say, i was overjoyed to try out the difference between 4gb 1066 and 16gb 2133. The good news is, GTA 5 now runs in dx11 smoothly at 768p, whereas before it had to be 800x600 dx10. It's also amazingly stable. In gta v at least. Windows begins flickering a few seconds after login, followed by driver crashes and system freeze soon after. So does ubuntu. Putting gaming load on makes the system instantly stable, alt-tabbing to desktop makes it instantly unstable again. Tried it with one and two modules, same behavior. I used bios test and windows test to check ram integrity, it's all good. Apparently the machine simply hates the speed. I'm using the i7-3740qm cpu, will also use the i5 to check, might be memory controller flaw? Anyone have any suggestions? Is there a way i can underclock the ram on 2570p? EDIT: Tried to boot the ram with my i5-3320m, the system won't even post. So the 3740QM is not strong enough to drive 2133 ram?? EDIT2: Solved with IntelliMod32 driver. If anyone runs into similar issues, give them a try.
  11. I wonder how much the igpu performance increase will be going from 1066 cl7 to 2133 cl11. I have really crappy ram. Sent from my Neken N6 using Tapatalk
  12. ​The way it sounds to my ear, whether I'm using TPFanControl or NBFC, i don't think it's going past around 3000 rpm.
  13. Strange for sure. Must be using some other internal algorithm to throttle the CPU then. May I suggest you move to a 14" Dell E6430 platform instead? It can unlock the i7-3740QM's extra 400Mhz REF: Unlocking i7-37x0QM/38x0QM extra 4 multipliers on a Dell Latitude E6430 http://forum.techinferno.com/dell-latitude-vostro-precision/9690-14-dell-latitude-e6430-performance-upgrades-system-mods.html A E6430 has 1600x900 LCD and NVida dGPU option, can do a x2 2.0 eGPU. I those units to be more of a powerhouse than a 2570P. There's only a small increase in weight and size. I 'upgraded' from a 2570P to a Haswell Dell E6440. After seeing the E6440 wasn't x2 2.0 eGPU capable AND the Haswell CPUs run hot, I regretted not getting a E6430 instead. Only thing that slightly favors the E6440 is they now have a FHD eDP model upgrade and it has mSATA. E6430 is worth sidegrading to. You can just pop in your i7-3740QM and start tweaking away EDIT: I've bannered this info on the opening post of this thread: That machine looks incredibly tempting. I'd go for it, but I'd need to find a buyer for my 2570p first and people don't seem very interested (have had it listed for weeks now on multiple sites and nothing). Still got warranty on it till 2017. What do you think about that RSA sniffing hack? I don't personally have the resources or the tools, but in the hands of the right person, this can basically open up almost any interesting locked down machine (HP Elitebooks, Dell Precisions etc.) Edit: If you happen to run into someone looking for a 2570p, mind giving me a heads up? Since you're not trying to keep an OS from the old HDD to the new, there's absolutely no problem just copy/pasting. The M9T is a really sweet piece of hardware. I also got mine for 100$ right on launch
  14. But isn't it weird for the multi not to tick up with HT disabled and the corresponding massive decrease in TDP? I mean I have like 29W with HT disabled.
  15. Does this look weird to anyone else or just me? Disabling Chipset Clock Mod and Clock Modulation as well as Hyperthreading lowered my TDP and voltage a decent bit, and temps went down as much as 18C. But no matter what I do, the CPU isn't breaking 33X and I'm pretty sure by this point it has nothing to do with power consumption, voltage or TDP/heat. I'm running Windows 8.1 and just switched between 3612QM, 3320M and now I'm with a 3740QM OEM. @Tech Inferno Fan did you happen to run into any such behavior when testing out your own 45W cpu? EDIT: Also, single-core performance can push up to somewhere around 3.55-3.6GHz and no higher, even though it should reach 3.7. EDIT2: Supposedly there's a new tech out there that allows for attacking and stealing encryption keys, including those using RSA technology. RSA such as the one HP uses to encrypt and protect the BIOS? Suppose we got our hands on the RSA keys... could we unlock the BIOS?
  16. So I got my hands on a 3740QM. Temps are better than my 3612qm at higher clocks, but i still can't push an inch past 33x multi. Anyone have suggestions? TS reports max power usage around 37W. How'd you guys reach 3.8-4GHz?
  17. I find the rules a bit unclear on whether soliciting outside the marketplace is allowed or not, but since I can't PM you, I'd just like to let you know I have a 3612QM I'd be willing to part with around your budget due to financial hardship. Great chip, usually pretty cool around 60-70C with low fans, in my own machine it maxes out temps at 96 if I uncap vsync while playing SC2 for example, but there doesn't seem to be much in the way of throttle or anything (partly because the speed bins don't go that high). It's 35W.
  18. I find the design quite interesting, from the tinkering I've done with laptops. The unavailability of hdd bays and reliance on M.2 makes the question of storage capacity an expensive proposition. It's built for speed, quad raid would be nice. Your design doesn't seriously take into account the tracking and mobo space spent on providing all those other auxiliary features, such as wlan/wwan/bluetooth/ ports areas, or usage of a possible dock to offload the space used by multiple ports. Would be nice to take some cues from the smartphone market designs to see how they implement low-power, small and quality radios and features, wlan/wwan/3g/audio dac/bluetooth, basically try to reproduce smartphone-level designs to get radio capability on the cheap and small. There may be an option of one hard-drive slot by pushing the CPU out of the center of the machine, and closer to the top, thus keeping it away from the battery and other heat-sensitive components in the bottom area. I'm also a big fan of the caging designs used by dell precision machines. Beautifully complex, yet tight and packaged. Such a machine would be solid, not overwhelmingly heavy, and a true speed demon. So long as sufficient space and importance is truly awarded to the cooling system (and so long as we have hundreds of pages of user-mods here at nbr, pushing machines further and further, including cooling mods, adding heatpipes, etc.). I think we're positioned at the right point in time when we have access to exotic materials (possible use of indium/gallium foil in a high-heatsink pressure design around 100PSI - the heatsinks would be expensive because they'd be required to be machined and lapped to great precision), and to energy-efficient designs that we could reasonably cool 3 spots generating 100W of heat, maybe more. Your design is still a traditional one, I've been seeing videos of heatsinks much more interesting, that may cut down on cooling space requirements tremendously. Cooler Master's ''kinetic'' engine turns rotating heatsink into a fan - The Tech Report
  19. Ok, so as far as i can tell, you power the adapter through a molex 4pin-to-floppy connector on the adapter (correct me if it's not there), which also supplies the card up to around 75W through the PCIe slot. And you give extra power to the card, using standard ATX PSU GPU power connectors, or molex-to-6-pin, into the card, directly. I hope for your sake @Tech Inferno Fan chimes in, because I'm only guessing based on what I've read so far, I didn't actually get any adapter yet.
  20. What amperage does the PSU theoretically provide on the 12V rail? Have you tried setting the ExpressCard slot to Gen1 speed in BIOS to confirm it's not due to signal issues? Unfortunately, I'm not sure how the ATX PSU should work precisely - I think you should use the 12V from the PSU directly into GPU, and use one molex-to-floppy cable to power the adapter.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.