Jump to content

angerthosenear

Moderator
  • Posts

    551
  • Joined

  • Last visited

  • Days Won

    10

Everything posted by angerthosenear

  1. I'm using the vbios off a random clevo. With the Dell one you are stuck with the 67C thermal limit. Can't use throttlestop since it'll mess up the voltage the CPU needs to operate, at least for me. I've done all my changing in the BIOS or with XTU. What CPU do you have? I have the 3920XM running at 4.4GHz at a little over 1.3V atm. I don't really run into thermal issues. And that's around 65W.
  2. I'll try this and report in tomorrow. I don't have issue with the GPU getting too hot. I wouldn't really call 75C hot. Maybe that's just me though. Might need another vbios mod to allow the slider to bump up a bit more if this does have a significant enough impact.
  3. The size / weight doesn't bother me much. I used to carry a 17" brick around before I got the T901 and the m14xR2. I was talkin with @svl7 and he said the 8570P (at least the one he used) had a 1080P display, so that's pretty attractive. Potential for RGB IPS display too apparently. I use a mSATA drive for my downloads folder lol..... it's not overly critical, just neat to have. The E6440 is about 200$ more than the 8470P, so that's a bit much for me. I'm going to be firm on the 500$ max to leave money for new GPU and PSU for desktop. Apart from screen, any other advantage with the 8570P over the 8470P? The price can exceed the 500$ limit so that might not happen regardless. --- Any large critical difference that I'm missing apart from size/weight between the 2570P and the 8470P? Don't worry, I'm not bashing the 2570P at all, it's just 12" is a bit too small for me. Not sure if you've seen my desktop setup, but I love my screen real-estate ;D
  4. SLI won't work unless there exists some dual port TB adapter and I'm sure a lot of fiddling will be needed. Crossfire is a PAIN to setup, and not worth it imo, much better to just spend the money on one nice single card instead. You can see my experience with Crossfire here: http://forum.techinferno.com/diy-e-gpu-projects/5622-%5Bblog%5D-crossfire-testing.html#post82280
  5. Seeing there is no dedicated 8470P thread that I see, I'm guessing this thread would encompass both since they are so similar. I greatly like the 14" laptop I currently have and didn't like the 13.1" Fujitsu T901 I had earlier, felt too small. So going to go with a 8470P or 8570P, whichever I can find cheap. I am planning on selling my current AW m14xR2 w/o drives and getting one of these nice HPs and a GPU for my desktop. Soooooo, all the little mods on the 2570P, do those apply to the 8470P/8570P as well? Battery, wireless, etc. I'm not really shooting for gaming performance out of my laptop anymore since I got a desktop. I'm more concerned about battery life and perhaps some very small-end CAD work (engineering student). I don't plan on using an eGPU with it either, so one with a dGPU might be handy. Should I look into the 8470W/8570W then? The price is a bit steep to be in my liking though. I'd like to stay sub-500$ (prefer around 400$). What display options for the 14" / 15" models can be modded in? I know the 2570P uses an LVDS connector and maxes out at 900p for 14" laptops. Rather, are the 8470P / 8570P LVDS as well? Or is one of those two utilizing eDP. --- Anyone got a 8470P / 8570P + money they'd like to trade for a m14xR2 (specs in sig)? --- As for mSATA, would the 8470W (which has mSATA support) BIOS work on the 8470P do enable the mSATA port if the connections were bridged? Or are these systems too different.
  6. For CPU, I'm still messing with it to make sure it isn't overly hot. 4.5GHz is a bit too hot when under full load. (over 85C upwards of 90C+) For GPU, since it has the same core as the 660M and 750M, it can be OC'd quite a bit. (can't even get over 75C when I'm really trying to heat it up) Here is a screenie of NVI for the settings I use when gaming on the go:
  7. More than likely just a maximum of two active displays. I tried with my m14x R2 and could only have two displays active at a time: internal + vga connected internal + hdmi connected vga + hdmi I don't have any mDP adapters, so you'd have to test that. Would be curious though.
  8. @Anthonycy Were you installing Win8 via UEFI / GPT install or legacy mode? That would explain why Win7 said everything was all messed up during install. I usually drop all partitions on the drive and then just click 'Next' in the installer without formatting any partitions. Then again, I've only downgraded from Win8 to Win7 once (and that was a while ago). --- As for disabling the iGPU, I can't help much there. I can't do it on my m14xR2 since the screen is attached to the iGPU and not the dGPU. Someone else would have to chime in on that one. What error does it give you exactly on your 6990M (right-click -> properties)? No driver? Not enough resources?
  9. I tried disabling Optimus on my m14x R2. But that just ended with bricks and blind flashing of the BIOS. Since the display is hooked up to the iGPU, I don't think I can disable the iGPU and have just the dGPU. It just beeps at me. If you want, I can disable that and try eGPU stuffs. Might just result in more bricks haha (I'll unplug the speakers, I hate hearing the beeps). Would you want me to test: iGPU + dGPU + eGPU (if possible, eGPU shows up in device manager, but not enough resources) iGPU + eGPU (this will probably work, eGPU would have enough resources) dGPU + eGPU (don't think this will do anything, dGPU would be stuck since there would be no iGPU) eGPU (perhaps? not really sure on this one. Would have to figure out what ? I can give it a whirl tonight / tomorrow if you want. I'm running Windows 8.1 Update 1 at the moment with a GPT/UEFI install, so don't think Setup 1.x will work (unless something changed).
  10. Pretty much this, I can't see the HD 5830 being much of a gain over the GT 750M unless you are doing stuff like CAD work or encoding or something that benefits from having more cores and isn't overly bandwidth sensistive. Your GT 750M would probably be much better if you are gaming. You might be able to OC your GT 750M nicely too. The GT 650M (same core as the GT 750M) OC's really well. NVIDIA GeForce GT 650M video card benchmark result - Intel Core i7-3820QM Processor,Alienware M14xR2
  11. Yeah, I might have to fiddle some, about 8 of the mSATA pins (on the mobo daughterboard) weren't even soldered, so I had to solder them up to get the connector ... well... connecting. Will probably resolder tomorrow. That might help. Dunno.
  12. ADATA ASP300S-64GM-C 64GB SATA 3Gb/s mSATA Not the fastest thing ever, but good enough since I just am using this for my Downloads folder
  13. Would not work through USB, you must use ExpressCard / mPCIe / Thunderbolt which carries native PCIe signals. USB does not have PCIe signals.
  14. You can really get the R2 goin.. Here is my HWBot submission: angerthosenear`s 3DMark11 - Performance score: 3547 marks with a GeForce GT 650M I had 1271 MHz on the core @ 1.2V. Thermal throttling disabled. I even beat out a lot of 650M SLI setups ;D ...... -- I would like to get a 3920XM to put in here. I have my 3820QM set to 41x/40x/39x/39x and I still am not running into thermal issues at all. Would easily be able to support an extreme processor.
  15. For eGPU projects, see the forum section here: DIY e-GPU Projects It's got a lot of information, be sure to readup on in if you intend to use an eGPU. I tried it on the m14x very quickly. It doesn't seem like it'll be too hard to setup. Would probably need Setup 1.x to accomplish it, but I'm on Win 8.1 with a GPT install so I can't quite do that. Could probably fiddle around enough and make it work however.
  16. Well, it would certainly help out if you are maxing out your VRAM (like I did). Everything else is the same. Same GPU core and such. I got svl7's unlocked bios mod on it atm if interested. Once you get another quality post I can PM you. --- Tested it multi-displays. Can only have a max of two active displays (even with the R2). http://i.imgur.com/9VnWiqD.png (semi large screenie) http://i.imgur.com/JpGKMyj.jpg (pic of setup)
  17. Honestly not much. You'd be better off just swapping the R1 mobo with a R2 mobo. The jump from a GT 555M with DDR3 with a GT 650M with GDDR5 (which has the same core as the 660M and can be clocked as such), is a very worthwhile upgrade. See my link to that in my signature. I'll test the multi-screen thing tomorrow, been busy with stuff this weekend.
  18. Hehe, I have a huge Elfen Lied poster on my wall too. Actually, I've always had yellow+green+red when the eGPU was powered but laptop was off. The red light goes away as soon as the laptop was powered on. -- Everything hardware wise looks fine. Personally, I never had that two wire cable connecting the PE4L and the SWEX, some do, some don't - don't think it'll make a difference though. Have you tried updating your BIOS? Is your eGPU detected when you enter Setup 1.x? If so, then it'll just require some fiddling to get it working in Windows, but it should show up in Windows if that was the case (at least with an error).
  19. Man that turned out awesome. Glorious work! I'm guessing you might shave off all of a fraction of a degree if you cut away all of the plastic there (the diagonal bits). I also noticed you put a copper sheet across both of your heatpipes, I'm guessing this is even out the temperature between both pipes / distribution? Also, I see little aluminum/steel plates under two of your screws + springs, was the stock mount that uneven @@? I hunted around for a while for a 12mm or larger heatpipe, but still couldn't find any reasonably priced. And looking at it, doing direct pipe cooling looks to be tricky there, would need two sharp 90 degree bends. Could you perhaps replace the copper plate you have with a thin copper sheet, or would that be more detrimental since it would be like having your current dual heatpipes directly on the die leaving hotspots? Another thought space restrictive, could you move your fan back about an inch? Could use shorter heatpipes (that are straight), then just fab a tunnel to exhaust the heat. Or does heatpipe efficiency not change much with that minor a change in heatpipe length and the bend? (I lost the page to that calculator thing). --- Completely unrelated, we have the same fingers... das kinda creepy.
  20. I was just messing with a M5110 (AMD version of the N5110) that came into the computer shop I work at. That WiFi card is in a horrible location... Could you take a picture of your setup? You said you have the floppy power cable going to your PE4L and then the PCIe power cables going to your GPU correct? Does the GPU fan spin? What lights on the PE4L adapter are lit and what color? This'll help the most. What OS are you using?
  21. I'm only going to take a guess on this one. But since the dGPU has to pass back to the iGPU, and SB based iGPUs can only support two displays; I feel your will be restricted to two active monitors. I used to have a R1 (still have mobo, trying to sell it), but I have the R2 now. I could try it with the R2 and see if I can have internal + 2 external active. I don't have a miniDP adapter of any sort so can't test 3 external. Side note, I got the 3GB version if you wanna upgrade ;D .. :shameless plug: hehe
  22. Also, try not to discharge below 20% if you can help it. I've done this with my phone (original iPhone). And even 6 years later it is still amazing (3-5 days battery life). For laptops when you are on the go it's a bit harder. Perhaps try to stick with 35-50% charge. Mainly because:: DISCHARGE RATE IS NOT LINEAR It isn't. So don't think it is. ;D Here is some experience from custom flashlight building (my other hobby). Here is some discharge charts for various batteries: http://www.lygte-info.dk/info/Battery%2018650%20UK.html Can also see how much capacity changes with different voltage draws: Battery test-review 18650 summary As you can see, the voltage drops as the battery cells gets more and more discharged. Guess what happens? The controller has to pull more current to make up for the loss in voltage so the output is still the same. Your computer isn't going to run off 9V when you have a 11V battery, so the battery has to suck more current from the cells to put out the 11V required. Remember to take this into account when you view your battery percentage, it's quite important.
  23. Laptop: i7-2620M with R9 280X eGPU Desktop: i7-4770K with R9 280X I played in the Test Range so I can have the most stable comparison between tests. Things like crashing through the concrete walls / shooting those tended to give the largest performance drop. I wanted to stray from CPU intensive things since the T901 does get really hot and likes to throttle.
  24. Mantle does help with performance. I did some testing here: http://forum.techinferno.com/diy-e-gpu-projects/2109-diy-egpu-experiences-%5Bversion-2-0%5D-54.html#post82601 and comparison in a desktop here: http://forum.techinferno.com/amd/5908-amd-catalyst%99-14-1-mantle-beta-driver.html#post83225 Improvement is about the same for desktop GPU and laptop eGPU (about 10fps in BF4 on 14.1). I don't have, didn't try Star Swarm. A nice Nvidia card is still better for eGPU configurations.
  25. It was probably your RAM setting that caused the issue. I know when I tried clocking the RAM higher on my system (AW m14xR2), it refuses to boot, beeps at me, and eventually decides to reset it's settings / I have to clear CMOS. Did you change this setting and then it all went downhill? Or was this something you changed inbetween edit 1 and 2?
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.