Jump to content

angerthosenear

Moderator
  • Posts

    551
  • Joined

  • Last visited

  • Days Won

    10

Posts posted by angerthosenear

  1. I was wondering if underclocking the gpu mem by about 750 would bring down heat for a better core clock? i never seem to need to utilize the full mem speed in any games i play anyhow.

    I'll try this and report in tomorrow. I don't have issue with the GPU getting too hot. I wouldn't really call 75C hot. Maybe that's just me though.

    Might need another vbios mod to allow the slider to bump up a bit more if this does have a significant enough impact.

    • Thumbs Up 1
  2. I had a 8470P for a little bit. Nice machine but was noticably heavier than a 2570P and larger too. The only benefits were the 900P LCD, bigger heatsink and fan resulting in better cooling and a dedicate pgup/pgdn key. mSATA I'm not sure about. Besides it's designed to be a cache drive rather than a boot drive if it did work. eDP I very much doubt.

    I'd suggest you look at a 14" Dell E6440 instead. A lighter/thinner Haswell machine with a socketted CPU. *some* have an expresscard slot too. 900P is an option. If it uses eDP then there's potential to install a 1080P 14" LCD too. Has a HD8690M dGPU option too. no mSATA though. Only problem is it will be pricier than a 8470P.

    The size / weight doesn't bother me much. I used to carry a 17" brick around before I got the T901 and the m14xR2.

    I was talkin with @svl7 and he said the 8570P (at least the one he used) had a 1080P display, so that's pretty attractive. Potential for RGB IPS display too apparently.

    I use a mSATA drive for my downloads folder lol..... it's not overly critical, just neat to have.

    The E6440 is about 200$ more than the 8470P, so that's a bit much for me. I'm going to be firm on the 500$ max to leave money for new GPU and PSU for desktop.

    Apart from screen, any other advantage with the 8570P over the 8470P? The price can exceed the 500$ limit so that might not happen regardless.

    ---

    Any large critical difference that I'm missing apart from size/weight between the 2570P and the 8470P? Don't worry, I'm not bashing the 2570P at all, it's just 12" is a bit too small for me. Not sure if you've seen my desktop setup, but I love my screen real-estate ;D

  3. Hallo TI member

    I want ask a noob question

    May be one of you already tred and succesfully use the egpu in SLI mode..?

    I just think if this may possible, since I want a x2 configuration, but I just have port 1 for WLAN and port 3 for Express Card available. The port 2 is already used for memory stick card, and to make it free I must make a hardware level configuration, that not so easy to do

    It may need 2 egpu adapter ( PE4H or PE4L + 2 HDMI Cable ), but it may be worthy to try, if the result would be great

    thanks

    SLI won't work unless there exists some dual port TB adapter and I'm sure a lot of fiddling will be needed.

    Crossfire is a PAIN to setup, and not worth it imo, much better to just spend the money on one nice single card instead.

    You can see my experience with Crossfire here:

    http://forum.techinferno.com/diy-e-gpu-projects/5622-%5Bblog%5D-crossfire-testing.html#post82280

    • Thumbs Up 1
  4. Seeing there is no dedicated 8470P thread that I see, I'm guessing this thread would encompass both since they are so similar. I greatly like the 14" laptop I currently have and didn't like the 13.1" Fujitsu T901 I had earlier, felt too small. So going to go with a 8470P or 8570P, whichever I can find cheap. I am planning on selling my current AW m14xR2 w/o drives and getting one of these nice HPs and a GPU for my desktop.

    Soooooo, all the little mods on the 2570P, do those apply to the 8470P/8570P as well? Battery, wireless, etc.

    I'm not really shooting for gaming performance out of my laptop anymore since I got a desktop. I'm more concerned about battery life and perhaps some very small-end CAD work (engineering student). I don't plan on using an eGPU with it either, so one with a dGPU might be handy. Should I look into the 8470W/8570W then? The price is a bit steep to be in my liking though. I'd like to stay sub-500$ (prefer around 400$).

    What display options for the 14" / 15" models can be modded in? I know the 2570P uses an LVDS connector and maxes out at 900p for 14" laptops. Rather, are the 8470P / 8570P LVDS as well? Or is one of those two utilizing eDP.

    ---

    Anyone got a 8470P / 8570P + money they'd like to trade for a m14xR2 (specs in sig)?

    ---

    As for mSATA, would the 8470W (which has mSATA support) BIOS work on the 8470P do enable the mSATA port if the connections were bridged? Or are these systems too different.

  5. For CPU, I'm still messing with it to make sure it isn't overly hot. 4.5GHz is a bit too hot when under full load. (over 85C upwards of 90C+)

    For GPU, since it has the same core as the 660M and 750M, it can be OC'd quite a bit. (can't even get over 75C when I'm really trying to heat it up)

    Here is a screenie of NVI for the settings I use when gaming on the go:

    TA11pB3.png

    • Thumbs Up 2
  6. Hi All

    I am just about to get some new 21" monitors and intend to use them for my laptop.

    Can anyone tell me if I can use the display port with the relevant adapter and also the HDMI out at the same time to drive 2 independent monitors?

    Hope someone can help.

    More than likely just a maximum of two active displays.

    I tried with my m14x R2 and could only have two displays active at a time:

    internal + vga connected

    internal + hdmi connected

    vga + hdmi

    I don't have any mDP adapters, so you'd have to test that. Would be curious though.

  7. @Anthonycy

    Were you installing Win8 via UEFI / GPT install or legacy mode? That would explain why Win7 said everything was all messed up during install. I usually drop all partitions on the drive and then just click 'Next' in the installer without formatting any partitions. Then again, I've only downgraded from Win8 to Win7 once (and that was a while ago).

    ---

    As for disabling the iGPU, I can't help much there. I can't do it on my m14xR2 since the screen is attached to the iGPU and not the dGPU. Someone else would have to chime in on that one.

    What error does it give you exactly on your 6990M (right-click -> properties)? No driver? Not enough resources?

  8. I tried disabling Optimus on my m14x R2. But that just ended with bricks and blind flashing of the BIOS. Since the display is hooked up to the iGPU, I don't think I can disable the iGPU and have just the dGPU. It just beeps at me. If you want, I can disable that and try eGPU stuffs. Might just result in more bricks haha (I'll unplug the speakers, I hate hearing the beeps).

    Would you want me to test:

    iGPU + dGPU + eGPU (if possible, eGPU shows up in device manager, but not enough resources)

    iGPU + eGPU (this will probably work, eGPU would have enough resources)

    dGPU + eGPU (don't think this will do anything, dGPU would be stuck since there would be no iGPU)

    eGPU (perhaps? not really sure on this one. Would have to figure out what

    ?

    I can give it a whirl tonight / tomorrow if you want. I'm running Windows 8.1 Update 1 at the moment with a GPT/UEFI install, so don't think Setup 1.x will work (unless something changed).

  9. do you think useless with hd 5830? performans pass than gt750m enough for me now.after that If I can this system,I will buy better graphics card
    To be honest I don't think you will gain much at all.

    AMD + PE4L2.1b/EXP GDC will only give a PCI-e 1.2 link, whilst NVIDIA cards have the benefits of utilizing Optimus Compression => 1.2opt link.

    See 5830 vs 750M for comparison.

    Pretty much this, I can't see the HD 5830 being much of a gain over the GT 750M unless you are doing stuff like CAD work or encoding or something that benefits from having more cores and isn't overly bandwidth sensistive. Your GT 750M would probably be much better if you are gaming. You might be able to OC your GT 750M nicely too.

    The GT 650M (same core as the GT 750M) OC's really well.

    NVIDIA GeForce GT 650M video card benchmark result - Intel Core i7-3820QM Processor,Alienware M14xR2

    • Thumbs Up 1
  10. 4k reads are a bit on the low end, other than that quite impressive for a downloads drive lol

    Tapatalkin'

    Yeah, I might have to fiddle some, about 8 of the mSATA pins (on the mobo daughterboard) weren't even soldered, so I had to solder them up to get the connector ... well... connecting. Will probably resolder tomorrow. That might help. Dunno.

    • Thumbs Up 1
  11. I've had my M14x R2 since Dec 2012. For almost a year I felt like I was duped by Dell for buying their all talk no delivery crap machine. Then I came across this forum while researching how to safely overclock my notebook. Being a complete noob to all of this, as well as bios/vbios flashing I wasn't too sure about taking the risk of bricking my machine. I had bricked a few android phones in my beginning days of rooting and flashing because I would read and do instead of researching and understanding. After joining this forum and learning for quite some time now I feel more comfortable in what I want to do and I think its time to start modding my pc. I wanna say thanks to everyone who has helped or come up with the mods and fixes, asked questions, and posted results. You all make this an amazing forum that I'm glad to be a part of!

    You can really get the R2 goin..

    Here is my HWBot submission:

    angerthosenear`s 3DMark11 - Performance score: 3547 marks with a GeForce GT 650M

    I had 1271 MHz on the core @ 1.2V. Thermal throttling disabled.

    I even beat out a lot of 650M SLI setups ;D ......

    --

    I would like to get a 3920XM to put in here. I have my 3820QM set to 41x/40x/39x/39x and I still am not running into thermal issues at all. Would easily be able to support an extreme processor.

  12. 1st of all thank you for this forum for what they are doing for people like me. 2nd, angerthosenearman, thank you very very much for spending some of your time for me.

    i saw something like using an external graphics card. i forgot where i saw it but how possible is that? if that applicable to our m14x?

    For eGPU projects, see the forum section here:

    DIY e-GPU Projects

    It's got a lot of information, be sure to readup on in if you intend to use an eGPU.

    I tried it on the m14x very quickly. It doesn't seem like it'll be too hard to setup. Would probably need Setup 1.x to accomplish it, but I'm on Win 8.1 with a GPT install so I can't quite do that. Could probably fiddle around enough and make it work however.

  13. Okay, so you mean to say, there is not much difference. It is still GT555m with 3GB DDR3? Am I correct? then everything else is the same?

    Well, it would certainly help out if you are maxing out your VRAM (like I did). Everything else is the same. Same GPU core and such. I got svl7's unlocked bios mod on it atm if interested. Once you get another quality post I can PM you.

    ---

    Tested it multi-displays. Can only have a max of two active displays (even with the R2).

    http://i.imgur.com/9VnWiqD.png (semi large screenie)

    http://i.imgur.com/JpGKMyj.jpg (pic of setup)

  14. HI,

    Thank you for your response. What will i get if you are going to sell yours? Is it worth it to upgrade just for that purpose?

    Honestly not much. You'd be better off just swapping the R1 mobo with a R2 mobo. The jump from a GT 555M with DDR3 with a GT 650M with GDDR5 (which has the same core as the 660M and can be clocked as such), is a very worthwhile upgrade. See my link to that in my signature.

    I'll test the multi-screen thing tomorrow, been busy with stuff this weekend.

  15. I tried to take a picture of everything possible.

    First of all, here are my system specifications: http://puu.sh/7XZe7.png

    And here is my setup.

    1. This is my power adapter, an AC 230V: https://dl.dropboxusercontent.com/u/40343524/Photo%2006-04-14%2011%2015%2056.jpg

    2. My power adapter is connected to the swex with a 24 pin, as seen here: https://dl.dropboxusercontent.com/u/40343524/Photo%2006-04-14%2011%2016%2006.jpg

    3. This is my r7 260x in a powered state and connected to the laptop and the PE4L, the fans are running: https://dl.dropboxusercontent.com/u/40343524/Photo%2006-04-14%2011%2016%2019.jpg

    4. These are the lights which are on when the PE4L and GPU are connected to the laptop, The Yellow and Orange one, (A red light shows up when the PE4L ISNT connected to the computer, but goes away once I connect it): https://dl.dropboxusercontent.com/u/40343524/Photo%2006-04-14%2011%2016%2027.jpg

    5. The PE4L is connected to the power adapted by a 4 pin as seen here: https://dl.dropboxusercontent.com/u/40343524/Photo%2006-04-14%2011%2016%2046.jpg

    6. This is the mini HDMI i believe which connects the PCI-E port and the PE4L: https://dl.dropboxusercontent.com/u/40343524/Photo%2006-04-14%2011%2017%2000.jpg

    7. Same as above but closer, in case its hard to see on the previous picture: https://dl.dropboxusercontent.com/u/40343524/Photo%2006-04-14%2011%2017%2006.jpg

    8. Yet again the same as above, but with the original wifi card installed: https://dl.dropboxusercontent.com/u/40343524/Photo%2006-04-14%2011%2021%2029.jpg

    9. A picture of the PE4L itsself in case someone is interested: https://dl.dropboxusercontent.com/u/40343524/Photo%2006-04-14%2011%2022%2035.jpg

    Here are the properties of the Wifi Card when installed, which PCI port it uses, ect: http://puu.sh/7XZCU.png

    Im really grateful for your help, and bigup on the Elfen Lied profile pic.

    Hehe, I have a huge Elfen Lied poster on my wall too.

    You shouldn't get the red light when the PE4L isn't connected to the laptop. It should ALWAYS be yellow and green. Please have a look again.

    Actually, I've always had yellow+green+red when the eGPU was powered but laptop was off. The red light goes away as soon as the laptop was powered on.

    --

    Everything hardware wise looks fine. Personally, I never had that two wire cable connecting the PE4L and the SWEX, some do, some don't - don't think it'll make a difference though.

    Have you tried updating your BIOS?

    Is your eGPU detected when you enter Setup 1.x? If so, then it'll just require some fiddling to get it working in Windows, but it should show up in Windows if that was the case (at least with an error).

    • Thumbs Up 1
  16. Man that turned out awesome. Glorious work!

    I'm guessing you might shave off all of a fraction of a degree if you cut away all of the plastic there (the diagonal bits).

    I also noticed you put a copper sheet across both of your heatpipes, I'm guessing this is even out the temperature between both pipes / distribution?

    Also, I see little aluminum/steel plates under two of your screws + springs, was the stock mount that uneven @@?

    I hunted around for a while for a 12mm or larger heatpipe, but still couldn't find any reasonably priced. And looking at it, doing direct pipe cooling looks to be tricky there, would need two sharp 90 degree bends. Could you perhaps replace the copper plate you have with a thin copper sheet, or would that be more detrimental since it would be like having your current dual heatpipes directly on the die leaving hotspots?

    Another thought space restrictive, could you move your fan back about an inch? Could use shorter heatpipes (that are straight), then just fab a tunnel to exhaust the heat. Or does heatpipe efficiency not change much with that minor a change in heatpipe length and the bend? (I lost the page to that calculator thing).

    ---

    Completely unrelated, we have the same fingers... das kinda creepy.

  17. I tried every possible thing o.o

    Setup 1.x, Hot plugging, anti whitelisting the PCI, messing around with the delay, Im about to give up :(

    Im really fustrated cause i spent a lot on this setup.

    I was just messing with a M5110 (AMD version of the N5110) that came into the computer shop I work at. That WiFi card is in a horrible location...

    Could you take a picture of your setup? You said you have the floppy power cable going to your PE4L and then the PCIe power cables going to your GPU correct? Does the GPU fan spin?

    What lights on the PE4L adapter are lit and what color? This'll help the most.

    What OS are you using?

    • Thumbs Up 1
  18. hi all,

    i was wondering if it is possible to get multiple displays in my m14xr1 gt555m 1.5gb.

    i know 2 monitors is possible but how about 3?

    I'm only going to take a guess on this one. But since the dGPU has to pass back to the iGPU, and SB based iGPUs can only support two displays; I feel your will be restricted to two active monitors.

    I used to have a R1 (still have mobo, trying to sell it), but I have the R2 now. I could try it with the R2 and see if I can have internal + 2 external active. I don't have a miniDP adapter of any sort so can't test 3 external.

    Side note, I got the 3GB version if you wanna upgrade ;D ..

    :shameless plug: hehe

  19. Also, try not to discharge below 20% if you can help it. I've done this with my phone (original iPhone). And even 6 years later it is still amazing (3-5 days battery life). For laptops when you are on the go it's a bit harder. Perhaps try to stick with 35-50% charge. Mainly because::

    DISCHARGE RATE IS NOT LINEAR

    It isn't. So don't think it is. ;D

    Here is some experience from custom flashlight building (my other hobby). Here is some discharge charts for various batteries:

    http://www.lygte-info.dk/info/Battery%2018650%20UK.html

    Can also see how much capacity changes with different voltage draws:

    Battery test-review 18650 summary

    As you can see, the voltage drops as the battery cells gets more and more discharged. Guess what happens? The controller has to pull more current to make up for the loss in voltage so the output is still the same. Your computer isn't going to run off 9V when you have a 11V battery, so the battery has to suck more current from the cells to put out the 11V required.

    Remember to take this into account when you view your battery percentage, it's quite important.

    • Thumbs Up 1
  20. It's interesting. Your CPU doesn't seem to be much more powerful. Which map did you play? The Siege of Shanghai map is more difficult to handle for the CPU. I tried other maps too. Your GPU Utilization is way more stable, which means that the CPU is not bottlenecking the GPU that much. Mantle shows better results, while comparing powerful CPU and GPU combinations, but it should work with the weaker ones too. Your GPU is a much better performer. I'll make sure about Mantle working or not with the Star Swarm benchmark.

    Laptop: i7-2620M with R9 280X eGPU

    Desktop: i7-4770K with R9 280X

    I played in the Test Range so I can have the most stable comparison between tests. Things like crashing through the concrete walls / shooting those tended to give the largest performance drop.

    I wanted to stray from CPU intensive things since the T901 does get really hot and likes to throttle.

  21. Yes, I am sure. Safe mode and many reboots to be sure.

    I used the latest 14.3 drivers and older: 14.1 and 13.9 (with lack of Mantle support). The performance was pretty the same. I'll check if Mantle works in the Star Swarm demo tomorrow.

    I could post the videos, but I'm still working on my DIY enclosure... and I'll need to return the GPU on Monday. So, I'm not sure if it will be possible to record the video.

    If you're planning on making an DIY eGPU, why don't you stay with Nvidia graphics? They work very well, the performance is quite good. Only Geforce is capable of the fastest Expresscard, 1.2Opt connection. Radeon lacks the Optimus support and AMD didn't make an Optimus alternative. I hoped, that Mantle could be a performance booster.

    Mantle does help with performance. I did some testing here:

    http://forum.techinferno.com/diy-e-gpu-projects/2109-diy-egpu-experiences-%5Bversion-2-0%5D-54.html#post82601

    and comparison in a desktop here:

    http://forum.techinferno.com/amd/5908-amd-catalyst%99-14-1-mantle-beta-driver.html#post83225

    Improvement is about the same for desktop GPU and laptop eGPU (about 10fps in BF4 on 14.1). I don't have, didn't try Star Swarm. A nice Nvidia card is still better for eGPU configurations.

  22. It was probably your RAM setting that caused the issue. I know when I tried clocking the RAM higher on my system (AW m14xR2), it refuses to boot, beeps at me, and eventually decides to reset it's settings / I have to clear CMOS.

    Did you change this setting and then it all went downhill? Or was this something you changed inbetween edit 1 and 2?

    • Thumbs Up 1
  23. It's Matlab, last time I used it though was 10 years ago for my thesis... it was slower than death... took me 3 months to do something that in C required max a week... I am pretty sure things have changed since then, and it will probably offer code optimization for the toolbox you use but I wouldn't expect miracles.

    Don't worry, it hasn't changed much. I haven't run anything fancy that required a decently measurable amount of time to run, but even with the small stuff I think "this could be a bit faster...."

    So yeah, don't expect it to run blazing fast.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.