Jump to content

jot23

Registered User
  • Posts

    31
  • Joined

  • Last visited

  • Days Won

    1

Everything posted by jot23

  1. 13,3' or 14' notebook with powerful MXM graphics and maybe aluminium body would be nice.
  2. Thank you, but I'm testing Radeon R9 270 now. I'll try with the GTX660 soon, so your advice may be useful.
  3. That explains your CPU Usage. This map is made for a single player? I mean that there are no enemies, just you and the map and jets/tanks etc.? I'll give it a try tomorrow, to make my benchmarks comparable to yours.
  4. It's interesting. Your CPU doesn't seem to be much more powerful. Which map did you play? The Siege of Shanghai map is more difficult to handle for the CPU. I tried other maps too. Your GPU Utilization is way more stable, which means that the CPU is not bottlenecking the GPU that much. Mantle shows better results, while comparing powerful CPU and GPU combinations, but it should work with the weaker ones too. Your GPU is a much better performer. I'll make sure about Mantle working or not with the Star Swarm benchmark.
  5. Yes, I am sure. Safe mode and many reboots to be sure. I used the latest 14.3 drivers and older: 14.1 and 13.9 (with lack of Mantle support). The performance was pretty the same. I'll check if Mantle works in the Star Swarm demo tomorrow. I could post the videos, but I'm still working on my DIY enclosure... and I'll need to return the GPU on Monday. So, I'm not sure if it will be possible to record the video. If you're planning on making an DIY eGPU, why don't you stay with Nvidia graphics? They work very well, the performance is quite good. Only Geforce is capable of the fastest Expresscard, 1.2Opt connection. Radeon lacks the Optimus support and AMD didn't make an Optimus alternative. I hoped, that Mantle could be a performance booster.
  6. First log: BF4 on map Siege of Shanghai. FullHD + medium settings. eGPU setup: HP 2570p i5-3210m MSI Radeon R9 270 Starting at 7th minute, I played with Mantle. First 7 minutes is gameplay without Mantle. As we can see, Mantle gave me... nothing. Framerate is not very good, many drops below 40 FPS. The CPU Usage is very high and it bottlenecks the GPU. Mantle should help the CPU Usage, but it does not. I'll check out the Star Swarm benchmark to see, if Mantle even works on eGPU. For now, Mantle gives me nothing.
  7. That solved the issue! I'll post the logs soon.
  8. I have HD400 iGPU and no dGPU. Maybe it's the driver conflict, because usually I'm using GTX660. I didn't uninstall Nvidia drivers. But Geforce GPU works fine with AMD Catalyst installed.
  9. I wanted to benchmark Battlefield 4. Unfortunately, FPS stayed at about 20-30 FPS. GPU Usage seems strange, it's 30-50% all the time, while CPU Usage gets 100% constantly. Moreover, the GPU Render Test in GPU-Z shows barely 20 FPS. I tried 4 different drivers, turning ulps off, different PSU... Nothing helped. Do you know, what could be an issue here?
  10. Finally I've got my hands on MSI R9 270 GPU. My 3dmark11 score: i5-3210m + R9 270. GPU score is significantly higher than the GPU score of GTX660: 6785p vs 6019p. Does it perform that well in games? No I've tested Crysis 3, ACIV, Batman Arkham Origins. Some HWINFO logs: a) Crysis 3. Played on low settings in FullHD resolution. The performance is way worse than on GTX660. The FPS peaks in vents etc. are higher on R9 270, but overall, 660 beats R9 270. The third mission is barely playable on R9 270 with low settings. With GTX660, I could play the mission on mixed medium to very high settings, getting higher FPS (about 35 average). I don't have a log for the mission 2 "Welcome to the jungle" with the "grass problem", but the FPS was about 10-15% lower on the grass, reaching 15 FPS minimum. Both cards can't achieve playable framerate on this grass level. Batman Arkham Origins Played on almost full settings, without motion blur. The game is fully playable on both cards. The benchmark contains both mission in a closed space and some city free running. For now, I am disappointed with the R9 270 eGPU performance. I hoped, that the GCN architecture could give me some magic like Nvidia Optimus does. I'm going to test BF4 with and without Mantle, to see if that helps. In theory, CPU Usage should be lower with Mantle and the communication between the CPU and GPU, which is a serious drawback of eGPU due to PCIe bandwidth, should be faster.
  11. Hi! Nit sure if you saw that, but I've tested HD6850 eGPU at x1 bandwidth. Drivers used were 13.9. No severe slowdowns caused by drivers, I think. You could check out my Crysis 3 benchmarks (FPS logs) in the main topic. Since 5770 and 6850 are based on the same old architecture, I think the same drivers apply to them, so the performance loss shuold be comparable. But the performance I've got is quite as expected from such eGPU setup.
  12. And here, catch "The root of all Evil", part one from the beginning to destroying the badass Hydro-Electric Generator. HWInfo log: Screenshots: Placed in a spoiler, because there are 20 screenshots, and someone might have not played the game yet... it's my favourite mission in the game, spectacular and mind blowing, even on low settings like in the test. I don't have the GTX660 log at the moment, but I remember the mission was fully playable at >30 fps min and about 35 average, played with high textures and med-high settings, so again, way better Maybe the FPS was higher because of better CPU Utilisation? The CPU Usage stays at 60% avg on hd6850, and on 660 was more like 80-85%. Because I need a GPU comparable with GTX660, so I'll rather stay at the level of HD7870/r9 270. But the new Radeon series is... strange. R9 270 is sometimes better, sometimes way worse than 660, and it costs about 10-20% more than I payed for 660 four months ago. 270x is a bit more powerful, but 270 should be easy to overclock to match 270x performance. Then there is a gap, because the next GPU is 280x, which can be compared to 680. There should be a r9 280, priced at 200$ and being a good competitor for 660ti or 670. I'll propably take the 270 one or find an older 7870 GPU, which has better GPU clock, but worse memory clock... and that's the only difference. I can take 7870xt too. It's priced at the level of R9 270x here in Poland, and it's performance is better. But it's only available in Club3d and XFX versions, and I'd rather stay with Asus, MSI or Gigabyte ones.
  13. First GTX660 vs HD6850 comparison (I don't have access to 660 atm...on the other hand, such comparison is a bit pointless, because 6850 is way weaker than 660, and it seems like it's more limited by pcie bandwidth): Crysis 3, mission 6 (from the beginning to the second Ceph AA defence, getting there took me 7 minutes on 660 and 12 minutes on hd6850 due to lower FPS, which makes it hard to play well). Settings: Resolution: 1920x1080 All low [email protected]: Look at the red line, green represents gameplay at high setting (textures and the rest) and is really short. AVG FPS: 45 [email protected]: AVG FPS: 27 Like before, the CPU Usage is way lower then it was with GTX660. It's really worth consideration, because as we know, Nvidia drivers cause a greater CPU load then AMD drivers. It's much more visible on older Core 2 Duo based desktop PCs, causing microstuttering, but I think that it might affect Core i5 as well. The problem should be not present when using quad core CPU. On low settings, GTX660 should easily get average of 60 FPS. Not sure about HD6850 performance, but according to benchmarks, 28 FPS is a valid result... for high settings. On low it should be more like 40 FPS I suppose. So, we can see that both cards are limited. By... and now it's starting to get really difficult. I'm not sure which of the factors is more important here. Maybe the performance is CPU limited, like I think it is on Welcome to the Jungle level. Of course, when it comes to the "grass moment", the PCIe bandwitdh is a real drawback there, giving us drops to 20 FPS instead of a way more playable 30-35. On GTX660, mission 6 is really playable, even at high setting. On HD6850 it's not playable on low settings, which keeps mi thinking about how a GCN-based card would perform. It would have to be less bandwidth-limited to maintain good FPS. p.s. I'm going to buy a GCN-based card like HD7870/7870xt/r9-270(X), but given that it's Christmas time, I think I'll have to wait till January, because prices are a bit higher now and shipping might take very long. Which is really important, because in Poland when buying online, I could give the GPU back after 10 days of testing without providing any cause of it. So, I'd like to use that privelege for an almost-free benchmarks, if I won't stay with AMD GPU.
  14. OK, maybe there is some magic that AMD implemented in GCN architecture, which works well with eGPU as the Nvidia Optimus does, and AMD does not know how brilliant they are So, I'll try to get my hands on any GCN-based GPU. I'm excited about this benchmarks too.
  15. OK, I hoped that my score was to low Looking at my performance, I doubt if going with AMD GPU is a good choice. I thought it would be less PCIe bus demanding, but it's not. And using Nvidia GPU is easier.
  16. I tested it on a desktop PC with this GPU, but there's an ancient Core 2 Duo e6550 CPU. Performance was fairly the same in the grass scene, that's why I thought it's rather CPU related. But the first mission is not playable, even on low setting. GTX660 or desktop HD6850 gave me better FPS there. On our forum nhl.pl there is an user waldeksik, he really helps us in testing by emulating x1 2.0 PCIe Bus on his Quadro 4000m. I'll ask him to make a Crysis 3 comparison, if he didn't do it already (I am not sure). Or we can wait till you get a new GPU MY GTX660 and HD6850 is not going anywhere at the moment. To be honest, I hoped that AMD would give me higher FPS and thus be a better choice. Now, looking at your BF4 performance I'd like to get a powerful AMD GPU, but looking at how much HD6850 is limited in the grass scene, I'm not sure if Radeon is a better choice than Geforce. For now, given that 6850 is way weaker than 660, AMD and Nvidia are pretty the same in eGPU use. I still think I'm missing something. I feel there is an important factor which affects my performance.
  17. You mean that it's too low for a comparison, or too low for a HD6850 card? It's not possible to get a kind of x1E 2.0? I mean, could I somehow use the x1E tweak on my setup? For now it's not possible too get a GCN-based GPU. I'm curious what is the difference between HD6850 and 7770 in eGPU. Maybe I could OC my GPU, but I'm not sure about the results. Crysis 3 performance is quite disappointing. I hoped to get FPS closer to the desktop HD6850.The FPS is not CPU limited, is it?
  18. HP Elitebook 2570p: Radeon HD6850 eGPU implementation In order to make a comparison of AMD and Nvidia eGPU setup, I decided to make an HD6850 eGPU implementation. It took me three days of struggle, but finally I've got it working. The reason for the comparison is that I wanted to check the level, in which both cards are limited by eGPU bandwidth. I've experienced huge FPS drops in a few games on my usual GTX660 eGPU, and it got me thinking if an AMD GPU would perform better. Great performance achieved with a HD7970 GPU by user sskillz was very promising. Maybe an AMD GPU could somehow overcome the bandwidth limitation problem? Great thanks to Tech Inferno Fan and sskillz, who helped me to succesfully install HD6850 on my notebook. Of course using the HD6850 GPU required making and DSDT Override and PCIe compaction. I used the newest Setup 1.3, because it's much more user friendly and powerful than previous version that I've owned, I mean Setup 1.1. After three days I achieved this: Looks nice to me First, I made a PCIe bandwidth benchmark using PCIeSpeedTest. The peak values for 20 benchmarks were: CPU->GPU: 360MB/s GPU->CPU: 440MB/s Compared to the [email protected], there is no big difference. In fact, depending on how many benchmarks we run, this numbers are quite the same. So, there is a chance that comparing an GTX660 with it's AMD equivalent (7870?) there could be a similar performance. First bench: HD6850 x1.2 3dmark 11 -for me the GPU score is rather normal for a HD6850 GPU, but on our forum nhl.pl people said it's too low... what do you think? And the most important test for me, Crysis 3, mission 2 "Welcome to the Jungle" (map fields). The reason for it's importance is, that it's the first and the most distinctive example of the GPU limited by the eGPU bandwidth. After about 2 minutes of gameplay there is a beautiful "grass moment", meaning that the player goes outside and discovers the fuly grass and tree covered New York. The grass moves naturally with the wind blowing, and somehow it does not only load the CPU... it loads the PCIe Bus really hard. On GTX660 O've got about 20 FPS in this scene. How about the HD6850? Unfortunately, performance is even worse than with GTX660. I could play the first mission on GTX660 with 30-45 FPS on medium setting, with HD6850 I could barely pass 25 FPS. In the second mission we can see a really low FPS, about 3 less average than with GTX660. And here is the fun part: on GTX660 the CPU was constantly loaded at about 95-99%, here we can see a waaay lower CPU utilisation. The GPU usage is not present in the HWInfo logs, but checked with MSI Afterburner it's a constant 99% load. Now, what do you think about this performance? Is something wrong? Do I need to turn something off, like AMD HD Audio...?
  19. Would it be possible to enable the PCIe lanes to provide 150W of total power for the GPU? Currently the amount of power the PE4L can provide is 75W. So, GPU with a very small power consumption, like relatively weak hd7750, or maybe GT440, can be powered by a small power brick. There is a power connector for it on a PE4L board, but it's 75W limited. If they could push it forward, to 150W, which I think should be not a problem, as the PCIe 2.x standard doubled this amount to 150W (in comparison to 75W provided by PCIe 1.x), it would be possible to power a few great performance per watt cards, like GTX660, HD7850... And we could get the rid off our big desktop PSU's and all the wires related to it. There would be only one power wire to the power connector of the adapter.
  20. Yes, of course, I considered buying 660ti, but 660 was way cheaper for me. My 3dm11 score is very good for a 660 eGPU, but... I don't understand why we can't (mostly) get a score equal to the desktop one. In games, we see that sometimes the PCIe bus causes a huge performance drop, but as I mentioned, 3dm11 is way less demanding for the PCIe bus than those games (Crysis 3 "Welcome to the jungle" mission is a perfect example). So, I just don't get it, how can we loose that much performance in 3dm11? What is the cause of it? The more powerful the card is, the bigger the difference. For example, GTX760 scores about 1k points less than it should.
  21. The best score I've achieved with 2570p i5-3210m and GTX660 eGPU: NVIDIA GeForce GTX 660 video card benchmark result - Intel Core i5-3210M Processor,Hewlett-Packard 17DF GTX660@1241MHz core/6008Mhz memory (effective). I wonder what is the drawback for our 3dm11 scores? It's not as demanding for PCIe bus, as some games like Crysis 3 or Assassins Creed III. The GPU score should be a lot higher.
  22. Maybe HP could at least tell us, if there is a point in looking for eGPU performance boost this way. If it gives us a 0% chance of overclocking PCIe, there's no reason to further investigate this issue, as I don't even have a i7 QM, which would be limited by TDP. Wasn't there a 2570p (somewhere on Ebay) with 45W i7? Correct me if I'm wrong. I tried every version of XTU. Reinstalled it dozens of times. Before the flash, there was at least a greyed out slider. Then, it disappeared and I can't get it to appear again. Khenglish said there's some voodoo needed to do it... so, it makes me wonder, why this reinstallation method worked for Aikimox.
  23. I know it's a not very important issue, but I would enquiry HP about allowing BCLK overclocking. Of course, in order to use it with XTU. I realise, that we have genious Khenglish and his ME FW files, but.. I still haven't figured out how to enable the BCLK slider in XTU. It's a kind of a lottery, unfortunately. It could be connected with raising the max. TDP allowed by the BIOS. Oh, and maybe they would answer if BCLK OC affects EC PCIe frequency. Because if it does not, we would need only higher TDP limits to unleash the power of powerful i7 CPUs. This OC is really important for EC, because it could be the only way to increase eGPU bandwidth.
  24. I thought, that it's a part of the keyboard itself, and the only problem is the power supply/wiring. And that 2170p keyboard did not fit because of size and latches incompatibility.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.