Can i crossfire a 5770 and a 5830




















For a better experience, please enable JavaScript in your browser before proceeding. Previous Next Sort by votes. Ookami Distinguished. Nov 11, 11 0 18, 0. Is it possible? When I was putting together my new system about 6 months ago I decided to go with the because of its price point in hopes of later on adding another and CrossFireing them together. Now I have the money to buy the second card and I can probably afford the small difference in upgrading to the series which has Eyefinity an option I wouldn't mind having for my next upgrade or possibly a if my bonus is as much as I'm hoping.

I just want to make sure that you can indeed Crossfire a series with a 58xx series before I spend the money. Thanks for any input that can be given. Sep 16, 4, 0 23, However, an HD crossfire is very potent, especially if you overclock both of them.

JDFan Splendid. Jan 5, 3, 0 21, Best option is either get a second or sell the one you have and get the !! OK, thanks for the info. I was actually afraid of this, but coming from the nVidia side of the house originally I was hoping Crossfire ran the same way that SLI was capable of, that being you could SLI to different series together. Of course you did't get the same power as two of the same cards, but it was something. I guess ATi didn't enable this type of option for their cards.

I'll have to mill this around some and figure which way I want to go. As I recall, and were doing so well at their respective price points, that NV had to re-release the GTX with more shaders cause not even they could justify charging what they were for that card. Hands down. My trucks along quite nicely at 2. Some of those PhysX titles are also upcoming, though. And hey, at least I did not hide the fact that many of those games were still in development.

Yes, there may be fewer titles out now for DirectX DirectX 11, on the other hand…. They are about equal broadly speaking. It would be cheaper to produce, and might have performed better too. The is not good. Good try. I feel like the issue was they tried to cram too many cards into the same general area of performance. Would have made the whole line-up look more complete and more desirable.

That way the could have been and the could have been Filling in the price gap fine. So you really have 1 or 2 games available. You really have to bend the truth a lot break even , for your response to be even close to valid. Great, we know IGPs are bad for newer games…not the most useful tests.

It would however be very useful to know what older cards low-end GPUs or IGPs are equivalent to for the sake of playing older games. Even just one such article would be useful because then we could reference off of it in the future using extrapolation for newer IGPs. Pretty likely given the clockspeed advantage. A running speeds is pretty close to the real thing:. This thing is awful. And just think of all the price hikes to come when nVidia releases nothing in this segment or lower for months yet….

I might do something like that. Remove MW2 from the test list. If you can afford 3 monitors decent ones, not crappy ones , then forking out for a will be no problem at all. Green fans whine but…. It is an instant visual representation of what I wanted from the article. Disappointing to say the least. BTW, who buys 3 monitors only to pair it with a ?

Type search above and then hit Enter. Less stuff, more speed, say what? Processor Core i Extreme 3. Conclusions The case against the Radeon HD was made quite clearly in the value scatter plot on the preceding page. Comments closed. Nutmeg 12 years ago Me too. OneArmedScissor 12 years ago A higher clocked would use a lot of power and require more expensive memory, as well. I imagine that will be a viable option for the midrange of the series. Freon 12 years ago Nothing would indicate all chips have that many flaws.

Freon 12 years ago This was only a comment with regards to video cards. I promise to keep out of commercial airspace. JustAnEngineer 12 years ago Agreed. JustAnEngineer 12 years ago Amen to that. Using a rare overclocked card and not labeling it as such just leads to confusion. NeelyCam 12 years ago Address? NeelyCam 12 years ago Priceless!

YellaChicken 12 years ago Waka, waka,waka. Rakhmaninov3 12 years ago Still like my GTX. Rakhmaninov3 12 years ago Amen, brother. Freon 12 years ago My anecdotes nullify your argument! Freon 12 years ago I agree more on the and below, but less on the and above. LoneWolf15 12 years ago Need a band-aid for that? Luckily so far all the cards appear to have beefy coolers, basically coolers so until they figure out they can save a few bucks on a smaller cooler… It will be interesting to see how much you can overclock a flawed chip.

Veerappan 12 years ago l[. LoneWolf15 12 years ago I waited for the to come out. BoBzeBuilder 12 years ago Last generation was the golden age. Dposcorp 12 years ago q[. BobbinThreadbare 12 years ago l[. MadManOriginal 12 years ago I see it as an attempt at interspecies communication and understanding. SubSeven 12 years ago I bow at your intelligence. Yet in the current tests the is clearly inferior. Joerdgs 12 years ago Kind of missed the Geforce 7 series in the power consumption graphs, wanted to have a final laugh at them.

Joerdgs 12 years ago Re-read the HD series review. I, for one, would like to see more ROPs. Every post is golden. PhysX takes a bigger performance hit than DX Pick one. MadManOriginal 12 years ago They are about equal broadly speaking. Palek 12 years ago Yes, and 14 out 15 of those PhysX-capable cards slow to a crawl when you enable PhysX. You should cry for how biased you are.

BoBzeBuilder 12 years ago Wow. StuG 12 years ago I feel like the issue was they tried to cram too many cards into the same general area of performance.

Same memory, bit the v1's wouldn't clock very well but 2 of the same cards with the same layout work extremely well together. Joined Feb 26, Messages 8, 2. Bollocks I got all excited for a minute. Id love to try crossfiring my and Crossfired and before and was a bit disapointed with the results and have always considered a single gpu to be a better solution.

Joined Oct 17, Messages 9, 2. I have a question that i see Sort of answered to my understanding in the previous posts, but can someone with Experience answer two questions. Joined Apr 29, Messages 3, 1. Joined Apr 30, Messages 2, 0. I just ordered an LG x Dragonusdrako New Member. Joined Jun 8, Messages 1 0. Okay I seen already but I want to double check before doing so as I'm building a computer and over time will upgade the parts on it but right now I'm low budget. The question I have is, Will a amd a10 work with a r7 x and later with an r9 or x?

If not what CPU can I get budget wise to work with the R x for the time being and then upgrade to the R9 or x? Thank you so much for answering the question if you can. Joined Apr 20, Messages 21, 4.

R7 x might be a bit too powerful to crossfire with that CPU - Most folks would recommend a r7 or Just some guy New Member. The power consumption of 24 Watt according to AMD is not that high. Therefore, this GPU could also be used in " laptops. In this setup, alternate frame rendering AFR is the most common rendering technique. Because each card renders a different frame, the Crossfire solution may suffer from micro stuttering at the FPS range as there are irregular delays between sequential frames.

Therefore, the Crossfire HD may need higher framerates for fluent gameplay. If DDR3 is used by the laptop vendor, then the graphics performance will suffer noticeably.

All in all, 2x Million transistors offer a theoretic computation power of up to 2x1. For comparison, the HD series received no such optimizations. The gaming performance of the CF solution should be sufficient for all current games in high resolution and high detail settings with some AA enabled. Since Flash Of course, this depends on how many monitor outputs the laptop vendor adds to the notebook and how they are wired.

Most laptops will support up to three screens simultaneously one internal and two external. Due to the high power consumption of the Mobility Radeon HD CF, it is only suited for large laptops with good and possibly loud cooling solutions. According to AMD, the performance per watt ratio same power consumption, better performance and the idle power consumption have been improved thanks to Memory Clock Scaling and Clock Gating.

Compared to desktop graphics cards, the Mobility Radeon HD CF should perform somewhere between a Radeon HD and in Crossfire mode based on the shader count of the and the clock rate of The following benchmarks stem from our benchmarks of review laptops.



0コメント

  • 1000 / 1000