I have used this graphic card for a long time and what i can say from the very start is that it never failed to amaze me with the image quality both in 3D games and while viewing movies through it's TV-out. This is a great trait we've come to expect from all ATI graphic cards.
To be more specific, these are the main advantages of this video card:
1. It offers a brilliant image quality in games, provided the games themselves have a nice graphic to display. As most ATI video cards, the Radeon VE has built in texture filters, antialiasing features, directX 8.0 support, environment mapping, and many many others.
2. The possibility to extend the desktop on 2 monitors is amazing and gives you a great deal more workspace. When i was using this card, most of the time i would leave Winamp on one monitor and the application i was working with on the main one.
3. It is equipped (mine was at least) with both types of TV-out connectors: s-video and composite. It is also possible to connect 2 TVs at the same time, although i can't imagine why you would want that. All possible TV standards are covered: NTSC, 2 main types of PAL, 2 main types of SECAM. It is pointless to comment on the image quality on the TV since it's a known fact that ATI video cards have excellent TV-out features.
4. A plus are also the nice, user friendly controls for various features, available in the "display properties" menu. These not only look nice, but are also comprehensive, offering you control over all of Radeon's important functions.
5. ATI graphic cards generally do not get hot, and the Radeon VE stays particularly cold at all times, so it is shipped without an active cooling system. That's a plus since most modern graphic cards have noisy coolers, which add up to the already existing noise of the PC. Also, if you decide to replace it's small heat sync with an active cooler, you get a "ready-for-overclock" video card. You'll want to do this if you're a gamer. I'll explain later on.
Cheap video cards always have some inconvenients, and are never perfectly suited for all applications, or are equally but poorly suited for all applications. This card from ATI is in no way an exception, it too having some inconvenients, which i will enumerate below(only the ones i know for sure of course):
1. The Radeon VE is well known among IT profesionals as one of the few 3D video cards that actually does NOT have a geometric processing unit... That's a little strange and also sad, since it means the geometric calculations as well as rendering and various texture filtering procedures are done by the same graphic engine, which by far can't handle all of these tasks. Thus, don't expect fantastic performance in games, unless they're not the latest apparitions. So it's frustrating that the image quality is rather in vain as far as gaming is concerned. Being an ATI graphic card, all games including the latest will launch but some won't be playable, and that also because there's no geometric processor. If you can stand playing a game that runs only slightly slow, you may not be able to stand some areas with missing textures that appear because the graphic processor is simply overloaded with tasks and cannot render everything if the scene's textures are too variated. Here's a short list of games you should avoid if using this graphic card:
- Homeworld 2 - appears acceptable at first but some ships appear without textures; when there's a battle you loose because of slowdown
- Area 51 - runs too slow, no textures at all
- Punnisher - runs too slow
- Farcry - runs slow and there's missing textures, lots
- Need For Speed-anything later than the 4th is not playable
- Unreal Tournament-none of the versions are playable
There may be others but i haven't tried many games.
2. There's also a slight slowdown in 2D performance compared to single display cards, but not too noticeable. Strangely, this slowdown occurs even if all displays but the primary are disabled.
I bought an OEM ATI Radeon VE (StLabs 64Mb 4xAGP TVOut), mainly for the purposes of being able to connect the computer up to the TV and so watch DVDs and films on the TV while running them from the computer, fed up of having to watch them in the spare room on the PC screen. I've had the card for about 6 months now and I've found it very good running under both Windows XP and 98SE. The basic spec of the card is 64Mb SDRam memory, and it has outputs for VGA (for the monitor), SVideo or RGB TV-out. Computer-wise it runs very well and can handle Quake 3 smoothly although I do not use it generally for graphics or games, more for the TV-Out functions. It has had no problems and does not generate any significant noise or heat. It is fine for general word processing, internet usage, photo and spreadsheet work. Looking on the web for info generally the ATi cards got favourable opinions on their TV out features and quality - there are Ge-Force TV-Out cards too but I saw so many different OEM GeForce graphics cards, some with TV out and others not and others not with dual display, that it was too hard to tell which ones would be suitable. Therefore I decided to go for ATi. The installation drivers for the card were installed without any problems - basically you install the drivers and another utility called ATi Hydravision which controls all the display settings and switching the output between TV, monitor or both. The ATi VE importantly allows you to output simulataneously to a monitor and a TV at once - this is vital when setting up and your TV and computer are in seperate rooms as otherwise you will spend time running between the rooms to try and select options to set up the display and selecting menu items and buttons. ATi cards basically are numbered in versions 7000, 7500 or VE (same thing), 8000, 8500 and so on - the cards from VE onwards have Hydravision on them so are capable of dual display. Basically the TV display is c
ontrolled by accessing the refresh rate and col our settings tab on your computer (Display properties, Settings) and then the Advanced button takes you to a raft of options for the ATI VE (OpenGL settings, Direct3D settings, refresh rate, flip the display, colour profiles, font sizes etc) - one which includes switching the output between TV and monitor (TIP: make the TV the primary display here if watching on the TV or sometimes the films will not show on the TV). If the connections are in place you have simple graphical options here for monitor on/off and TV on/off, toggle primary display or to show the same view as on other device. You can have both TV and monitor on at once. The TV output is very good - the display and films are certainly at TV quality. I found my TV could not handle SVHS input and only showed the picture in black and white so I had to use RGB instead. However as with most TV Out graphics cards the text quality on a TV screen is terrible and certainly would not be suitable for word processing or other activities - basically the TV does not have the sharpness, resolution or refresh rate of a monitor so has no chance of competing here. It can even be difficult to read text on the TV screen. To connect the TV and PC there are many useful links on the web, do a search for PC To TV. YOU WILL NEED: - a TV with a spare SCART socket - some cables for the sound and video connections from the PC to the TV - a SCART convertor this is basically a SCART connector with 3 plugs on the back - two phono sound inputs and one RGB (phono) input, some also have a SVHS input plug and an input/output switch - set this to input. FOR THE SOUND - twin phono cable (same as the interconnects on your hifi) - speaker double adaptor the double adaptor basically plugs into the back of your sound card or pc speaker and has two raised sockets on the back for plugging two phono cables in (left and right), plug the
other 2 ends into the SCART convertor. Don't buy the one with 3.5mm plugs on the back that look like headphone sockets - you need the raised plugs on the back for phono leads (look at the back of your hifi amplifier to see what they look like). FOR THE PICTURE (A) RGB - single phono lead, plug from TV out on graphics card into back of SCART convertor on TV or (B) SVHS - SVHS lead and SVHS to RCA (ie phono) convertor basically these are 4 pin plugs similar to the mouse and keyboard PS2 plugs, the convertor then plugs into the SVHS lead and then into the phono lead on the SCART convertor. If your SCART convertor already has a SVHS input (4 pin circular plug) you do not need to SVHS RCA convertor. Generally some TVs are capable of displaying SVHS pictures and others not - as the SVHS signal is carried as black and white with colour as extra information on the top some TVs are not capable of picking up the colour information and so you will only get a black and white picture. There is generally no way of knowing this until you try - if the picture is monochrome you will then need to use RGB and the phono lead instead for the picture. The SVHS picture is better though it you can get it. If your TV does not have a spare SCART socket get a SCART switcher (about £10 from Argos) to add more SCART bays and save having to pull in and out leads to swap them around. Generally this equipment costs about £25 at a computer fair, PC World now sell it as a complete kit at a premium - this is what it generally costs at a fair: SCART convertor: c £10 10m twin phono (RCA) lead: £5 for the sound 10m phono (RCA) lead: £3 for RGB video 10m SVHS lead and SVHS-RCA convertor: £15 for SVHS video 3.5mm (headphone jack) male to 2 phono adaptor :£1 doubler adaptor for sound If your TV is further or nearer you will obviously need different
length leads, usually 5m or 15m leads are available inst ead of 10m. SVHS leads are expensive and longer leads cost a lot. If you are using RGB only it may be cheaper to get a 3-lead RCA/phono set. The Radeon VE is about £50-£80 depending on what graphics card manufacturer you are buying from and where you buy it - as far as I know all of them use ATi's drivers though. You will also need a mouse extension lead to be able to control the PC from in front of your TV, you can get these up to 5m but after this the mouse signal is much less effective so it is not recommended to go much further. If you do not have a spare SCART lead on the TV buy a SCART switching box to allow extra connections and save having to swap round SCART cables (about £10 from Argos). Hopefully this has not got too tekky, it really is simple to use once set up - basically switch off both TV and PC and connect it up, then boot computer and turn on TV and use the Hydravision settings to switch the output to the TV. When finished switch the output back to the PC only and you can unplug the leads and switch off the TV and carry on working with the PC. However I would certainly recommend it - after a few uses it becomes second nature to set up. It is great to be able to watch video and DIVx trailers etc on the TV and not the spare room or bedroom hunched in front of the monitor whilst a DVD is whirring away noisily. Add this to the fact that hardly any DVD players can play DIVX or other video formats and it is even better. Overall an excellent card for both graphics and TV display.
With a street price under £70, the ATI Radeon VE graphics card is the least expensive dual monitor graphics card you'll find! It has good performance ,functional utilities, which adds up to an excellent solution for most mainstream business users and consumers, though fast twitch gamers and digital artists should definitely look elsewhere. The board itself has one 15-pin VGA, a 24-pin DVI-I digital flat panel connector, and an S-Video output port, with the supplied DVI-I-to-VGA adapter enabling dual VGA monitor support. Any two connectors can operate simultaneously. I have done a benchmark and it scored 93 fps, illustrating the 3D power you'll give up in gaining dual monitor support. Application support is extensive. Once you configure an application on either monitor or both, ATI can save these settings and apply them next time you run the program. ATI also provides configurable hot keys for display functions and the ability to control where dialog boxes appear, which is essential for everyday dual monitor use. Overall, the Radeon VE is a great dual monitor card for a range of applications. Just don't expect it to provide the same performance as $500 game engine or $2,000 dual monitor workstation graphics solution.
It's always amazing to see how fast the computer market can change. For example, the past few years, we've seen ATI, the Canadian graphics card maker, slip from being the world's largest supplier of consumer-level graphics accelerators to becoming a close second to NVIDIA. Last year, when it was still considered to be the largest of all graphics chip makers, ATI released their Radeon DDR video card in an effort to try to slow down NVIDIA from gaining the right to be the biggest of them all with their GeForce2 GTS. With performance almost matching, and sometimes even beating, the NVIDIA chipset, the ATI Radeon DDR was a top performer; however, it wasn't met with much success in its goal to slow down NVIDIA, probably because of the fact that ATI had never grown a reputation for top 3D performance while NVIDIA, with their TNT2 and GeForce256 chipsets, had just the opposite and was known throughout the enthusiasts' and gamers' community for having top-performing products. Now, NVIDIA has covered all aspects of the consumer graphics market with their GeForce2 line of products; ATI has only managed to produce direct competition to the GeForce2 GTS and MX line of products with their Radeon DDR and SDR chips, respectively. Each version of the Radeon produces performance levels similar to that of the NVIDIA equivalent; the Radeon DDR performs similar to the GeForce2 GTS and the Radeon SDR performs similar to the GeForce2 MX. There is, however, one feature in NVIDIA's GeForce2 MX line of mainstream-level chipsets that ATI's Radeon SDR lacks: native dual-display support. With the Radeon VE, it is clear that ATI's intention is to try to steal support for dual-display on a single chip for the consumer from NVIDIA. Today, we look at the ATI Radeon VE and will compare it directly to NVIDIA's GeForce2 MX line of products in terms of performance, features and quality. My money is on Nvida, i still think it i
s the best video card maker on the market at the moment.