Leadtek Graphic Card Reviews
Leadtek Winfast A170 DDR T
by The Duke
When I was in the market last year for a new graphics card for my PC, I had been hunting around the web to see what was on offer. I had roughly £50 to spend on a new card, and I didn't know what to get as I hadn't read any of the pc magazines (and therefore not seen any recent hardware reviews). On one of the sites, I ... saw a selection of GeForce 4 cards ranging from £30 up to over £100. One of these took my eye: the Leadtek WinFast A170DDRT. It was priced at £43.70 ex. VAT and it was duly purchased.
* Note that this was a year ago and these cards will now have fallen in price as newer versions have come onto the market.
The WinFast A170 DDRT is an NVIDIA GeForce 4 based card with 64mb DDR (double data rate i.e. transfers data twice as fast as ordinary memory) memory, and a TV-out socket as well. This explains the DDRT jumble of letters as part of the card name.
¤ Very brief history of graphics cards for non-techs
Years ago, when PCs were in their infancy, there were only simple graphics cards (also known as graphics adapters). These were able to produce a limited amount of colours (256 was good in those days!) and were typically good enough for basic gaming (Space Invaders!) and your normal word processing etc.
As the pc took off as a gaming machine, the hardware required also needed to evolve at the same time. Around the time when the original Quake game hit the market, dedicated 3D cards (adapters designed purely for producing 3D graphics) started to come onto the market. The early market leader was a company called 3dfx who produced a range of chips known as 'Voodoo'. Cards with these Voodoo chipsets connected to your ordinary 2d graphics card and they worked together when 3D graphics were needed. 3dfx ruled the roost for a few years with a huge market share, continually updating their chipsets (I believe the last 'proper' version was the voodoo3), and this continued until a company called NVIDIA came along with their 'GeForce' chip which they described as a GPU (graphics processing unit - much like your PC's CPU, only dedicated to graphics.) these GeForce cards were different to the old Voodoo cards in that they were 2d/3d cards, and you only needed one to perform all graphical operations, rather than the old set up of two different cards, and this has become the standard all cards released today.
Recently, NVIDIA have had competition from other chipset manufacturers who have started to catch up and produce high quality chips which have excellent performance comparable to the GeForce cards on the market. This is good for the consumer because prices will drop as competition forces better cards onto the market.
¤ What do you get for your money?
Contained inside the box are two manuals - a quick installation guide and a more general guide for the card. These guides are not specifically for this card, but will cover all the A170 series of cards. For my particular card there were three leads, an S-video-AV cable, an ordinary AV cable, and an AV / S-video-AV adaptor which should solve any problems about connection worries. Then, you get the card itself and finally, the CD with the drivers and support software.
¤ The Card
The card was a standard size for an AGP (accelerated graphics port) graphics card, and was solidly built. The cooling fan for the GPU was metal and added a bit of weight to the card and while this doesn't affect performance of the card, it gave the card a good, solid feel to it.
¤ The Documentation
The quick installation leaflet contains enough information to install the card into your PC and install the drivers. It's very well laid out, has plenty of diagrams to help you out and the tasks that it describes are all done in a simple, logical series with relevant referrals to other documentation at the relevant times.
The General Guide is a 60 page long, English only guide to the card and software. Most of the guide is taken up by the description of WinFox which is a piece of software to enable the user to tweak the settings of their graphics card to get better results. Also included in the guide are instructions for installing DirectX (a series of Microsoft modules which standardise how programs use the graphics and sound hardware in your computer.), updating the BIOS of the graphics card (only really used if you are a complete nerd) and a section on VIVO use (which this card isn't capable of).
¤ The Software
WinFox is a good piece of software, although only the people who want to squeeze every last piece of performance out of their system will ever really use it to its fullest extent. Most ordinary users might use it once in a while to check that their settings are ok. It's useful in that it's easy to understand where to go to do what you want to do, and would not be too confusing for less technical people.
WinFastDVD is a program which allows you to watch DVD movies on your PC (provided you have a DVDROM, of course) and has quite a lot of features. This piece of software isn't included in the manual, but is simplicity itself to use, especially if you've used DVD software before.
The drivers can be installed automatically, and use NVIDIA's own reference drivers. More than likely, the drivers on the enclosed CD will be out of date but you can download up to date ones direct from NVIDIA's site, or from Leadteks site.
DirectX8.1 is bundled with the card as well, but it has recently been superseded by DirectX9.0. You can download this from Microsoft directly if you ever need to.
¤ Card performance and specifications
It's quite hard to tell these days without throwing some really high end games at a card how it is actually performing. I'm running the card currently at 1280 x 1024 pixels at a col our depth of 32bits (i.e. lots of colours), and everything whizzes about just fine when I'm surfing, word processing etc. even with the new, and fancy graphics effects in Windows XP. The card can use resolutions from 640 x 480 all the way up to 2048 x 1536 at 32bit colour depth (if you've got a good enough monitor!). More detailed specifications can be obtained from the Leadtek website (URL given below).
Games are no different. Recently I've been playing Grand Theft Auto 3 and No One Lives Forever 2 and the standard of graphics in these games is amazing with a graphics card powerful enough to show them off. The card was able to deal with these easily (games performance also depends on things like CPU speed, amount of memory etc.) on my system. I have given brief details of my PC below to use as a reference.
One minor point is my DVD playback performance. Sometimes, the DVD seems to skip a bit (i.e. one frame is held on screen for a second or so, and then the film continues as normal), but I haven't identified this as a graphics card problem yet, and I suspect might actually be a DVD ROM problem.
So, it's about 40% of the price of a high end GeForce 4 card with all the bells and whistles, but I really don't play enough brand new games to be able to justify that amount of cash. However, I regularly have two or three different programs running at once, and with a high resolution screen, things are made much easier and clearer. I'm quite pleased with this card, especially at the price and would have no hesitation in recommending it to anyone looking to upgrade their own graphics adapters. It's also worth noting that if you do play a lot of the newer games on the market then this card probably isn't for you. The 440MX versions of the GeForce 4 cards are supposedly no more technologically advanced than older versions of GeForce video cards with their speed pumped up to give better perfor mance. If you're trying to play the likes of Deus Ex: Invisible War, then you'd be better off saving a few extra pounds and going for something more recent with a bit more kick to it.
One final point, in case there is any confusion. The TV out socket is so you can use your TV as a monitor (for example, playing games). This is useful if you currently have a small monitor. The TV socket does not enable you to watch TV on your PC.
¤ System requirements:
AGP compatible motherboard (most are these days, but check with your motherboard manual just to be sure)
Any Windows operating system from Windows98 onward (98, ME, NT4.0 [with SP6], 2000 or XP)
CDROM for installing software (although you could just download it from the internet)
¤ Using my PC as a comparison (note only relevant components are listed).
Operating System: Windows XP Service Pack 1
Memory: 704 MB
CPU: AMD Duron 1.4 GHz
Motherboard: ABit KT7E
Monitor: 19" CTX
Leadteks website is at:
NVIDIA can be found at:
Microsoft can be found at:
Read the complete review
Leadtek Winfast GeForce3 TD
My PC sits way above "minimum" or even "recommended" specification for all of the games on the market at the moment. But as any "hardcore" games player will tell you, it's all about constant upgrading. In the past year, my motherboard, processor, memory, hard drive and case have all fallen prey ... to the upgrade bug, whilst I have been through 3 graphics cards in the past year alone..Luckily, upgrading decent kit to even more high spec kit isn't too expensive as you can get good cash or exchange prices for many of these parts at computer exchange stores. I also found several friends willing to pay cash for second-hand parts, all of
which still had 8 months of guarantee left.
After all this upgrading, I'm left with an Athlon 1.2ghz running on the superb Abit KG-7 Lite board with 512mb of Crucial pc-2100 DDR and a Geforce 2 Ultra graphics card. A week ago, I thought that would be enough for a long time.
However, whilst playing the excellent single-player demo of "Medal of Honour : Allied Assault", I was unhappy with the lack of anti-aliasing, which was making all the trees look really "bitty" and "pixelated". And then I got a new copy of Pc Gamer, and there were several articles about new PC games like "Unreal Tournament 2", "Unreal 2" and so on, all of which have features written for Geforce 3 graphics cards.
I had been looking at this expensive upgrade in two ways
1) upgrade to Geforce 3
2) Keep the 2 Ultra and buy X-box console
Both upgrades cost the same, although x-box games are £40-50, extra controllers £30 so it would cost more. But, seeing all these cool games coming out for PC, I was happy to upgrade my graphics card, and wait until Christmas or maybe New Year for X-Box. By then it could be cheaper, and will have more games than the 10 or so launch games that reach the UK for the 14 March launch.
Som ething I have found about the upgrading path is that you can't leave it too long otherwise your parts don't attract a good trade-in value. As an example, I got my Geforce 2 Ultra for £220 at Computer Exchange in London. I had traded in my Geforce 2 GTS and got about £120, so £100 wasn't bad to pay for the Ultra. When I came to trade my Ultra yesterday, they are now selling at £160, so I got £108 in cash. Things move so quickly that prices always tumble when new, faster cards come onto the market.
Overall, I had been pretty happy with my Geforce 2 Ultra, it was a fast card that could hold it's own against most cards. One problem was that switching on Anti-aliasing would severely impact the performance, and even then the maximum AA rate was 3 sample rather than the Geforce 3 or Radeon 8500's Quincunx mode (4x AA). Also, both those cards are designed to use AA without having massive impact on framerates.
There is alot of confusion on the market about Geforce 3 cards. The Geforce 3 came out, and was followed two months later by a range of cards labelled "Titanium". These new cards were available in 3 flavours
Geforce 2 Titanium
Geforce 3 Titanium-200
Geforce 3 Titanium-500
The 2 Ti replaces the Geforce 2 Ultra, although it is a little slower than the Ultra due to lower clock speeds.
The 3 Ti-200 replaces the Geforce 3, although the Ti-200 is actually slower than original GF3 due to 175mhz core speed in contrast to the Geforce 3's 200mhz core speed.
The 3 Ti-500 is the fastest of all these cards, with 240mhz core/500mhz memory speeds and was the one I wanted.
I took a trip down Tottenham Court Road in London and visited a dozen computer stores to get a price. There were plenty of Ti-200 cards kicking about, but few Ti-500's. I went to my favourite store, and they told me they had a Ti-500 for £290 cash. That seemed a reasonable price, so I parted with the mone y and went home.
I was puzzled by the lack of reference on the box to ti-500, just "Geforce 3 TDH", from Leadtek. My Geforce 2 Ultra had been a Leadtek card, and had been great (nice build quality, big fan and heatsinks) so I wasn't too worried.
Installation was easy, and I loaded Leadtek's software suite which includes the video drivers and Winfox card utilities.
I tested the new card using 3D Mark 2001, a great benchmarking program that now boasts specific Geforce 3 features like pixel and vertex shading tests.
The results are below:
3D Mark 2001
1024 x 768 x 32bit x 24bit z buffer x no AA
Geforce 2 Ultra 4509
Geforce 3 5574
Geforce 3 O'c 6006
1280 x 1024 x 32bit x 24bit z buffer x 3 sample AA
Geforce 2 Ultra 1973 (2 sample AA)
Geforce 3 2436 (3 sample AA)
Geforce 3 O'c 2781
You can see quite a big difference in the figures, the 3 is 1038 3D marks faster than the 2 Ultra, but this is more noticeable when you implement Anti-aliasing, or jump to higher resolutions.
The 2 Ultra would not run at 1280x1024 with 3 sample AA, only 2 sample. The 3 was happy running either 3 or 4 (quincunx) samples at this resolution
The O'c figure for the Geforce 3 relates to overclocking, which is as follows:
Geforce 3 core clock - 200 mhz
Geforce 3 memory clock - 458 mhz
GF3 core o'c - 240 mhz
GF3 Mem o'c - 509 mhz
I could have O'c it higher but didn't want to fry my new card, not yet anyhow!
The overclocking was easy with a special utility within Winfox, and the core clock will go up to 350mhz, and the memory clock to 600mhz as long as your system remains stable.
After checking out my new card on the internet, I found out I had been sold a Geforce 3, and not a Geforce 3 Ti-500. I rang the store, spoke to the boss and he's arranged an ASUS Ti-500 for this afternoon.
The Ti-500 differs to the standard GF3 in that the core and memory clock speeds are 20% higher, it has faster memory, higher bandwidth (8GB/sec instead of 7.2GB/sec), lower power consumption and a new 8 layer printed circuit board. You can make up the difference in clock speed between GF3 and GF3 ti-500 by slight overclocking, but this places more heat stress on the card. The Ti-500 was designed to run faster, and so features a redesigned PCB, heatsink and fan.
Although the GF3 is a great card, I paid for a GF3 Ti-500 and that is what I want! I have just been told that the Ti-500 is no longer available, they have new (GF 4) cards out in the next few weeks, these will cost considerably more (128mb video ram!!) and won't be supported by games for at least another year, so I'll stick with the GF3 solution for now..
Overall, a useful jump in speed, and the ability to implement full screen anti-aliasing without impacting performance too much.
There is little software on the market that exploits the GF3's new features, one game is Dronez which is bundled with the card, you also get Windvd and Rebellion's "Gunlok" game. However, many new games, out this year, will support the new Direct X 8.1 feature set, something that the 2 Ultra did not support. Many of these features relate to high-level detail such as being able to accurately simulate fur, feathers, water, etc.
Note: The card has tv-out and DVI output in addition to the normal D-pin monitor connector. The tv-out quality is ok, not as good as Radeons but acceptable.
You get an older Nvidia driver with the bundle. I tested it using 3d mark, and it gave the figures seen above. I then tried using the new Nvidia Detonator XP driver for Win 9XX and 3D Mark showed a decrease in performance, which is quite common with Beta drivers. I reinstalled the bundled driver and performa nce increased. If you use Nvidia's own XP driver, you'll need to install a Geforce control programe otherwise you have no control over the settings. You can't use "Geforce Tweak" as this doesn't work with GF3 cards.
It's alot of money to pay for a small increase in performance, but it's nice to have the best, and it gives a good degree of future-proofing for maybe 2 years of games to come. When the new games arrive this year, the GF3 cards will finally come of age and that money will be well spent.
If you are looking for a new video card, it may be worth waiting a few weeks and seeing what happens with the GF4?
Read the complete review
Leadteck DDR Winfast Geforce2 Ultra
Much has been written about the Geforce family of graphics card; rightly so as NVidia revolutionised the idea of a GPU (graphics processing unit) with the original Geforce graphics cards. Before this, 3D cards were added to existing 2D cards, which didn't make the best use of bandwidth or system resources. Following the ... success of the original Geforce card, Nvidia have continually introduced, updated and refined their graphics chipset. This is now available in many flavours, and the chipset is sold to many card manufacturers across the World.
The most common card is the cheap and cheerful Geforce 2MX, which uses a cut down and bandwidth restricted version of the Geforce 2 GTS (Gigatexel shader) chipset found in the more expensive cards. The 2MX is fine for occasional gamers, or people using smaller (15" and 17") monitors.
But if you want to run your games at high resolutions (i.e. 1280 and 1600) in 32 bit colour with effects turned on (and who wouldn't want all those cool effects that game creators now fill the new games with?), or you are running a larger monitor like a 19" or 21", then you'll need a faster graphics card.
If you're really looking for a decent budget card get one based around the Kyro chipset - this is much more capable than the Geforce 2MX which seems ok but is deliberately castrated by using a half-width memory bus as opposed to the 128 bit bus used in the 2 GTS cards.
However, for power users check out the rest of the Geforce 2 family - this includes the GTS 32mb, the GTS 64mb, the "Pro" and the "Ultra".
And finally the Geforce 3, a new chipset offering a brand new feature package and even faster memory. But as many games do not, and will not, support the Geforce 3's new features, is it worth getting one yet? Price must be the decision maker, you can get Geforce 3 cards for around £300, but most cost £350-£375.
The rele ase of the Geforce 3 is great news for value-conscious buyers, as it seems to have driven down the price of the Geforce 2 cards. As an example, The Geforce 2 Ultra originally cost over £350 when released, I picked up my Geforce 2 Ultra for £220. I've already owned a Geforce 2MX and a Geforce 2GTS 32mb, so I'm well placed to discuss the differences between these three cards. And why would I change cards three times since March 2001?
I found the Geforce 2MX very limited, especially when I bought a Diamondtron 19" monitor. The larger monitors eat video ram, and the 2MX couldn't cope. Frame rated became a real issue, and I didn't think the card was doing the rest of my system any justice (Athlon 1ghz, Raid motherboard, 256mb,etc.)
So I traded the 2MX for a Geforce 2GTS with 32mb of DDR memory, this was fine until I started using 1600x1200 or 1280 resolutions and the dreaded frame rate started to shudder.It wasn't a case of just bandwidth, but a lack of video ram. 64mb of video ram seems essential to make the most out of modern games.
I checked out the Geforce 2 Pro 64mb, and the Geforce 2 GTS 64mb, but these didn't have the extra bandwidth or faster memory of the Ultra card. the Ultra has a clock speed of 230mhz, using the DDR memory gives a whopping 460mhz speed. This can be overclocked to 490mhz without problems.
So I went for the Ultra, and got a great card from Leadtek, Taiwanese manufacturer with a good reputation for quality and price. Included in the bundle was a software dvd player, some colour balancing software and other bits and pieces.
The card is based around the same GTS chipset that the Geforce 2 GTS and Pro cards use, but with higher core and memory clock speeds, giving a more useful 7.2gb/sec video memory bandwidth. This compares to 3-5 gb/sec for the GTS and Pro cards. The Ultra uses the fastest memory that was available at the time of release, which is dual data rate 4ns. T his has now been superceded by 3.8ns and 3.5ns memory which is fitted to the Geforce 3 card, but 4ns is plenty fast for my purposes.
The Geforce chip has always been more capable than the memory fitted to these graphics cards, with bandwidth restricting the Geforce chip from flexing it's muscles. The faster memory and higher core and clock speeds of the Ultra card lets the Geforce chip work more efficiently, without being bottlenecked by slow video ram or limited bus bandwidth(as with the Geforce 2MX).
What this translates to for the end user is the ability to use maximum resolution settings and turn all of the game's features on (i.e. fog, particles, shadows, tri-linear filtering) without the game degenerating into a 10 frame per seconds mess. To give an example, my Ultra will hit 120 frames a second in Quake III at 1024x768x32 bit colour.
The Leadtek card is well built, and features a dedicated cooling solution in the form of a huge heatsink / alloy slab and cooling fan. this is vastly superior to the "orb" style cooling fan fitted to most Ultra cards, and should give your system more stability, especially for overclockers. The card is physically quite hefty, so be careful when fitting it (i.e. don't let it drop!)
Other cool features include TV-out in the form of a S-Video socket, and an unusual but very useful DVI socket (digital video). The card will drive the Tv-out to 800x600 and the DVI to 800x600, plenty for working on video capture and editing. The Tv-out image quality is good, stable and colours are accurately presented. It was weird playing Unreal Tournament on my 28" widescreen television!
Overall, a very impressive graphics card at an affordable price. It's definitely faster than ADI's Radeon 64mb and noticeably faster than the Geforce 2 GTS and Pro cards. You're realitically looking at 10-20% faster frame rates up to 1280, and at 1600 resolution the Ultra is around 35 % faster. You'll only really notice the improvements at higher resolutions, so if you plan to play games at 1024x768 then check out the cheaper Geforce 2MX and GTS cards, these will do for your needs.
Video intensive games, like Max Payne or Giants:Citizen Kabuto work very fluidly with the Ultra. Giants makes best use of the Geforce's chip Hardware T&L mode (transform and lighting), something that many games are supporting (check the box).
But if like me, you like things to work properly, and want to push your system with higher resolutions and 32 bit colour, without frame rate problems, then check out the Ultra. Leadtek's card is now around £200-£250, I've seen Creative's Ultra priced at £225 so prices have definitely fallen.
I spoke to an Nvidia spokesman at the recent ECTS show in London, and asked him what kind of performance difference there would be between an Ultra and a Geforce 3.
He said with older and current games you wouldn't notice any real difference, although the 3 would be more efficient. With new games like Aquanox, which has features from the new Direct 8.1 set, there would be operations that the Ultra doesn't do (as the feature set doesn't support the new features). Within 6 months, he recommended getting the Geforce 3. Nvidia is planning to release it's latest drivers, Detonator 4, before the end of this month (September 01). These will give a 10-15% speed boost, and can be applied across the entire Nvidia range, using their unified driver architecture (smart guys!). And the drivers are free, that's customer support!!
If you're not planning to upgrade for 6 months then wait for the Geforce 3 prices to fall. I'll probably end up with a Geforce 3 within the year, but for now the Ultra is plenty fast and made better sense in terms of performance / price.
*A final point about higher resolutions* if you're running games at 1280x900 and 1600x12 00 you should turn anti-aliasing OFF. You don't need it at these resolutions, it just eats bandwidth and reduces your card's performance. If you running under 1024 then it's a good idea, but really it's only essential at low resolutions like the games consoles use (i.e. television 540x480). To turn anti-aliasing off, use the software supplied with your card. or for best performance, get hold of Detonator 3 / 4 drivers and use something like "Geforce Tweak Utility" available as shareware, great utlity that lets you control all the features of your Geforce card.
Read the complete review
Leadtek Graphic Card
Leadtek WinFast PX7800GT TDH MyVIVO - Graphics adapter - GF 7800 GT - PCI Express x16 - 256 MB GDDR3 SDRAM - Digital Visual Interface (DVI) - HDTV out / Graphic Card / video in - Leadtek has always been focused on intensive R&D, and its manufacturing facility has been recognized as an industry quality le...
Graphic Card /
Graphic Card /
Graphic Card / Leadtek WinFast A170 DDR T - Graphics adapter - GF4 MX 440-SE - AGP 4x - 64 MB DDR - TV out - retail - WinFast A170 series brings a new level of graphics performance and flexibility to the mainstream desktop PC. By incorporating nVIDIA's latest innovative techniques, the WinFast A170 serie...
Graphic Card / Leadtek WinFast GeForce3 TD - Graphics adapter - GF3 - AGP 4x - 64 MB DDR - Digital Visual Interface (DVI) - Leadtek Research is a research and development company that is specialized in the design and manufacture of a wide range of high-performance graphics and video solutions. Leadtek ha...
Second GenerATIon Nvidia GeForce2 Ultra Chipset.
On Board 128-Bit 64MB -4ns DDR Memory.
|Leadtek Graphic Card Recommendations 1|
|dooyoo Results 1 - 6 of 6|