This article is more than 1 year old
VGA dead: not many end users out of pocket
New lamps for old monitors arrive at 15 per cent premium
Intel Developer Forum The bells were tolling for the VGA standard today as Intel and a large claque of manufacturers said the digital video interface (DVI) will appear and real soon now. And it won't cost very much to buy systems incorporating the interface once it's taken off, major players averred. At the Intel Developer Forum in Palm Springs, representatives from both Viewsonic and IBM rolled out their plans for displays using the interface, which is part of an initiative sponsored by the chip giant. At Intel's Showcase, there are dozens of monitors using DVI, which turns over all analogue functions to the monitor itself. Marc McConnaughey, VP of technology and sourcing at display company Viewsonic, admitted that monitors will cost between 10 to 15 per cent more than the current analogue standard. But, after a year or so, parity will be reached, he claimed, after the whole industry adopts DVI. And Ed Anwyl, brand marketing manager of Flat Panel Monitors at IBM, said that Big Blue has already incorporated DVI into one of its Aptiva models. It plans to do the same with its desktops, but supplying "dongles" for large companies which still have legacy equipment in house. According to McConnaughey, DVI based CRT (cathode ray tube) systems will deliver crisper and sharper images to end users. There is another twist to this story and Hollywood is involved. According to Steve Spina, Intel's strategic initiative manager, VGA was given a waiver by the large film studios worried about their films being shown on displays. But the parties in the Digital Display Working Group, which are legion, have now agreed on technology which will protect Hollywood content. It is, therefore, a win-win situation for manufacturers and Hollywood. ® Full IDF Summer 99 coverage