A Windows Vista forum. Vista Banter

Welcome to Vista Banter.

You are currently viewing our boards as a guest which gives you limited access to view most discussions, articles and access our other FREE features. By joining our free community you will have access to ask questions and reply to others posts, upload your own photos and access many other special features. Registration is fast, simple and absolutely free so please, join our community today!

If you have any problems with the registration process or your account login, please contact contact support.

Go Back   Home » Vista Banter forum » Microsoft Windows Vista » Hardware and Windows Vista
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Hardware and Windows Vista Hardware issues in relation to Windows Vista. (microsoft.public.windows.vista.hardware_devices)

VGA to DVI?



 
 
LinkBack Thread Tools Display Modes
  #1 (permalink)  
Old October 24th 09, 09:45 PM posted to microsoft.public.windows.vista.hardware_devices
Charles W Davis[_2_]
external usenet poster
 
Posts: 155
Default VGA to DVI?

I have a nearly new XP machine. I also have a new Win7 machine. I
continually work on other computers for our computer club.

I have bought a 4 port KVM switch with DVI in and out. My XP machine
(although it has a DVI connection on the back of the video card) won't
connect. I have now added a DVI to VGA connector at the XP machine. Still
won't recognize the monitor. The Win7 machine has no problem with it.

Thoughts? Thanks

  #2 (permalink)  
Old October 24th 09, 10:13 PM posted to microsoft.public.windows.vista.hardware_devices
Curious[_4_]
external usenet poster
 
Posts: 395
Default VGA to DVI?

Unless the specs or instructions for the graphics card in your XP machine
specifically state that can output either DVI using a DVI cable or VGA using
a DVI to VGA connector(dongle) the odds are that it does not.

"Charles W Davis" wrote in message
...
I have a nearly new XP machine. I also have a new Win7 machine. I
continually work on other computers for our computer club.

I have bought a 4 port KVM switch with DVI in and out. My XP machine
(although it has a DVI connection on the back of the video card) won't
connect. I have now added a DVI to VGA connector at the XP machine. Still
won't recognize the monitor. The Win7 machine has no problem with it.

Thoughts? Thanks


  #3 (permalink)  
Old October 25th 09, 12:07 AM posted to microsoft.public.windows.vista.hardware_devices
Charles W Davis[_2_]
external usenet poster
 
Posts: 155
Default VGA to DVI?

GeForce 8500 GT, I see nothing on the NVIDIA site that explains...
"Curious" wrote in message
...
Unless the specs or instructions for the graphics card in your XP machine
specifically state that can output either DVI using a DVI cable or VGA
using a DVI to VGA connector(dongle) the odds are that it does not.

"Charles W Davis" wrote in message
...
I have a nearly new XP machine. I also have a new Win7 machine. I
continually work on other computers for our computer club.

I have bought a 4 port KVM switch with DVI in and out. My XP machine
(although it has a DVI connection on the back of the video card) won't
connect. I have now added a DVI to VGA connector at the XP machine. Still
won't recognize the monitor. The Win7 machine has no problem with it.

Thoughts? Thanks



  #4 (permalink)  
Old October 25th 09, 12:33 AM posted to microsoft.public.windows.vista.hardware_devices
Curious[_4_]
external usenet poster
 
Posts: 395
Default VGA to DVI?

I have a 8500GT card and there is no dongle for it to connect to VGA using a
DVI-VGA adapter. My card came with a 7Pin round mini pin connection and a
dongle for it that supports a component or S-video connection. My card also
has a standard VGA connection. Each manufacturer of 8500GT cards can have
different output connections and connection options. So the specific make
of your 8500GT card is important to know its abilities.

"Charles W Davis" wrote in message
...
GeForce 8500 GT, I see nothing on the NVIDIA site that explains...
"Curious" wrote in message
...
Unless the specs or instructions for the graphics card in your XP machine
specifically state that can output either DVI using a DVI cable or VGA
using a DVI to VGA connector(dongle) the odds are that it does not.

"Charles W Davis" wrote in message
...
I have a nearly new XP machine. I also have a new Win7 machine. I
continually work on other computers for our computer club.

I have bought a 4 port KVM switch with DVI in and out. My XP machine
(although it has a DVI connection on the back of the video card) won't
connect. I have now added a DVI to VGA connector at the XP machine.
Still won't recognize the monitor. The Win7 machine has no problem with
it.

Thoughts? Thanks



  #5 (permalink)  
Old October 25th 09, 07:27 PM posted to microsoft.public.windows.vista.hardware_devices
Charles W Davis[_2_]
external usenet poster
 
Posts: 155
Default VGA to DVI?

Curious, just an update. I have been to the Iogear web site. I found one bit
of information. The connector should be a DVI-I to VGA. I have chased all
over town and the only connector that mentions DVI-I is a Radio Shack item
that reads "DVI-A Female to HDD/VGA Male Adapter." It didn't work. On the
back is a note that reads: "Note: This adapter works with DVI-A (analog) and
DVI-I (digital and analog integrate) cables. It does not work with DVI-D
(digital) cables."
"Curious" wrote in message
...
I have a 8500GT card and there is no dongle for it to connect to VGA using
a DVI-VGA adapter. My card came with a 7Pin round mini pin connection and
a dongle for it that supports a component or S-video connection. My card
also has a standard VGA connection. Each manufacturer of 8500GT cards can
have different output connections and connection options. So the specific
make of your 8500GT card is important to know its abilities.

"Charles W Davis" wrote in message
...
GeForce 8500 GT, I see nothing on the NVIDIA site that explains...
"Curious" wrote in message
...
Unless the specs or instructions for the graphics card in your XP
machine specifically state that can output either DVI using a DVI cable
or VGA using a DVI to VGA connector(dongle) the odds are that it does
not.

"Charles W Davis" wrote in message
...
I have a nearly new XP machine. I also have a new Win7 machine. I
continually work on other computers for our computer club.

I have bought a 4 port KVM switch with DVI in and out. My XP machine
(although it has a DVI connection on the back of the video card) won't
connect. I have now added a DVI to VGA connector at the XP machine.
Still won't recognize the monitor. The Win7 machine has no problem with
it.

Thoughts? Thanks



  #6 (permalink)  
Old October 25th 09, 07:50 PM posted to microsoft.public.windows.vista.hardware_devices
Curious[_4_]
external usenet poster
 
Posts: 395
Default VGA to DVI?

All DVI to VGA or DVI to Component adapters are DVI-I to XXX adapters since
they are using the 4 analog pins on a graphics card DVI-I( the I stands for
integrated digital and analog support capable) connector to provide the VGA
or Component content. If your graphics card and drivers do not support the
output of VGA using the 4 analog pins then the adapter can not do anything
since there is no analog signal available to connect to.
Cables are not an issue since you are trying to use an adapter an not a
cable.

"Charles W Davis" wrote in message
...
Curious, just an update. I have been to the Iogear web site. I found one
bit of information. The connector should be a DVI-I to VGA. I have chased
all over town and the only connector that mentions DVI-I is a Radio Shack
item that reads "DVI-A Female to HDD/VGA Male Adapter." It didn't work. On
the back is a note that reads: "Note: This adapter works with DVI-A
(analog) and DVI-I (digital and analog integrate) cables. It does not work
with DVI-D (digital) cables."
"Curious" wrote in message
...
I have a 8500GT card and there is no dongle for it to connect to VGA using
a DVI-VGA adapter. My card came with a 7Pin round mini pin connection and
a dongle for it that supports a component or S-video connection. My card
also has a standard VGA connection. Each manufacturer of 8500GT cards
can have different output connections and connection options. So the
specific make of your 8500GT card is important to know its abilities.

"Charles W Davis" wrote in message
...
GeForce 8500 GT, I see nothing on the NVIDIA site that explains...
"Curious" wrote in message
...
Unless the specs or instructions for the graphics card in your XP
machine specifically state that can output either DVI using a DVI cable
or VGA using a DVI to VGA connector(dongle) the odds are that it does
not.

"Charles W Davis" wrote in message
...
I have a nearly new XP machine. I also have a new Win7 machine. I
continually work on other computers for our computer club.

I have bought a 4 port KVM switch with DVI in and out. My XP machine
(although it has a DVI connection on the back of the video card) won't
connect. I have now added a DVI to VGA connector at the XP machine.
Still won't recognize the monitor. The Win7 machine has no problem
with it.

Thoughts? Thanks



  #7 (permalink)  
Old October 25th 09, 09:20 PM posted to microsoft.public.windows.vista.hardware_devices
Charles W Davis[_2_]
external usenet poster
 
Posts: 155
Default VGA to DVI?

When attached directly to a Samsung flat panel monitor (its mate for two
years) it functions flawlessly, using VGA cable.
"Curious" wrote in message
...
All DVI to VGA or DVI to Component adapters are DVI-I to XXX adapters
since they are using the 4 analog pins on a graphics card DVI-I( the I
stands for integrated digital and analog support capable) connector to
provide the VGA or Component content. If your graphics card and drivers
do not support the output of VGA using the 4 analog pins then the adapter
can not do anything since there is no analog signal available to connect
to.
Cables are not an issue since you are trying to use an adapter an not a
cable.

"Charles W Davis" wrote in message
...
Curious, just an update. I have been to the Iogear web site. I found one
bit of information. The connector should be a DVI-I to VGA. I have chased
all over town and the only connector that mentions DVI-I is a Radio Shack
item that reads "DVI-A Female to HDD/VGA Male Adapter." It didn't work.
On the back is a note that reads: "Note: This adapter works with DVI-A
(analog) and DVI-I (digital and analog integrate) cables. It does not
work with DVI-D (digital) cables."
"Curious" wrote in message
...
I have a 8500GT card and there is no dongle for it to connect to VGA
using a DVI-VGA adapter. My card came with a 7Pin round mini pin
connection and a dongle for it that supports a component or S-video
connection. My card also has a standard VGA connection. Each
manufacturer of 8500GT cards can have different output connections and
connection options. So the specific make of your 8500GT card is
important to know its abilities.

"Charles W Davis" wrote in message
...
GeForce 8500 GT, I see nothing on the NVIDIA site that explains...
"Curious" wrote in message
...
Unless the specs or instructions for the graphics card in your XP
machine specifically state that can output either DVI using a DVI
cable or VGA using a DVI to VGA connector(dongle) the odds are that it
does not.

"Charles W Davis" wrote in message
...
I have a nearly new XP machine. I also have a new Win7 machine. I
continually work on other computers for our computer club.

I have bought a 4 port KVM switch with DVI in and out. My XP machine
(although it has a DVI connection on the back of the video card)
won't connect. I have now added a DVI to VGA connector at the XP
machine. Still won't recognize the monitor. The Win7 machine has no
problem with it.

Thoughts? Thanks




  #8 (permalink)  
Old October 25th 09, 09:36 PM posted to microsoft.public.windows.vista.hardware_devices
Curious[_4_]
external usenet poster
 
Posts: 395
Default VGA to DVI?

I am now confused I am not sure what works and what does not work.
When you say when attached directly to a Samsung Flat Panel Monitor "It"
functions I am not sure what "it: is.
Is "it" a direct VGA connection from a 15 pin Din connector on the graphics
card to the monitor. Or is it a DVI_I to VGA adapter to a VGA cable to the
monitor?

"Charles W Davis" wrote in message
...
When attached directly to a Samsung flat panel monitor (its mate for two
years) it functions flawlessly, using VGA cable.
"Curious" wrote in message
...
All DVI to VGA or DVI to Component adapters are DVI-I to XXX adapters
since they are using the 4 analog pins on a graphics card DVI-I( the I
stands for integrated digital and analog support capable) connector to
provide the VGA or Component content. If your graphics card and drivers
do not support the output of VGA using the 4 analog pins then the adapter
can not do anything since there is no analog signal available to connect
to.
Cables are not an issue since you are trying to use an adapter an not a
cable.

"Charles W Davis" wrote in message
...
Curious, just an update. I have been to the Iogear web site. I found one
bit of information. The connector should be a DVI-I to VGA. I have
chased all over town and the only connector that mentions DVI-I is a
Radio Shack item that reads "DVI-A Female to HDD/VGA Male Adapter." It
didn't work. On the back is a note that reads: "Note: This adapter works
with DVI-A (analog) and DVI-I (digital and analog integrate) cables. It
does not work with DVI-D (digital) cables."
"Curious" wrote in message
...
I have a 8500GT card and there is no dongle for it to connect to VGA
using a DVI-VGA adapter. My card came with a 7Pin round mini pin
connection and a dongle for it that supports a component or S-video
connection. My card also has a standard VGA connection. Each
manufacturer of 8500GT cards can have different output connections and
connection options. So the specific make of your 8500GT card is
important to know its abilities.

"Charles W Davis" wrote in message
...
GeForce 8500 GT, I see nothing on the NVIDIA site that explains...
"Curious" wrote in message
...
Unless the specs or instructions for the graphics card in your XP
machine specifically state that can output either DVI using a DVI
cable or VGA using a DVI to VGA connector(dongle) the odds are that
it does not.

"Charles W Davis" wrote in message
...
I have a nearly new XP machine. I also have a new Win7 machine. I
continually work on other computers for our computer club.

I have bought a 4 port KVM switch with DVI in and out. My XP machine
(although it has a DVI connection on the back of the video card)
won't connect. I have now added a DVI to VGA connector at the XP
machine. Still won't recognize the monitor. The Win7 machine has no
problem with it.

Thoughts? Thanks




  #9 (permalink)  
Old October 29th 09, 05:44 PM posted to microsoft.public.windows.vista.hardware_devices
smlunatick
external usenet poster
 
Posts: 180
Default VGA to DVI?

On Oct 25, 9:36*pm, "Curious" wrote:
I am now confused I am not sure what works and what does not work.
When you say when attached directly to a Samsung Flat Panel Monitor "It"
functions I am not sure what "it: is.
Is "it" a direct VGA connection from a 15 pin Din connector on the graphics
card to the monitor. *Or is it a DVI_I *to VGA adapter to a VGA cable to the
monitor?

"Charles W Davis" wrote in ...

When attached directly to a Samsung flat panel monitor (its mate for two
years) it functions flawlessly, using VGA cable.
"Curious" wrote in message
...
All DVI to VGA or DVI to Component adapters are DVI-I to XXX adapters
since they are using the 4 analog pins on a graphics card DVI-I( the I
stands for integrated digital and analog support capable) connector to
provide the VGA or Component content. *If your graphics card and drivers
do not support the output of VGA using the 4 analog pins then the adapter
can not do anything since there is no analog signal available to connect
to.
Cables are not an issue since you are trying to use an adapter an not a
cable.


"Charles W Davis" wrote in message
...
Curious, just an update. I have been to the Iogear web site. I found one
bit of information. The connector should be a DVI-I to VGA. I have
chased all over town and the only connector that mentions DVI-I is a
Radio Shack item that reads "DVI-A Female to HDD/VGA Male Adapter." It
didn't work. On the back is a note that reads: "Note: This adapter works
with DVI-A (analog) and DVI-I (digital and analog integrate) cables. It
does not work with DVI-D (digital) cables."
"Curious" wrote in message
...
I have a 8500GT card and there is no dongle for it to connect to VGA
using a DVI-VGA adapter. *My card came with a 7Pin round mini pin
connection and a dongle for it that supports a component or S-video
connection. My card also has a standard VGA connection. * Each
manufacturer of 8500GT cards can have different output connections and
connection options. *So the specific make of your 8500GT card is
important to know its abilities.


"Charles W Davis" wrote in message
...
GeForce 8500 GT, I see nothing on the NVIDIA site that explains...
"Curious" wrote in message
.. .
Unless the specs or instructions for the graphics card in your XP
machine specifically state that can output either DVI using a DVI
cable or VGA using a DVI to VGA connector(dongle) the odds are that
it does not.


"Charles W Davis" wrote in message
...
I have a nearly new XP machine. I also have a new Win7 machine. I
continually work on other computers for our computer club.


I have bought a 4 port KVM switch with DVI in and out. My XP machine
(although it has a DVI connection on the back of the video card)
won't connect. I have now added a DVI to VGA connector at the XP
machine. Still won't recognize the monitor. The Win7 machine has no
problem with it.


Thoughts? Thanks


15 pin VGA cable to a 15 pin VGA "socket" port on the monitor is an
Analog VGA cable.
  #10 (permalink)  
Old November 2nd 09, 01:19 AM posted to microsoft.public.windows.vista.hardware_devices
Charles W Davis[_2_]
external usenet poster
 
Posts: 155
Default VGA to DVI?


"smlunatick" wrote in message
...
On Oct 25, 9:36 pm, "Curious" wrote:
I am now confused I am not sure what works and what does not work.
When you say when attached directly to a Samsung Flat Panel Monitor "It"
functions I am not sure what "it: is.
Is "it" a direct VGA connection from a 15 pin Din connector on the
graphics
card to the monitor. Or is it a DVI_I to VGA adapter to a VGA cable to the
monitor?

"Charles W Davis" wrote in
...

When attached directly to a Samsung flat panel monitor (its mate for two
years) it functions flawlessly, using VGA cable.
"Curious" wrote in message
...
All DVI to VGA or DVI to Component adapters are DVI-I to XXX adapters
since they are using the 4 analog pins on a graphics card DVI-I( the I
stands for integrated digital and analog support capable) connector to
provide the VGA or Component content. If your graphics card and drivers
do not support the output of VGA using the 4 analog pins then the
adapter
can not do anything since there is no analog signal available to
connect
to.
Cables are not an issue since you are trying to use an adapter an not a
cable.


"Charles W Davis" wrote in message
...
Curious, just an update. I have been to the Iogear web site. I found
one
bit of information. The connector should be a DVI-I to VGA. I have
chased all over town and the only connector that mentions DVI-I is a
Radio Shack item that reads "DVI-A Female to HDD/VGA Male Adapter." It
didn't work. On the back is a note that reads: "Note: This adapter
works
with DVI-A (analog) and DVI-I (digital and analog integrate) cables.
It
does not work with DVI-D (digital) cables."
"Curious" wrote in message
...
I have a 8500GT card and there is no dongle for it to connect to VGA
using a DVI-VGA adapter. My card came with a 7Pin round mini pin
connection and a dongle for it that supports a component or S-video
connection. My card also has a standard VGA connection. Each
manufacturer of 8500GT cards can have different output connections and
connection options. So the specific make of your 8500GT card is
important to know its abilities.


"Charles W Davis" wrote in message
...
GeForce 8500 GT, I see nothing on the NVIDIA site that explains...
"Curious" wrote in message
.. .
Unless the specs or instructions for the graphics card in your XP
machine specifically state that can output either DVI using a DVI
cable or VGA using a DVI to VGA connector(dongle) the odds are that
it does not.


"Charles W Davis" wrote in message
...
I have a nearly new XP machine. I also have a new Win7 machine. I
continually work on other computers for our computer club.


I have bought a 4 port KVM switch with DVI in and out. My XP
machine
(although it has a DVI connection on the back of the video card)
won't connect. I have now added a DVI to VGA connector at the XP
machine. Still won't recognize the monitor. The Win7 machine has
no
problem with it.


Thoughts? Thanks


15 pin VGA cable to a 15 pin VGA "socket" port on the monitor is an
Analog VGA cable.

I know what the cables are, and the sockets. I have returned the KVM switch
and ordered this:
http://www.buy.com/prod/startech-com...207534815.html
It should arrive in a couple of days.

 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On



All times are GMT. The time now is 07:54 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.Search Engine Optimization by vBSEO 3.0.0 RC6
Copyright ©2004-2024 Vista Banter.
The comments are property of their posters.