Discussion:
Blue, white, and HDMI cables for a 23-inch diagonal monitor
(too old to reply)
Jean Fredette
2019-03-20 01:50:55 UTC
Permalink
I was given a 23-inch diagonal monitor and desktop that has all three
colors (blue white & hdmi) and the desktop can output blue white or hdmi
but the owners who gave it to me couldn't find the cables.

Does it matter in the "quality" of the end result which cable I buy?
😉 Good Guy 😉
2019-03-20 02:04:45 UTC
Permalink
Post by Jean Fredette
I was given a 23-inch diagonal monitor and desktop that has all three
colors (blue white & hdmi) and the desktop can output blue white or hdmi
but the owners who gave it to me couldn't find the cables.
Does it matter in the "quality" of the end result which cable I buy?
What is this blue white thing you are talking about? If you want just a
cable for hdmi output then go and buy the one with gold plated
connectors. Cables are pretty cheap these days so go and buy any IMO as
long as they are called HDMI!!!!!!!!!!!!!!.

Is this the first time you are trying to connect a monitor to your
desktop? the monitor must have a VGA connection as well which will work
like what has been used so far by many people. HDMI is supposed to give
you a clearer picture but most people don't even notice it as long as
there is some output to work with.

You just need to cut the crap and stop talking about "diagonal monitor"
because they are all monitors. Never heard of a diagonal monitor; you
will need to turn your head to read something on the screen if it is
diagonal!!! After sometime you'll get tired of doing it.
Post by Jean Fredette
Path: aioe.org!.POSTED.J3COxAMg2MpZViDf8ofn2w.user.gioia.aioe.org!not-for-mail
Newsgroups: alt.windows7.general
Subject: Blue, white, and HDMI cables for a 23-inch diagonal monitor
Date: Tue, 19 Mar 2019 20:50:55 -0500
Organization: Aioe.org NNTP Server
Lines: 5
NNTP-Posting-Host: J3COxAMg2MpZViDf8ofn2w.user.gioia.aioe.org
Mime-Version: 1.0
Content-Type: text/plain; charset=us-ascii
Content-Transfer-Encoding: 7bit
X-Notice: Filtered by postfilter v. 0.9.2
X-Newsreader: Forte Agent 2.0/32.652
Xref: aioe.org alt.windows7.general:41516
--
With over 950 million devices now running Windows 10, customer
satisfaction is higher than any previous version of windows.
Paul
2019-03-20 02:36:44 UTC
Permalink
Post by Jean Fredette
I was given a 23-inch diagonal monitor and desktop that has all three
colors (blue white & hdmi) and the desktop can output blue white or hdmi
but the owners who gave it to me couldn't find the cables.
Does it matter in the "quality" of the end result which cable I buy?
With this amount of info to go on, I'd take HDMI, because "it works".

If you want additional feedback:

1) Provide the monitor make and model number. E.g. Acer MX123 or Asus 97GX
This allows reviewing the native resolution we're trying to hit
(which is likely 1920x1080). And perhaps the Internet has a picture
of the back of the item.

2) If you provide the information about the driving video card itself,
that allows heading off trouble. This is usually harder for posters
to dig up, unless they hold onto the video card box when they buy one.
The only reason in this case, that we care, is if the video card is
an AGP one, from when NVidia made their first cards with digital output,
there's a couple card models which only do 135MHz clock, when the spec
says they should work out to 165MHz. This might prevent such a card
from reaching 1920x1080 perhaps, and you might consider using a VGA
output in such an (obscure) case. My NVidia 7900GT for example, I don't
think I have any accurate identifiers on it, and Device Manager tells
you "7900GT/7900GTX", so it won't say specifically what it is. Then
I might attempt to use GPU-Z from Techspot.

The chances of you being screwed on (2) are slim to none.

If you have really old video cards, ancient ones from when dual
head first came out, the second connector on those might only
do 1024x768, but you're not likely to have something that
ancient on a Windows 7 computer. There could be corner cases.
We were able to buy ATI 7000 video cards for years and years,
so there was rubbish in the retail channel for a long long time
(i.e. unlikely to be able to still find specs for it).

But if you have HDMI at both ends in front of you, then
that is highly likely to be "the answer". To get the best
price, don't buy those locally. Unless you have a Frys maybe.
I could see you paying $25 for a $3 cable with some
"Monster" branding on it, and gold-tinted connectors.

*******

If you want to use 25 feet of HDMI cable, then you're going to
need a higher quality cable. It is possible to run some
distance, but you might not get it right on your first
purchase. The higher the resolution, the higher the clock
rate, and the harder it is to get decent signal amplitude
on a really long cable. You could likely send 1024x768 @ 60Hz over
HDMI a good long distance. Sending 4K at 144Hz via a modern
HDMI standard, less so.

HTH,
Paul
Jean Fredette
2019-03-20 08:09:21 UTC
Permalink
Post by Paul
1) Provide the monitor make and model number. E.g. Acer MX123 or Asus 97GX
On the front it says LG FLATRON E2341.
On the back, it says LG Flatron E2341V-BN, model E2341V, December 2011.

The computer has blue (vga), white (dvi), and dark (hdmi).
The monitor also has a headphone jack but I don't see speakers anywhere.

I don't see a jack for sound input to the monitor.
Where does it get the sound for the headphone from?
Post by Paul
2) If you provide the information about the driving video card itself,
I might attempt to use GPU-Z from Techspot.
Do you mean techspot or techpowerup?
https://www.techspot.com/downloads/5716-gpu-monitor.html
https://www.techpowerup.com/download/gpu-z/

I couldn't get the techspot to download so I used techpowerup.
NVIDIA GeForce 201
Driver version 9.18.13.4174 (NVIDIA 341.74)
Paul
2019-03-20 09:53:47 UTC
Permalink
Post by Jean Fredette
Post by Paul
1) Provide the monitor make and model number. E.g. Acer MX123 or Asus 97GX
On the front it says LG FLATRON E2341.
On the back, it says LG Flatron E2341V-BN, model E2341V, December 2011.
The computer has blue (vga), white (dvi), and dark (hdmi).
The monitor also has a headphone jack but I don't see speakers anywhere.
I don't see a jack for sound input to the monitor.
Where does it get the sound for the headphone from?
Post by Paul
2) If you provide the information about the driving video card itself,
I might attempt to use GPU-Z from Techspot.
Do you mean techspot or techpowerup?
https://www.techspot.com/downloads/5716-gpu-monitor.html
https://www.techpowerup.com/download/gpu-z/
I couldn't get the techspot to download so I used techpowerup.
NVIDIA GeForce 201
Driver version 9.18.13.4174 (NVIDIA 341.74)
That's a 1920x1080 monitor, and either HDMI or DVI would be sufficient.
DVI single lane works to 1920x1200 @ 60Hz CRTRB (reduced blanking), so
covers 1920x1080 OK.

At that resolution, VGA would still work, but might be slightly
less good than the digital ones.

You can then decide, which cable is cheaper, between DVI and HDMI,
as electrically they could work with a single lane.

*******

Yes, this is probably what I was thinking about.

https://www.techpowerup.com/download/gpu-z/

It's probably a Geforce 210. And you're lucky to get this from
the VGA perspective, as in 2018, VGA kinda disappeared from
video card faceplates. In 2019, people will be buying active
adapters, to go from HDMI to VGA or DisplayPort to VGA. Your
card is still fully functional, by the looks of it. But VGA is
likely a better choice at more pedestrian resolutions like
1600x1200. Pushing it all the way to 2048 would be kinda nutty
and wouldn't look too good. The two digital standards tend
to keep their quality (with short cable, good quality cable).

https://www.newegg.com/Product/Product.aspx?Item=9SIA6ZP3R86688

1 x VGA 2048x1536
1 x DL-DVI-I Dual Link DVI 2560x1600 or
VGA via passive adapter at 2048x1536.
1 x HDMI (1.3A claimed by one source...)

You'll notice as well, that NVidia absolutely refuses to tell
us what HDMI standards version that implements. It has to do
1920x1080 at least. 1920x1080 relies on an HDMI clock of 165MHz,
same as DVI. But later standards of HDMI have faster clocks,
like 330MHz, and that's where the higher resolutions come from.
Knowing a card has DVI and the DVI does 165MHz (its final limit),
we know the HDMI has to be at least that good.

https://www.geforce.com/hardware/desktop-gpus/geforce-210/specifications
https://www.nvidia.in/object/product_geforce_210_in.html

The review here seems to be claiming it is HDMI 1.3a, and has
8 channel LPCM audio over HDMI capability. I'm not sure that
people in the field were seeing this, but this is the
"chartware" from NVidia. For some reason, the HDMI version
is suppressed in adverts. Like, maybe it's broken or something.
You don't suppress a spec like that, unless you're ashamed of
something.

https://www.pcper.com/reviews/Graphics-Cards/Galaxy-GeForce-210-and-GT-220-Review-NVIDIA-40nm-GPUs-hit-consumers

And HDMI 1.3a here, has a clock of 330MHz. That's the "transportation
equivalent" of dual-link DVI. Two DVI in parallel at 165MHz, is the
same as the one lane on HDMI running at 330MHz. Which means in theory,
2560x1600 @ 60Hz should be in reach.

With that out of the way, you can use either the DVI or HDMI.
And because the video card connector is DVI-I, there should
not be a cabling problem. Some later cards, like a 2018 card
stripped of all VGA, the connector on the video card end
is DVI-D and the VGA blade slots are filled in with plastic.

If using the DVI cable, you want to verify both ends,
whether they're DVI-D or DVI-I, look at the cable carefully
to ensure there isn't a conflict. I don't expect a problem,
but sometimes "stuff happens". The video card end on your
card, appears to support "any cable".

https://en.wikipedia.org/wiki/Digital_Visual_Interface

The monitor end looks DVI-D. DVI-D on both ends should work.

Loading Image...

Paul
Jean Fredette
2019-03-20 14:56:54 UTC
Permalink
Post by Paul
You can then decide, which cable is cheaper, between DVI and HDMI,
as electrically they could work with a single lane.
If that's the choice then I'll get HDMI since I might need it in the
future. The price of the cable isn't an issue I care about. I just want to
choose the right cable type since the GeForce 210 and Flatron E2341 have 3
choices.
Post by Paul
It's probably a Geforce 210.
You are right. I must have transposed the letters.
I checked GPU-Z again where it's a GeForce 210 as you said.
Post by Paul
The two digital standards tend
to keep their quality (with short cable, good quality cable).
I was thinking six feet or maybe ten feet as four is too short I think.

The desktop will sit on the floor to the side of the monitor where the
cable has to snake around the desk a bit to get to the monitor down below.
Post by Paul
The review here seems to be claiming it is HDMI 1.3a, and has
8 channel LPCM audio over HDMI capability.
I think what you're saying is that the hdmi will carry the audio to the
headphone jack of the LG Flatron E2341 monitor but the DVI will not?
Post by Paul
With that out of the way, you can use either the DVI or HDMI.
Thank you for that research where both will work.
Did I correctly read you that the DVI does not carry the audio?

I think from what you wrote, I'll buy a 6 or 10 foot hdmi cable where the
main difference is only that the hdmi carries audio?

My main confusion is that I think you said the dvi has slightly better
resolution under some circumstances? But I am just using it for normal
things where the basic good resolution should be ok for me.
Paul
2019-03-20 21:43:06 UTC
Permalink
Post by Jean Fredette
Post by Paul
You can then decide, which cable is cheaper, between DVI and HDMI,
as electrically they could work with a single lane.
If that's the choice then I'll get HDMI since I might need it in the
future. The price of the cable isn't an issue I care about. I just want to
choose the right cable type since the GeForce 210 and Flatron E2341 have 3
choices.
Post by Paul
It's probably a Geforce 210.
You are right. I must have transposed the letters.
I checked GPU-Z again where it's a GeForce 210 as you said.
Post by Paul
The two digital standards tend
to keep their quality (with short cable, good quality cable).
I was thinking six feet or maybe ten feet as four is too short I think.
The desktop will sit on the floor to the side of the monitor where the
cable has to snake around the desk a bit to get to the monitor down below.
Post by Paul
The review here seems to be claiming it is HDMI 1.3a, and has
8 channel LPCM audio over HDMI capability.
I think what you're saying is that the hdmi will carry the audio to the
headphone jack of the LG Flatron E2341 monitor but the DVI will not?
Post by Paul
With that out of the way, you can use either the DVI or HDMI.
Thank you for that research where both will work.
Did I correctly read you that the DVI does not carry the audio?
I think from what you wrote, I'll buy a 6 or 10 foot hdmi cable where the
main difference is only that the hdmi carries audio?
My main confusion is that I think you said the dvi has slightly better
resolution under some circumstances? But I am just using it for normal
things where the basic good resolution should be ok for me.
1) All ports have roughly the same resolution choices,
give or take a bit. At least in the current situation
all are theoretically better than is needed for the
1920x1080 application at 60Hz. I'd have to be more careful
shooting from the hip, if your monitor was 144Hz (gamer monitor).

We don't like to push VGA too far, because the cabling is the
issue with VGA. The connector design isn't suited to "high frequency
signaling". So just picking a figure out of the air, I suggest
that maybe 1600x1200 is the point at which the digital ones
might start looking better, and VGA is running out of steam.
At 1024x768, you likely couldn't tell the difference between
VGA and HDMI.

2) HDMI appears to have audio in this case. But for the time,
this might have been the first generation of low end card
with the audio integrated. A previous generation used SPDIF
passthru, with the HDMI standard unaltered and having "slots"
for 8 channel audio.

3) I can see reports of "funny things happening" with the audio
over DVI. It appears it can work.

https://forums.tomsguide.com/threads/audio-through-dvi.233582/

The hard part, would be digging up a technical backing for it.

The "swapping connectors thing" started before audio carriage
existed.

https://en.wikipedia.org/wiki/Digital_Visual_Interface

"For example, an HDMI display can be driven by a DVI-D source
because HDMI and DVI-D both define an overlapping minimum set
of supported resolutions and frame buffer formats.

Some DVI-D sources use non-standard extensions to output HDMI
signals including audio (e.g. ATI 3000-series and
NVIDIA GTX 200-series).[9] Some multimedia displays use a
DVI to HDMI adapter to input the HDMI signal with audio.
Exact capabilities vary by video card specifications."

To me, where this might "break", is if the monitor was
2560x1600 and used dual-link DVI, I doubt the audio would
work over that, because the "overlap of standards" no longer
works when dual lanes are needed on DVI carriage. But in
your case, a 165MHz clock on a single TMDS (transition minimized
differential signaling) interface, means "easy swapping" via passive
connector conversion. There's sufficient overlap of standards
at 1920x1080 for this to be possible. The monitor also plays a
part, if say, the designer chose to be picky and "only supported
legacy (no audio) data extraction" on the DVI.

If you had an Apple 30" Cinema display with speakers, this
would be a much more iffy proposition, because you'd need two
lanes on the DVI to work.

But the above information suggests "it could happen" at 1920x1080.

The Wikipedia article refers to EDID as well, and both HDMI and
DVI have EDID (monitor declares what it supports), so that
should work on both of them. HDMI has additional functions
with the extra wires it's got, such as CEC for switching
equipment off. On a TV set, if you hit the power button,
using CEC the BluRay player would power down, because
the TV would tell the BluRay player "we won't be needing
you now". That channel doesn't exist on DVI, so a DVI to
HDMI adapter would have no signal to drive that pin. There
might be a similar issue with audio return channel or something,
which isn't an issue in this case. Audio return channel is
more of a home theater issue. These are examples of pins
not driven, when a DVI to HDMI is used.

HDMI

Pin 13 CEC (Consumer Electronics Control extensions, power down)
Pin 14

Reserved (HDMI 1.0–1.3a)
Utility/HEAC+ (HDMI 1.4+, optional, HDMI Ethernet Channel
and Audio Return Channel)

Paul
Jean Fredette
2019-03-21 01:08:01 UTC
Permalink
Post by Paul
1) All ports have roughly the same resolution choices,
I was surprised but you are right that the vga didn't change the resolution
from the hdmi. The only difference I notice is the hdmi carries the
headphone audio (which I won't likely use but it's ok to have anyways).
Post by Paul
2) HDMI appears to have audio in this case.
Yes.
Post by Paul
3) I can see reports of "funny things happening" with the audio
over DVI. It appears it can work.
I didn't realize dvi also handled audio.
I'm fine with the 10 foot hdmi cable I bought.

They had twenty, thirty, forty, and fifty dollar 10 foot cables!
I just bought the cheapest 10 foot cable they had.

Its called osmartech electronic High-Speed HDMI Cable (10 feet/3 meters)
Gaming Edition designed for PS4 and Xbox One Game Consoles
Ver 1.4, supports 1080p, 4k2k, Ethernet and 3D supported,
The SKU is 7 00253 86173 0

It was 7 dollars.
Ken Blake
2019-03-20 15:28:51 UTC
Permalink
On Wed, 20 Mar 2019 03:09:21 -0500, Jean Fredette
Post by Jean Fredette
Post by Paul
1) Provide the monitor make and model number. E.g. Acer MX123 or Asus 97GX
On the front it says LG FLATRON E2341.
On the back, it says LG Flatron E2341V-BN, model E2341V, December 2011.
The computer has blue (vga), white (dvi), and dark (hdmi).
The monitor also has a headphone jack but I don't see speakers anywhere.
Jacks for loudspeakers are usually on the computer, not the monitor.
Jean Fredette
2019-03-20 17:40:56 UTC
Permalink
Post by Ken Blake
Jacks for loudspeakers are usually on the computer, not the monitor.
I followed Paul's advice which activated the monitor headphone jack.

The output with both the vga & hdmi cable was the same at 1920x1080.
With hdmi I also get sound out of the headphone jack on the monitor.

I found switching to sound outputs was different than I expected.
I had expected that merely plugging in the headphone jack would instantly
disable the speakers, which is what happens with my external speakers
(which also have a headphone jack).

That would have been a problem with this monitor because just to plug in
the headphone jack requires more effort than you want to turn the monitor
around, flip it upside down, and find the tiny headphone port in a crevice.

I was worried that it's pretty hard to insert the headphone jack into the
back bottom crevice of the Flatron E2341 monitor so I was very happy to
find out that there is a software choice to switch between 3 outputs

E2341 (NVIDIA High Definition Audio)
Speakers (High Definition Audio Device)
Digital Audio (S/PDIF)(High Definition Audio Device)

The first of those 3 switches audio to the headphone jack on the monitor.
It's nice that this has a software control so that I can leave the
headphones plugged into the monitor full time.

The second switches to the external speakers I connected to the desktop,
where the sound plays out the speaker if no headphone is connected to those
external speakers, but the sound output automatically switches in hardware
to headphones if I connect them to those external speakers. So I can't
leave the headphones plugged into the external speakers full time.

I have no idea what the third choice switches to though.
Paul
2019-03-20 22:13:40 UTC
Permalink
Post by Jean Fredette
I have no idea what the third choice switches to though.
SPDIF:

That's either TOSLink (glowing red LED color, leaks through
rubber capped square connector on back of PC), or it
can be delivered by legacy copper coax connection (RCA/Cinch
connector, same style connector which peppers the back
of large screen TV sets). The SPDIF connector isn't
keyed, so it can "go in the wrong holes", and with RCA/Cinch
you do have to be careful. Audio doesn't have the same
attention to detail as computer connectors. At least
a few things on computers "won't fit" to help prevent
you from blowing stuff up.

You can see rather poor examples of both here. (The RCA/Cinch
should be showing more metal reflections in the picture.)

https://www.dx.com/p/spdif-toslink-to-coaxial-spdif-coaxial-to-spdif-coaxial-converter-2000407#.XJK1YaUwDQx

The TOSLink is a fiber optic connection that uses plastic "dental" fiber,
the same kind of fiber that conducts light for dental work.
The connector is squarish on its perimeter. On the "out"
connector, you would see red LED light "leaking" from
under the rubber cover on the port. You peel back the
rubber cover, before inserting the cable.

The RCA/Cinch is for the copper equivalent of the signal.

https://en.wikipedia.org/wiki/RCA_connector

The signal in that case, is carried on coaxial cable, with
RCA/Cinch on either end. The signal might have a characteristic
impedance of 50 ohms. The RCA/Cinch is a lousy choice for
maintaining a 50 ohm environment (it's not an RF
connector). When you look at cable TV connectors,
F-series maybe, those are 75 ohms and are going to be
closer to being the proper impedance for their application.
RCA/Cinch are used for:

Audio speakers
Line In audio
Composite video (red/yellow/white on TV back)
YPbPr video ?
SPDIF

The Speaker Out on a high power stereo, would likely
have sufficient amplitude to destroy the SPDIF-In on
your AV receiver, if connecting the cable to the wrong holes.

But in any case, both TOSLink/RCA SPDIF standards carry a 6Mbit/sec
stream, sufficient for two channels of high definition
audio. An alternate format, is to support four channels
using "fewer bits", which would not particularly be
an audiophile choice. I keep seeing references to that
mode, but nothing seems to use it.

When computers first got "digital audio", it was just
that RCA connector. It was only later that TOSLink
optical output was offered, and at first, there
was an adapter card you put in a PC slot with the
simple driving components (LED and switching transistor).
Later, TOSLink got put in the I/O plate area, and
PCs could have both TOSLink and SPDIF at the same time.
When you have both, you can drive two home theater
receivers at the same time, since SPDIF is unidirectional
and the PC only "sends" to each AV receiver. The TOSLink
and RCA, are copies of the same signal.

PCs have also had SPDIF-in, but that was only
via the adapter card that sits in a slot. The reason
the industry tried to "hide" that one, was DRM and
"making perfect copies" of audio content. They
didn't want to encourage people to use computers
to record digital audio. But the bastards did have a way
to "get even", as some 24 bit audio sent that way,
had the 8 least significant bits "set to zero"
to "ruin" the resolution, making perfect copies
impossible. So while the user might have a big
shit eating grin on their face "recording 24 bit audio",
they were in fact only getting 16 bit copies. You could
always examine a recording with your hex editor,
and figure out you were "ripped off".

Paul
Jean Fredette
2019-03-21 01:08:13 UTC
Permalink
Thank you for showing the pictures.

On the back of the desktop I found near the motherboard a connector that
looks the same only with a white covering cap, labeled "Optical Audio Out".

I was surprised it wasn't anywhere near the Nvidia GeForce 210 graphics
card where it must come with the motherboard.
Post by Paul
The Speaker Out on a high power stereo, would likely
have sufficient amplitude to destroy the SPDIF-In on
your AV receiver, if connecting the cable to the wrong holes.
I am not an audiophile so I'm only using the "green" connector.

There are six different "headphone like" audio jacks on the back.
Blue, green, and red in the top row.
Brown, black, and grey in the bottom row.

I connected separate powered speakers only to the green audio out jack.
Post by Paul
When computers first got "digital audio", it was just
that RCA connector.
I think you explained the three software selections
[1] E2341 (NVIDIA High Definition Audio)
[2] Speakers (High Definition Audio Device)
[3] Digital Audio (S/PDIF)(High Definition Audio Device)

[1] audio out from the computer through hdmi to the monitor headphone jack
[2] audio out from the computer through the green jack to powered speakers
[3] unused "optical audio out" which I don't think I will ever need to use

I think I'm all set now with the $7 hdmi cable & the software controls!
Thank you for your help!
J. P. Gilliver (John)
2019-03-21 02:04:35 UTC
Permalink
In message <q6uo5r$nd5$***@gioia.aioe.org>, Jean Fredette
<***@jolens.com> writes:
[]
Post by Jean Fredette
There are six different "headphone like" audio jacks on the back.
Blue, green, and red in the top row.
Brown, black, and grey in the bottom row.
[]
Blue: stereo line in. Nov very rare on laptops, usually OK on desktops.

Green: stereo line out; may have enough oomph to drive 35 ohm
headphones. (Old sound cards - and I'm talking ISA!, like the original
SoundBlaster! - sometimes had enough oomph to drive unpowered speakers,
e. g. 2 watts per channel; but the almost universal provision of powered
speakers stopped sound card manufacturers putting drive in their cards.
[I'm pretty sure they predate the colour code.])

Pink: microphone in. Often (usually, I think) mono - it may be a
three-terminal connector, but the third is bias volts out for electret
mics, not other channel in.

Brown, black, grey: for rear channel speakers and sub-woofer, in 5.1 or
7.1 channel use. (I forget which is which.)

Those are the defaults; however, many these days have auto-detect,
detecting when you misconnect, connecting an input to an output or vice
versa.
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)***@T+H+Sh0!:`)DNAf

in the kingdom of the bland, the one idea is king. - Rory Bremner (on
politics), RT 2015/1/31-2/6
NY
2019-03-20 19:45:47 UTC
Permalink
Post by Ken Blake
Jacks for loudspeakers are usually on the computer, not the monitor.
Although some monitors have a pass-though connector, so you connect PC to
monitor in which feeds a monitor out socket that headphones/microphones cane
be plugged into. The same applies to USB: some have a square input socket
for the lead from the computer and then one or more flat-USB output sockets
for devices to be plugged into.

In the case of loudspeaker socket on the monitor it may even play sound
through the monitor's speakers but mute that if headphones are plugged into
the monitor's socket.

In both cases, you are taking advantage of the monitor's sockets being more
accessible that the corresponding ones on the PC if that is under a desk.
pjp
2019-03-20 03:29:02 UTC
Permalink
In article <q6s69t$1673$***@gioia.aioe.org>, ***@jolens.com
says...
Post by Jean Fredette
I was given a 23-inch diagonal monitor and desktop that has all three
colors (blue white & hdmi) and the desktop can output blue white or hdmi
but the owners who gave it to me couldn't find the cables.
Does it matter in the "quality" of the end result which cable I buy?
Almost surely not. Same as old days and stereos with the gold plated
cables which are no better than regular quality made cables at 10X the
price. Signal is only electrons and they either can or can't traverse
the cable.
Paul
2019-03-20 04:11:16 UTC
Permalink
Post by pjp
says...
Post by Jean Fredette
I was given a 23-inch diagonal monitor and desktop that has all three
colors (blue white & hdmi) and the desktop can output blue white or hdmi
but the owners who gave it to me couldn't find the cables.
Does it matter in the "quality" of the end result which cable I buy?
Almost surely not. Same as old days and stereos with the gold plated
cables which are no better than regular quality made cables at 10X the
price. Signal is only electrons and they either can or can't traverse
the cable.
There is the skin effect, and where the currents for a
high frequency signals travel.

But usually, the precious metals tinting cables, are
on the business ends, and not in the part buried under
the plastic insulation. Just plain copper under there
would be more the norm. When I started in the business, lots
of stuff was nickel plated, but that isn't a popular practice
any more. The nickel plating stopped corrosion or oxidation.

A cable could have conductive loss as well as dielectric loss.
It's just possible the dielectric choice is more significant
than the outside conductor finish. HDMI cables do seem to differ
in dielectric loss, so there must be some differences in the "goo"
inside coaxial or biaxial sections. High speed signals
tend to be differential, and it helps if the environment
the signals move through, are "equal".

Loading Image...

The main diff pairs there are R,G,B, and CLK. The CLK being
one tenth the rate of the data waveforms. By sending a LF clock,
you can synthesize up a sampling clock from it, then phase
shift it dynamically to keep it centered. I have no idea
what the "training method" is on that standard. And how often
it might recalibrate.

When the signal launches, it's low amplitude. Let's pretend
for the same of argument, it is 1 volt tall. As the signal
moves down the HDMI cable, it shrinks in height. If might
be 0.05 volt high at the receiving end. The receiver thresholds
are just barely sensitive enough, to reliably detect that
signal. If you make the cable longer, the slicing action
will be off, the signals will start to get "fuzzy", the
eye opening will close, and the image will start to get
"colored snow" from the transmission errors. If you double
the length of the cable, so little "intelligence" will be
sensed in the cable, you lose "sync". Then the screen goes dark.

Paul
Jean Fredette
2019-03-20 08:09:23 UTC
Permalink
gold plated cables which are no better than regular quality made cables
Its the quality of the output on the monitor screen that I was asking about
for the three possible formats (not the cable quality).
blue vga
white dvi
dark hdmi

And I was also wondering about how it gets the sound since I don't see any
speakers but it has a headphone output but I don't see any sound input.

The monitor is an LG Flatron E2341.
Paul
2019-03-20 10:03:33 UTC
Permalink
Post by Jean Fredette
gold plated cables which are no better than regular quality made cables
Its the quality of the output on the monitor screen that I was asking about
for the three possible formats (not the cable quality).
blue vga
white dvi
dark hdmi
And I was also wondering about how it gets the sound since I don't see any
speakers but it has a headphone output but I don't see any sound input.
The monitor is an LG Flatron E2341.
Audio over HDMI.

The claim is, your video card barely has it. (Audio provided as of
that version of HDMI. 8 channel LPCM.)

NVidia was originally a bit lazy. They put an SPDIF connector
at the top edge of their video cards, and you were supposed
to run a cable from the motherboard, over to the video card.
Your Geforce 210 is supposed to have an actual HDAudio digital
source to drive that function instead. The video card driver
package should mention it also contains an audio driver file.
In Windows, you select "HDMI audio" or at least select some
other option besides the ones your sound card is providing.

The monitor, if it has a headphone jack, could extract
2-channel LPCM from HDMI and send it to the headphone jack.
A headphone jack is usually good for a 32 ohm load (i.e. not
enough to drive some 100W 2 ohm speakers :-) ) The DVI
might not support that, hard to be certain what comes
out of the DVI port.

That gives the HDMI cable a slight edge, in terms of
"fun factor" and "what-if".

Paul
NY
2019-03-20 11:41:05 UTC
Permalink
Post by Jean Fredette
I was given a 23-inch diagonal monitor and desktop that has all three
colors (blue white & hdmi) and the desktop can output blue white or hdmi
but the owners who gave it to me couldn't find the cables.
Does it matter in the "quality" of the end result which cable I buy?
I interpret "white" as being DVI and "blue" as being VGA, since those seem
to the industry-standard colours for the plastic insert with the pinholes in
it (in the socket) and the plastic shroud (on the plug).

Good question: faced with a graphics card that can produce all three outputs
(at the same resolution) and a monitor that can accept all three inputs, is
there a preference for HDMI over DVI over VGA? I'm assuming digital, not
analogue, outputs on DVI and VGA, so different contact resistance will not
cause colour cast or missing colour, and cable capacitance will not cause
ghosting/blurring.

I suppose one advantage of HDMI is that the same cable can also carry the
sound.
Loading...