The online racing simulator
LFS in HD
(65 posts, closed, started )
Quote from logitekg25 :maybe cause hdmi and dvi is either on or off, while vga is on, faded, off.

What do u exactly mean?
Quote from logitekg25 :actually that was my friend...never trusting him again :banghead:

Dont trust anyone!
you have to get to a record shop and listen to a few tunes on record. it's a much warmer sound. more natural if you know what i mean. even the little clicks from dust instead of that awful digital distortion you can get sometimes.
HD is 1080p, 1080i and 720p.
ED is 480p (NTSC) and 576p (PAL).
SD is 480i (NTSC) and 576i (PAL).

The p is for progressive, the i is for interlaced. Progressive means that it displays the images in much the same way that your computer display does. Interlaced is that ever other line is displayed with each successive frame, making a full image out of half the size. 1080i is called then 1080 because it mimics 1080p but it does it in half the amount of data.
But doesnt interlaced equal with no vsync? In games for example.
Quote from dadge :http://h10025.www1.hp.com/ewfr ... /Doc/images/c01532378.gif
red = digital
grey = Analog
i prefere Analog. same with radio. same with music media (vinyl over CD). same with game controllers.

I don't see how that's relevant to this discussion. Computers are digital machines, not analog. If your computer is set to 32bit color, you're going to get the same information whether you use VGA or a digital interface. Digital just uses less power, and reduces interference.
Quote from hazaky :But doesnt interlaced equal with no vsync? In games for example.

It could seem like that, but it's not really the same. Refreshing the screen each two frames is considered better for sports like footbal/soccer because it helps to avoid the flickering in static images.

But in the other hand, in movies/games with rapid moves it could lead to the no vsync problem.

(I hope this information is right, I'm not an expert in AV hehe)
Vsync has nothing to do with interlacing. And Vsync has nothing to do with sports on TV, since a 1080i camera is outputting 60fps (fields per second, not frames per second) no matter what.
Quote from wheel4hummer :I don't see how that's relevant to this discussion.

check post #25. a diagram is much easier to understand than an explination.
Quote from dadge :check post #25. a diagram is much easier to understand than an explination.

Clearly you don't understand how VGA works.
clearly i don't need to know how it works. i only need to know what it is. clearly in responce to post #25 (which i can only guess you've read) the diagram shows what was being explained. Clearly you would have noticed that the discussion went a tad off topic. clearly you'd notice that the discussion had gone to analog (record) audio over digital (CD) audio.
clearly you don't understand how a conversation works.

nice use of a generic forum taunt m8(to state that a person knows nothing about the given subject). but i kinda saw that one coming. what's next? tell me to "STFU" or maybe pull me on my lack of grammar? or how's about you get the last word in and then put me on your ignore list?
back on topic,
maybe you could find it within yourself to offer an explination of what VGA is. unless you have deemed me unworthy of such knowledge. i kneel before you oh humble one. please enlighten me.
You appear to want to start an argument. I just want to make sure that people don't get too much false information.

Modern, digital computers are digital machines. Colors in your computer are digital. Your computer uses a color pallate. The colors are NOT infinitely varied over VGA. Sure, it's an analog signal, but you're not getting any more colors than what is represented in the data on your computer. You lose absolutely no color information with a digital video output versus VGA, when the data is digital in the first place when it's rendered on your computer.

Vinyl vs. CD, on the other hand is a completely different situation. On Vinyl, the signal varies infinitely, whilst on a CD it is represented digitally, and therefore you lose some information. The closer you approach the nyquist frequency, the more the signal becomes a triangle wave. This is of course way different than VGA vs. digital.

EDIT: Come to think of it, one actually could make an analogy with VGA and listening to a CD. When you listen to a CD, the digital signal is converted to analog. But you don't have the same information as a signal that was analog in the first place, because the data started out as digital. Same with VGA. Sure, the output is analog, but it was derived from digital information. Vinyl is to CD as real life is to VGA.
on the contrary bud. you felt like talling me that my post was irrelevent and then went on to tell me that i didn't know what i was talking about. without giving further information to clarify any misunderstandings. to me, you were starting the old classic flame war. but done now that we have some clarification.
i was always under the impression that a digital signal is either on or off (like a digital joypad) and an analog signal is progressive (like the analog sticks on said controller).
using that as an example, would i be right in saying that an analog signal is less responsive but more accurate?


getting back to vinyl/CD(not mp3).
for me, i'd only notice the difference when something goes wrong. if there's dust on the record, there will be clicking sounds. but on cd, it would skip and sound horrible. a lot of people prefere vinyl only because that even when there's an imperfection (dust or small scratches or warp) it can sometimes add to the overall "feel" of the record. whereas on cd, it's just a nasty skipping sound. there's not many things in this world that can be better when damaged or dirty.
Quote from dadge :i was always under the impression that a digital signal is either on or off (like a digital joypad) and an analog signal is progressive (like the analog sticks on said controller).

That's correct, I was just saying that when you're talking about VGA, the data is digital when it's on the computer anyway, and converting it to analog before sending it to the monitor doesn't make the quality any better.

Quote from dadge :
for me, i'd only notice the difference when something goes wrong. if there's dust on the record, there will be clicking sounds. but on cd, it would skip and sound horrible. a lot of people prefere vinyl only because that even when there's an imperfection (dust or small scratches or warp) it can sometimes add to the overall "feel" of the record. whereas on cd, it's just a nasty skipping sound. there's not many things in this world that can be better when damaged or dirty.

I agree with that. I keep my vinyls clean with a swiffer cloth, it really keeps alot of the dust off. And I have pretty good needles (Shure m44g) which are low-wear needles. (low wear for DJ needles, at least. Audiophile needles probably wear the record even less) All of my vinyl has fewer scratches than my CDs. I've actually never purchased a CD. All the CDs I have were free.
Quote from wheel4hummer :the data is digital when it's on the computer anyway, and converting it to analog before sending it to the monitor doesn't make the quality any better.

ya know, i deleted something similar to that in my previous post because felt i may have misunderstood and left myself open for a bit of a smack down .
i don't mind being wrong about something as long as i am pointed in the right direction.
i use stanton 500 styli for my Technics 1210's. they're stock needles for most "DJ" turntables. they do the job well and i have always used this model of needle (since 1995). i'm planning on getting a pair (or two) of the STANTON DIABLO V3 cartriges and needles. and a pair of STANTON 681 EEE MKIII for personal listening/ practice.
Quote from dadge :http://h10025.www1.hp.com/ewfr ... /Doc/images/c01532378.gif
red = digital
grey = Analog
i prefere Analog. same with radio. same with music media (vinyl over CD). same with game controllers.

So you have a digital version of the image in your graphics cards memory.
Then it will be converted to an analog signal and put out through an analog vga cable.
The signal will loose in strength with every electro magnetical inteference and of course due to the length of the cable.
After that the signal gets converted to digital again to show it on to your flatscreen.

Are you sure connecting a digital video device through a analog connection is a wise choice ?
Quote from yankman :Are you sure connecting a digital video device through a analog connection is a wise choice ?

it has nothing to do with wisdom. and it's been done for years with no problems.
Quote from yankman :
After that the signal gets converted to digital again to show it on to your flatscreen.

Are you sure connecting a digital video device through a analog connection is a wise choice ?

This is one of the joys of a digital signal - it can be broadcast over an analogue medium and even if the signal has suffered from interference during the time it was analogue, it can be rebuilt into a perfect signal again fairly easily.

As an example, suppose your signal starts out as 1001010, which you then broadcast over an analogue medium. Due to a little interference, the signal received by the analogue receiver at the other end could be:

0.91, 0.1, 0.002, 0.987, 0.04, 0.991, 0.086.

As you know the original signal was either 1 or 0, it's very easy to clamp the analogue values back to their original values. You need a hefty amount of interference to flip a bit from 1 to 0 or vice-versa.
i have been running LFS for about a year on a 47"HDTV at 1920x1080, off a DVI to HDMI adapter. I have sound and all from the DVI to HDMI adapter. You have to have a video card capable of sending sound though it, in my case, i have a ASUS P5N-D and BFG 250GTS 1GB. I had to get a wire to go from my motherboard sound to the input on the video card, then just make sure audio is enabled in the Nvidia control panel.
Attached images
HDTVinfopic.jpg
Quote from Crashgate3 :This is one of the joys of a digital signal - it can be broadcast over an analogue medium and even if the signal has suffered from interference during the time it was analogue, it can be rebuilt into a perfect signal again fairly easily.

As an example, suppose your signal starts out as 1001010, which you then broadcast over an analogue medium. Due to a little interference, the signal received by the analogue receiver at the other end could be:

0.91, 0.1, 0.002, 0.987, 0.04, 0.991, 0.086.

As you know the original signal was either 1 or 0, it's very easy to clamp the analogue values back to their original values. You need a hefty amount of interference to flip a bit from 1 to 0 or vice-versa.

You can't be more wrong. What you are talking about is digital transmission.
In analog transmission the signal, a analog wave signal is transmitted via frequency modulation see here. There is no 0 or 1.

To detail out what I was talking about. The digital image would be converted to multiple analog waves. These are going to be transmitted via vga cable and afterwards these waves will be converted again to a digital signal.
Quote from dadge :before HD became available, the standard resolution on all telivisions was 640x480. but to say it's low quallity is really stupid. after all, we've been watching it for decades without any problems.

So what u mean by this is that u can get me a video in 480i that is JUST AS CLEAR as a video in 1080p?

And what does a TV have to do with LFS? Sure ???x480 is enough for a SD signal to the analog TV, but feed a Digital (flat) TV with SD-signal and it looks rather ugly (most of the time ofc)..
Quote from Feffe85 : So what u mean by this is that u can get me a video in 480i that is JUST AS CLEAR as a video in 1080p?

no. i said that 640x480 on a normal tv is not as bad as most people think.

Quote from Feffe85 : And what does a TV have to do with LFS?

if you have to ask that question, maybe you should read the thread again.
Quote from Feffe85 : feed a Digital (flat) TV with SD-signal and it looks rather ugly (most of the time ofc)..

of course a 640x480 resolution is going to look crap on a flatscreen tv. the tv stretched the ratio just like lcd monitors do.
Quote from dadge :no. i said that 640x480 on a normal tv is not as bad as most people think.

if you have to ask that question, maybe you should read the thread again. of course a 640x480 resolution is going to look crap on a flatscreen tv. the tv stretched the ratio just like lcd monitors do.

on a normal tv (fatscreen) it looks nice cause of the fact that its made for the SD-signal that was made at the same time as the fatscreen revolutionized the world..

he wanted LFS in HD.. doesnt say anything about a TV.. period.

stretched the ratio? If it is a flatscreen with 640x480, will it still look crap as u just stated?

Quote from dadge :before HD became available, the standard resolution on all telivisions was 640x480.

Just noticed.. WHEN were 640x480 a STANDARD ress in TV's? :S
Quote from Feffe85 : he wanted LFS in HD.. doesnt say anything about a TV.. period.

at a guess i'd say that the OP is NOT the only person to have made a post in this thread. infact, by the 4th post the discussion was leading more towards telivision resolutions. but you'd see that if you READ THE WHOLE THREAD!

stretched the ratio? If it is a flatscreen with 640x480, will it still look crap as u just stated?
Quote from Feffe85 :stretched the ratio? If it is a flatscreen with 640x480, will it still look crap as u just stated?

seriously. think about what you're saying. it's bullshit. i've unsubscribed from this thread because you're chatting out of your arse. period
This thread is closed

LFS in HD
(65 posts, closed, started )
FGED GREDG RDFGDR GSFDG