FREAKED'S RETROSPACE - Keeping old technology alive

=============================================================================

\\\\ Startseite | Infoportal | Kontakt ////

=============================================================================

Infoportal * Kein Analog-Ton, wenn TV zeitgleich via DVI-HDMI Kabel mit dem PC verbunden wird

-----------------------------------------------------------------------------

MEMO Diesen Beitrag auf der neuen Seite lesen

Diese Anleitung, besonders Part 3 hat mir geholfen, als ich von einer nVidia 7900GS auf eine GT330 wechselte und dann keinen Ton am via DVI-HDMI und zeitgleich via Analogton verbundenen Fernseher (Toshiba) mehr hatte. Dieser unterstützt VGA/HDMI sowie AV. Ich verwende ein DVI auf HDMI-Kabel, welches keinen Ton überträgt. Auch in den nVidia-Einstellungen hatte ich Tonübertragung via DVI deaktiviert. Es schien auch kein zusätzliches Audiogerät seitens der GT330 auf. Sie kann es wohl selbst nicht?

Der Toshiba-TV bekam oder meldete dennoch die Info, dass Tonausgabe via HDMI stattfinden kann/soll, und er stellte den via AV verbundenen analogen Ton beim umschalten auf HDMI ab. Die Anleitung sorgt dafür, dass diese Meldung übergangen wird und Windows denkt, dass es keinen Ton via DVI geben kann und der TV wieder analogen Ton ausgibt, auch wenn er auf HDMI geschalten ist.

Part I - Introduction

-----------------------------------------------------------------------------


(DISCLAIMER: YOU CAN SAFELY SKIP THIS FIRST PARAGRAPH IF YOU'RE ONLY INTERESTED IN TECHNICAL DETAILS)

I spent the whole weekend breaking my head over this... and finally found a solution. I live in a 'cardboard' house with very thin walls, so to avoid the wrath of my neighbors I'm restricted to a good earphone / amplifier / audio card setup. I'm also fond of gaming so some time ago I picked up a Creative X-Fi card, something like Fatality Pro. Even though it's a cool sound adapter, it's stripped of one thing: an onboard SPDIF-out. The card does have an optical out, but that's on the exterior and would only accept a TOSLINK cable. So I'm not able to pass the sound signal directly to my sound card so it would add it to the HDMI carrier. Well, I had no need to do that until I replaced a burnt 8800 Ultra with a GTX 295. A few days ago, I wanted to watch a movie from my PC - not alone, but with my friends actually. In such cases, I simply used the analog sound input on my TV plasma panel. The two built-in speakers were just enough for me and my friends - you can't have a home theatre in a 'cardboard' house unless you want police knocking at your door. Long story short, it turned out that GTX 295 thought itself too smart to let the analog sound override the silence it cast over HDMI - okay, you remember that I couldn't just connect my sound card to the GeForce due to the unfortunate lack of SPDIF-out pins on the former. But my GTX 295 (equipped with an onboard SPDIF-in) would prefer to broadcast dead silence as its HDMI signal component rather than let a third-party analog sound stream override it. I tried DVI->HDMI and HDMI->HDMI connections and got the analog sound muted just because the TV panel kept its speakers busy with playing the silent signal it kept receiving from the wicked GeForce chip.

Part II - Possible user mistakes and hardware workarounds

-----------------------------------------------------------------------------


(DISCLAIMER: YOU CAN SAFELY SKIP THE MOST PART OF THE TEXT BELOW UNTIL YOU GET TO PART III OF THE INSTRUCTION IF YOU DO NOT WANT TO TRY A HARDWARE WORKAROUND FOR THE ISSUE AND / OR CHECK IF YOUR ANALOG SOUND WAS LOST DUE TO YOUR OWN ACTIONS AND NOT THE MISCHEVIOUS NATURE OF YOUR GRAPHICS CARD)

Now, let's get down to technical details. First of all, I realized that DVI-I can have a sound component coded as part of the video signal. One of my buddies has shown me how a DVI->HDMI connection carries sound to the TV panel via a HDMI audio passthrough link (that is, his sound card's SPDIF-out pins are connected to his graphics card's SPDIF-in pins - so his life is much easier!). I don't know how perverted a graphics chip should be to encode sound as part of the video signal, - you know that DVI was never meant to carry sound, - but one thing becomes clear: even if you are using a DVI->HDMI connection, you won't be able to cut out the silent audio signal. So, it makes no difference whether you are using DVI->HDMI or HDMI->HDMI if you face the No Analog Input problem.

Secondly, make sure you are using a 'special' HDMI port on your TV panel. Most TV panels will only allow one of the several onboard HDMI ports run compatible with analog sound. Such 'special' HDMI ports usually have a couple of associated RCA sockets nearby. Before proceeding further with this guide, please make sure that you are using the 'special' HDMI port and the 'special' analog sound in. For example, my Samsung plasma panel has the 'special' RCA jacks marked as HDMI2 only. Ridiculously, the printed manual doesn't say a thing about that. So you should be looking for some audio in that says HDMIX only or something like that. You won't get any analog sound if you choose the wrong HDMI port or the wrong RCA or Minijack sockets and that's perfectly normal.

Thirdly, strip your TV panel of all the other cables: S-Video, HDCP, everything. Still no sound? Congratulations, now you can be sure it's your graphics card that we must fix.

Forthly, look for a DVI connector on your TV panel. If you've got one, the best choice is to use a DVI->DVI or HDMI->DVI cable to connect your graphics card to the TV. You'll get your analog sound back for sure, just don't forget to use the analog sound input particularly associated with that DVI connector (consult the manual and look for printed markings on the panel, normally you get some sort of audio in planted just half an inch away from the DVI connector). A HDMI->DVI switch can also be an option. As long as you use a DVI connector on your TV panel, it will magically tell your video card that there's no hdmi audio signal to be sent.

As a last resort, you might want to use the VGA connector. In this case, it's safer to go with a DVI->DVI or HDMI->DVI cable with a DVI->VGA adapter plug added to the TV-side end of the cable. Let me explain. A plain VGA cable carries a plain analog video signal that's prone to noise and distortions. DVI and HDMI cables carry a digital signal that doesn't care about any distortions as long as it's not exposed to an alien EMP emitter that's gonna screw the difference between the 0's and 1's sent through. The only way to get distorted picture with a DVI or HDMI cable is to buy a cable so cheap and crappy that it simply fails to meet the DVI or HDMI specifications, but that's a very rare case (anything costing above $20 will surely do). VGA cables are quite different - the longer one is, the more vulnerable to distortions it is. Quality VGA cables can cost a real lot ($200-$400 depending on the length), and in this case the price and quality of the cable can be crucial to the picture quality. So it's always better to go with DVI or HDMI cables because you get a nearly incorruptible picture quality for little money and regardless of the cable length. In order to use the VGA connector and get little to no picture distortion, just take a DVI->DVI or HDMI->DVI cable and stick a DVI->VGA adapter to the DVI end. Now you'll have digital signal running all along the cable, and the digital-to-analog encoding is done within the adapter and instantly fed to the TV. You see - as there's no actual VGA cable in such a setup (just a tiny adapter plugged directly into the TV set!), there's no place for EMP interference. There might be only a very minor loss in picture quality due to the digital->analog conversion, but it's really hard to notice so don't bother. As for the DVI->VGA or even rare HDMI->VGA solid cables, my guess would be that the digital->analog encoding is done at the near end of the cable and the rest of the cable is of an analog type, so basically those should be regarded as analog cables with digital adapters built onto one end - and you are likely to have some picture corruption due to the EMP interference unless you buy a very expensive cable. So stick with digital cables and analog adapter plugs.

In my case, a DVI->DVI cable with a DVI->VGA adapter on the TV end worked fine. Once I plugged it into the VGA port of my TV panel, the analog sound was back (don't forget that the VGA connector uses its own associated analog sound input, just like the 'special' HDMI port or the DVI connector; don't forget to look up where to stick your analog sound so it gets along with the VGA video input). I could have stopped at this point, - for I got a screen picture hardly worse than the one I had with my DVI->HDMI connection, - but still I was hoping to get back to the HDMI+analog setup. Honestly, I didn't like the look of VGA adapter that made the bulky DVI connector appear twice bulkier - looked like some five-inch-long piece of crap, and of course this wouldn't do if I were using a wall-mounted TV panel, just because it looked too long to fit in between the wall and the panel. Mine is standing on a tripod, but commonly that's not the case.

Part III - Spoofing the EDID signature

-----------------------------------------------------------------------------


If none of the measures described above helped you to get your screen picture and analog audio the way you want them to look and sound, it's time to resort to EDID spoofing. Also, if you are an experienced Windows XP / Vista / 7 user, it should be easier for you to fiddle with the registry than with the cables and adapters.

EDID spoofing means that you are about to deceive your graphics card into thinking that your TV panel uses only DVI connection and has no HDMI capability. You may get some of your display settings screwn. Or not. Really depends on the system, but more often you get away with it than not. There's always a way to undo the changes.

Here's the instruction (don't ask me why it should be done this way and not another - it just works and that's ok).

The instructions below are given for Windows XP 64-bit, but there's little difference between its registry structure and what you'll see in Vista / Win7.

1. Shut down the system, disconnect all the displays except the TV panel (use a digital connection: DVI->HDMI, DVI->DVI or HDMI->DVI)

2. Boot up and get the Monitor Asset Manager moninfo.exe

3. Run it and look into the upper left window. One of the entries should be marked as 'real-time', not 'registry'. That's the display that you are currently using. Select it.

4. In the 'Raw Data' window, copy the first four bytes (symbol pairs) from the second row, as seen in the picture:
Bild ohne Beschreibung

5. Now it's time to make up the magic spell. Paste these symbol pairs, e.g. 58 B3 00 37 in a text document. Then add the following symbols right behind: 00 00 FF FF 04 00 00 00 7E 01 00.
In my case, the spell looks like 58 B3 00 37 00 00 FF FF 04 00 00 00 7E 01 00. Obviously, the first four bytes (shown in red) will vary depending on the model of your TV panel.

6. Press Start, choose Run, type Regedit and press enter.

7. Let's see where Nvidia graphics adapter settings are stored. Go to HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Video\

8. Inside \Video\ you will see one or more folders named like {4B74B220-C714-4200-8809-697049CFB9FF}. Each folder corresponds to a different GPU found in your system. Cards with 2 GPUs will get you two folders per card, the same goes to SLI setups. So a triple-SLI setup of GTX 295 cards should result in six different folders there.

9. It's hard to tell which folder corresponds to the GPU that actually drives your TV panel if there are two or more folders there. So let's make each and every GPU forget to use HDMI connection with this particular type of display and emulate a DVI link instead. None other displays will be affected as long as they are of a different model. And if they are of the same model, you'd want to fix the missing analog audio for them, too. (Honestly, I can hardly image a guy with two or three widescreen TVs hooked to the same PC in the same room)

10. Inside each folder named like {########-####-####-####-############} you will see a folder titled 0000. Open it and behold a crap load of application-specific parameters. Scroll the list to the very bottom, right-click on it and do Edit / New / Binary Value. You'll get an empty binary value at the bottom of the long list.

11. Rename the New Binary Value into OverrideEdidFlags0 (no typos please or won't work!). Right-click on it and choose Edit Binary Value. In the opened window, manually type your spell, e.g. 58 B3 00 37 00 00 FF FF 04 00 00 00 7E 01 00. That's how it likes when if done properly:
Bild ohne Beschreibung

12. Apply the same procedure to each folder titled like {########-####-####-####-############} inside \Video\ that is related to your graphics card.

13. Reboot and enjoy your analog audio just as much as you enjoy my broken English.

14. If the analog sound isn't back, send for the witch doctor. But most likely you did something wrong - misspelled the spell or tinkered in a wrong registry folder.

15. To undo the changes simply kill all these newly created OverrideEdidFlags0 parameters and reboot.

16. If you lose your analog audio again after reinstalling or updating video drivers, repeat the whole ritual.

Derzeit gibt es 39 klassische Banner, haben Sie schon alle gesehen?
Werbung

< Zurück ^ Nach oben EXIT Neue Seite

=============================================================================

Besucher Nummer: ERROR

Copyright © 2012-2024 retrospace.net

=============================================================================