• USB Port on Computer to HDMI port on Monitor?

    From Alan Holbrook@21:1/5 to All on Tue Apr 16 12:41:19 2024
    I have an older Win10 box with USB 2.0 ports and no HDMI out. I have video
    on it that I'd like to send to a monitor with an HDMI port using VLC or
    some similar software. I see that there are USB to HDMI dongles available
    that I could plug into the computer and then attach to the monitor with a standard HDMI cable. Would such a setup work for what I want to do? Is
    USB 2.0 liable to give me performance issues while playing the video?

    Or, is there any other solution anyone can suggest?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Big Al@21:1/5 to Alan Holbrook on Tue Apr 16 09:57:44 2024
    On 4/16/24 08:41 AM, Alan Holbrook wrote:
    I have an older Win10 box with USB 2.0 ports and no HDMI out. I have video on it that I'd like to send to a monitor with an HDMI port using VLC or
    some similar software. I see that there are USB to HDMI dongles available that I could plug into the computer and then attach to the monitor with a standard HDMI cable. Would such a setup work for what I want to do? Is
    USB 2.0 liable to give me performance issues while playing the video?

    Or, is there any other solution anyone can suggest?
    If this is a rare occasion, you might look into a Chromecast adapter for you standard TV. You could
    could then cast the video to your TV via the Chromecast. But then that means buying the
    Chromsecast adapter for $29 US.

    You could do without the 4K model and just get the plain ole simple Chromecast. https://www.amazon.com/Chromecast-Google-TV-Streaming-Entertainment/dp/B0B9HS6DLZ

    --
    Linux Mint 21.3 Cinnamon 6.0.4 Kernel 5.15.0-102-generic
    Al

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Joerg Walther@21:1/5 to Alan Holbrook on Tue Apr 16 15:28:06 2024
    Alan Holbrook wrote:

    I have an older Win10 box with USB 2.0 ports and no HDMI out. I have video >on it that I'd like to send to a monitor with an HDMI port using VLC or
    some similar software. I see that there are USB to HDMI dongles available >that I could plug into the computer and then attach to the monitor with a >standard HDMI cable. Would such a setup work for what I want to do? Is
    USB 2.0 liable to give me performance issues while playing the video?

    Your box must be very old, USB3 came out 16 years ago and it probably
    took a year or two till nearly every new PC had it.
    All USB-HDMI adapters that I can find have USB3, so USB2 will likely
    only work with a very small framerate, if at all. But you seem to be
    mixing up two methods, mentioning VNC (not VLC, that's a video player).
    With VNC you can transfer a picture via the network to another computer,
    this should work quite nicely even on older PCs, but it does not run on
    the setup you described since it would include a second PC with an HDMI
    port to play back the picture from the first one.

    -jw-
    --
    And now for something completely different...

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Frank Slootweg@21:1/5 to Big Al on Tue Apr 16 14:14:35 2024
    Big Al <alan@invalid.com> wrote:
    On 4/16/24 08:41 AM, Alan Holbrook wrote:
    I have an older Win10 box with USB 2.0 ports and no HDMI out. I have video on it that I'd like to send to a monitor with an HDMI port using VLC or some similar software. I see that there are USB to HDMI dongles available that I could plug into the computer and then attach to the monitor with a standard HDMI cable. Would such a setup work for what I want to do? Is USB 2.0 liable to give me performance issues while playing the video?

    Or, is there any other solution anyone can suggest?

    If this is a rare occasion, you might look into a Chromecast adapter
    for you standard TV. You could could then cast the video to your TV
    via the Chromecast. But then that means buying the Chromsecast
    adapter for $29 US.

    You could do without the 4K model and just get the plain ole simple Chromecast.
    https://www.amazon.com/Chromecast-Google-TV-Streaming-Entertainment/dp/B0B9HS6DLZ

    Be careful with this, because as far as I know, you need a smartphone
    (with the Google 'Home' app) to set up the Chromecast. In the old days
    there was a Chromecast setup program for Windows (I used it), but IIRC
    it does no longer exist or/and does not work anymore.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Big Al@21:1/5 to Frank Slootweg on Tue Apr 16 11:33:50 2024
    On 4/16/24 10:14 AM, Frank Slootweg wrote:
    Big Al <alan@invalid.com> wrote:
    On 4/16/24 08:41 AM, Alan Holbrook wrote:
    I have an older Win10 box with USB 2.0 ports and no HDMI out. I have video >>> on it that I'd like to send to a monitor with an HDMI port using VLC or
    some similar software. I see that there are USB to HDMI dongles available >>> that I could plug into the computer and then attach to the monitor with a >>> standard HDMI cable. Would such a setup work for what I want to do? Is >>> USB 2.0 liable to give me performance issues while playing the video?

    Or, is there any other solution anyone can suggest?

    If this is a rare occasion, you might look into a Chromecast adapter
    for you standard TV. You could could then cast the video to your TV
    via the Chromecast. But then that means buying the Chromsecast
    adapter for $29 US.

    You could do without the 4K model and just get the plain ole simple Chromecast.
    https://www.amazon.com/Chromecast-Google-TV-Streaming-Entertainment/dp/B0B9HS6DLZ

    Be careful with this, because as far as I know, you need a smartphone (with the Google 'Home' app) to set up the Chromecast. In the old days
    there was a Chromecast setup program for Windows (I used it), but IIRC
    it does no longer exist or/and does not work anymore.
    Actually some TVs will not let you cast directly to them. I have one.
    So I can cast to Roku, Chromecast, or LG TV. Too confusing huh!?
    --
    Linux Mint 21.3 Cinnamon 6.0.4 Kernel 5.15.0-102-generic
    Al

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Newyana2@21:1/5 to Alan Holbrook on Tue Apr 16 13:29:51 2024
    On 4/16/2024 8:41 AM, Alan Holbrook wrote:
    I have an older Win10 box with USB 2.0 ports and no HDMI out. I have video on it that I'd like to send to a monitor with an HDMI port using VLC or
    some similar software. I see that there are USB to HDMI dongles available that I could plug into the computer and then attach to the monitor with a standard HDMI cable. Would such a setup work for what I want to do? Is
    USB 2.0 liable to give me performance issues while playing the video?

    Or, is there any other solution anyone can suggest?


    When I wanted to stream Netflix to a new TV from Win7, the
    computer had no HDMI port. I got a cheap graphics card. I think
    it was about $35, so that I could have the ports I needed for the
    TV and the monitor both.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Paul@21:1/5 to Alan Holbrook on Tue Apr 16 14:08:49 2024
    On 4/16/2024 8:41 AM, Alan Holbrook wrote:
    I have an older Win10 box with USB 2.0 ports and no HDMI out. I have video on it that I'd like to send to a monitor with an HDMI port using VLC or
    some similar software. I see that there are USB to HDMI dongles available that I could plug into the computer and then attach to the monitor with a standard HDMI cable. Would such a setup work for what I want to do? Is
    USB 2.0 liable to give me performance issues while playing the video?

    Or, is there any other solution anyone can suggest?


    Really need more details about your desktop platform source end and your TV destination.

    VGA ---- (adapter) ----- HDMI # Active adapter, uses USB for VBUS power source. 1920x1080.

    https://www.startech.com/en-ca/audio-video-products/vga2hdu

    Now, that's a good solution, as it does not load the CPU any more than regular video playback would.
    This would be my recommendation. My big assumption here, is your computer is old enough to
    say, only have a VGA laying around. Laptops frequently have a VGA, as an example of when
    that adapter would be gold.

    *******

    USB2 to DisplayLink HDMI # Only suited to PowerPoint slide shows. NOT a video player.
    # Used HEAVY compression, which is why it was suited to slide shows.

    USB3 to DisplayLink HDMI # Can play video. Uses LITE compression, needs some CPU.
    # A "FAT" USB3 link might be useful in this case.

    To get USB3 on a USB2 computer, you could use a PCI Express x1 slot and a NEC/Renesas USB3 card (200MB/sec).
    Or, you could use a PCI Express x4 slot and an Asmedia x2-wired USB3 chip (400MB/sec). Numbers, approximate.

    Having x2 (two lanes), the purpose of that is to take the lower-speed older-computer PCIe Rev 1.1 250MB/sec
    lanes and get enough speed for full rate USB3. If your expansion slot used Rev 2.0 then the
    sum of the lanes is more than enough with one of those x2 cards.

    Your E8400 might be just enough CPU to run 720P or 1080P videos.

    If the TV set has four HDMI, three will be Media HDMI (1280x720 or 1920x1080, no other res available),
    and the "Computer" HDMI connector will support 640x480, 800x600, 1024x768, 1280x720, 1600x1200, 1920x1080.
    As long as the computer is willing to drive 1920x1080 out the VGA port for the first
    solution, it won't matter which HDMI input you use on the TV.

    VGA ports have had "400MHz bandwidth" for a long time. This was
    variously listed as 2048x1536 or 2048x2048, to imply limits which
    are ridiculous for practical use. There used to be "reflection" problems
    on the VGA cable, which made high res a bit distorted. In the case
    of your active VGA to HDMI adapter above, the adapter screws right to the
    VGA port, the interconnect is short, there should be no problem
    running 1920x1080. Should be a clean signal. When you want "reach" and a long cable, just make the HDMI cable longer. At 1920, an HDMI cable should support
    a fairly long run. (At 4K, 140FPS, DeepColor, the HDMI cable would be six
    feet at most. The more whizzy the standard on HDMI, the shorter the cable.)

    Video cards had:

    VGA output (a long time ago, no dongle to adapt. Laptops preserved this output connector for a longer span.)
    DVI-I (a mix of digital and analog, use passive DVI to VGA adapter to get a VGA connector and its VGA signal)

    DVI-D (In this generation, NVidia removes analog entirely, passive adapter won't help,
    cross on connector is missing for the analog signal pins)

    No DVI at all (same as previous, even DVI of any description is now missing).

    An older video card could have VGA and DVI-I, and you could make two VGA from it.

    A video card with VGA, HDMI, DVI-D, that was really just DVI-I plus HDMI and only two ports.
    You cannot use the VGA and the DVI-D at the same time, because they're the same port.

    To convert the DP (not a DP++ port) on my Optiplex 780, I used an active DP to HDMI adapter.
    Most DisplayPort (DP) connectors you find today, will have a DP++ logo, and you can
    make HDMI using a passive adapter (which is half the price). but on the Optiplex, I
    needed the active converter, to convert from non-DP++ mode to HDMI.

    There are still-more ways to do it. A DVI to HDMI adapter might work, but
    in order to guarantee a solution works for you, I'm trying to avoid iffy adaptations. I've even had HDMI from a modern video card, not work
    with the HDMI on my new monitor, then I used the active DP to HDMI
    adapter, and THAT worked. It's a good thing I bought that adapter,
    as I've got more use from it than I was supposed to.

    Paul

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Vlad Siffer@21:1/5 to Nym-Shifter Pedo on Tue Apr 16 20:00:00 2024
    On 16/04/2024 13:41, Arlen Holder, Nym-Shifter Pedo wrote:

    Or, is there any other solution anyone can suggest?

    My suggestion is to buy a new machine with all the features you need.
    That's the best solution for what you want to do.

    Is this video of young boy(s) you have abused in your YMCA?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Paul@21:1/5 to All on Tue Apr 16 16:47:39 2024
    On 4/16/2024 1:29 PM, Newyana2 wrote:
    On 4/16/2024 8:41 AM, Alan Holbrook wrote:
    I have an older Win10 box with USB 2.0 ports and no HDMI out.  I have video >> on it that I'd like to send to a monitor with an HDMI port using VLC or
    some similar software.  I see that there are USB to HDMI dongles available >> that I could plug into the computer and then attach to the monitor with a
    standard HDMI cable.  Would such a setup work for what I want to do?  Is
    USB 2.0 liable to give me performance issues while playing the video?

    Or, is there any other solution anyone can suggest?


      When I wanted to stream Netflix to a new TV from Win7, the
    computer had no HDMI port. I got a cheap graphics card. I think
    it was about $35, so that I could have the ports I needed for the
    TV and the monitor both.

    You should have stated where you got that card :-)

    This card (1050ti) has a video decoder block. My movie is played by the 1050Ti. The CPU is sleeping in the corner.

    [Picture] Currently using 77W total to play the movie... Yikes. Not proud of this.
    Machine totally idle is 36W. Rail one CPU core is 130W. Use whole CPU, 224W.

    https://i.postimg.cc/X7mvzcpS/video-decoder-acceleration-cheap-card-1050.gif

    There are even older cards, you might find for sale.

    (GT710) https://www.newegg.com/gigabyte-geforce-gt-710-gv-n710d3-2gl-1-0/p/N82E16814125844

    (GT730)
    https://www.newegg.com/p/1FT-000P-005R5

    The 710/730 have MPEG2 and H.264, which is a
    start at playback capability.

    The 1030/1050 have MPEG2, H.264, and H.265 .

    https://en.wikipedia.org/wiki/Nvidia_NVDEC

    And the RTX4090 plays everything, and is a yard wide and a yard long. It does Deep Color.
    Even if your TV does not have Deep Color.

    Video cards also have encoders, and there is a table elsewhere that does a better summary.
    You can re-encode a movie at 10x realtime, if the card has the encoder. Encoder uses 60W.

    https://en.wikipedia.org/wiki/Nvidia_NVENC

    *******

    https://www.techpowerup.com/gpu-specs/geforce-gt-1030.c2954

    GP108 GPU Notes
    NVENC: No Support <=== A 1030 has no accelerated video encoding capability NVDEC: 3rd Gen <=== But still plays your movie
    PureVideo HD: VP8
    VDPAU: Feature Set H

    https://www.techpowerup.com/gpu-specs/geforce-gtx-1050-ti.c2885

    GP107 GPU Notes
    NVENC: 6th Gen <=== A 1050Ti has an encoder...
    NVDEC: 3rd Gen <=== as well as the limited decoder
    PureVideo HD: VP8
    VDPAU: Feature Set H

    https://www.techpowerup.com/gpu-specs/geforce-gt-710.c1990

    GK208B GPU Notes
    NVENC: No Support
    NVDEC: 1st Gen <=== after being rebranded so many times, this is what you would expect
    PureVideo HD: VP5
    VDPAU: Feature Set D

    https://www.techpowerup.com/gpu-specs/geforce-gt-730.c1988

    GK208B GPU Notes
    NVENC: No Support
    NVDEC: 1st Gen <=== spending extra for a 730 isn't helping with movies PureVideo HD: VP5
    VDPAU: Feature Set D

    And just for giggles...

    https://www.techpowerup.com/gpu-specs/geforce-rtx-4090.c3889

    AD102 GPU Notes
    Ray Tracing Cores: 3rd Gen <=== games
    Tensor Cores: 4th Gen <=== "AI"
    NVENC: 8th Gen
    NVDEC: 5th Gen <=== "Plays everything, Deep Color included" even if you don't need it
    PureVideo HD: VP12
    VDPAU: Feature Set L

    So that's an analysis for a non-gamer. No mention of CUDA cores or DirectX37. My card is "weak sauce" for gaming. Have not tested.

    Paul

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Paul@21:1/5 to All on Tue Apr 16 17:44:52 2024
    On 4/16/2024 5:34 PM, Newyana2 wrote:
    On 4/16/2024 4:47 PM, Paul wrote:


    You should have stated where you got that card :-)

    This card (1050ti) has a video decoder block. My movie is played by the 1050Ti.
    The CPU is sleeping in the corner.

        [Picture]  Currently using 77W total to play the movie... Yikes. Not proud of this.
                   Machine totally idle is 36W. Rail one CPU core is 130W. Use whole CPU, 224W.

         https://i.postimg.cc/X7mvzcpS/video-decoder-acceleration-cheap-card-1050.gif

    There are even older cards, you might find for sale.

    (GT710)
    https://www.newegg.com/gigabyte-geforce-gt-710-gv-n710d3-2gl-1-0/p/N82E16814125844


       GT710 sounds familiar. I just went to Microcenter and got what was
    cheap that had HDMI plus other options. I'm afraid I don't understand you analysis and the distinction of decoding on the card. I don't play games
    or do anything that needs optimized graphics, so whatever is basic
    works fine for me.

    When you have a machine with a weak CPU, it's nice to have the
    video card do all the movie playback for you.

    In years past, you could rail a core doing movie playback in software,
    and then it depended on the quality of the coding, as to whether
    the movie playback was flawless.

    If the video card plays mpeg2 or H.264, chances are better it will
    work and have some quality.

    My laptop for example, has a single core, and is little better than
    a Pentium 4. But it still plays video, because the chipset has
    a decoder. But the laptop isn't good at much else. The CPU sees
    to that (it is like a 64-bit version of the AthlonXP ).

    Paul

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Newyana2@21:1/5 to Paul on Tue Apr 16 17:34:33 2024
    On 4/16/2024 4:47 PM, Paul wrote:


    You should have stated where you got that card :-)

    This card (1050ti) has a video decoder block. My movie is played by the 1050Ti.
    The CPU is sleeping in the corner.

    [Picture] Currently using 77W total to play the movie... Yikes. Not proud of this.
    Machine totally idle is 36W. Rail one CPU core is 130W. Use whole CPU, 224W.

    https://i.postimg.cc/X7mvzcpS/video-decoder-acceleration-cheap-card-1050.gif

    There are even older cards, you might find for sale.

    (GT710) https://www.newegg.com/gigabyte-geforce-gt-710-gv-n710d3-2gl-1-0/p/N82E16814125844


    GT710 sounds familiar. I just went to Microcenter and got what was
    cheap that had HDMI plus other options. I'm afraid I don't understand you analysis and the distinction of decoding on the card. I don't play games
    or do anything that needs optimized graphics, so whatever is basic
    works fine for me.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Newyana2@21:1/5 to Paul on Tue Apr 16 22:44:46 2024
    On 4/16/2024 5:44 PM, Paul wrote:

    If the video card plays mpeg2 or H.264, chances are better it will
    work and have some quality.

    My laptop for example, has a single core, and is little better than
    a Pentium 4. But it still plays video, because the chipset has
    a decoder. But the laptop isn't good at much else. The CPU sees
    to that (it is like a 64-bit version of the AthlonXP ).

    Would it be accurate to say the newer the graphics, the more
    it's taking over the work, or does it really depend on the hardware?
    The 710 seems to handle streaming fine, but actually it's been better
    since awhile back when I discovered that I had some old RAM that
    fit that machine, jackinh up the total from 4 to 6 GB. Before the
    machine was slightly groggy and video would occasionally stutter.
    The computer is a Dell XPS625. My father ordered it custom back
    in 2010, but then at some point his mental functioning deteriorated
    to the point that he could no longer use it. So now it's one of my
    streaming boxes.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Paul@21:1/5 to All on Wed Apr 17 02:44:28 2024
    On 4/16/2024 10:44 PM, Newyana2 wrote:
    On 4/16/2024 5:44 PM, Paul wrote:

    If the video card plays mpeg2 or H.264, chances are better it will
    work and have some quality.

    My laptop for example, has a single core, and is little better than
    a Pentium 4. But it still plays video, because the chipset has
    a decoder. But the laptop isn't good at much else. The CPU sees
    to that (it is like a 64-bit version of the AthlonXP ).

       Would it be accurate to say the newer the graphics, the more
    it's taking over the work, or does it really depend on the hardware?
    The 710 seems to handle streaming fine, but actually it's been better
    since awhile back when I discovered that I had some old RAM that
    fit that machine, jackinh up the total from 4 to 6 GB. Before the
    machine was slightly groggy and video would occasionally stutter.
    The computer is a Dell XPS625. My father ordered it custom back
    in 2010, but then at some point his mental functioning deteriorated
    to the point that he could no longer use it. So now it's one of my
    streaming boxes.

    First you would want to review the BIOS settings.

    I haven't committed all these things to memory :-)

    There was some deal with AMD at one point, where you
    could decouple dual-channel. This allowed cores (like on
    your quad core) to make independent requests on a single
    channel. Ah, yes, it was called "Unganged". It could depend
    on what the machine is doing, as to which option is better.
    A lot of software is single core and does not take
    advantage of multiple cores, and leaving the RAM in dual
    channel mode may help. On a server (multiple uncorrelated processes),
    maybe Unganged is better.

    At one time, the AMD memory controller was not very sophisticated.
    Intel had Flex memory, whereas AMD did not bother with this. It
    would have made their test bench too large to test. With Intel,
    it didn't matter which holes you put the RAM in, the Intel could
    pair them you virtually in a sense. The AMD was too clumsy for
    this, and required to put stuff in the correct holes. Nobody
    would tell you that you made a mistake.

    Ch0 Ch1
    | |
    1GB 2GB <---- Not matched across, preferred to be matched horizontally
    2GB 1GB This is 6GB in single channel mode (a bit slow).
    You won't like my drawing -- consult user manual for diagram :-)

    Ch0 Ch1
    | |
    2GB 2GB <---- Now we have a better-performing setup. It can't interleave
    1GB 1GB between rows quite as well as it might, but it is still a lot better.
    Some things will be more snappy (partly because AMD cache at one time
    was a little bit slow, and there was more of a dependency on RAM
    performance because of the hit rate. The Intel design seemed to
    "hide" issues with RAM better). Now it's dual channel mode. It can
    rock back and forth between rows, but the pattern will be a bit
    asymmetric.

    Ch0 Ch1
    | |
    2GB 2GB <---- Depending on the stride through memory, when you access a block of
    2GB 2GB locations, it can alternate between the two rows. This gives several
    percent more performance. You can keep pages open on two ranks
    on each DIMM, times the number of DIMMs, times the number of banks inside
    the DIMM. Intel had a name for this, that I forget, this mode
    (symmetric something-or-other). It's still dual channel, only with less
    waiting for precharge or something.

    But something might also be going on with VRAM on the video card, versus an allocation of system memory to help the VRAM. Otherwise, I don't know
    what would be clogging it up. If my video card has VRAM, perhaps I would configure things so only that VRAM was on offer, and I would not want it playing with my system RAM. Some of the low end AMD cards, could use a little of both. And some cards didn't have local VRAM at all, and using system RAM wasn't helping matters (when it's DDR2 and a bit slow).

    With things like Bulldozer and Piledriver, Intel style hyperthreading versus the sort of hybrid approach AMD used (Y shaped major axis, the bottom of the
    Y is a shared resource, the top is duplicated stuff and might have beat
    Intel given how Intel's worked). You need a "CPU driver" to get the
    best from the hardware (might have been Scheduler related). Sometimes they didn't tell people this, and the customers did not know that a little extra performance would come their way, if they tracked down the driver on the AMD.com site.

    Maybe if I was an Asus customer, I would not expect to find that driver
    on their site, and I would have to spelunk the AMD site instead. Which
    is harder than you would think. This is the kind of stuff that
    should have been bundled into a Windows Update. And perhaps today, with the "drivers on Catalog server" thing, it would get done that way. The CPU driver may have been a bit late arriving, which makes it harder for a Dell to put
    it on the hard drive on the day of release. It could have been a year late. There was some fighting about money, behind the scenes, and this resulted
    in issues of this type. And eventually we had Microsoft indicating "no support for new processors on OS nn", as evidence there was no free lunch available.

    Your box is tune-able, but you really needed to be reading the forums
    of the time, and keeping your notes file up to date, on the latest
    discovered "issue".

    Yours is a Phenom II, so you don't have the "half my TLB is dead" issue.

    https://www.anandtech.com/show/2489

    *******

    If you go back a long long ways in time, the only thing the video card
    had was IDCT (inverse discrete cosine transform). Lots of video format
    (except Cinepak), they use "macro blocks", square groups of pixels, and
    the video was modified in the frequency domain. This involves doing
    some sort of "trig" to convert the content from the frequency domain
    to something else. The video card offered IDCT for hardware acceleration.
    This might mean the CPU transferred the block to the video card, pushes a button,
    coefficients come back, it copies them out.

    This is ridiculously CPU intensive. My software "buddy" at work, I'd add
    some accelerator to the hardware when I designed something, and she'd bench
    the damn thing and tell me it was shit (but in a polite way). And so IDCT probably fell into that area, appreciated that someone tried to do it,
    but not enough automation.

    Fast forward to today, the video card does everything. It's probably given
    a pointer in memory, it uses DMA, loads the video card, some vertex or texture processors do all the math, video loads at a certain offset in the frame buffer,
    you're watching a movie. Maybe the CPU finds out, and loads another 64KB into RAM and the CPU goes back to sleep.

    if the video is 720x480, the video card can have a scaler to change
    the dimensions. It used to take 40% of a Pentium 4 to "do this by hand".
    When a hardware scaler was added to the video card, suddenly this was free.

    It's the progress of the automation in there, that is the shocker. We've come all the way from "pathetic acceleration", to "series of logic blocks, still
    not tied together well", to "runs the show almost completely itself". The
    video card likely can't read the disk drive, and so that's where it has
    to stop. That is the boundary.

    It would take a while, to find out which generation of hardware reached the full automation level. Anandtech stopped evaluating video playback after
    a while, because the "engineering tweaking" had stopped. Maybe full automation has been around for a decade or so. A very wild guess. It's sort of the
    way that nobody pays any attention to Direct2D performance. Nobody uses
    a stopwatch on BitBLT any more :-)

    And having MPEG2 and H.264, that subset is enough to keep a person
    happy, even if you have to transcode from other formats when building
    your library of titles.

    Paul

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Paul@21:1/5 to Chris on Wed Apr 17 04:00:24 2024
    On 4/17/2024 2:45 AM, Chris wrote:
    Alan Holbrook <no.thanks@lets.not> wrote:
    I have an older Win10 box with USB 2.0 ports and no HDMI out. I have video >> on it that I'd like to send to a monitor with an HDMI port using VLC or
    some similar software. I see that there are USB to HDMI dongles available >> that I could plug into the computer and then attach to the monitor with a
    standard HDMI cable. Would such a setup work for what I want to do? Is
    USB 2.0 liable to give me performance issues while playing the video?

    Or, is there any other solution anyone can suggest?

    I would look at other options before USB 2. What video outputs do you have? DVI, Display port and even VGA would work with the appropriate adapter.


    We still don't know much about the equipment. I suspect there isn't DP++
    on it. There's no HDMI, so we have to adjust our range of possibilities
    for what the vid card might have. And since it's Windows 10, to a certain extent
    the card can't be too old, or there would no no acceleration at all.
    A lot of stuff would require fallback code to cover the missing bits.
    A disaster (CPU loading, see this in VMs).

    My 7900GT has (A Win10 2015 driver or so) and

    DVI-I <== can make HDMI, DVI-D, VGA

    DVI-I

    Mini-DIN YPbPr (analog)

    Some TV sets have YPbPr input (fifteen year old set).

    And as for DVI

    https://www.pcmag.com/encyclopedia/term/hdmi-dvi-compatibility

    "The HDMI circuit detects the DVI signals and switches to the DVI protocol."

    Well maybe. If my brand new HDMI monitor would not take HDMI from my
    video card, then what are the odds the HDMI would have perfectly
    functional DVI backward compatibility ? :-/

    This shows what a passive DVI to HDMI looks like. But then the TV
    has to like the flavour of the signals, and it's hard to guess what
    that's like these days.

    https://www.newegg.com/p/1DG-0786-00013

    If a card has a mini-DIN, you check in the video card box for one of these.
    The three outputs are YPbPr.

    https://global.discourse-cdn.com/standard11/uploads/lzxindustries/original/2X/d/dd381757aab70303e1a619e58518873b96546f3f.jpeg

    the signals are luminance and chrominance related.

    https://en.wikipedia.org/wiki/YPbPr

    If the TV has it, it would look similar to the colours on this example.

    https://en.wikipedia.org/wiki/YPbPr#/media/File:Component_video_jack.jpg

    Paul

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Newyana2@21:1/5 to Paul on Wed Apr 17 08:01:13 2024
    On 4/17/2024 2:44 AM, Paul wrote:

       Would it be accurate to say the newer the graphics, the more
    it's taking over the work, or does it really depend on the hardware?
    The 710 seems to handle streaming fine, but actually it's been better
    since awhile back when I discovered that I had some old RAM that
    fit that machine, jackinh up the total from 4 to 6 GB. Before the
    machine was slightly groggy and video would occasionally stutter.
    The computer is a Dell XPS625. My father ordered it custom back
    in 2010, but then at some point his mental functioning deteriorated
    to the point that he could no longer use it. So now it's one of my
    streaming boxes.

    First you would want to review the BIOS settings.

    I haven't committed all these things to memory :-)

    There was some deal with AMD at one point, where you
    could decouple dual-channel. This allowed cores (like on
    your quad core) to make independent requests on a single
    channel. Ah, yes, it was called "Unganged". It could depend
    on what the machine is doing, as to which option is better.
    A lot of software is single core and does not take
    advantage of multiple cores, and leaving the RAM in dual
    channel mode may help. On a server (multiple uncorrelated processes),
    maybe Unganged is better.

    At one time, the AMD memory controller was not very sophisticated.
    Intel had Flex memory, whereas AMD did not bother with this. It
    would have made their test bench too large to test. With Intel,
    it didn't matter which holes you put the RAM in, the Intel could
    pair them you virtually in a sense. The AMD was too clumsy for
    this, and required to put stuff in the correct holes. Nobody
    would tell you that you made a mistake.

    I suspect that was the trouble with the RAM on the XPS.
    It seemed awfully slow and I decided to open it up to make
    sure it all looked kosher. It turned out that the first RAM stick
    was missing, but I had a stick of the same stuff in an old
    computer. I'll have to recheck to see whether the order is
    optimized. I didn't realize that would matter.

    I wouldn't be surprised if my father took that stick out.
    He was a fearless tinkerer and may have felt that the other sticks
    might get too lazy with so many working together. "Why do
    you need 4 sticks to do your work?" He once called to ask
    if I could fix a problem and I arrived to find it autorebooting
    at a fast rate. It turned out that he had somehow either reset
    the BIOS or just changed a few things willy nilly. I'm sure it
    started with, "Hmm... What's this thing? Hmm... That thing
    sounds good. Why is it disabled? I'll just fix that."

    I never tried streaming with the XPS until I got the new
    card, so I don't know how it would have worked. But of course,
    the cable company and the streaming company could also
    play a role. In any case, it seems to work OK now. Interestingly,
    my little RPi4B, which can barely handle one task at a time,
    streams quite well. And even better since I upgraded from Buster
    to Bookworm. What a pleasure to find an OS that's actually much
    more efficient than the last one. I don't recall that happening since
    going from 98 to XP.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Alan Holbrook@21:1/5 to Big Al on Sat Apr 20 13:46:31 2024
    Big Al <alan@invalid.com> wrote in news:uvm04o$vlq7$1@dont-email.me:

    If this is a rare occasion, you might look into a Chromecast adapter
    for you standard TV. You could could then cast the video to your TV
    via the Chromecast. But then that means buying the Chromsecast
    adapter for $29 US.

    You could do without the 4K model and just get the plain ole simple Chromecast. https://www.amazon.com/Chromecast-Google-TV-Streaming-Entertainment/dp/ B0B9HS6DLZ

    Not casting to a TV. It's a monitor.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Alan Holbrook@21:1/5 to All on Sat Apr 20 13:45:09 2024
    Joerg Walther <joerg.walther@magenta.de> wrote in news:pqus1jdl4lnpadi08b38vc1jbj4vfe7afe@joergwalther.my-fqdn.de:
    But you seem to be
    mixing up two methods, mentioning VNC (not VLC, that's a video
    player).

    Nope, not mixing them up at all. The monitor is not on a network, it's
    just a stand-alone monitor. I was proposing using VLC to play the video
    over the USB dongle to HDMI.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Frank Slootweg@21:1/5 to Alan Holbrook on Sat Apr 20 14:51:09 2024
    Alan Holbrook <no.thanks@lets.not> wrote:
    Big Al <alan@invalid.com> wrote in news:uvm04o$vlq7$1@dont-email.me:

    If this is a rare occasion, you might look into a Chromecast adapter
    for you standard TV. You could could then cast the video to your TV
    via the Chromecast. But then that means buying the Chromsecast
    adapter for $29 US.

    You could do without the 4K model and just get the plain ole simple Chromecast. https://www.amazon.com/Chromecast-Google-TV-Streaming-Entertainment/dp/ B0B9HS6DLZ

    Not casting to a TV. It's a monitor.

    Doesn't matter. A Google Chromecast device is connected to a HDMI
    input. It's normally used on a TV, because for a monitor you normally
    connect it with a cable, but there's nothing preventing connecting a
    Chromecast to a monitor.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)