• Re: [gentoo-user] New monitor, new problem. Everything LARGE O_O

    From Michael@21:1/5 to All on Tue Jul 2 20:15:17 2024
    On Tuesday, 2 July 2024 19:58:59 BST Dale wrote:
    Howdy,

    New monitor came in. Found it on my porch. Unboxed it and hooked it
    up. Started with no xorg.conf file. It's high res, plasma comes up so
    all that is fine. Now comes the problem. Everything is HUGE. Even on
    SDDM screen, it's like it is zoomed in or something. Icons are huge,
    fonts are huge. I think what it is doing is this, and it may be the
    video card. I think it is thinking I have four monitors set up as one
    large screen. With this one monitor, I'm seeing the top left monitor
    but not seeing the rest since I only have one monitor connected. I
    might add, when I click to logout, that screen is huge too.

    I generated a xorg.conf with the nvidia tool. No change. I tried
    removing options that I thought might cause this largeness problem. I
    even rebooted. No change. I open the Nvidia GUI software and tried adjusting things there. No change. I compared the setting to my main
    rig, other than the difference in the cards, all settings appear to be
    the same including what should be displayed where. The resolution is
    correct too.

    Is it possible this card functions differently than my usual cards?
    Someone mentioned these tend to be used in a business system which could
    mean they use four monitors for one large display, for presentations or something. Is there something I need to disable or enable so the card
    knows I want independent monitors maybe?

    I'm attaching the xorg.conf file, including options I commented out. I
    also checked the logs, no errors or anything. It's just the usual
    things like mouse and keyboard loading and it finding the monitor
    connected, unlike the old LG. Maybe someone here as ran into this at
    work or for a client and has a idea on how to fix or that this card is
    not designed for my use.

    While I'm waiting on a reply, I'm going to try one of my older spare
    video cards. If it works fine, it could be the new video card is set up
    to work this way. May not can even change it.

    Thanks.

    Dale

    :-) :-)

    If it is this *this* monitor:

    https://www.samsung.com/us/computing/monitors/flat/32--s30b-fhd-75hz-amd-freesync-monitor-ls32b300nwnxgo/#specs

    you should be able to set:

    Section "Monitor"
    Identifier "Monitor0"
    VendorName "Unknown"
    ModelName "Samsung LS32B30"
    HorizSync 30.0 - 84.0
    VertRefresh 50.0 - 75.0
    Option "PreferredMode" "1920x1080_75.00"
    Option "DPMS"
    EndSection

    for better responsiveness with less flicker when watching sports.

    Did you comment out all the #lines in the "Screen" section, rather than
    letting nvidia set configure it as it needs to?

    Set the screen to the desired resolution to match the monitor, e.g.:

    Option "metamodes" "1920x1080 +0+0"

    and it should scale correctly.

    Also check if "Samsung Magic Upscale" has been enabled and this affects what screen size is eventually displayed on the monitor:

    https://www.samsung.com/us/support/answer/ANS00086623/

    -----BEGIN PGP SIGNATURE-----

    iQIzBAABCAAdFiEEXqhvaVh2ERicA8Ceseqq9sKVZxkFAmaEUcUACgkQseqq9sKV ZxlWeQ//bJka7W06MY1Tk2kErkJgELuMzUkMo38l9vDGG0V8puNxXF9w1da5ppl3 HwBSNxpYIAyX/cb+lpQ57inX8G91dnCmJYOkr0YndGyrHR0lhju7u9egz/0f8aGS DdjXeiW24taDXddEvljoLGWatLJUouCWtmBf0XonF4B7QHdSz3eseW2Br/99EkDf d6v4l1+WfFELYOSgoJ5NhYKBVhlkyihf5xIY+V9oHFNpBAgSVt8VXIV2G35zhwOt JzbqQe0mugGfNcKtLZt8cffPcHqeT7WOD3qBr/S8rBIIqnJ4rhXyVQlnMfuxKQpL fjVM+mvjRANGGHNW2oUUKEjZ2COamugdPMHx17HUl8yJ6QP0PSIivqVNVF6UEX6x 9oFPfMUBRNZ5o8vFt+wUMMWKjT+HYRevXnJ1TStuyTn0PBMieTSs883ZdNetgSPp TeRb/TLUVULKHfZzx3m0/PEsCqu/j6+t8eKhjwEhAHKyozjjTfZy7+xY3pvxNqyE QeZf9hOSw0AUfxTwczYEC5yIFT2hi6wZDoqP0zwtk3M3YMJuwAg0ZqPm3uwTk+8h ZRg9spsZP1/0l+NVLBEhwYAFg5cOGvry/Up8aPxcRlNQIqUIouS4nfqVyRVqMMAy Lb9KiHGK2MFLxiCBkUEZgX41/cLFMkf25IFtFOatDshP6yLAeG8=
    =ZeBh
    -----END PGP SIGNATURE-----

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Michael@21:1/5 to All on Wed Jul 3 15:53:56 2024
    On Wednesday, 3 July 2024 10:22:33 BST Dale wrote:

    Another update. I rebooted several times to make sure whether things
    would be consistent. Most of the time, it came up as it should. Some
    times, not so much. When I had just the new Samsung monitor connected,
    it was consistent. When I added the old LG, it would not always come up
    like it should. The biggest thing, the plasma panel would be on the
    wrong monitor.

    If you are adding a second monitor then you need an additional "Monitor" section with a different identifier in your xorg.conf for a multi-headed
    setup. You need to add in the first monitor section:

    Section "Monitor"
    Identifier "Monitor0"
    VendorName "Unknown"
    ModelName "Samsung LS32B30"
    HorizSync 30.0 - 84.0
    VertRefresh 50.0 - 75.0
    Option "PreferredMode" "1920x1080_60.00"
    Option "Primary" "true"
    Option "DPMS" "true"
    EndSection

    and then in the second monitor section:

    Section "Monitor"
    Identifier "Monitor1"
    VendorName "Unknown"
    ModelName "LG blah-blah"
    Option "PreferredMode" "1920x1080_60.00"
    Option "RightOf" "Monitor0"
    Option "DPMS" "true"
    EndSection

    Section "Screen"
    Identifier "Screen0"
    Device "Device0"
    Monitor "Monitor0"
    SubSection "Display"
    Depth 24
    Virtual 3840 1080 # 1920 + 1920 (3840), 1080 + 0 (1080)
    EndSubSection
    EndSection

    You'll get the correct identifiers and "Modelines", "PreferredMode", resolution, refresh rate, etc. values for the above by using 'xrandr -q'.


    I tried using xrandr to set this but it kept changing what monitors was connected where which would throw off what monitor got what priority.

    Manually instructing xranrd to set up your monitors will not survive between reboots unless you store its settings in your xorg.conf. You need to rerun it each time, manually or via a script. Or, you just set correctly your
    xorg.conf once and then you can forget about it. ;-)


    Finally, I removed the old LG. It has caused enough grief already. I unhooked the TV cable for my bedroom TV and connected it to the new
    rig. I then booted. I installed a package called arandr. It's a
    sister to xrandr but GUI based. Makes it very easy to see what is
    what. On the first boot, the Samsung showed as connected to port 1.
    The TV showed as port 3 I think. It seems each port can do two displays
    so it kinda skips. The first port is actually 0. Anyway, I used arandr
    to set it up like I wanted. I saved the file with the command in my
    home directory. I then moved the command to a file in /etc/X11/xinit/xinitrc.d/ as a file. They are usually started with a
    number in the file name. Don't forget to add the bash bit on the first
    line if needed and make it executable as well. Once I did that, the
    displays worked like they should. So far at least.

    The lesson to be learned is this. When you have a monitor that is
    having issues and keeps showing as connected to different ports and
    such, you can't use that display to get a reliable configuration that
    will survive a reboot, maybe even a power off and back on. Another
    thing, using either xrandr or arandr is a nifty feature if set up
    correctly. Those two make it so a display, or set of displays more importantly, work like you want. The arnadr command since it is a GUI,
    makes it a lot easier to create the xrandr command with the right
    options. If you use that route tho, make sure all monitors are
    connected and on before starting. You may can do it without it with
    xrandr but arandr needs the monitor to be on and working. The other
    thing, putting the setting in /etc/X11/xinit/xinitrc.d/ seems to work
    pretty well. So far at least.

    To be honest tho, I wish Nvidia would generate a conf file that contains
    both monitors and I could set it up properly there. Then when I boot
    up, it reads that file and knows what monitor is what long before DM
    and/or sddm even starts. It could also keep a monitor powered on even
    while on a console with nothing GUI running. I kinda wish we could do
    it like we did back in the old days.

    I also had another thought. When changing the xorg.conf file, I wonder
    if it only reads that file when loading the nvidia drivers but not when
    DM is started/restarted. I noticed on my system, when I booted but have
    not started DM, the Nvidia drivers were already loaded. I'm not sure
    when the xorg.conf file is loaded but if it is loaded when the drivers
    load, then that could explain why some changes didn't make any changes
    to the display. The changes were not seen unless I rebooted which I
    didn't always do. Maybe someone here knows what order this happens in.
    It could explain a lot tho.

    I think if you change parameters in the "Device" section for the graphics driver in your xorg.conf, you need to reload the driver itself, then restart
    X. If the driver is built-in the kernel, you have to reboot.

    If you change something in your "Monitor" section you just need to restart X.


    I'm hoping all this will help someone. It sure has been a hair puller
    for me. LOL

    Yeah, that LG monitor has been a pain. You better keep it matched to the old PC where you know it just works. ;-)


    -----BEGIN PGP SIGNATURE-----

    iQIzBAABCAAdFiEEXqhvaVh2ERicA8Ceseqq9sKVZxkFAmaFZgQACgkQseqq9sKV Zxkwdw/9HdAPxRE/3NXEMd4148jK5K6nL7IEy5EVXPPRqouqJLsRoRMwPM6mJh+G rGhIhVKAcnVbSdIdDYYKOwoJ+OyUr6y8jDL3PXptC4w8bfsMq4tYg0cMeK5P9+HS 1csJ2jDP5r3Gas7kZnFnPJW4LGuj1UZEuCTltGzs+GlDC7eoEJe5j5eRHGXnLC9u XJ3r2zb+akO10LtyUu27HB/2WVIUlTcJ5FYHI4XtiHHzfF7zez+/H14D3nQQYpU5 idVxRO/mBOdZUqBCKH/x5QFexb5JJKSNrazILbhkpd9RlzwHPNv+S6vnep0Dq3Ji jjNMfpAPbrJ7Nwd/ENDXFM4pyDfFMGqU3XJFbZ6Jm3hMc1irhMerp9+Nsqj3Iniv QIiDp5wVRlDqEu4cpOJu6G+PsSY+pIVP+Xqyp3uThNmTjW353y0OD34XncB5kfzf MytBGrgukEzKto31d8qv8/tLuPf3gNX+C49F0Nrkk70OCNL556ri+OJXWyZ79H4B EsK3jVhulEmGxGNvLHp2hggnZomtwob+f2O7UtWPDJuf7z8xFEpiwQ2NpLqRWzoF cI8FOdHGTGuUPrtw4m5nhxxB27Pj3HvSQK/P86OOU+cfGIqIOru7eT2fY1IQT4AQ h/ltSzOinpynZPdVgETBlpmn3HeFf48lF7BkobZ8qEaSRt3oNHI=
    =I/D+
    -----END PGP SIGNATURE-----

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Michael@21:1/5 to All on Sat Jul 6 13:19:11 2024
    On Saturday, 6 July 2024 10:59:30 BST Dale wrote:
    Mark Knecht wrote:
    On Tue, Jul 2, 2024 at 12:44 PM Dale <rdalek1967@gmail.com <mailto:rdalek1967@gmail.com>> wrote:
    <SNIP>

    I tried it with those options and without. Neither changed anything. I originally tried it with no xorg.conf at all. I was hoping maybe the Nvidia GUI thing would adjust things. I may try that again. No xorg.conf and use the GUI thing. That's what I use to set up my TV and such anyway. Thing is, the sddm screen is HUGE too.

    <SNIP>

    :-) :-)

    ???

    xdpyinfo | grep -B2 resolution

    ???

    I booted my new rig up again. Dang that thing. It was HUGE again. I started reading stuff, mainly about xorg.conf and the available
    settings. I changed all sorts of stuff, including some things Micheal suggested. I restarted DM each time. I was about ready to toss it in
    the old minnow pond, that's where everything goes to die here. Lots of
    CRT monitors in there. LOL Anyway, I had to install that package to
    run that command. It spit out a oops when I tried to run it after a
    copy and paste. I also installed it on my main rig, just to compare.
    On the new rig, the DPI was a fairly large number. I thought I had the output saved but it seems to be gone. My main rig tho showed 80x80 dots
    per inch. I did a duck search, finally found how to set that. I then restarted DM and YEPPIE!!! It was a normal size again.

    Now the monitor on my main rig is a bit older too. Maybe 6 or 7
    years??? Should newer monitors be set to a higher number for DPI? Is
    that normal? Why was it using such a high number by default? I want to
    say one was like 200 or something. It was quite large. The reason I'm asking, I may need to set something else to make the screen the right
    size but let it use that larger dpi number, if that is what the newer
    monitor prefers to use.

    Now to reboot, see if I have thoughts of that minnow pond again. :/

    Dale

    :-) :-)

    I'm struggling to follow your post because you do not provide specific information on the commands you input, the output you get in your terminal and the observed changes in the monitor.

    You also don't provide info on the changes you made in your xorg.conf, or xrandr and the corresponding changes observed each time in your Xorg.0.log.

    Strictly speaking, the pixel density of an on-screen digital image is referred to as Pixels Per Inch (PPI), but the term DPI which refers to a printed image of ink Dots Per Inch has stuck.

    In addition, there is the physical pixel density of your monitor and the rendered pixel density of the X11 image(s). Tweaking the latter allows you to scale the display and make images look larger than the native monitor resolution.

    You can set the DPI in your xorg.conf, or you can set it with xranrd, or you can set it on the CLI when you launch X, but usually this is not necessary and could mess up the scaling of your fonts, window decorations and symbols too (the font DPI is set differently by setting Xft.dpi: in ~/.Xresources, of the window manager's/DE font settings).

    A good starting point is to get the manual of your monitor and look at its published native resolution, e.g. 1920x1080 and the physical monitor size over which this resolution is displayed. Let's assume this 1920x1080 native resolution belongs to a 23" monitor. A 23" diagonal would correspond to a 20" wide screen real estate. Consequently the horizontal PPI would be:

    PPI = 1920 pixels / 20" = 96

    The same resolution on a 24" wide monitor would give a PPI of:

    PPI = 1920 pixels / 24" = 80

    Obviously a physically wider 24" monitor with the same native screen resolution as a smaller 20" monitor will not look as sharp when viewed from the *same* distance.

    Similarly, changing the selected resolution on the same 23" monitor from say 1920 pixels wide to a lower resolution of 1280 pixels gives a PPI of 64.

    I leave the calculation of the vertical PPI to the reader.

    Usually I start with no xorg.conf and leave the card to detect what the monitor prefers, then use the Scale setting in the desktop settings to increase/decrease (zoom in/zoom out) the displayed scale. This has the effect of altering the PPI to higher or lower values to improve readability of content. The above should help you arrive at some practical resolution, but I would start with the native resolution of the monitor and work down from there if you find it difficult to read its display.

    NOTE: Using Qt scaling can mess up window decorations, widgets, etc. I've found it doesn't work well with some KDE applications and their menus/ submenus, or pop up windows. You need to set PLASMA_USE_QT_SCALING=1 to make it follow Qt scaling and there's GTK3 too which may need tuning. This is the reason I calculate PPI before I venture into buying a new monitor, unless I can see it in person to make sure I can still read its content. ;-)
    -----BEGIN PGP SIGNATURE-----

    iQIzBAABCAAdFiEEXqhvaVh2ERicA8Ceseqq9sKVZxkFAmaJNj8ACgkQseqq9sKV ZxnIeBAAqvJKTVaJ95qgi5bCfmCARe17cV9JLIGCvmCW+DXKY7/VPmDk8nvFbQT3 0QKMX/jBxfo4sdg1QefXmMyq3UPLm2YOmrq4B0TuQ5NGjPxqOVDl+3g7K3hHwYXJ X9keXOgH94ZXmlkcTP1e71dTp/OaN7Cten4xOJqytVojNsz2LJXIJMqv0hAnA4rh IEYZdjSnGtqDrjAniv4KzrrVmHcIu6fSME1AoX6msy7NQIOXq2afYx7J3nsJIJOF tnsDZZqDoZcG6t+xicshHTPmTjaBvTv4VM3i9G4C68lZ/W/X6zQYGWAC3rKOkiuy TnVPB1CEVSP4LaUkD1zHWnIBY+WXy1vCysx1pLS8uZv3MolEGfhcntzHOsygZvmj qU6ZY0v4IGgHeO2LAg0LVZpvH1u69bNhKbmRFhslmzexNoW4LROVxL+TK3MThoX0 cAXNwUuoj1G5xZmRSINlSlDQBxdIx7oKgOWykH5SBMvC/Jn4fZ1gRv4svD08PC2c ehuaJ5vEGtQzw10yvCUGL/T5HAFErMcB+5XPsdIOXY3wJLZI12vSG2JtqmUns7s9 YLy8kUDZAVlBlL1icswUOfOYZoZUpReDaoCeDQ5tO2nT83Lu39fZ+UAnNc0pOPj6 Oy4F7SxbG9UD3jZD0XnL3rvOJ5rX+veovWUD9HES7mmIzPbXt6c=
    =UNGO
    -----END PGP SIGNATURE-----

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Michael@21:1/5 to All on Sun Jul 7 00:00:38 2024
    On Saturday, 6 July 2024 17:11:23 BST Dale wrote:
    Michael wrote:
    On Saturday, 6 July 2024 10:59:30 BST Dale wrote:
    Mark Knecht wrote:
    On Tue, Jul 2, 2024 at 12:44 PM Dale <rdalek1967@gmail.com
    <mailto:rdalek1967@gmail.com>> wrote:
    <SNIP>

    I tried it with those options and without. Neither changed anything. >>>> I
    originally tried it with no xorg.conf at all. I was hoping maybe the >>>> Nvidia GUI thing would adjust things. I may try that again. No
    xorg.conf and use the GUI thing. That's what I use to set up my TV and >>>> such anyway. Thing is, the sddm screen is HUGE too.

    <SNIP>

    :-) :-)

    ???

    xdpyinfo | grep -B2 resolution

    ???

    I booted my new rig up again. Dang that thing. It was HUGE again. I
    started reading stuff, mainly about xorg.conf and the available
    settings. I changed all sorts of stuff, including some things Micheal
    suggested. I restarted DM each time. I was about ready to toss it in
    the old minnow pond, that's where everything goes to die here. Lots of
    CRT monitors in there. LOL Anyway, I had to install that package to
    run that command. It spit out a oops when I tried to run it after a
    copy and paste. I also installed it on my main rig, just to compare.
    On the new rig, the DPI was a fairly large number. I thought I had the
    output saved but it seems to be gone. My main rig tho showed 80x80 dots >> per inch. I did a duck search, finally found how to set that. I then
    restarted DM and YEPPIE!!! It was a normal size again.

    Now the monitor on my main rig is a bit older too. Maybe 6 or 7
    years??? Should newer monitors be set to a higher number for DPI? Is
    that normal? Why was it using such a high number by default? I want to >> say one was like 200 or something. It was quite large. The reason I'm
    asking, I may need to set something else to make the screen the right
    size but let it use that larger dpi number, if that is what the newer
    monitor prefers to use.

    Now to reboot, see if I have thoughts of that minnow pond again. :/

    Dale

    :-) :-)

    I'm struggling to follow your post because you do not provide specific information on the commands you input, the output you get in your terminal and the observed changes in the monitor.

    You also don't provide info on the changes you made in your xorg.conf, or xrandr and the corresponding changes observed each time in your
    Xorg.0.log.

    Strictly speaking, the pixel density of an on-screen digital image is referred to as Pixels Per Inch (PPI), but the term DPI which refers to a printed image of ink Dots Per Inch has stuck.

    In addition, there is the physical pixel density of your monitor and the rendered pixel density of the X11 image(s). Tweaking the latter allows
    you to scale the display and make images look larger than the native monitor resolution.

    You can set the DPI in your xorg.conf, or you can set it with xranrd, or you can set it on the CLI when you launch X, but usually this is not necessary and could mess up the scaling of your fonts, window decorations and symbols too (the font DPI is set differently by setting Xft.dpi: in ~/.Xresources, of the window manager's/DE font settings).

    A good starting point is to get the manual of your monitor and look at its published native resolution, e.g. 1920x1080 and the physical monitor size over which this resolution is displayed. Let's assume this 1920x1080 native resolution belongs to a 23" monitor. A 23" diagonal would correspond to a 20" wide screen real estate. Consequently the horizontal PPI would be:

    PPI = 1920 pixels / 20" = 96

    The same resolution on a 24" wide monitor would give a PPI of:

    PPI = 1920 pixels / 24" = 80

    Obviously a physically wider 24" monitor with the same native screen resolution as a smaller 20" monitor will not look as sharp when viewed
    from
    the *same* distance.

    Similarly, changing the selected resolution on the same 23" monitor from say 1920 pixels wide to a lower resolution of 1280 pixels gives a PPI of 64.

    I leave the calculation of the vertical PPI to the reader.

    Usually I start with no xorg.conf and leave the card to detect what the monitor prefers, then use the Scale setting in the desktop settings to increase/decrease (zoom in/zoom out) the displayed scale. This has the effect of altering the PPI to higher or lower values to improve
    readability of content. The above should help you arrive at some
    practical resolution, but I would start with the native resolution of the monitor and work down from there if you find it difficult to read its display.

    NOTE: Using Qt scaling can mess up window decorations, widgets, etc. I've found it doesn't work well with some KDE applications and their menus/ submenus, or pop up windows. You need to set PLASMA_USE_QT_SCALING=1 to make it follow Qt scaling and there's GTK3 too which may need tuning.
    This is the reason I calculate PPI before I venture into buying a new monitor, unless I can see it in person to make sure I can still read its content. ;-)
    The reason I picked Mark's post is that I used the command he gave to
    find out the DPI info was different from my main rig. When I first
    booted up and started DM, I got that HUGE screen again. It worked last
    time. I hadn't changed anything. I sometimes wonder still if it reads xorg.conf each time. Anyway, when it didn't work, I started reading up
    a bit. I tried several things including checking options you posted but nothing worked. It stayed HUGE. Then I ran the command Mark gave and noticed the difference in DPI between my main rig and the new rig. I
    then found out how to set that in xorg.conf and set it the same as my
    main rig. As soon as I restarted DM, the screen came up the correct
    size. The HUGE part was gone. When I rebooted, it was still the normal size. It also worked each time I restarted DM. The only change is
    setting DPI. Like this:


    Section "Monitor"
    Identifier "Monitor0"
    VendorName "Unknown"
    ModelName "Samsung LS32B30"
    Option "PreferredMode" "1920x1080_60.00"
    Option "DPMS"
    Option "DPI" "80 x 80"


    I just booted the new rig again and it has a normal display. None of
    that huge stuff. I think I've rebooted like three times now and it
    worked every time. I think that is the most reboots with a config that
    works since I built this rig. Usually, it works once, maybe twice, then fails. Later on, it might work again. This machine is like rolling
    dice. You never know what you going to get. Three consecutive reboots
    with it working gives me hope on this one. I won't be surprised if when
    I hook up a second monitor or the TV that it breaks again tho. ;-)

    My only question now, is that a good setting or is there a better way to
    make sure this thing works, each time I reboot?

    Dale

    :-) :-)

    Is this your monitor?

    https://www.samsung.com/us/business/computing/monitors/flat/32--s30b-fhd-75hz-amd-freesync-monitor-ls32b300nwnxgo/#specs

    If the screen is 27.5" wide and 15.47 high, then at a native 1,920 x 1,080 pixel resolution the DPI would be approx. 70x70. However, if you're happy with the way it looks @80x80, then that's a good setting. After all, you're the one looking at it! :-)

    -----BEGIN PGP SIGNATURE-----

    iQIzBAABCAAdFiEEXqhvaVh2ERicA8Ceseqq9sKVZxkFAmaJzJcACgkQseqq9sKV ZxmbTA//ZfjOvpu/eBYT1fnSS+tS881LnvPoQira5b8xYpOn1Ssi2iVk1+q9w9Z5 Z3ahsmdI1YGsxLYILd9cYG8jfTVj0nWatCjdEEoriKgm4GVXbOTnZYligFoNwtB7 v+ZSOTaSj+EBh40jowipl2Ayer4sdFBzNbKNfoaNO8AdFitiFIGjufBdfI5qIGDz mR2pPnw7orgnvIdIKoaMyCjLFxutU6I2SgYmVvnvIQDClo17giwBPL0lSEgZAy7W iNgb4h7Lrmv7e8R+kDpij/VLYsDKpV4snCFDO0GhWnptkhlkXbQOyTtOT/YLNKs4 JtiXaTbCw150X6rqmCbr6N2/M+/FcLptkhrmb6VNSpWiw/FaCdEUS65gv/1Iu5Tw ZurR9KO3x+usMrwjIuautkMyfKSCdnNDcVS4bgxPHArz2+G6sv9MiD43daKawvrK CVh9B7zCxvPrtxvhCarMemGXp/XPEtg3NezdtUqNfE8Dwvd8nIxdK8T07lv3GOMn exggOp/t19n/5kw+3d+5iQaXaZgH6a/Pprms6/ZX6J1bykR4Am9RjmHUV2EdzfS6 48OLf99MBOQJYQIXMJ6qb2jTwSy7Txt0tJE5mQ8KChSjLBJ8EjmwYcEmlVsTZWtm yjr/SH1gR7TyReJorzykIwr3UVigfXenH/bqyDthm9Xe2KzPNJg=
    =8BWa
    -----END PGP SIGNATURE-----

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Frank Steinmetzger@21:1/5 to All on Sun Jul 7 22:10:02 2024
    Am Sat, Jul 06, 2024 at 07:32:49PM -0500 schrieb Dale:

    Michael wrote:
    On Saturday, 6 July 2024 17:11:23 BST Dale wrote:
    Michael wrote:
    On Saturday, 6 July 2024 10:59:30 BST Dale wrote:

    Now the monitor on my main rig is a bit older too. Maybe 6 or 7
    years??? Should newer monitors be set to a higher number for DPI?

    DPI does not depend on age, but only on physical characteristics, of course.

    Strictly speaking, the pixel density of an on-screen digital image is
    referred to as Pixels Per Inch (PPI), but the term DPI which refers to a >>> printed image of ink Dots Per Inch has stuck.

    In addition, there is the physical pixel density of your monitor and the >>> rendered pixel density of the X11 image(s). Tweaking the latter allows >>> you to scale the display and make images look larger than the native
    monitor resolution.

    Is this your monitor?

    https://www.samsung.com/us/business/computing/monitors/flat/32--s30b-fhd-75hz-amd-freesync-monitor-ls32b300nwnxgo/#specs

    If the screen is 27.5" wide and 15.47 high, then at a native 1,920 x 1,080 pixel resolution the DPI would be approx. 70x70. However, if you're happy with the way it looks @80x80, then that's a good setting. After all, you're
    the one looking at it! :-)


    Actually, mine is a LS32B304NWN.  I'm not sure what the difference is between 300 and 304.  There may just be a minor version change but
    display is the same.

    If I look at the Samsung pages: https://www.samsung.com/us/computing/monitors/flat/32--s30b-fhd-75hz-amd-freesync-monitor-ls32b300nwnxgo/
    https://www.samsung.com/us/business/computing/monitors/flat/32--s30b-series-with-dp-cable-ls32b304nwnxgo/
    then the difference is in the caption: the 304 comes with a DP cable.

    It's hi res and a good deal.  :-D 

    Please define hi res. Full HD at 32″ is definitely not hi res. ;-P
    It’s about as much as CRTs back in the day, close to 1024×768 at 17″.

    Compared to the HUGE display, yea, it looks good.  The reason I was
    asking if that is correct is this, maybe it should be set to, just
    guessing, 128 x 128 but some other setting makes the picture the right
    size, not HUGE.  If 70 x 70 or 80 x 80 is a setting that the monitor is designed for and ideal, then that is fine.

    Well technically, a monitor is not designed for, but designed with a
    specific number. It is determined by the size of its physical pixels.

    Monitors, even the old CRTs, have resolutions and settings they work
    best at.

    True, at bigger pictures (meaning more pixels), the frame rate went down and the CRT started to visibly flicker. So the sweet spot was at the highest resolution for which a comfortably high framerate could be maintained. I was too little in the CRT era to know the exact reason, but there are many to choose from:
    - insufficient GPU power to deliver enough pixels per second
    - limited bandwidth in the display cable
    - the monitor couldn’t keep up
    - the CRT’s pixel pitch in the phosphor screen

    I read once where a
    person had a monitor that had a great picture at 60Hz refresh.  Even tho
    it would work at 75Hz, the picture wasn't as good.  It seems that
    something didn't like that 75Hz setting.  That person used the 60Hz setting.  Some things are picky that way.  Higher isn't always better.

    How long ago was that? If it was in the VGA era, maybe the analog circuits weren’t good enough and produced a bad signal.

    I may try that 70 setting.  Odds are, just like the difference between
    60 and 75Hz refresh rate, I likely won't be able to tell the
    difference.  Time will tell tho. 

    Well don’t mix up frame rate and scaling. 75 Hz vs. 60 is quite sublte, you might not even notice 90 Hz. But chaing DPI from 80 to 70 will mean an increase in fonts by 14 %.

    By the way, I booted the rig up when I went to heat up supper and was downloading new messages.  It booted to a normal screen.  I think it is
    at least being consistent now.  Before, it was hit or miss, mostly
    miss.  Given how good things are at just working, I'm surprised that the correct setting wasn't used automatically.  I'd think it should be. 
    Maybe that is a bug???? 

    Now to go eat supper.  I hope the monitor comes in soon.

    I’m confused. I thought the new one has already arrived and is the one where everything was HUGE. %-)

    --
    Grüße | Greetings | Salut | Qapla’
    Please do not share anything from, with or about me on any social network.

    People who are not convex with foreign words should not renovate with them.

    -----BEGIN PGP SIGNATURE-----

    iQIzBAABCgAdFiEEVbE9o2D2lE5fhoVsizG+tUDUMMoFAmaK9bQACgkQizG+tUDU MMoyZw//XwqWrxudHXDng9VpevnwcGjWgkSydqGej3N18vRza5oWLakqjv4ltZ2T ZTN/CI8Z/jsQoGOxj9Gktbq4686bhV+XgQTAV5Lk6LfJpXFHqyp2WQ2iELORZfmM Q0dzTHluxOoFD0uWtBv5GGDh9BqMFpVbK060kI4nrnhXlc46C4lEJD+DiWmVTnfX Jg/8Cd6hYVOyQdIWIQPlgaHeds7UX4JWQLWYZXsl3ns/ft3PPbFQAn+k5UeNZPiO O6gq/hAgNhC/eQl6kbnJx9Cdygq4i/GyuzbcRf1YEP2o9S6v1RC8iViQQDGB4yEr tj9i5GTuazgmanbP33rXk0tRF//dHgiRNNdZCELNCJneJNuVrogIk8v3vgLB7p3Q jxd/sTy5FT8bZu8ZFSzLpiZQzpb+OMkXUHKt13gG9TYUU28TSCrbnenTNJHOVVpg mi5vnDL1qULm/MpAGUsJeAEdvgKp18HDgQ8YMHxd0VAWk6FCm0OjnTHVnY8z9nUH t2fhME9999dejKtYIFYOOBW8xv2nwn5WwPOLWAWEqQdqcA6Z7h4z1r0uJQ39ld4M 6yAB1BLJtllyTNWVaVs1ugCFmpYutPYWMK7RjT4Q3jFfN7zI2HHP27mLnmyAm1/2 a6nzzCp5drdoW2NROzB9ir4LloaGWoh03IWjq5mA44x8hFBsSic=
    =Zq5P
    --
  • From Wols Lists@21:1/5 to Frank Steinmetzger on Sun Jul 7 23:10:02 2024
    On 07/07/2024 21:08, Frank Steinmetzger wrote:
    I read once where a
    person had a monitor that had a great picture at 60Hz refresh.  Even tho
    it would work at 75Hz, the picture wasn't as good.  It seems that
    something didn't like that 75Hz setting.  That person used the 60Hz
    setting.  Some things are picky that way.  Higher isn't always better.

    How long ago was that? If it was in the VGA era, maybe the analog circuits weren’t good enough and produced a bad signal.

    In Europe, I think many CRTs were 50Hz. Basically, you need at least
    24Hz to fool the eye that it's a moving picture. And if I'm correct (I
    never really dug into that), it's clearly something to do with the
    frequency of the AC supplying the monitor.

    I know that has a whole series of weird effects that are "oh THAT'S why"
    when it's pointed out.

    Cheers,
    Wol

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Frank Steinmetzger@21:1/5 to All on Sun Jul 7 23:30:01 2024
    Am Sun, Jul 07, 2024 at 04:12:11PM -0500 schrieb Dale:

    It's hi res and a good deal.  :-D 
    Please define hi res. Full HD at 32″ is definitely not hi res. ;-P
    It’s about as much as CRTs back in the day, close to 1024×768 at 17″.

    Well, I still consider 1080P hi res.  That's what I get for any monitor
    or TV I buy.  The biggest thing I have is a 32" tho.  My rooms are kinda small.  No need for a 60" TV/monitor. 

    Well my TV sits over 4 m (that’s 13 feet for the imperialists) away from the sofa. So I splurged and got myself a 65″ one.

    Now to go eat supper.  I hope the monitor comes in soon.
    I’m confused. I thought the new one has already arrived and is the one where
    everything was HUGE. %-)

    I ordered a second identical monitor.  I been wanting two monitors for a while.  On occasion when I have hundreds of files to process manually, I need a second monitor just to stick a file manager on and drag files
    from one directory to another but being able to see both at the same
    time.

    I’ve never grown accustomed to multi-monitor-setups. I’ve always used just one (a habit from my long laptop days). Instead I multitask with virtual desktops (as you do) and with teminal multiplexers.

    At 70 DPI, I recommend the terminus font, a bitmap font which is very
    readable at small sizes and allows you to get lots of information on the screen.

      A second monitor will help with this.  Plus, I have a spare as
    well.  So, first monitor is here and fixed a lot of problems except it
    added a new one, being HUGE.  Now that is fixed as well.  When I connect the second monitor, I should be able to set it up the same way except connected to a different port.

    Considering that you have a thread about GPU temps going, be warned: GPUs
    tend to suck a lot more power when running in multi-head setups.

    The biggest thing I dread right now, cleaning off my desk.  -_o

    A clean desk is just a sign for a messy drawer.

    --
    Grüße | Greetings | Salut | Qapla’
    Please do not share anything from, with or about me on any social network.

    “A computer is like air conditioning:
    it becomes useless when you open Windows.” – Linus Torvalds

    -----BEGIN PGP SIGNATURE-----

    iQIzBAABCgAdFiEEVbE9o2D2lE5fhoVsizG+tUDUMMoFAmaLB+wACgkQizG+tUDU MMrPphAAp4lBcdNWuqHOHYPkysVSMyCvE/eOlf0x2G0DHGL/CpB4pvE/riBw0WM3 x4gLcVYswCuWKfEg78UxE/umobD3yI+d2U6A2eosXadDyqtOXFQ44+GRO/aJgY9s b2WMMd6sGU/S3cOBX7uKWbUkMo7TO61hiLlFBLdEpytzsyhMrjFXW2n5JT9AZR2h nO2v2f9Qrd58+n5lvWKsCr17GDSQ86O/ywhVJIBpBFAK9wYFYGXvlZDQJdqoJcEh n9c19N0sHxpD5yS7DSzERGjcooVMizXOq3QSXHD0LcQlsRG0pzDyejzLRezsLVKi V0hFVE5edA1aCihfMCGGLza+Rwm5Qt/8T4ZzcM3SnMlmYr/BT8SJMDEJ0qHu1yCv 2mxXXjnL1wTrOTXnrf/GdddDcY0R8ESfRYrRH47QcYOVN5uDgaayfbuOwfUJ1bqf Uivli1asyynm7a7WPWkc3lBqO1CL+ExACoVero2p9b90deO8TUcHVjs+atifII/L njhjJb/VpsT6V3VWJVZTKNAqVp4WSw10z0fizWAd9xP+zlvJvSz10HbFBJ6mBJzP KsJWFNwDs2z5/LiWKRVKTETUa7nlehrNNxcuTmtpbO5qJiscS3xEGVLL2jYaKvNH qqRTkx9h8RN5w20ZA6rYsIFPrDmQMCqXRH
  • From Frank Steinmetzger@21:1/5 to All on Mon Jul 8 00:20:02 2024
    Am Sun, Jul 07, 2024 at 02:06:04PM -0700 schrieb Mark Knecht:
    On Sun, Jul 7, 2024 at 1:09 PM Frank Steinmetzger <Warp_7@gmx.de> wrote:

    Am Sat, Jul 06, 2024 at 07:32:49PM -0500 schrieb Dale:
    <SNIP>

    Well don’t mix up frame rate and scaling. 75 Hz vs. 60 is quite subtle,
    you
    might not even notice 90 Hz. But changing DPI from 80 to 70 will mean an increase in fonts by 14 %.

    So I understand the 14% calculation, but help me understand the underlying technology. Is the DPI how a font file, which I presume is some fixed size, like 25x25, gets scaled onto the screen? I'm not clear about the conversion from the font to the number of dots used to draw the font on the screen.

    Yeah. So, big convoluted topic. ^^

    First, there is the physical pixel raster of the screen, which determines
    the PPI value. But what may confuse people without knowing (I was very confused in my early computing days when I was using Windows): font sizes
    and their units. People usually think in pixels, but font sizes are given in point, especially on modern Linux desktops. Historically, Points come from lead typesetting, where 1 pt = 1/72 inch. And monitors of early publishing machines (and I think at the time in general) all had 72 ppi, so if you have
    a font size of 12 pt == 1/6 in == 4,233 mm on your screen, it will be
    exactly the same size on the printed paper. No scaling necessary.

    I forgot some of the minutiae over time; AFAIR Windows 9x+ assumed a standard density of 96 ppi and font sizes were set up in pixels in the control panel. The monitor market was very homogeneous, there was not much diversity, so no need for scaling factors. The default in Windows 2000 and XP was Tahoma at 8 pixel. And it was the same on Pocket PCs (PDAs with 3″ touch screens of 240×320). So if you took a screenshot on all of those screens, the font was identical to the pixel.

    No comes the clash between the logical and the physical world. Today we have
    - high-density screens like tablets and laptops: 4K at 14″ equals 315 ppi
    - the standard cheap office screen of 1900×1200 at 24″ equals 94 ppi
    - my 8 years old Thinkpad with FullHD at 12.5″ and 176 ppi

    A text of size 12 pixel will always be 12 pixels high, so it will appear smaller to the eye when the pixels are small, and bigger when the pixels are big.

    OTOH, a text at 12 pt should be displayed physically (in millimeters or
    inches on the screen) at the same size no matter how fine a screen resolves
    an image. So the computer needs to know how many pixels it needs to reach
    that size. That’s where the ppi come in:

    font size in pt
    Number of pixels = --------------- * Screens density in pixel/in
    1/96 pt/in

    The first factor gives you the font’s physical dimension in inch, the second factor converts that into pixel height. The units all cancel each other out with pixels remaining.

    That’s why you can enter the screen’s ppi into the settings (or use it automatically, if possible). So the font size you set will be the same to
    your eye no matter what monitor you plug in. The scaling factor business
    hides that: 100 % means 96 ppi, 200 % means 192 ppi.

    This produces two “Unfortunately”s:

    Unfortunately 1: people don’t know what the scaling means and how it works physically.

    Unfortunately 2: UI developers stick to this scaling factor idea. Everything outside certain values (meaning integer multiples of 96) looks ugly. But
    none of my screens have a ppi of n * 96. They are all inbetween (117, 176, 216) and when I set the correct scaling, the Plasma UI becomes ugly as hell because the previously nice-looking pixel-perfect lines become blurred or their thickness varies depending on where on the screen they are drawn.


    I’m confused. I thought the new one has already arrived and is the one
    where everything was HUGE. %-)

    Dale does this at times and I get confused also. He will (the way I read the messages) sometimes be talking about different machines or different monitors. His 'main rig", his "new rig", etc.

    We could stick to hostnames. *ducksandruns*

    --
    Grüße | Greetings | Salut | Qapla’
    Please do not share anything from, with or about me on any social network.

    It’s a pity that at the end of the money there’s so much month left.

    -----BEGIN PGP SIGNATURE-----

    iQIzBAABCgAdFiEEVbE9o2D2lE5fhoVsizG+tUDUMMoFAmaLEssACgkQizG+tUDU MMrSXg/+O8O5f9o02SRgbhtu36UcXQY8Eu4wKnXtQ0d7iOgcyzdSL4shSuMLV6NE NMPztYTyLif+7q3jUlL1b2DowmU20qZzTexB7jZR6Gcv0mq85IySZGrlDQqsD14p 4cBudPd0jgc/UangXMl+Z/dulc1tkX9x21hiYmQh41dR69H2aX5l5S4/H9oh8q4T nKRI9DGcSoP+X+g6ikrmjiu0GfYSI6r8e+LSEk/rAPFpwjfugwSzUEnhBtakASnq /oS4qTm9HcFVmISuEjr+BlnSwzAPrQ1epDz7f+4WN6+cBwbTGQa+hmt9pKWpc5qs vI8irTV4v6n1JD5bF0fFyL4gHg2WwudNJKZCaP60+2jIsHTxJG09/US7viuikae6 7K5FY6EBKFz0okKVCMy4dgqU3ZJgALTPZpnOhwQ8ihKXttDXCnZsDQ4E/sexKhzL QZnURCwnMHJJz6pmF2yn7iHBCzRyoa4lZ2OR2oWlRxtRO2wL49ZBSYQe7lrizcqW 5FG7JZFRxK51/6RbdyhrWCsB7nLXBMMeYPqy05Y/biYjspWWFY9DBZlKYkRBDjC7 IE7LaM1dp3c8vVJvrXFGywtFCOYSd7P0Gy3/6zcLgVUZJgHjXTZnJd2wYXjJEWNI BIqgQVRvb/PdL1QVb6Nsc3EU4W+aoF1kNiRZ501M40S8eCVg/9U=
    =I9Us
    -----
  • From Frank Steinmetzger@21:1/5 to All on Mon Jul 8 00:30:01 2024
    Am Sun, Jul 07, 2024 at 05:10:18PM -0500 schrieb Dale:

    It's hi res and a good deal.  :-D 
    Please define hi res. Full HD at 32″ is definitely not hi res. ;-P
    It’s about as much as CRTs back in the day, close to 1024×768 at 17″.
    Well, I still consider 1080P hi res.  That's what I get for any monitor >> or TV I buy.  The biggest thing I have is a 32" tho.  My rooms are kinda >> small.  No need for a 60" TV/monitor. 
    Well my TV sits over 4 m (that’s 13 feet for the imperialists) away from the
    sofa. So I splurged and got myself a 65″ one.

    Well, I saw on a website once where it gave info on distance, monitor
    size and what you are watching can factor in too.  It claimed that a 32"
    is the ideal size for my room.  Given my old eyes tho, a 42" might serve
    me better.  Thing is, I'm bad to watch old videos from the 80's, 70's
    and even 60's.  Most of those are 480P or if lucky, just a little higher resolution.  With those, monitor size can make videos worse.

    This websites’s goal probably was about covering your eyes’ natural field of
    view. Sitting at my desk, my 27 inch monitor appears only slight smaller
    than my 65 inch TV 4 m away. Watching 50s TV shows will be the same
    experience on both in those situations.

    If you want to fill that entire field of view with details, then naturally,
    a 50s TV show in 480p won’t suffice. The more of your viewing arc you want to cover, the more picture resolution you need. You basically want to map
    X amount of pixels on each degree of viewing arc. Physical units are great.

    It also goes into the other direction: people these days™ watch 4K movies on their phones. Why, just why? Even if the screen can display it physically, their eyes cannot resolve that fine detail, because the pixels are too small.

    --
    Grüße | Greetings | Salut | Qapla’
    Please do not share anything from, with or about me on any social network.

    How do you recognise a male hedgehog?
    It has one more spine.

    -----BEGIN PGP SIGNATURE-----

    iQIzBAABCgAdFiEEVbE9o2D2lE5fhoVsizG+tUDUMMoFAmaLFsEACgkQizG+tUDU MMqsuBAAhAfuHwOc+bg8ZNIcP65s72fqpRaFqh76gILENxOSGTr3GJuH4GKYaghf Ts5ab2XqK846lRIWGjQa6Jm+i1FzCi3V8UV7jrjtt2xLZ1WsMU0UnPUSPAONpexm SY4MlmhDRFJqBr5JxECXsA2vH6In1sD3M2dKlHsMq55P7WsUhUZVATB9iMs7C4wF MooR6EsYBy/smJ3SrOcuuAurYBNtjxjHajlKcSDeHyslShvhTo3TeZ7sUzGnc9Jg diobRSn1d5vl0h4Jv9ar7qZs9b7Iz6S69/2bjuiaBMBrU0tYeJKluCQtYq7fL+lI MiqOg66ycm+pYzTv2N+1FoaGxL6lwrJuNUgJXId/JlE+TosTnpHWYJMHa2cyjfkN Stn22KiEoK754QNQfDKiEf1cjPuydMf15csGSgQMldcAdTl0FX0kAR+3iLeNSfsG CrV/vLUcsPuJplIPdfhBs+NiY/gQqu5hFLMmjklkcTG4LRCfoUDw+NrB8jbu93/V hSpSJYzV6u5Eg7olTBXk1i4ilFc3Ix+FQcVVaIfRDQ8XTmvsNAEPf0JvDZ6CgDaM 9gknpewO5Y9akM03E3qnxJySrD7Oqky4y4TxzWwCTysy8Yu8nQehgvFcaSsiS7I2 rkeK8rs5J7xhMR1vygRYtNaLv5BnBxK77as8IG4ZTzg8y61J1Ec=
    =l3XA
    -----END PGP SIG
  • From Wol@21:1/5 to Frank Steinmetzger on Mon Jul 8 01:20:01 2024
    On 07/07/2024 23:29, Frank Steinmetzger wrote:
    It also goes into the other direction: people these days™ watch 4K movies on
    their phones. Why, just why? Even if the screen can display it physically, their eyes cannot resolve that fine detail, because the pixels are too small.

    What's worse, as far as I can tell, I have no control over the download resolution! Why would I want to download a video in Full HD, when I only
    have an HD Ready screen, and it's over a metered connection! So
    basically, I'm paying for resolution I can't see!

    Cheers,
    Wol

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Michael@21:1/5 to All on Mon Jul 8 10:57:41 2024
    On Monday, 8 July 2024 00:14:59 BST Wol wrote:
    On 07/07/2024 23:29, Frank Steinmetzger wrote:
    It also goes into the other direction: people these days™ watch 4K movies on their phones. Why, just why? Even if the screen can display it physically, their eyes cannot resolve that fine detail, because the
    pixels are too small.
    What's worse, as far as I can tell, I have no control over the download resolution! Why would I want to download a video in Full HD, when I only
    have an HD Ready screen, and it's over a metered connection! So
    basically, I'm paying for resolution I can't see!

    Cheers,
    Wol

    yt-dlp -F <URL>

    will show you what different resolutions the streaming server offers and you can then select the one you need/prefer. Of course this is conditional on yt- dlp being capable of parsing the stream you want to download and on the streaming server offering different resolutions.


    -----BEGIN PGP SIGNATURE-----

    iQIzBAABCAAdFiEEXqhvaVh2ERicA8Ceseqq9sKVZxkFAmaLuBUACgkQseqq9sKV Zxnmkg//fevmRV9TRokyKWF+K7BxEsvYtifKUgKAUwAyCnR5zVFx3SZuvaKfKPHy i2ZEvDnAkxFKmSwKcA94Y+sAm7JZR6R41KmNBSFl/EqRG4Kpm8iu4crpnWfqBmID fY5yIlbfWjT4FxQ1J4R7xT9tdf8kfjhgpSupsu99uSGfgrC0Ge6617OhDhkPpvPW G7Ocx8+i61k3Tn9W7rBjPivbTP/fOgQDq6wmKJTbglDs7vyp9JKzNkSo8oZx/4XN FbnVVx8QmPH8C9TEIh1s986u63XkXvTXEVblj3qCQ8zqzWLvFGKfCLhiOahDSIaF oCsUgSvAR14JUfXA+Ou+5/zad0Hxhsfg5HBd0THkuHmv29JRyIRwsoNxuZ5BCuOt HdSBqvM7O0OgYW0rloROX6HimJBwVq1wk0NL/qbfMbtrroG9o6lk8oqXzQXJx/nH xLWbp1Zn+TAae6PgT6hbyvsDkeHcjm+SCEO9nG0LDE7W6rTI2YymJQyaN3HO1ABN f0Gcw+kddgF0zLebjoMSCq0bsZKmiLPGNmmsGdjrwtANcZ4+YuwwSNh62toILeq+ vNHwmluq/fZLLEwYtf7ccmnscBjeGkCXFNaJDGw3xoSkDak90uPBlDjibsfKmd5l SyIaAFH8X4dW1//G31t6wl4zFMnS5Ethh1WuynSBNxHp3/92ZFc=
    =QkUU
    -----END PGP SIGNATURE-----

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Michael@21:1/5 to All on Mon Jul 8 10:56:44 2024
    On Sunday, 7 July 2024 23:29:21 BST Frank Steinmetzger wrote:
    Am Sun, Jul 07, 2024 at 05:10:18PM -0500 schrieb Dale:
    It's hi res and a good deal. :-D

    Please define hi res. Full HD at 32″ is definitely not hi res. ;-P >>> It’s about as much as CRTs back in the day, close to 1024×768 at 17″.

    Well, I still consider 1080P hi res. That's what I get for any monitor >> or TV I buy. The biggest thing I have is a 32" tho. My rooms are
    kinda
    small. No need for a 60" TV/monitor.

    Well my TV sits over 4 m (that’s 13 feet for the imperialists) away from
    the sofa. So I splurged and got myself a 65″ one.

    Well, I saw on a website once where it gave info on distance, monitor
    size and what you are watching can factor in too. It claimed that a 32"
    is the ideal size for my room. Given my old eyes tho, a 42" might serve
    me better. Thing is, I'm bad to watch old videos from the 80's, 70's
    and even 60's. Most of those are 480P or if lucky, just a little higher resolution. With those, monitor size can make videos worse.

    This websites’s goal probably was about covering your eyes’ natural field of
    view. Sitting at my desk, my 27 inch monitor appears only slight smaller
    than my 65 inch TV 4 m away. Watching 50s TV shows will be the same experience on both in those situations.

    If you want to fill that entire field of view with details, then naturally,
    a 50s TV show in 480p won’t suffice. The more of your viewing arc you want to cover, the more picture resolution you need. You basically want to map
    X amount of pixels on each degree of viewing arc. Physical units are great.

    It also goes into the other direction: people these days™ watch 4K movies on
    their phones. Why, just why? Even if the screen can display it physically, their eyes cannot resolve that fine detail, because the pixels are too
    small.

    The rule of thumb is to come as close as possible to the TV screen until you start seeing different pixels. Then you back off a little bit and plonk your armchair there. Obviously with a UHD TV the higher pixel density at a given screen size means you can seat much closer - or buy a larger TV. At some point the TV size becomes too large to provide a sharp non-pixelated image, if the room is small. When I asked a friend why he kept upgrading his TV to ever larger sizes for what was becoming an obviously worse visual experience, he responded "... for a man there's no such thing as too large a TV size!". o_O

    The problem arises when you are watching old TV material recorded at a much lower resolution than UHD. For these you have to move your seat further back when using a UHD TV, until the displayed pixels in the picture appear to merge and give a smooth(er) non-pixelated image. The TV chipset will try upscaling/ interpolating and smoothing algos to improve the situation, but this won't fare well when the jump is from a VHS equivalent of 320 pixels to the 2160 of UHD.

    -----BEGIN PGP SIGNATURE-----

    iQIzBAABCAAdFiEEXqhvaVh2ERicA8Ceseqq9sKVZxkFAmaLt9wACgkQseqq9sKV Zxkrqw/+POuHno9B/c7rq6T+lgnjuHidKZq45CScf/FXO3cZ4ZI7HCPoTKQYr7rx jXjuuX+ma7lqA3M+JSd54hryyy1GJy6Sa/QU8VPV/kP0CieGob6k4XzsbDwZFj9p CxAxwishH2XyK5vh6D7a9fwMI6n8LRSWUC5OkEslas6yX9JEGS//2bkg6yI62axI reXyZ2AjK00IJ87zQALKO19mjV8mtw+9J5kZF5pxioP/IVYKRPUpet8Cet/BYnE9 mF+Yt67oNzHaQOr6Hkg9RquXbQZkXWt5XXHAXg2gqI+JgTvZIC1Pf7L1kL1CYHE5 0jXZVAyTplHKMM7nYojCqXUTOXJLGMkscGhv3EJAW5lQxRcEVaQVsUE3wwV+T8UA PicS9p3+AUCbrlYwOs8Az96sQzSfJwI5nIC5kH90KA9+1IF7w7ekPciqQPZ6nvfT CyLLaZztL92+dvbunjT2f1amswkjGFYEHApKj2PmExzR5oillHJjevjncaPhdh7h vM+6N0imWSucyMqqLlB7MeBxadGuuRstjsF+hvCV1DeCKa8TzuoosSVtb5WA8tEi 0k6Fob3khTxGfaUH61fumSlbgtUw7w8PsTQkg1NIEummMozECjfch64z9g/Dmbyf aFfYMxNz4e/61lDQw4OMAl6ULrVRVkSGDx7zlZrn7oOLBWXVWxw=
    =xhRY
    -----END PGP SIGNATURE-----

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Wols Lists@21:1/5 to Michael on Mon Jul 8 13:10:01 2024
    On 08/07/2024 10:57, Michael wrote:
    will show you what different resolutions the streaming server offers and you can then select the one you need/prefer. Of course this is conditional on yt-
    dlp being capable of parsing the stream you want to download and on the streaming server offering different resolutions.

    And on you even having a clue what the hell the url is, seeing as it's
    all hidden behind/inside the tv streaming app ... I would guess it's
    streaming a .ts ...

    Cheers,
    Wol

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Michael@21:1/5 to All on Mon Jul 8 11:48:30 2024
    On Monday, 8 July 2024 00:57:40 BST Dale wrote:
    Frank Steinmetzger wrote:
    Am Sun, Jul 07, 2024 at 05:10:18PM -0500 schrieb Dale:
    It's hi res and a good deal. :-D

    Please define hi res. Full HD at 32″ is definitely not hi res. ;-P >>>>> It’s about as much as CRTs back in the day, close to 1024×768 at 17″.

    Well, I still consider 1080P hi res. That's what I get for any monitor >>>> or TV I buy. The biggest thing I have is a 32" tho. My rooms are
    kinda
    small. No need for a 60" TV/monitor.

    Well my TV sits over 4 m (that’s 13 feet for the imperialists) away from
    the sofa. So I splurged and got myself a 65″ one.

    Well, I saw on a website once where it gave info on distance, monitor
    size and what you are watching can factor in too. It claimed that a 32" >> is the ideal size for my room. Given my old eyes tho, a 42" might serve >> me better. Thing is, I'm bad to watch old videos from the 80's, 70's
    and even 60's. Most of those are 480P or if lucky, just a little higher >> resolution. With those, monitor size can make videos worse.

    This websites’s goal probably was about covering your eyes’ natural field
    of view. Sitting at my desk, my 27 inch monitor appears only slight
    smaller than my 65 inch TV 4 m away. Watching 50s TV shows will be the
    same experience on both in those situations.

    If you want to fill that entire field of view with details, then
    naturally,
    a 50s TV show in 480p won’t suffice. The more of your viewing arc you want
    to cover, the more picture resolution you need. You basically want to map
    X amount of pixels on each degree of viewing arc. Physical units are
    great.

    It also goes into the other direction: people these days™ watch 4K movies on their phones. Why, just why? Even if the screen can display it physically, their eyes cannot resolve that fine detail, because the
    pixels are too small.

    -- Grüße | Greetings | Salut | Qapla’ Please do not share anything from, with or about me on any social network. How do you recognise a
    male hedgehog? It has one more spine.

    Yea. The website at the time was mostly likely to help people not buy a
    TV that is waaaay to large.

    I made a DVD of the TV series Combat for my neighbor. That was when he
    had a little smaller TV. It said it looked like large blocks on the
    screen. He watched it tho. lol He sits about 10 feet from the TV. It
    is a nice TV tho. All that smart stuff.

    I agree, a device should pick a resolution that it can easily display
    without downloading more than it needs. There's really not much need
    putting a 4K or even a 1080P video on a small cell phone. Unless a
    person is using a magnifying glass, they won't see the difference. I remember some of my old CRTs that ran at 720P. For their small size,
    that was plenty.

    Devices send a viewport size to the server to fetch scaled images and fonts as required, instead of downloading a huge resolution only for it to be consumed on the small screen of a phone or tablet. I'm not sure how the screen size information is shared between server-phone-TV when you mirror your phone on a TV.

    -----BEGIN PGP SIGNATURE-----

    iQIzBAABCAAdFiEEXqhvaVh2ERicA8Ceseqq9sKVZxkFAmaLw/4ACgkQseqq9sKV ZxmpSA/8DZq2mlpjGkZjI3qgP/9WtBZHQYBaN1gmPAnIShk7Z8UbA1bRFd8VdCdF ROdHoiQpzN5kP9DCwlkTlsz0k8gzrnxa6JWrSrKgW2ue/nqy4EkPiBfXAQvMuQ32 H08ySQT/gflsY6Z2oFYE7NGpBghExtNvICofFwC6xrnI1g8aiAz6F+sZnOjHpP2h dmaqtBr6QYEyattfuM3U4Fu3AcQCKaJpHmloaubZcywyzchPfWFBob1QqHkQBvpd Xj0pZB1FAiXOVL7aaQ54oP51R44ECHGdQX8hqGcKQtcz5OQkSElZOkNAEfzkqihc DLYEPgBxkRmEns/IJa84xjLJjUHl3fin3tMJsde4ja6OgSr/bMJ3HDlGiS+FFl5K iStUrr6SExWDczLoBhJux+mVH61X/spNlgI8pLDda5PzYeQQ+gf1LsgKVk7Vz9HK 6v9u58WN91wm+0mfbNK+1Gc/z4GFF5aipH1E6aZH54D17bXX2RPxlShzYlGztkFu TjBs6sagbYcQDo2LS+/xyBeNuzhNao90G8MlYquGU5t9mWBOt7vW2Zm4wFXabv2a NYT/3GRqqLAz52SklioyJGYhF+nWGV3NY9tXi4/E0pE0/7UYjLxmQUjVJC/BRcne LoFjw6BUsIpA4xpCd+Ltm+25u8J5sHX+ewPpx1QEYk6yTD74Zls=
    =6X1M
    -----END PGP SIGNATURE-----

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Wols Lists@21:1/5 to Michael on Mon Jul 8 14:00:01 2024
    On 08/07/2024 11:48, Michael wrote:
    Devices send a viewport size to the server to fetch scaled images and fonts as
    required, instead of downloading a huge resolution only for it to be consumed on the small screen of a phone or tablet. I'm not sure how the screen size information is shared between server-phone-TV when you mirror your phone on a TV.

    You wish. Given that the broadcast organisations control both the
    sending server and the consuming app, and that fibre and 5g and
    youngsters with unlimited bandwidth give the impression that "bandwidth
    doesn't matter", monitoring my data usage tells me that these apps
    expect to stream and receive in whatever bandwidth is available, namely
    the single hi-res version on the broadcaster's servers.

    Why should I pay £50/month for unlimited data, when my normal monthly
    usage hardly hits 500MB? I'm currently paying £6/m (the cheapest tariff available is £5) for unlimited phone, text, and 15GB with rollover. For
    10 months of the year that's effectively unlimited! Why should I pay so
    much more just because some idiot insists on spamming my paid-for
    bandwidth for stuff I can't even use !!!

    Cheers,
    Wol

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Wol@21:1/5 to Dale on Mon Jul 8 15:00:01 2024
    On 08/07/2024 13:27, Dale wrote:
    I don't know about cell phones but if using the youtube app, I'd think
    it would know what you are using and the resolution too.

    BBC iPlayer, ITVx. Along with the other Freeview apps for Channels 4 and 5.

    To me, it
    looks like it would be best for everyone to only download what is needed
    and no more.

    You'd think. Trouble is - most people DON'T think!

    It saves bandwidth of the server, bandwidth for the user
    as well.  Most people have unlimited nowadays but still, one would think
    a company like youtube would see the benefit of only sending enough resolution to get the job done.  If they do that for millions of users,
    it would have to save them some amount of money.  I'd think anyway.

    Youtube doesn't (yet) have a monopoly on streaming, fortunately ...

    Cheers,
    Wol

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Peter Humphrey@21:1/5 to All on Mon Jul 8 17:00:01 2024
    On Monday 8 July 2024 13:59:27 BST Wol wrote:
    On 08/07/2024 13:27, Dale wrote:
    I don't know about cell phones but if using the youtube app, I'd think
    it would know what you are using and the resolution too.

    BBC iPlayer, ITVx. Along with the other Freeview apps for Channels 4 and 5.
    To me, it
    looks like it would be best for everyone to only download what is needed and no more.

    You'd think. Trouble is - most people DON'T think!

    It saves bandwidth of the server, bandwidth for the user
    as well. Most people have unlimited nowadays but still, one would think
    a company like youtube would see the benefit of only sending enough resolution to get the job done. If they do that for millions of users,
    it would have to save them some amount of money. I'd think anyway.

    Youtube doesn't (yet) have a monopoly on streaming, fortunately ...

    I don't know about dedicated services with their own clients, but anything you get via a web browser is tailored to your screen. That was so when I was operating a web site, anyway.

    A script blocker in your browser may be able to thwart this query-reply about screen sizes; I don't know.

    --
    Regards,
    Peter.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Michael@21:1/5 to All on Mon Jul 8 18:26:26 2024
    On Monday, 8 July 2024 15:52:03 BST Peter Humphrey wrote:
    On Monday 8 July 2024 13:59:27 BST Wol wrote:
    On 08/07/2024 13:27, Dale wrote:
    I don't know about cell phones but if using the youtube app, I'd think
    it would know what you are using and the resolution too.

    BBC iPlayer, ITVx. Along with the other Freeview apps for Channels 4 and

    To me, it
    looks like it would be best for everyone to only download what is needed and no more.

    You'd think. Trouble is - most people DON'T think!

    It saves bandwidth of the server, bandwidth for the user
    as well. Most people have unlimited nowadays but still, one would think a company like youtube would see the benefit of only sending enough resolution to get the job done. If they do that for millions of users, it would have to save them some amount of money. I'd think anyway.

    Youtube doesn't (yet) have a monopoly on streaming, fortunately ...

    ;-)

    I don't know about dedicated services with their own clients, but anything you get via a web browser is tailored to your screen. That was so when I
    was operating a web site, anyway.

    Still is the case today. I have not worked on mobile apps, beyond their browsers, but the reasonable assumption must be mobile devices should only download what they are able to render. It is really odd they don't do this - as Wol attests to.

    Back to the previous topic, I have not yet found a case where changing the scale by means of the desktop settings, arrives at non-blurred fonts. The clearest sharpest fonts are always rendered at the native monitor resolution, at a 100% scale setting. Am I missing a trick, or is this to be expected?

    -----BEGIN PGP SIGNATURE-----

    iQIzBAABCAAdFiEEXqhvaVh2ERicA8Ceseqq9sKVZxkFAmaMIUIACgkQseqq9sKV ZxmipxAAls8aYaGkB95Tsz3bE84uWbPlxPmpuedFoy2wS3b/W+EsgF+Qz2Itr8X+ 5xXmgWbRbUXOTpPvDBqAn0ePBvhb3+DMNAegoXWovDDfV3iVs45C8B3LqKwDSCnc k2LFByMzDnHSTQKxYs5sSNbrEX0IjbRNUItstoJwPZ0ympZBVrJ5WqrKS2SuxW/d k2bqGyPyrK4WPnsUm0g9Z04ANQvbl4wqokOEOiXCowFIEhwUGx4XIA3l9UsslGiO Y4boWw0f/rjq90qjIcAi2bbG2jUmiBnHiDVu8nj0rOKnOs/TAITjsZ1RB7wT9hoh /NXnHhOSVFep9P+EkBi/ZJAFf6y+HKDqpzuA0c0sQHW2co4GfZsgfzG1U5irM9sS 5mu141nOFj75xK3ek0jG+JULlWowhdZHqPAYagTdcqxxPJiHcAjHffLNX/7uoAA7 kd80E7lXoMo78k64QRQLFn4r3Ky2a8m25twrxc3OCBeSGoWtnNmtG9FKwZqOzkz9 p7mQotLFVc8DtmoT5xuF/hEeUjA2tRLKNaVAK9SZRyXkl58qhelTt8qQUwCy3SY7 Z3ews/2t1rqIwy3DBg+E2dx4+hSoNH+ntzHnQWCMoGByqb2i7k6h4IRryy7bv/dl yn5wmFwE7bLsDgUy5nn4C35H954qz3m6Ez21vh1Gpk0lsHAHSM0=
    =9MAz
    -----END PGP SIGNATURE-----

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Frank Steinmetzger@21:1/5 to All on Mon Jul 8 22:30:01 2024
    Am Mon, Jul 08, 2024 at 06:26:26PM +0100 schrieb Michael:

    Back to the previous topic, I have not yet found a case where changing the scale by means of the desktop settings, arrives at non-blurred fonts. The clearest sharpest fonts are always rendered at the native monitor resolution,
    at a 100% scale setting. Am I missing a trick, or is this to be expected?

    That doesn’t really make sense. Fonts are always rendered natively, no matter what size. Except if they are really rendered at 100 % and then the rendered bitmap is scaled by the GPU or somesuch.

    Or because their hinting information is limited to a certain size range.
    This info gives the renderer special knowledge on how to render the glyphs.

    Do you have screenshots?

    --
    Grüße | Greetings | Salut | Qapla’
    Please do not share anything from, with or about me on any social network.

    One doesn’t eat salad, one feeds salat to one’s food.

    -----BEGIN PGP SIGNATURE-----

    iQIzBAABCgAdFiEEVbE9o2D2lE5fhoVsizG+tUDUMMoFAmaMSj0ACgkQizG+tUDU MMr6yxAAuWYDSnwM430xab1WvzuBHa45O+X7pqSEQc4I81rxhoe/LpkTWF6OXCT+ bl6xotyNRP8rhkd1/SG24LHgDLnhVjLKvbe1pDKELTYyLFhT4oKLMKih6SWZNLLq F7kbcUppbLjkwEfi9P1NNWA7TCJ+Zm1OaNjocPlWzMhBTwQAqpSfVgDRzOjwqXFz f5FpbOkIVIQZNjO+GG48HCURQvwBMb8sWu7p6GpyL7UH2J6tDYHzhgJE7CI4S1B6 mVFVB9QXynEfifCg6DlAavTaEhCFot7/837QoVIto24Kj/eaB1Qj60UC0yDvwoyg BOY/8gC7+98EzTd+wNLbUdusnOwvBnhMAkepERwHk+pQqKkOp2a61p9nZfc4Hu9J 9gUJbH03yxr84zpnkMMLQ69vveXVCsQu4FxukWbiqsYKzeZ3M7rPhnqBNfN2poSy 3I2uTGyZY+f57lNe3T9uk5iR4e9ERHr7jo+Uv1mIuco3eADvQbZRSfs0nES3pxuH TxHrMzVtiLH1Ls9c/iAv0TQNAi4W7NVwKnwH9uiLdhAYjGcgwYzJLCzRycHYYaq7 eaoczc3VuRzddFf0gWzflf6ss5kOkadDLv4AV+NeXaMPViFFaKlDNYY5N/P1505J kih+GFuN8BnKaSyoyn/hHwzwVTTx+GfpnqfyXhxTOJ9AYc9tgSY=
    =VkvZ
    -----END PGP SIGNATU
  • From Michael@21:1/5 to All on Wed Jul 10 10:45:19 2024
    On Wednesday, 10 July 2024 06:00:41 BST Dale wrote:
    New subthread. Slightly new situation.
    [snip ...]
    On the old LG monitor, when I first plugged up the new monitor, it
    wouldn't power up from standby. It wouldn't even when I connected only
    the new monitor. The BIOS would beep that it can't find a display as
    well. I thought at first I had a broken monitor. Getting power but
    won't wake up. I tried another cable, it worked. So, the first cable,
    same as I used on the LG monitor, didn't work at all with the new
    monitor. Might be that the cable has a problem not the LG monitor
    itself. I didn't think of that. I've used that cable quite often
    before. I'm not sure how old that cable is but it may find a trash can.

    It is best you swap cables around to make sure you are not getting bad or inconsistent results just because of a faulty cable. It goes without saying you should select a cable specification which is capable of the required bandwidth for your card and monitor and do not gimp this via some lower throughput adaptor in-between.


    On to the new monitor. I'm trying to decide whether to use xrandr and friends to set this up or xorg.conf. Using both seems to cause a bit of
    a clash and KDE isn't helping any. I'd kinda like to go the xorg.conf
    route. I think, but not sure, that xorg.conf is read very early on. It seems, but not sure, that the xinit files are read later on. I'm not
    sure on all that. It could be the other way around. I'm also pretty
    sure that if set up in xorg.conf, it would work if I logged into another
    GUI; Gnome, Fluxbox or some other flavor.

    Yes, xorg.conf would determine your screen layout for any window manager you launch, unless the window manager/DE has its own specific layout configuration overriding the default xorg.conf file settings (using libxrandr).


    It could be that xrandr and friends would as well.

    No, the xrandr extension of the X11 protocol is meant to be used to
    dynamically change your settings in real time, or query X to obtain current settings. If you're running xrandr from a script, then it will change the X settings when it is run.

    I suggest you use one tool at a time to avoid conflicts and duplication.


    Current situation config wise. The first problem I noticed, the
    monitors are nearly identical. Even the serial numbers are only a few
    digits off and that's the only difference I can see. I did some
    searching and was wanting to set a option in xorg.conf Monitor section
    that identifies the monitors by not only model but also serial number.
    That way I could set one to right or left of the other, or above/below,
    and it know specifically which monitor is which, even if plugged into a different port. I can't find a option for serial number yet. I did
    find where someone else wanted to do the same a couple years ago but
    sadly, no answer to that question. So, if that is not doable, may have
    to specify the port number. If I ever have to disconnect and reconnect, getting the order right could prove interesting. ;-)

    xrandr --listmonitors

    will show monitor number, which you can use as your monitor identifier in xorg.conf, the port of the graphics card it is connected to, which you may prefer to use as your monitor identifier in xorg.conf, if the monitor is detected as the primary display or not, relevant screen position, size, and other current settings of your display(s).


    Right now, I have this:


    root@Gentoo-1 ~ # cat /etc/X11/xinit/xinitrc.d/20.xrandr
    xrandr --output DP-0 --off --output DP-1 --mode 1920x1080 --pos 1920x0 --rotate normal --output DP-2 --off --output DP-3 --primary --mode
    1920x1080 --pos 0x0 --rotate normal --output DP-4 --off --output DP-5
    --off --output DP-6 --off --output DP-7 --off
    root@Gentoo-1 ~ #


    From what I've read, that is the correct place for that command. If
    not, please let me know where it should be. Keep in mind, putting it in
    /usr somewhere gets overwritten when the package providing that file
    gets updated.

    It is a correct place to put it, if you intend running xrandr to change your monitor layout every time you launch X, from whatever it would otherwise be detected as.


    I'm also attaching the current xorg.conf file. Keep in
    mind, gotta add the TV later on. I think if I can get the monitors set
    up, I can add the TV pretty easily. Even if it only works in KDE, that
    is fine since I need KDE up and running to use the TV anyway. I'm
    mostly needing to know if there is a way to add the serial number for xorg.conf. I think the rest is OK. I also need a command to get what
    the system sees as the serial number, in case it only sees a part of it,
    last few digits or something.

    I don't think specifying a serial number is necessary. Use the identification xrandr provides for each monitor.


    I also had to argue with KDE about which is primary. At first, like
    with the LG, it wanted to make the second monitor the primary despite
    the first monitor being marked primary. I did the old set it backwards, apply and then set it back the way I want it trick. It took it a second
    but KDE reversed it. Main screen went to the primary display. This is
    what I mean by it clashing and me setting the displays up in xorg.conf
    and KDE hopefully getting its info from that and adjusting itself.
    Plus, if I use a different GUI, it should work too.

    You can specify which monitor is the primary monitor with:

    Option "Primary" "true"


    These new monitors sure are nice. My old eyes like them. o_O Oh, I
    was going to copy over the KDE config directories. I think I'm going to
    take the opportunity to give KDE a fresh start. I think those old
    config files date back to the switch to KDE4. We on KDE6 now.

    If anyone knows if/how to set up xorg.conf with a monitor serial number,
    plus chime in. Other info welcome as well.

    Dale

    :-) :-)

    The serial number should be provided in the EDID table. You can see this and check if it is different for your two monitors in the Xorg.0.log output. Some cheap monitors have the same EDID flashed in their EEPROM and the serial
    number would not be of any use. However, I doubt you need this to provide an identifier for each monitor. It would be much easier using the card port the monitor is connected to, e.g.:

    Section "Monitor"
    Identifier "DP-3"
    ....
    Option "Primary" "true"
    ....
    EndSection

    Section "Monitor"
    Identifier "DP-5"
    ....
    Option "RightOf" "DP-3"
    ....
    EndSection

    Section "Monitor"
    Identifier "DP-7"
    ....
    Option "RightOf" "DP-5"
    ....
    EndSection


    The randr extension of X11 allows for dynamic changes in real time of your layout and it makes sense to use it on the CLI when you want to resize the screen displayed in a monitor, its orientation, etc. Since you want to set up a permanent dual-monitor layout, you probably want to configure this in xorg.conf. Today, modern graphics drivers do most of the leg work themselves and you only need to specify what the driver won't know about, e.g. the relative positioning of your monitors and which is the primary monitor. So, less is more in this respect.

    A minimalist xorg.conf configuration would need to include sections for your monitors. With the monitors connected and X running, query xrandr to find out what X11 has configured:

    xrandr --listmonitors
    xrandr -q

    Then move your current xorg.conf out of the way and create the file '/etc/X11/ xorg.conf.d/10-monitor.conf' to add your two or three monitor/TV sections in there. Reboot and see if everything works as expected. If not add the
    minimum directives necessary, e.g. "PreferredMode", or "DPI", but add only one of these at a time and restart X or reboot. Normally you wouldn't need any other sections, but if you do, e.g. for Screen, add it with the minimum configuration possible and work up from there. Soon you should have arrived
    at a working layout which should be replicated across different window
    managers and DEs.
    -----BEGIN PGP SIGNATURE-----

    iQIzBAABCAAdFiEEXqhvaVh2ERicA8Ceseqq9sKVZxkFAmaOWDAACgkQseqq9sKV ZxkZ8BAA8ER8+pw5TP56tqvAa7uYzhpZ5oKoJkQ8g2j2hHkmYYRe0G0iJ58yZQZO iwRarARgJygRrOGsbVIJoSV8UwjH5GxGvTPp6IpLpefurdp5rPATAPlz6M3U27O9 VWDTCweECb30nXpgLG+6ME1t/lks0P3UDVF6NqzB/lFUMjM7FIHlZt8YRSe+V38H pBcRRQ3AgcRUwiTlUxc0iozzNsqpK852sUDWOP3RaVB4asB4gvpQBxr5KWKsVYhd YELrwOkistmfsopFdUiyev3agCCVdXLC2Wp73L3kDQhVYmIBa6DHAcKNQSp9E95R eUJvaOkheWUEj3T73Zxi3fcg5ka6wOAzM1zjDtTTLPv4+Rr/aoDFB42lMo1hY8aS /FtpUvVYgd5hhpcRXUnpic1+2kZ931v0Zvq6hW55AO7xvwyjEIBIqrdCXO+m+uRs /WJfVK3kFhTXI92hRP4K4Gy+yeG4s5kkqq8ujvD/IeT0JrzVce9n3MdXjgbTinS/ OBFrdF6IXgHfNi7UlThPcCcTtRZw/WWF2HeUHBSyM5beD0PlFvFs3Rtyjf4bWHkH WH9oaHTz5trNhN+D1P+GJh+fonJ8I96acdVaX/oWYWtLUbRp1+jbcrNnsh8j5F4B Avd0Dz0eK3RaYDM3aQ8R2cvFoyKTd08slj1K62JeCzYN/0mV0hA=
    =BqOF
    -----END PGP SIGNATURE-----

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Michael@21:1/5 to All on Wed Jul 10 14:14:46 2024
    On Wednesday, 10 July 2024 12:44:28 BST Dale wrote:

    It sounds like you recommend me using xorg.conf and not xrandr. I was thinking that using both would also cause a clash. Basically, I need
    one tool to do this. That's why I picked xorg.conf for long term,
    xrandr is just for now or a second option. I may comment that command
    and reboot. See if it is the xorg.conf file doing the work or xrandr.

    I recommend using whichever tool does the job best, for your specific needs. Normally, sections for xorg.conf can be used for special input and display configurations, when the default configuration (running without a xorg.conf) will not do.

    The xranrd command is there to manually interface in real time with the RandR extension of the X11 API and change some settings to make sure they suit your preferences. You can, if you want to, script it and run it every time X starts, to change the default settings.

    If you are always using Plasma, then it may be convenient to use neither an xorg.conf, nor xrandr and instead use the 'Plasma > SystemSettings > Display and Monitor' GUI to configure your desktop setup.

    Any of the above three options should be able to do the job, but some may be more reliable than others. I found out whenever Plasma was being upgraded to
    a new major/minor version the layout on a dual monitor setup running on X was all over the place. I moved that system over to Wayland and I had no more complaints from users about a displaced toolbar, or reversed monitor layout
    and the like. YMMV.


    I think we talked about this maybe off list. On my old machine, when
    sddm comes up, the password field on the second monitor shows the dots,
    TV in my case. On the new machine, both monitors show the dots for the password. I'm not sure what is different tho. It did that even before
    I set the primary option. I like it that way myself but makes me
    curious why my main rig is different. It seems the new rig sends the
    same screen to both monitors. Once logged into KDE, it splits into two monitors. My main rig it seems is always two separate screens.

    As far as I know SDDM is using the file(s) in /usr/share/sddm/scripts/ to
    start a login GUI. I haven't looked into how far can these be tweaked for a dual monitor setup and if they even have a 'primary' monitor concept.


    I got some things going on. I'll read the email closer later and make
    some changes. I'll post back then. Oh, so far, it shows several
    packages headed in the right direction. The monitor stand left a small
    hub and when it leaves there, it almost always gets delivered that day.
    So, I may get the monitor stand today. The new /home hard drive is on
    the right path too. I'm expecting quite a lot of packages. While
    proofing this, got text from USPS that stand and several other packages
    are out for delivery. UPS updates a little later.

    Oh, in /etc/X11/xorg.conf.d/ when the files are numbered, does it read
    them from low to high?

    Yes.

    If I set a option in one file but set the same
    option differently in another file, which one does it apply? Or does it
    not apply either?

    First the lower numbered file, then the higher numbered file (see man run- parts). Also see explanation in the URL below.

    Thanks for the info. :-D Will work on it shortly.

    Dale

    :-) :-)

    https://wiki.gentoo.org/wiki/Xorg.conf

    The separate files in /etc/X11/xorg.conf.d/ are meant to break things up and make it easier to check, add, or take out sections. Configuration files are read in numeric order and sequentially, i.e. 10-monitor.conf will be read and applied before 20-monitor.conf, or 30-something-else.conf. Files will be read in alphabetic order if they are not prefixed by a number.

    Note, as the above URL points out, if you have a /etc/X11/xorg.conf file it will take precedence over any files in /etc/X11/xorg.conf.d/ and these in turn will take precedence over the default files installed in /usr/share/X11/ xorg.conf.d/.

    -----BEGIN PGP SIGNATURE-----

    iQIzBAABCAAdFiEEXqhvaVh2ERicA8Ceseqq9sKVZxkFAmaOiUYACgkQseqq9sKV ZxlZog/7BjjfcQ1GyK5w6tZOt949+3D18rA83NxlXRBgOoVV3PzXequpbzZb+Tvv vUK9a1rI366nH478aAUtjRp3Mt3fzcGpD8wUfVVhGFbbFAvFPg2m2AZbRa/3YQIX pNvd3NK+GuW0RAgo4MAbfTLubCV80gY8rM9ZsD4RUKfwy/XYwSq4yskKK0Ugfsyz N+VwQy62BLNuJf2lfHuHFqZEZj16umz+MGk1GPFOnjG3lAPUxVs8FLwCBO9UPukw lRU+3GZhO5MswRV+whBAjzlxGw8sH7zu5zw5834Nd58dakMZ1CxRy6AhyGUiLX9x aYuO2fTwFD744ez+QPjuLmDxeS3/MuLOx9v7q+E6ZEl5EuHydvCHMo0tq7Oe2QZV ZQ5E86ujH8oVRDgBx+xax5dnya/Z/TKBupRAWb2BnD4I+0+CzHENpiwlsGcuIohL N1qzYLS5y1R8raCM1ro3YAJBMvERDxJBs1zVEwRD60d57dlXFjh9JEpRmXKxJjgO zN4LRypmpN/CRhb2VOo6B1a2VO8dKUKD9R/XVJATnJVB623yLV8RnTKV8HyuSVpl PGrec0FzX6rFqmiTXae9lvfjAqC+NExF+XT6Kb+WT2wbdzh8UKNDKlQV/X2QHKCW Pn0GxqZOOHxSGKVPuae9gdBP5OfPCOkrZ3MrNwb4TirRac/uPGE=
    =lw1J
    -----END PGP SIGNATURE-----

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Michael@21:1/5 to All on Thu Jul 11 13:44:16 2024
    On Thursday, 11 July 2024 07:23:58 BST Dale wrote:
    Michael wrote:

    As far as I know SDDM is using the file(s) in /usr/share/sddm/scripts/ to start a login GUI. I haven't looked into how far can these be tweaked for a dual monitor setup and if they even have a 'primary' monitor concept.
    I've never really looked into it either. I mentioned it because it
    seems something has changed. On my old rig, it seems to have kept some setting somewhere but on new installs, it uses a new setting which we
    may both like better. Luckily one of my TVs is in the same room so I
    can see the screen. If however, you have a second monitor that you
    can't see, it may be worth looking into and setting it to the new way.
    It could be that someone reading this long thread would also like to
    know to do the same. ;-)

    Hmm ... on a system here running with two monitors, the SDDM passwd field is only showing being typed in on the right hand side (the secondary) monitor.
    The primary monitor passwd field remains empty, unless I click on it before I start typing. There is no custom SDDM config and no xorg.conf in this system. :-/


    I found the man page and another web page with a ton of info on
    options. Link below in case others want to bookmark it. Some of them I
    have no idea what they do. Even their description for some settings
    makes no sense since the terms used are things I never heard of. I
    doubt I need those anyway, thank goodness. Anyway. I been playing with
    this thing a bit. I made a simple change in xorg.conf just to see if it worked or not without changing anything else. I added this to the
    options for the second monitor:


    Option "Above" "DP-3"


    I'll see how that works. May try another GUI to, Fluxbox or something.
    For some reason tho, the port numbers are still odd, consistent but
    odd. Primary monitor is plugged into the lowest port, the one with #1 stamped on the bracket. It sees it as DP-3 tho. Even more odd, the
    second monitor is DP-1, which is marked as port #2 on the bracket. I
    can't make heads or tails of that mess. o_O

    Yes, this numbering incongruity between physical and logical ports is quite strange. o_O


    I did change how I plan to lay out the monitors tho. From the primary monitor as a starting point, second monitor that I use for handling
    large volume of files and such will be above the primary monitor. My TV
    will be to the right of the Primary monitor. The reason for that is
    mostly the physical layout. The monitor stand came in and I'll be
    putting the primary monitor on the bottom and second monitor on top of
    it. The TV can just go anywhere config wise but it has been to the
    right for so long, when I need my mouse pointer over there, habit makes
    me push the mouse to the right. It's as good a place as any.

    At first, I had the second monitor to the right of primary but then it
    hit me, dragging the mouse pointer, and files, to the right to go up to
    the top monitor seems kinda odd. Plus, for a long time now, the TV has
    been there on the right. I rearranged things a bit. Given the physical layout, it makes more sense this way. While I'm thinking on this. I
    may turn off the second monitor at times. Should I add a option to
    xorg.conf to make sure it doesn't go weird on me? I wouldn't want it to
    move my TV location for example. I'd just want it to power off but not affect anything else. I'd close all the apps first tho. I'd also like
    it to have the right settings if it has been off a while and I turn it
    on to use it. I'm not sure how hotpluggable monitors are.

    I have not observed any discrepancy when a monitor is switched off/on at the time of booting or thereafter, but I've used the Plasma Display settings to configure the monitors position and in any case here the desktop is on Plasma- Wayland. Therefore your experience may differ.



    -----BEGIN PGP SIGNATURE-----

    iQIzBAABCAAdFiEEXqhvaVh2ERicA8Ceseqq9sKVZxkFAmaP06AACgkQseqq9sKV ZxmTQw//RhwvOZWRKdoSRqPvaqqe1kAO1Ee0qSf9DIWTiH3HaMSBuLWAjStv39hh dEk37pk57feUUYay0rR+xRgXx93JttWjRFy2gD3RImCOi9H1hUavgX47QBMz0j9B OvfPSgIrbIf6VghWgrP5aNxx5FAyxg3168ylxgYeSsnOKjg/b3jzOfRMHI8l2Zig obaNiCGTVh2C34BrkgZ7mWVdGDZAZvcu3lqkyK04N2uPNBcP5NGtmCFk6JemDb18 QoYJ2f/lD0k/AKed+Ol38CAwg6dLPIRExOY0+x+pDGdukmOFKxC6PB+jpjlwaFrJ PyiA66cgODTOhh3VVboE0K6ndI18zfLBQt8AoVLnaUyl3c8o9qVYMJGQZ53uME2Z QtJFIwtS1hnE3NNk91JFVpy5l9MKEOcfpczRgh01C+n5Os7nen2bfZmCPD10pTpc 7g/ib3NZhQm+tnzYKPqrOEl6hKzMTEhk81gDiawi2gaHbzEiF5BKkODwVADzDPIF w0ZZ2w14/QHEa8Ruvu3lyO7B2fY3R/wF16H0zATkH159Xy9Q2xU4f6w928vQrO11 SM03ehFrbe5oVJ/0/rwhrSrL1A8rXDuypzAJNr/uREjXpAfYKwH0zNxrSCovyuyt YeLwl39nOO1D0Iar4qw7TzbMi9DWONFfVSqrlF5RsDCH5f/9Hko=
    =nDOM
    -----END PGP SIGNATURE-----

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Michael@21:1/5 to All on Sun Jul 14 10:01:17 2024
    On Sunday, 14 July 2024 06:08:27 BST Dale wrote:

    I then plugged the TV into the new rig. KDE saw that right away and
    popped up a screen wanting to know what to do. I just closed it and
    went to KDE settings then Display and Monitor section. I arranged the monitors like I wanted, several times. It would pop up a window asking
    if I wanted to keep the settings. Problem is, finding where the mouse pointer went. After three or four tries, I finally was able to hit the
    Keep button before it reverted back and I had to set it up again. I did
    set it up as described earlier. It's early on yet but it worked. Next
    is setting up xorg.conf for this. Gotta add the TV as "Right of" and
    all that.

    Do you even need an xorg.conf at all, if the Plasma Display settings can set
    up your monitors/TV reliably, as you want them?


    I did run into a error when trying to copy a couple video test files
    from a USB stick to the desktop. The error was this: 'The file or
    folder message recipient disconnected from message bus without replying
    does not exist.' It was confusing to say the least. It reads like it
    went through a translator or something. I checked to make sure
    everything was mounted rw correctly, checked permissions and such.
    Everything was set correctly. Did a search and dug out a thread on the forums. It said the kernel had to have the Fuse driver enabled. I'm
    not sure why but OK, I enabled and rebuilt the kernel. When I booted
    the new kernel, I could copy files over. Weird but it works, cool. :-)

    https://bugs.kde.org/show_bug.cgi?id=473733


    Before connecting the TV and all, I tested the audio. Soundless.
    Earlier, I thought it was able to detect nothing plugged into the output jack. Well, it appears it just didn't have any devices. I enabled the driver the boot media uses and lspci -k showed it as loaded on the new install. It seemed to be missing some decode stuff after a bit of
    searching. It so happens, my old rig and the new rig has almost
    identical audio chips. I just pulled up menuconfig on both in a Konsole
    and enabled the same things on the new rig that I had on the old rig. Recompiled the kernel and rebooted. I have sound to the output jacks
    now. That also likely helped me to have audio on the TV as well.

    These links are for MSWindows OS, but corresponding settings to Linux should also work for video cards which include audio processing capability:

    https://www.nvidia.com/content/Control-Panel-Help/vLatest/en-us/ mergedProjects/nvdsp/To_set_up_digital_audio_on_your_graphics_card.htm

    https://www.nvidia.com/content/Control-Panel-Help/vLatest/en-us/ mergedProjects/nvdsp/Set_Up_Digital_Audio.htm

    https://www.xda-developers.com/set-up-nvidia-high-definition-audio-is-it-worth-using/


    I haven't posted much but I been busy. I also checked the serial
    numbers of the monitors. One ends in 231 while other ends in 240. They
    are 9 digits apart. About as identical as one can get. ;-)

    You can compare the EDIDs in Xorg.0.conf to see if they are the same.

    It seems you're making good progress with your new PC & monitors. :-)

    -----BEGIN PGP SIGNATURE-----

    iQIzBAABCAAdFiEEXqhvaVh2ERicA8Ceseqq9sKVZxkFAmaTk90ACgkQseqq9sKV ZxlKsxAA2ylRkbJB33FIyqnVty4BbZUxS/8PEH9Xbq1cxFv7HiOzAuoU4xZTA5vk mf75iVCcJAMSBZSHqZhzTdt/Gy5YYSf7f0vl4nrxXNYw+m9geTelrZbvBf4J6yyJ pLSj4mWQ0tj37rC+cmBT0Yrj95TBVziTnUKFgAK+vlv1b5HfEvvq4iSIn9bIik++ KNX6/ywX3ka8ol8XLcGEFAnZnZ0QCAe8dTJjk16+3fxlSJvwByJ33Z6iKlfMbko9 W+yDE58zbBxIcLU0YMpMFxxdVxS76O3FHFfRoSaFabyVlABLnAfl//x6d4K5b60L DF1hL+NEJHco3DxOOBZm5S+jmf5m+ah8LKAF40qlSVfCvEvWz3Tid75VAXru1zg/ q9SL5mw3YnXBKm7QmwapuVWFt8p+x1nI7am7sfJeCLfvZt6f8zhczK+WrxpUh1x3 RnZvQIcR2IxqaBGN/2TPLywjdsJl8lntW/A5/KzqVL6ZgL63he3e840Xy0suBlND gK1GbSnKBMiyPVbklqgFZqUDUALYhO4YT2Kroo8z1e2hFaYrLSbBujutA/85O0MG n192f0C7+9Pkb+TlPQYjCCe4lC91sjo1Imk91mcHLSHZ368+uq825PNp+uy8DAMF vZo2IUB3K+yGpiZj+LbI3SOMWsL0AMf0jacYS4S0xfp/6Bwc2co=
    =EZd1
    -----END PGP SIGNATURE-----

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Michael@21:1/5 to All on Sun Jul 14 12:25:26 2024
    On Sunday, 14 July 2024 10:44:30 BST Dale wrote:
    Michael wrote:
    On Sunday, 14 July 2024 06:08:27 BST Dale wrote:
    I then plugged the TV into the new rig. KDE saw that right away and
    popped up a screen wanting to know what to do. I just closed it and
    went to KDE settings then Display and Monitor section. I arranged the
    monitors like I wanted, several times. It would pop up a window asking
    if I wanted to keep the settings. Problem is, finding where the mouse
    pointer went. After three or four tries, I finally was able to hit the
    Keep button before it reverted back and I had to set it up again. I did >> set it up as described earlier. It's early on yet but it worked. Next
    is setting up xorg.conf for this. Gotta add the TV as "Right of" and
    all that.

    Do you even need an xorg.conf at all, if the Plasma Display settings can set up your monitors/TV reliably, as you want them?

    That is likely true unless KDE has a bad update and won't come up. I'd
    like my monitors to come up the same way regardless of what GUI I use.
    I figure xorg.conf is the best way to make sure. At least as sure as I
    can be anyway. That said, it's been a long time since I had a bad KDE update. It may have a minor bug or something but it tends to work OK.

    Yes, an xorg.conf set up as you need it would be universal in its effect, across different DEs.


    Before connecting the TV and all, I tested the audio. Soundless.
    Earlier, I thought it was able to detect nothing plugged into the output >> jack. Well, it appears it just didn't have any devices. I enabled the
    driver the boot media uses and lspci -k showed it as loaded on the new
    install. It seemed to be missing some decode stuff after a bit of
    searching. It so happens, my old rig and the new rig has almost
    identical audio chips. I just pulled up menuconfig on both in a Konsole >> and enabled the same things on the new rig that I had on the old rig.
    Recompiled the kernel and rebooted. I have sound to the output jacks
    now. That also likely helped me to have audio on the TV as well.

    These links are for MSWindows OS, but corresponding settings to Linux should also work for video cards which include audio processing
    capability:

    https://www.nvidia.com/content/Control-Panel-Help/vLatest/en-us/ mergedProjects/nvdsp/To_set_up_digital_audio_on_your_graphics_card.htm

    https://www.nvidia.com/content/Control-Panel-Help/vLatest/en-us/ mergedProjects/nvdsp/Set_Up_Digital_Audio.htm

    https://www.xda-developers.com/set-up-nvidia-high-definition-audio-is-it-w orth-using/
    Well, this was about codec support. It seems I had none of them
    available. I likely enabled more than needed but if it isn't needed, it
    just ignores them, so I've read anyway. This is a sort list.

    [*] Support initialization patch loading for HD-audio
    <*> Build Realtek HD-audio codec support
    <*> Build Analog Devices HD-audio codec support
    <*> Build IDT/Sigmatel HD-audio codec support
    <*> Build VIA HD-audio codec support
    <*> Build HDMI/DisplayPort HD-audio codec support
    <*> Build Cirrus Logic codec support
    < > Build Cirrus Logic HDA bridge support
    <*> Build Conexant HD-audio codec support
    <*> Build Creative CA0110-IBG codec support
    <*> Build Creative CA0132 codec support
    <*> Build C-Media HD-audio codec support
    <*> Build Silicon Labs 3054 HD-modem codec support


    With those and a few others, it works. I suspect the Realtek is the one needed but it may need the HDMI one as well. I just set the same as on
    my main rig. It works. It seems the card itself had the right driver,
    just not the bit that tells the card how to process the sound.

    You can build them as modules, see which of these are loaded successfully and disable the rest. Some are generic, e.g.

    "Build HDMI/DisplayPort HD-audio codec support"

    Say Y or M here to include HDMI and DisplayPort HD-audio codec support in snd- hda-intel driver. This includes all AMD/ATI, Intel and Nvidia HDMI/DisplayPort codecs.

    Others are more specific to individual OEM chips, like e.g. Realtek with its proprietary audio converter driver.


    It's getting close. Any day now. It has been a adventure. That pesky
    LG monitor or that HDMI cable caused some issues tho. I'm looking to
    buy either 4K or 8K cables next. Be ready for the future. ;-)

    I think the future is DP rather than HDMI, which works with AMD's FreeSync and Nvidia's G-Sync, assuming both card and monitor support this. You can also chainload monitors from a single DP connection. However, cable bandwidth and functionality requirements are dictated by the specification of the devices
    you connect to your card. A 16K capable DP 2.1, or a 10K capable HDMI 2.1b won't make any difference when you're connecting 1920x1080 (i.e. sub-2K) display panels. I don't buy cables often, so I tend to buy the highest specification available at the time to future proof, as the monitors and TVs are increasingly made available at higher resolutions and refresh rates.

    -----BEGIN PGP SIGNATURE-----

    iQIzBAABCAAdFiEEXqhvaVh2ERicA8Ceseqq9sKVZxkFAmaTtaYACgkQseqq9sKV ZxlKVg/9FoTjSbIDnHj46qu0eBG39QUbdy2MDHNQlWB+Z4nSMHmmkPDIbc5iVNei R8PwyRM5dN5UFBDkNYby3XzMfshbmC6QWtmJUP5ZeKeMAnBYT2LIe6gTpGdWwfvg iAzqeDNFMq55XZseXOFj+dC7oMjjj8nRc1Jf76Aws2sq+C/BAZb45COmemkQx0ZU xSj0VX/tCAshdpJdBP6ITJvmm/WSJjuRiWlCcIMf7V6oD0OHlllqBzWVhEPcOyb8 xUjRSuMbrwWed8ivlQ/NYgvvydkBEzYFxhB2DFQHJ4dPNgGrikWwTujRBA9OJEnR yk18nbALcO79UYjiPRx06F0FB2MnSaBhG/JJ9FFUQwY4wPT37Hwu9h7en2u/CE3x 73uWTz+hBk8xnAkvmByBvKdoTtWVvq6DFLOg5d5UISTvbtwUgP6WBxdA2BV5FuE+ CWaR66D50jf5wNcSRAfQ0spOwCsfwp60ZRAfzvdq5HbgPpa5YQ566O3T7L5obNDU XWK4z5mnYAEQmbAcbDlDzDSjrYtdHRcOP4zguolt2CkfyyZYGigdKTp+spT3Igwk 8rdUg0cTCcjy/VGKJbLFDBu7BbAL6JWp6oBbZ3VhYdApdQfNK2jLEefDUK8v/yrL NlfbCARqv1DwyP/xc6enfXeLerx5jsGt/m4VtVCD4pVN6ZSS8D8=
    =EWHj
    -----END PGP SIGNATURE-----

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Michael@21:1/5 to All on Mon Jul 22 10:22:51 2024
    On Sunday, 21 July 2024 16:20:54 BST Frank Steinmetzger wrote:
    Oopsie, I found this mail in my drafts folder just now, where it’s been sitting since the ninth. Perhaps I had to pause writing, but now I can’t remember anymore. So I’ll just send it off. ;-)

    Am Tue, Jul 09, 2024 at 12:02:47AM +0100 schrieb Michael:
    On Monday, 8 July 2024 21:21:19 BST Frank Steinmetzger wrote:
    Am Mon, Jul 08, 2024 at 06:26:26PM +0100 schrieb Michael:
    Back to the previous topic, I have not yet found a case where changing the
    scale by means of the desktop settings, arrives at non-blurred fonts. The
    clearest sharpest fonts are always rendered at the native monitor resolution, at a 100% scale setting. Am I missing a trick, or is this to
    be expected?

    That doesn’t really make sense. Fonts are always rendered natively, no matter what size. Except if they are really rendered at 100 % and then the
    rendered bitmap is scaled by the GPU or somesuch.

    Or because their hinting information is limited to a certain size range. This info gives the renderer special knowledge on how to render the glyphs.

    Do you have screenshots?

    I attach two screenshots one at 100% and one at 90%. When viewed on the 1366x768 actual monitor they are worse than what the screenshots have captured. Perhaps I need to take a photo of the monitor. Anyway, if you view it on a 1920x1080 monitor you should hopefully see the difference. The font DPI is 96.

    I can see it. I use 2560×1440, but viewing an image pixel-perfect is not dependent on the screen’s resolution per se, but on it being run at its native resolution. So that one pixel in the image is actually displayed by one pixel on the screen without any scaling-induced blurring.

    I have no real explanation for the fonts. Do they also get blurry at scales bigger than 100 %?

    I'll check this when I'm next at that PC.


    The only thing I can say is that I use a font setting of
    slight hinting with no RGB subpixel rendering. The latter means that I don’t
    want the coloured fringes, but prefer greyscale aliasing instead. See my screenshot. 96 dpi (100 % scaling), main fonts set to 11 pt.

    I can see the slight hinting on your screenshot. On a same resolution monitor (2560×1440), I have:

    General font Noto Sans 10pt,
    Fixed width Hack 10pt
    RGB sub-pixel rendering
    slight hinting,

    mine look (very slightly) less blurred with naked eye. However, this may have to do with the choice of font and of course the monitor panel construction.

    Something else which affects font rendering is the selections on fontconfig. A lot of mine are unset - not sure what I should/shouldn't have enabled:

    ~ $ eselect fontconfig list
    Available fontconfig .conf files (* is enabled):
    [1] 05-reset-dirs-sample.conf
    [2] 09-autohint-if-no-hinting.conf
    [3] 10-autohint.conf
    [4] 10-hinting-full.conf
    [5] 10-hinting-medium.conf
    [6] 10-hinting-none.conf
    [7] 10-hinting-slight.conf *
    [8] 10-no-antialias.conf
    [9] 10-scale-bitmap-fonts.conf *
    [10] 10-sub-pixel-bgr.conf
    [11] 10-sub-pixel-none.conf *
    [12] 10-sub-pixel-rgb.conf
    [13] 10-sub-pixel-vbgr.conf
    [14] 10-sub-pixel-vrgb.conf
    [15] 10-unhinted.conf
    [16] 10-yes-antialias.conf *
    [17] 11-lcdfilter-default.conf *
    [18] 11-lcdfilter-legacy.conf *
    [19] 11-lcdfilter-light.conf *
    [20] 11-lcdfilter-none.conf
    [21] 20-unhint-small-dejavu-sans.conf *
    [22] 20-unhint-small-dejavu-sans-mono.conf *
    [23] 20-unhint-small-dejavu-serif.conf *
    [24] 20-unhint-small-vera.conf *
    [25] 25-unhint-nonlatin.conf
    [26] 30-metric-aliases.conf *
    [27] 35-lang-normalize.conf
    [28] 40-nonlatin.conf *
    [29] 45-generic.conf *
    [30] 45-latin.conf *
    [31] 48-spacing.conf *
    [32] 49-sansserif.conf *
    [33] 50-user.conf *
    [34] 51-local.conf *
    [35] 57-dejavu-sans.conf *
    [36] 57-dejavu-sans-mono.conf *
    [37] 57-dejavu-serif.conf *
    [38] 60-generic.conf *
    [39] 60-latin.conf *
    [40] 60-liberation.conf *
    [41] 61-urw-bookman.conf *
    [42] 61-urw-c059.conf *
    [43] 61-urw-d050000l.conf *
    [44] 61-urw-fallback-backwards.conf *
    [45] 61-urw-fallback-generics.conf *
    [46] 61-urw-fallback-specifics.conf *
    [47] 61-urw-gothic.conf *
    [48] 61-urw-nimbus-mono-ps.conf *
    [49] 61-urw-nimbus-roman.conf *
    [50] 61-urw-nimbus-sans.conf *
    [51] 61-urw-p052.conf *
    [52] 61-urw-standard-symbols-ps.conf *
    [53] 61-urw-z003.conf *
    [54] 65-fonts-persian.conf *
    [55] 65-khmer.conf
    [56] 65-nonlatin.conf
    [57] 66-noto-mono.conf *
    [58] 66-noto-sans.conf *
    [59] 66-noto-serif.conf *
    [60] 69-unifont.conf *
    [61] 70-no-bitmaps.conf
    [62] 70-yes-bitmaps.conf
    [63] 75-noto-emoji-fallback.conf *
    [64] 80-delicious.conf *
    [65] 90-synthetic.conf *


    I used to use full hinting in my early (KDE 3) days, which gives me sharp 1-pixel-lines, because I was used to the crisp look of non-aliased fonts on Windows. But for many years now I’ve been using only slight hinting, so the font looks more “real-worldy”, natural and not as computer-clean. I think that’s something I picked up during the few times I looked at a mac screen or screenshot (I’ve never sat at one for a longer time myself).


    PS.: Do you really still use KDE 4 or is it just Oxygen on Plasma 5? I kept using Oxygen Icons in Plasma 5. But more and more icons are not updated, so
    I get wrong icons or placeholders, so I bit the bullet and switched to breeze. :-/
    On second thought, I think I can answer that myself, because the blurred icons give it away. With Plasma 6, the global scaling not only affects fonts but also the entire UI. I wish this could be disabled, because that is the actual reason why I can’t keep on using custom DPI setting any longer. The UI just becomes ugly with far too much spacing and those blurry icons.

    I am still on (mostly) stable portage:

    Operating System: Gentoo Linux 2.15
    KDE Plasma Version: 5.27.11
    KDE Frameworks Version: 5.116.0
    Qt Version: 5.15.14
    Kernel Version: 6.6.38-gentoo (64-bit)
    Graphics Platform: Wayland

    with Oxygen selected as Global Theme and Oxygen on all relevant selections below that, under Appearance in SystemSettings. The Oxygen application icons look clear on my 2560×1440 desktop with 96x96 DPI. I'm glad for this because I really dislike the Breeze theme and its application icons - too close to the MSWindows flat icons theme. :p

    -----BEGIN PGP SIGNATURE-----

    iQIzBAABCAAdFiEEXqhvaVh2ERicA8Ceseqq9sKVZxkFAmaeJOsACgkQseqq9sKV ZxnlTRAAtKMjiIsMnnuKPNRi0Q67YyBHtmIFEn0lMDE5lbB+WbXDzeI+ZC2fwh2u S7WIN6Xrpa0RP86lHWXbOsAo2Dx/KXiTDlyzPZ/aR6xaX9C3w2Z0S3uNZzO1s0dW SzOlxgZsduV9yw5soAzsxu2g3p1o8FuGSMQvBs1C2EOP3m3MzKvMYil/VuMXRhwl TSfa7/G4OiWTZgSPjistNcKCjnUJl61+cl9qfYjlukCgutvb6yxkWw1i18fyne8p SfVX1maEK4JDsbilpUK6mfd/QRStq/Vc4oNMBeArdBvCBFkuIV54ij1G6ldybGNJ AwCh2P/YoC78r6xliI3XNiCapFcwqAEQCM2H3YeNr46pV7w9ce3F5Wei/3Hh+gmd HIzLs8XQdhfvUx51Kzl4Id5lDCUjneiTacn5LqUItYnkIYoJ8an6VM5QAELnFh6+ 3g1TcR0efl5VAChGqpjFE6UgU28UkwMMSCxF5dycYG1yh6ft0nAdpTGaZRKV/SA4 flz0CRMNPrZM1XCEv1Bx9LHluQXTLgrDnviMw8JY3lK7s63MWvmzCFEcD6UOplML ZfHqYGksyB5Bl/GT333+RR4jk48DLMvxIUZkLoQcspWJvw/5cjCVCCNPqVvjaYRS t8dLa2ZDAh1Xqu2M4Xy5amH0/f1ciXDqbGDXQ24qDWARBgEVdOA=
    =7Vga
    -----END PGP SIGNATURE-----

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)