Howdy,
New monitor came in. Found it on my porch. Unboxed it and hooked it
up. Started with no xorg.conf file. It's high res, plasma comes up so
all that is fine. Now comes the problem. Everything is HUGE. Even on
SDDM screen, it's like it is zoomed in or something. Icons are huge,
fonts are huge. I think what it is doing is this, and it may be the
video card. I think it is thinking I have four monitors set up as one
large screen. With this one monitor, I'm seeing the top left monitor
but not seeing the rest since I only have one monitor connected. I
might add, when I click to logout, that screen is huge too.
I generated a xorg.conf with the nvidia tool. No change. I tried
removing options that I thought might cause this largeness problem. I
even rebooted. No change. I open the Nvidia GUI software and tried adjusting things there. No change. I compared the setting to my main
rig, other than the difference in the cards, all settings appear to be
the same including what should be displayed where. The resolution is
correct too.
Is it possible this card functions differently than my usual cards?
Someone mentioned these tend to be used in a business system which could
mean they use four monitors for one large display, for presentations or something. Is there something I need to disable or enable so the card
knows I want independent monitors maybe?
I'm attaching the xorg.conf file, including options I commented out. I
also checked the logs, no errors or anything. It's just the usual
things like mouse and keyboard loading and it finding the monitor
connected, unlike the old LG. Maybe someone here as ran into this at
work or for a client and has a idea on how to fix or that this card is
not designed for my use.
While I'm waiting on a reply, I'm going to try one of my older spare
video cards. If it works fine, it could be the new video card is set up
to work this way. May not can even change it.
Thanks.
Dale
:-) :-)
Another update. I rebooted several times to make sure whether things
would be consistent. Most of the time, it came up as it should. Some
times, not so much. When I had just the new Samsung monitor connected,
it was consistent. When I added the old LG, it would not always come up
like it should. The biggest thing, the plasma panel would be on the
wrong monitor.
I tried using xrandr to set this but it kept changing what monitors was connected where which would throw off what monitor got what priority.
Finally, I removed the old LG. It has caused enough grief already. I unhooked the TV cable for my bedroom TV and connected it to the new
rig. I then booted. I installed a package called arandr. It's a
sister to xrandr but GUI based. Makes it very easy to see what is
what. On the first boot, the Samsung showed as connected to port 1.
The TV showed as port 3 I think. It seems each port can do two displays
so it kinda skips. The first port is actually 0. Anyway, I used arandr
to set it up like I wanted. I saved the file with the command in my
home directory. I then moved the command to a file in /etc/X11/xinit/xinitrc.d/ as a file. They are usually started with a
number in the file name. Don't forget to add the bash bit on the first
line if needed and make it executable as well. Once I did that, the
displays worked like they should. So far at least.
The lesson to be learned is this. When you have a monitor that is
having issues and keeps showing as connected to different ports and
such, you can't use that display to get a reliable configuration that
will survive a reboot, maybe even a power off and back on. Another
thing, using either xrandr or arandr is a nifty feature if set up
correctly. Those two make it so a display, or set of displays more importantly, work like you want. The arnadr command since it is a GUI,
makes it a lot easier to create the xrandr command with the right
options. If you use that route tho, make sure all monitors are
connected and on before starting. You may can do it without it with
xrandr but arandr needs the monitor to be on and working. The other
thing, putting the setting in /etc/X11/xinit/xinitrc.d/ seems to work
pretty well. So far at least.
To be honest tho, I wish Nvidia would generate a conf file that contains
both monitors and I could set it up properly there. Then when I boot
up, it reads that file and knows what monitor is what long before DM
and/or sddm even starts. It could also keep a monitor powered on even
while on a console with nothing GUI running. I kinda wish we could do
it like we did back in the old days.
I also had another thought. When changing the xorg.conf file, I wonder
if it only reads that file when loading the nvidia drivers but not when
DM is started/restarted. I noticed on my system, when I booted but have
not started DM, the Nvidia drivers were already loaded. I'm not sure
when the xorg.conf file is loaded but if it is loaded when the drivers
load, then that could explain why some changes didn't make any changes
to the display. The changes were not seen unless I rebooted which I
didn't always do. Maybe someone here knows what order this happens in.
It could explain a lot tho.
I'm hoping all this will help someone. It sure has been a hair puller
for me. LOL
Mark Knecht wrote:
On Tue, Jul 2, 2024 at 12:44 PM Dale <rdalek1967@gmail.com <mailto:rdalek1967@gmail.com>> wrote:
<SNIP>
I tried it with those options and without. Neither changed anything. I originally tried it with no xorg.conf at all. I was hoping maybe the Nvidia GUI thing would adjust things. I may try that again. No xorg.conf and use the GUI thing. That's what I use to set up my TV and such anyway. Thing is, the sddm screen is HUGE too.
<SNIP>
:-) :-)
???
xdpyinfo | grep -B2 resolution
???
I booted my new rig up again. Dang that thing. It was HUGE again. I started reading stuff, mainly about xorg.conf and the available
settings. I changed all sorts of stuff, including some things Micheal suggested. I restarted DM each time. I was about ready to toss it in
the old minnow pond, that's where everything goes to die here. Lots of
CRT monitors in there. LOL Anyway, I had to install that package to
run that command. It spit out a oops when I tried to run it after a
copy and paste. I also installed it on my main rig, just to compare.
On the new rig, the DPI was a fairly large number. I thought I had the output saved but it seems to be gone. My main rig tho showed 80x80 dots
per inch. I did a duck search, finally found how to set that. I then restarted DM and YEPPIE!!! It was a normal size again.
Now the monitor on my main rig is a bit older too. Maybe 6 or 7
years??? Should newer monitors be set to a higher number for DPI? Is
that normal? Why was it using such a high number by default? I want to
say one was like 200 or something. It was quite large. The reason I'm asking, I may need to set something else to make the screen the right
size but let it use that larger dpi number, if that is what the newer
monitor prefers to use.
Now to reboot, see if I have thoughts of that minnow pond again. :/
Dale
:-) :-)
Michael wrote:
On Saturday, 6 July 2024 10:59:30 BST Dale wrote:
Mark Knecht wrote:
On Tue, Jul 2, 2024 at 12:44 PM Dale <rdalek1967@gmail.com
<mailto:rdalek1967@gmail.com>> wrote:
<SNIP>
I tried it with those options and without. Neither changed anything. >>>> I
originally tried it with no xorg.conf at all. I was hoping maybe the >>>> Nvidia GUI thing would adjust things. I may try that again. No
xorg.conf and use the GUI thing. That's what I use to set up my TV and >>>> such anyway. Thing is, the sddm screen is HUGE too.
<SNIP>
:-) :-)
???
xdpyinfo | grep -B2 resolution
???
I booted my new rig up again. Dang that thing. It was HUGE again. I
started reading stuff, mainly about xorg.conf and the available
settings. I changed all sorts of stuff, including some things Micheal
suggested. I restarted DM each time. I was about ready to toss it in
the old minnow pond, that's where everything goes to die here. Lots of
CRT monitors in there. LOL Anyway, I had to install that package to
run that command. It spit out a oops when I tried to run it after a
copy and paste. I also installed it on my main rig, just to compare.
On the new rig, the DPI was a fairly large number. I thought I had the
output saved but it seems to be gone. My main rig tho showed 80x80 dots >> per inch. I did a duck search, finally found how to set that. I then
restarted DM and YEPPIE!!! It was a normal size again.
Now the monitor on my main rig is a bit older too. Maybe 6 or 7
years??? Should newer monitors be set to a higher number for DPI? Is
that normal? Why was it using such a high number by default? I want to >> say one was like 200 or something. It was quite large. The reason I'm
asking, I may need to set something else to make the screen the right
size but let it use that larger dpi number, if that is what the newer
monitor prefers to use.
Now to reboot, see if I have thoughts of that minnow pond again. :/
Dale
:-) :-)
I'm struggling to follow your post because you do not provide specific information on the commands you input, the output you get in your terminal and the observed changes in the monitor.
You also don't provide info on the changes you made in your xorg.conf, or xrandr and the corresponding changes observed each time in your
Xorg.0.log.
Strictly speaking, the pixel density of an on-screen digital image is referred to as Pixels Per Inch (PPI), but the term DPI which refers to a printed image of ink Dots Per Inch has stuck.
In addition, there is the physical pixel density of your monitor and the rendered pixel density of the X11 image(s). Tweaking the latter allows
you to scale the display and make images look larger than the native monitor resolution.
You can set the DPI in your xorg.conf, or you can set it with xranrd, or you can set it on the CLI when you launch X, but usually this is not necessary and could mess up the scaling of your fonts, window decorations and symbols too (the font DPI is set differently by setting Xft.dpi: in ~/.Xresources, of the window manager's/DE font settings).
A good starting point is to get the manual of your monitor and look at its published native resolution, e.g. 1920x1080 and the physical monitor size over which this resolution is displayed. Let's assume this 1920x1080 native resolution belongs to a 23" monitor. A 23" diagonal would correspond to a 20" wide screen real estate. Consequently the horizontal PPI would be:
PPI = 1920 pixels / 20" = 96
The same resolution on a 24" wide monitor would give a PPI of:
PPI = 1920 pixels / 24" = 80
Obviously a physically wider 24" monitor with the same native screen resolution as a smaller 20" monitor will not look as sharp when viewed
from
the *same* distance.
Similarly, changing the selected resolution on the same 23" monitor from say 1920 pixels wide to a lower resolution of 1280 pixels gives a PPI of 64.
I leave the calculation of the vertical PPI to the reader.
Usually I start with no xorg.conf and leave the card to detect what the monitor prefers, then use the Scale setting in the desktop settings to increase/decrease (zoom in/zoom out) the displayed scale. This has the effect of altering the PPI to higher or lower values to improve
readability of content. The above should help you arrive at some
practical resolution, but I would start with the native resolution of the monitor and work down from there if you find it difficult to read its display.
NOTE: Using Qt scaling can mess up window decorations, widgets, etc. I've found it doesn't work well with some KDE applications and their menus/ submenus, or pop up windows. You need to set PLASMA_USE_QT_SCALING=1 to make it follow Qt scaling and there's GTK3 too which may need tuning.The reason I picked Mark's post is that I used the command he gave to
This is the reason I calculate PPI before I venture into buying a new monitor, unless I can see it in person to make sure I can still read its content. ;-)
find out the DPI info was different from my main rig. When I first
booted up and started DM, I got that HUGE screen again. It worked last
time. I hadn't changed anything. I sometimes wonder still if it reads xorg.conf each time. Anyway, when it didn't work, I started reading up
a bit. I tried several things including checking options you posted but nothing worked. It stayed HUGE. Then I ran the command Mark gave and noticed the difference in DPI between my main rig and the new rig. I
then found out how to set that in xorg.conf and set it the same as my
main rig. As soon as I restarted DM, the screen came up the correct
size. The HUGE part was gone. When I rebooted, it was still the normal size. It also worked each time I restarted DM. The only change is
setting DPI. Like this:
Section "Monitor"
Identifier "Monitor0"
VendorName "Unknown"
ModelName "Samsung LS32B30"
Option "PreferredMode" "1920x1080_60.00"
Option "DPMS"
Option "DPI" "80 x 80"
I just booted the new rig again and it has a normal display. None of
that huge stuff. I think I've rebooted like three times now and it
worked every time. I think that is the most reboots with a config that
works since I built this rig. Usually, it works once, maybe twice, then fails. Later on, it might work again. This machine is like rolling
dice. You never know what you going to get. Three consecutive reboots
with it working gives me hope on this one. I won't be surprised if when
I hook up a second monitor or the TV that it breaks again tho. ;-)
My only question now, is that a good setting or is there a better way to
make sure this thing works, each time I reboot?
Dale
:-) :-)
Michael wrote:
On Saturday, 6 July 2024 17:11:23 BST Dale wrote:
Michael wrote:
On Saturday, 6 July 2024 10:59:30 BST Dale wrote:
Now the monitor on my main rig is a bit older too. Maybe 6 or 7
years??? Should newer monitors be set to a higher number for DPI?
Strictly speaking, the pixel density of an on-screen digital image is
referred to as Pixels Per Inch (PPI), but the term DPI which refers to a >>> printed image of ink Dots Per Inch has stuck.
In addition, there is the physical pixel density of your monitor and the >>> rendered pixel density of the X11 image(s). Tweaking the latter allows >>> you to scale the display and make images look larger than the native
monitor resolution.
Is this your monitor?
https://www.samsung.com/us/business/computing/monitors/flat/32--s30b-fhd-75hz-amd-freesync-monitor-ls32b300nwnxgo/#specs
If the screen is 27.5" wide and 15.47 high, then at a native 1,920 x 1,080 pixel resolution the DPI would be approx. 70x70. However, if you're happy with the way it looks @80x80, then that's a good setting. After all, you're
the one looking at it! :-)
Actually, mine is a LS32B304NWN. I'm not sure what the difference is between 300 and 304. There may just be a minor version change but
display is the same.
It's hi res and a good deal. :-D
Compared to the HUGE display, yea, it looks good. The reason I was
asking if that is correct is this, maybe it should be set to, just
guessing, 128 x 128 but some other setting makes the picture the right
size, not HUGE. If 70 x 70 or 80 x 80 is a setting that the monitor is designed for and ideal, then that is fine.
Monitors, even the old CRTs, have resolutions and settings they work
best at.
I read once where a
person had a monitor that had a great picture at 60Hz refresh. Even tho
it would work at 75Hz, the picture wasn't as good. It seems that
something didn't like that 75Hz setting. That person used the 60Hz setting. Some things are picky that way. Higher isn't always better.
I may try that 70 setting. Odds are, just like the difference between
60 and 75Hz refresh rate, I likely won't be able to tell the
difference. Time will tell tho.
By the way, I booted the rig up when I went to heat up supper and was downloading new messages. It booted to a normal screen. I think it is
at least being consistent now. Before, it was hit or miss, mostly
miss. Given how good things are at just working, I'm surprised that the correct setting wasn't used automatically. I'd think it should be.
Maybe that is a bug????
Now to go eat supper. I hope the monitor comes in soon.
I read once where a
person had a monitor that had a great picture at 60Hz refresh. Even tho
it would work at 75Hz, the picture wasn't as good. It seems that
something didn't like that 75Hz setting. That person used the 60Hz
setting. Some things are picky that way. Higher isn't always better.
How long ago was that? If it was in the VGA era, maybe the analog circuits weren’t good enough and produced a bad signal.
It's hi res and a good deal. :-DPlease define hi res. Full HD at 32″ is definitely not hi res. ;-P
It’s about as much as CRTs back in the day, close to 1024×768 at 17″.
Well, I still consider 1080P hi res. That's what I get for any monitor
or TV I buy. The biggest thing I have is a 32" tho. My rooms are kinda small. No need for a 60" TV/monitor.
Now to go eat supper. I hope the monitor comes in soon.I’m confused. I thought the new one has already arrived and is the one where
everything was HUGE. %-)
I ordered a second identical monitor. I been wanting two monitors for a while. On occasion when I have hundreds of files to process manually, I need a second monitor just to stick a file manager on and drag files
from one directory to another but being able to see both at the same
time.
A second monitor will help with this. Plus, I have a spare as
well. So, first monitor is here and fixed a lot of problems except it
added a new one, being HUGE. Now that is fixed as well. When I connect the second monitor, I should be able to set it up the same way except connected to a different port.
The biggest thing I dread right now, cleaning off my desk. -_o
On Sun, Jul 7, 2024 at 1:09 PM Frank Steinmetzger <Warp_7@gmx.de> wrote:
Am Sat, Jul 06, 2024 at 07:32:49PM -0500 schrieb Dale:<SNIP>
Well don’t mix up frame rate and scaling. 75 Hz vs. 60 is quite subtle,you
might not even notice 90 Hz. But changing DPI from 80 to 70 will mean an increase in fonts by 14 %.
So I understand the 14% calculation, but help me understand the underlying technology. Is the DPI how a font file, which I presume is some fixed size, like 25x25, gets scaled onto the screen? I'm not clear about the conversion from the font to the number of dots used to draw the font on the screen.
I’m confused. I thought the new one has already arrived and is the onewhere everything was HUGE. %-)
Dale does this at times and I get confused also. He will (the way I read the messages) sometimes be talking about different machines or different monitors. His 'main rig", his "new rig", etc.
Well my TV sits over 4 m (that’s 13 feet for the imperialists) away from theWell, I still consider 1080P hi res. That's what I get for any monitor >> or TV I buy. The biggest thing I have is a 32" tho. My rooms are kinda >> small. No need for a 60" TV/monitor.It's hi res and a good deal. :-DPlease define hi res. Full HD at 32″ is definitely not hi res. ;-P
It’s about as much as CRTs back in the day, close to 1024×768 at 17″.
sofa. So I splurged and got myself a 65″ one.
Well, I saw on a website once where it gave info on distance, monitor
size and what you are watching can factor in too. It claimed that a 32"
is the ideal size for my room. Given my old eyes tho, a 42" might serve
me better. Thing is, I'm bad to watch old videos from the 80's, 70's
and even 60's. Most of those are 480P or if lucky, just a little higher resolution. With those, monitor size can make videos worse.
It also goes into the other direction: people these days™ watch 4K movies on
their phones. Why, just why? Even if the screen can display it physically, their eyes cannot resolve that fine detail, because the pixels are too small.
On 07/07/2024 23:29, Frank Steinmetzger wrote:
It also goes into the other direction: people these days™ watch 4K movies on their phones. Why, just why? Even if the screen can display it physically, their eyes cannot resolve that fine detail, because theWhat's worse, as far as I can tell, I have no control over the download resolution! Why would I want to download a video in Full HD, when I only
pixels are too small.
have an HD Ready screen, and it's over a metered connection! So
basically, I'm paying for resolution I can't see!
Cheers,
Wol
Am Sun, Jul 07, 2024 at 05:10:18PM -0500 schrieb Dale:
It's hi res and a good deal. :-D
Please define hi res. Full HD at 32″ is definitely not hi res. ;-P >>> It’s about as much as CRTs back in the day, close to 1024×768 at 17″.
Well, I still consider 1080P hi res. That's what I get for any monitor >> or TV I buy. The biggest thing I have is a 32" tho. My rooms are
kinda
small. No need for a 60" TV/monitor.
Well my TV sits over 4 m (that’s 13 feet for the imperialists) away from
the sofa. So I splurged and got myself a 65″ one.
Well, I saw on a website once where it gave info on distance, monitor
size and what you are watching can factor in too. It claimed that a 32"
is the ideal size for my room. Given my old eyes tho, a 42" might serve
me better. Thing is, I'm bad to watch old videos from the 80's, 70's
and even 60's. Most of those are 480P or if lucky, just a little higher resolution. With those, monitor size can make videos worse.
This websites’s goal probably was about covering your eyes’ natural field of
view. Sitting at my desk, my 27 inch monitor appears only slight smaller
than my 65 inch TV 4 m away. Watching 50s TV shows will be the same experience on both in those situations.
If you want to fill that entire field of view with details, then naturally,
a 50s TV show in 480p won’t suffice. The more of your viewing arc you want to cover, the more picture resolution you need. You basically want to map
X amount of pixels on each degree of viewing arc. Physical units are great.
It also goes into the other direction: people these days™ watch 4K movies on
their phones. Why, just why? Even if the screen can display it physically, their eyes cannot resolve that fine detail, because the pixels are too
small.
will show you what different resolutions the streaming server offers and you can then select the one you need/prefer. Of course this is conditional on yt-
dlp being capable of parsing the stream you want to download and on the streaming server offering different resolutions.
Frank Steinmetzger wrote:
Am Sun, Jul 07, 2024 at 05:10:18PM -0500 schrieb Dale:
It's hi res and a good deal. :-D
Please define hi res. Full HD at 32″ is definitely not hi res. ;-P >>>>> It’s about as much as CRTs back in the day, close to 1024×768 at 17″.
Well, I still consider 1080P hi res. That's what I get for any monitor >>>> or TV I buy. The biggest thing I have is a 32" tho. My rooms are
kinda
small. No need for a 60" TV/monitor.
Well my TV sits over 4 m (that’s 13 feet for the imperialists) away from
the sofa. So I splurged and got myself a 65″ one.
Well, I saw on a website once where it gave info on distance, monitor
size and what you are watching can factor in too. It claimed that a 32" >> is the ideal size for my room. Given my old eyes tho, a 42" might serve >> me better. Thing is, I'm bad to watch old videos from the 80's, 70's
and even 60's. Most of those are 480P or if lucky, just a little higher >> resolution. With those, monitor size can make videos worse.
This websites’s goal probably was about covering your eyes’ natural field
of view. Sitting at my desk, my 27 inch monitor appears only slight
smaller than my 65 inch TV 4 m away. Watching 50s TV shows will be the
same experience on both in those situations.
If you want to fill that entire field of view with details, then
naturally,
a 50s TV show in 480p won’t suffice. The more of your viewing arc you want
to cover, the more picture resolution you need. You basically want to map
X amount of pixels on each degree of viewing arc. Physical units are
great.
It also goes into the other direction: people these days™ watch 4K movies on their phones. Why, just why? Even if the screen can display it physically, their eyes cannot resolve that fine detail, because the
pixels are too small.
-- Grüße | Greetings | Salut | Qapla’ Please do not share anything from, with or about me on any social network. How do you recognise a
male hedgehog? It has one more spine.
Yea. The website at the time was mostly likely to help people not buy a
TV that is waaaay to large.
I made a DVD of the TV series Combat for my neighbor. That was when he
had a little smaller TV. It said it looked like large blocks on the
screen. He watched it tho. lol He sits about 10 feet from the TV. It
is a nice TV tho. All that smart stuff.
I agree, a device should pick a resolution that it can easily display
without downloading more than it needs. There's really not much need
putting a 4K or even a 1080P video on a small cell phone. Unless a
person is using a magnifying glass, they won't see the difference. I remember some of my old CRTs that ran at 720P. For their small size,
that was plenty.
Devices send a viewport size to the server to fetch scaled images and fonts as
required, instead of downloading a huge resolution only for it to be consumed on the small screen of a phone or tablet. I'm not sure how the screen size information is shared between server-phone-TV when you mirror your phone on a TV.
I don't know about cell phones but if using the youtube app, I'd think
it would know what you are using and the resolution too.
To me, it
looks like it would be best for everyone to only download what is needed
and no more.
It saves bandwidth of the server, bandwidth for the user
as well. Most people have unlimited nowadays but still, one would think
a company like youtube would see the benefit of only sending enough resolution to get the job done. If they do that for millions of users,
it would have to save them some amount of money. I'd think anyway.
On 08/07/2024 13:27, Dale wrote:
I don't know about cell phones but if using the youtube app, I'd think
it would know what you are using and the resolution too.
BBC iPlayer, ITVx. Along with the other Freeview apps for Channels 4 and 5.
To me, it
looks like it would be best for everyone to only download what is needed and no more.
You'd think. Trouble is - most people DON'T think!
It saves bandwidth of the server, bandwidth for the user
as well. Most people have unlimited nowadays but still, one would think
a company like youtube would see the benefit of only sending enough resolution to get the job done. If they do that for millions of users,
it would have to save them some amount of money. I'd think anyway.
Youtube doesn't (yet) have a monopoly on streaming, fortunately ...
On Monday 8 July 2024 13:59:27 BST Wol wrote:
On 08/07/2024 13:27, Dale wrote:
I don't know about cell phones but if using the youtube app, I'd think
it would know what you are using and the resolution too.
BBC iPlayer, ITVx. Along with the other Freeview apps for Channels 4 and
To me, it
looks like it would be best for everyone to only download what is needed and no more.
You'd think. Trouble is - most people DON'T think!
It saves bandwidth of the server, bandwidth for the user
as well. Most people have unlimited nowadays but still, one would think a company like youtube would see the benefit of only sending enough resolution to get the job done. If they do that for millions of users, it would have to save them some amount of money. I'd think anyway.
Youtube doesn't (yet) have a monopoly on streaming, fortunately ...
I don't know about dedicated services with their own clients, but anything you get via a web browser is tailored to your screen. That was so when I
was operating a web site, anyway.
Back to the previous topic, I have not yet found a case where changing the scale by means of the desktop settings, arrives at non-blurred fonts. The clearest sharpest fonts are always rendered at the native monitor resolution,
at a 100% scale setting. Am I missing a trick, or is this to be expected?
New subthread. Slightly new situation.[snip ...]
On the old LG monitor, when I first plugged up the new monitor, it
wouldn't power up from standby. It wouldn't even when I connected only
the new monitor. The BIOS would beep that it can't find a display as
well. I thought at first I had a broken monitor. Getting power but
won't wake up. I tried another cable, it worked. So, the first cable,
same as I used on the LG monitor, didn't work at all with the new
monitor. Might be that the cable has a problem not the LG monitor
itself. I didn't think of that. I've used that cable quite often
before. I'm not sure how old that cable is but it may find a trash can.
On to the new monitor. I'm trying to decide whether to use xrandr and friends to set this up or xorg.conf. Using both seems to cause a bit of
a clash and KDE isn't helping any. I'd kinda like to go the xorg.conf
route. I think, but not sure, that xorg.conf is read very early on. It seems, but not sure, that the xinit files are read later on. I'm not
sure on all that. It could be the other way around. I'm also pretty
sure that if set up in xorg.conf, it would work if I logged into another
GUI; Gnome, Fluxbox or some other flavor.
It could be that xrandr and friends would as well.
Current situation config wise. The first problem I noticed, the
monitors are nearly identical. Even the serial numbers are only a few
digits off and that's the only difference I can see. I did some
searching and was wanting to set a option in xorg.conf Monitor section
that identifies the monitors by not only model but also serial number.
That way I could set one to right or left of the other, or above/below,
and it know specifically which monitor is which, even if plugged into a different port. I can't find a option for serial number yet. I did
find where someone else wanted to do the same a couple years ago but
sadly, no answer to that question. So, if that is not doable, may have
to specify the port number. If I ever have to disconnect and reconnect, getting the order right could prove interesting. ;-)
Right now, I have this:
root@Gentoo-1 ~ # cat /etc/X11/xinit/xinitrc.d/20.xrandr
xrandr --output DP-0 --off --output DP-1 --mode 1920x1080 --pos 1920x0 --rotate normal --output DP-2 --off --output DP-3 --primary --mode
1920x1080 --pos 0x0 --rotate normal --output DP-4 --off --output DP-5
--off --output DP-6 --off --output DP-7 --off
root@Gentoo-1 ~ #
From what I've read, that is the correct place for that command. If
not, please let me know where it should be. Keep in mind, putting it in
/usr somewhere gets overwritten when the package providing that file
gets updated.
I'm also attaching the current xorg.conf file. Keep in
mind, gotta add the TV later on. I think if I can get the monitors set
up, I can add the TV pretty easily. Even if it only works in KDE, that
is fine since I need KDE up and running to use the TV anyway. I'm
mostly needing to know if there is a way to add the serial number for xorg.conf. I think the rest is OK. I also need a command to get what
the system sees as the serial number, in case it only sees a part of it,
last few digits or something.
I also had to argue with KDE about which is primary. At first, like
with the LG, it wanted to make the second monitor the primary despite
the first monitor being marked primary. I did the old set it backwards, apply and then set it back the way I want it trick. It took it a second
but KDE reversed it. Main screen went to the primary display. This is
what I mean by it clashing and me setting the displays up in xorg.conf
and KDE hopefully getting its info from that and adjusting itself.
Plus, if I use a different GUI, it should work too.
These new monitors sure are nice. My old eyes like them. o_O Oh, I
was going to copy over the KDE config directories. I think I'm going to
take the opportunity to give KDE a fresh start. I think those old
config files date back to the switch to KDE4. We on KDE6 now.
If anyone knows if/how to set up xorg.conf with a monitor serial number,
plus chime in. Other info welcome as well.
Dale
:-) :-)
It sounds like you recommend me using xorg.conf and not xrandr. I was thinking that using both would also cause a clash. Basically, I need
one tool to do this. That's why I picked xorg.conf for long term,
xrandr is just for now or a second option. I may comment that command
and reboot. See if it is the xorg.conf file doing the work or xrandr.
I think we talked about this maybe off list. On my old machine, when
sddm comes up, the password field on the second monitor shows the dots,
TV in my case. On the new machine, both monitors show the dots for the password. I'm not sure what is different tho. It did that even before
I set the primary option. I like it that way myself but makes me
curious why my main rig is different. It seems the new rig sends the
same screen to both monitors. Once logged into KDE, it splits into two monitors. My main rig it seems is always two separate screens.
I got some things going on. I'll read the email closer later and make
some changes. I'll post back then. Oh, so far, it shows several
packages headed in the right direction. The monitor stand left a small
hub and when it leaves there, it almost always gets delivered that day.
So, I may get the monitor stand today. The new /home hard drive is on
the right path too. I'm expecting quite a lot of packages. While
proofing this, got text from USPS that stand and several other packages
are out for delivery. UPS updates a little later.
Oh, in /etc/X11/xorg.conf.d/ when the files are numbered, does it read
them from low to high?
If I set a option in one file but set the same
option differently in another file, which one does it apply? Or does it
not apply either?
Thanks for the info. :-D Will work on it shortly.
Dale
:-) :-)
Michael wrote:
As far as I know SDDM is using the file(s) in /usr/share/sddm/scripts/ to start a login GUI. I haven't looked into how far can these be tweaked for a dual monitor setup and if they even have a 'primary' monitor concept.I've never really looked into it either. I mentioned it because it
seems something has changed. On my old rig, it seems to have kept some setting somewhere but on new installs, it uses a new setting which we
may both like better. Luckily one of my TVs is in the same room so I
can see the screen. If however, you have a second monitor that you
can't see, it may be worth looking into and setting it to the new way.
It could be that someone reading this long thread would also like to
know to do the same. ;-)
I found the man page and another web page with a ton of info on
options. Link below in case others want to bookmark it. Some of them I
have no idea what they do. Even their description for some settings
makes no sense since the terms used are things I never heard of. I
doubt I need those anyway, thank goodness. Anyway. I been playing with
this thing a bit. I made a simple change in xorg.conf just to see if it worked or not without changing anything else. I added this to the
options for the second monitor:
Option "Above" "DP-3"
I'll see how that works. May try another GUI to, Fluxbox or something.
For some reason tho, the port numbers are still odd, consistent but
odd. Primary monitor is plugged into the lowest port, the one with #1 stamped on the bracket. It sees it as DP-3 tho. Even more odd, the
second monitor is DP-1, which is marked as port #2 on the bracket. I
can't make heads or tails of that mess. o_O
I did change how I plan to lay out the monitors tho. From the primary monitor as a starting point, second monitor that I use for handling
large volume of files and such will be above the primary monitor. My TV
will be to the right of the Primary monitor. The reason for that is
mostly the physical layout. The monitor stand came in and I'll be
putting the primary monitor on the bottom and second monitor on top of
it. The TV can just go anywhere config wise but it has been to the
right for so long, when I need my mouse pointer over there, habit makes
me push the mouse to the right. It's as good a place as any.
At first, I had the second monitor to the right of primary but then it
hit me, dragging the mouse pointer, and files, to the right to go up to
the top monitor seems kinda odd. Plus, for a long time now, the TV has
been there on the right. I rearranged things a bit. Given the physical layout, it makes more sense this way. While I'm thinking on this. I
may turn off the second monitor at times. Should I add a option to
xorg.conf to make sure it doesn't go weird on me? I wouldn't want it to
move my TV location for example. I'd just want it to power off but not affect anything else. I'd close all the apps first tho. I'd also like
it to have the right settings if it has been off a while and I turn it
on to use it. I'm not sure how hotpluggable monitors are.
I then plugged the TV into the new rig. KDE saw that right away and
popped up a screen wanting to know what to do. I just closed it and
went to KDE settings then Display and Monitor section. I arranged the monitors like I wanted, several times. It would pop up a window asking
if I wanted to keep the settings. Problem is, finding where the mouse pointer went. After three or four tries, I finally was able to hit the
Keep button before it reverted back and I had to set it up again. I did
set it up as described earlier. It's early on yet but it worked. Next
is setting up xorg.conf for this. Gotta add the TV as "Right of" and
all that.
I did run into a error when trying to copy a couple video test files
from a USB stick to the desktop. The error was this: 'The file or
folder message recipient disconnected from message bus without replying
does not exist.' It was confusing to say the least. It reads like it
went through a translator or something. I checked to make sure
everything was mounted rw correctly, checked permissions and such.
Everything was set correctly. Did a search and dug out a thread on the forums. It said the kernel had to have the Fuse driver enabled. I'm
not sure why but OK, I enabled and rebuilt the kernel. When I booted
the new kernel, I could copy files over. Weird but it works, cool. :-)
Before connecting the TV and all, I tested the audio. Soundless.
Earlier, I thought it was able to detect nothing plugged into the output jack. Well, it appears it just didn't have any devices. I enabled the driver the boot media uses and lspci -k showed it as loaded on the new install. It seemed to be missing some decode stuff after a bit of
searching. It so happens, my old rig and the new rig has almost
identical audio chips. I just pulled up menuconfig on both in a Konsole
and enabled the same things on the new rig that I had on the old rig. Recompiled the kernel and rebooted. I have sound to the output jacks
now. That also likely helped me to have audio on the TV as well.
I haven't posted much but I been busy. I also checked the serial
numbers of the monitors. One ends in 231 while other ends in 240. They
are 9 digits apart. About as identical as one can get. ;-)
Michael wrote:
On Sunday, 14 July 2024 06:08:27 BST Dale wrote:
I then plugged the TV into the new rig. KDE saw that right away and
popped up a screen wanting to know what to do. I just closed it and
went to KDE settings then Display and Monitor section. I arranged the
monitors like I wanted, several times. It would pop up a window asking
if I wanted to keep the settings. Problem is, finding where the mouse
pointer went. After three or four tries, I finally was able to hit the
Keep button before it reverted back and I had to set it up again. I did >> set it up as described earlier. It's early on yet but it worked. Next
is setting up xorg.conf for this. Gotta add the TV as "Right of" and
all that.
Do you even need an xorg.conf at all, if the Plasma Display settings can set up your monitors/TV reliably, as you want them?
That is likely true unless KDE has a bad update and won't come up. I'd
like my monitors to come up the same way regardless of what GUI I use.
I figure xorg.conf is the best way to make sure. At least as sure as I
can be anyway. That said, it's been a long time since I had a bad KDE update. It may have a minor bug or something but it tends to work OK.
Before connecting the TV and all, I tested the audio. Soundless.
Earlier, I thought it was able to detect nothing plugged into the output >> jack. Well, it appears it just didn't have any devices. I enabled the
driver the boot media uses and lspci -k showed it as loaded on the new
install. It seemed to be missing some decode stuff after a bit of
searching. It so happens, my old rig and the new rig has almost
identical audio chips. I just pulled up menuconfig on both in a Konsole >> and enabled the same things on the new rig that I had on the old rig.
Recompiled the kernel and rebooted. I have sound to the output jacks
now. That also likely helped me to have audio on the TV as well.
These links are for MSWindows OS, but corresponding settings to Linux should also work for video cards which include audio processing
capability:
https://www.nvidia.com/content/Control-Panel-Help/vLatest/en-us/ mergedProjects/nvdsp/To_set_up_digital_audio_on_your_graphics_card.htm
https://www.nvidia.com/content/Control-Panel-Help/vLatest/en-us/ mergedProjects/nvdsp/Set_Up_Digital_Audio.htm
https://www.xda-developers.com/set-up-nvidia-high-definition-audio-is-it-w orth-using/Well, this was about codec support. It seems I had none of them
available. I likely enabled more than needed but if it isn't needed, it
just ignores them, so I've read anyway. This is a sort list.
[*] Support initialization patch loading for HD-audio
<*> Build Realtek HD-audio codec support
<*> Build Analog Devices HD-audio codec support
<*> Build IDT/Sigmatel HD-audio codec support
<*> Build VIA HD-audio codec support
<*> Build HDMI/DisplayPort HD-audio codec support
<*> Build Cirrus Logic codec support
< > Build Cirrus Logic HDA bridge support
<*> Build Conexant HD-audio codec support
<*> Build Creative CA0110-IBG codec support
<*> Build Creative CA0132 codec support
<*> Build C-Media HD-audio codec support
<*> Build Silicon Labs 3054 HD-modem codec support
With those and a few others, it works. I suspect the Realtek is the one needed but it may need the HDMI one as well. I just set the same as on
my main rig. It works. It seems the card itself had the right driver,
just not the bit that tells the card how to process the sound.
It's getting close. Any day now. It has been a adventure. That pesky
LG monitor or that HDMI cable caused some issues tho. I'm looking to
buy either 4K or 8K cables next. Be ready for the future. ;-)
Oopsie, I found this mail in my drafts folder just now, where it’s been sitting since the ninth. Perhaps I had to pause writing, but now I can’t remember anymore. So I’ll just send it off. ;-)
Am Tue, Jul 09, 2024 at 12:02:47AM +0100 schrieb Michael:
On Monday, 8 July 2024 21:21:19 BST Frank Steinmetzger wrote:
Am Mon, Jul 08, 2024 at 06:26:26PM +0100 schrieb Michael:
Back to the previous topic, I have not yet found a case where changing the
scale by means of the desktop settings, arrives at non-blurred fonts. The
clearest sharpest fonts are always rendered at the native monitor resolution, at a 100% scale setting. Am I missing a trick, or is this to
be expected?
That doesn’t really make sense. Fonts are always rendered natively, no matter what size. Except if they are really rendered at 100 % and then the
rendered bitmap is scaled by the GPU or somesuch.
Or because their hinting information is limited to a certain size range. This info gives the renderer special knowledge on how to render the glyphs.
Do you have screenshots?
I attach two screenshots one at 100% and one at 90%. When viewed on the 1366x768 actual monitor they are worse than what the screenshots have captured. Perhaps I need to take a photo of the monitor. Anyway, if you view it on a 1920x1080 monitor you should hopefully see the difference. The font DPI is 96.
I can see it. I use 2560×1440, but viewing an image pixel-perfect is not dependent on the screen’s resolution per se, but on it being run at its native resolution. So that one pixel in the image is actually displayed by one pixel on the screen without any scaling-induced blurring.
I have no real explanation for the fonts. Do they also get blurry at scales bigger than 100 %?
The only thing I can say is that I use a font setting of
slight hinting with no RGB subpixel rendering. The latter means that I don’t
want the coloured fringes, but prefer greyscale aliasing instead. See my screenshot. 96 dpi (100 % scaling), main fonts set to 11 pt.
I used to use full hinting in my early (KDE 3) days, which gives me sharp 1-pixel-lines, because I was used to the crisp look of non-aliased fonts on Windows. But for many years now I’ve been using only slight hinting, so the font looks more “real-worldy”, natural and not as computer-clean. I think that’s something I picked up during the few times I looked at a mac screen or screenshot (I’ve never sat at one for a longer time myself).
PS.: Do you really still use KDE 4 or is it just Oxygen on Plasma 5? I kept using Oxygen Icons in Plasma 5. But more and more icons are not updated, so
I get wrong icons or placeholders, so I bit the bullet and switched to breeze. :-/
On second thought, I think I can answer that myself, because the blurred icons give it away. With Plasma 6, the global scaling not only affects fonts but also the entire UI. I wish this could be disabled, because that is the actual reason why I can’t keep on using custom DPI setting any longer. The UI just becomes ugly with far too much spacing and those blurry icons.
Sysop: | Keyop |
---|---|
Location: | Huddersfield, West Yorkshire, UK |
Users: | 490 |
Nodes: | 16 (1 / 15) |
Uptime: | 74:26:15 |
Calls: | 9,678 |
Calls today: | 2 |
Files: | 13,722 |
Messages: | 6,172,511 |