Update: the problem has fixed itself. It must have happened with an update of xrandr, Xorg, nvidia or intel gpu drivers, or anything else related, but it works fine again! It wasn't my fault after all...
My laptop has a 4K (3840x2160) screen (dual gpu, nvidia-prime) and sometimes I want to connect my Full HD TV on the HDMI port to watch a video. I prefer the TV not to mirror my 4K screen.
So I start arandr and enable the HDMI input.
On my previous installation of this laptop (Linux Arch), my 4K screen then became limited to 1920x1080, so 3/4 of the screen was unusable, even if not mirrored, but I would then just use mirror to watch the video which was limited to the 1/4 of the 4K screen, being the complete TV.
On my new installation of this laptop (again, Linux Arch), my desktop seems to resize to 5760x2160, which make the TV only show the upper half of the screen, and showing videos incorrectly because the video application thinks my screen is 1920x2160.
So none of the situations were exactly what I wanted.
I want it to work like it would work in Windows (or like my Arch installation on my desktop pc where I have a full HD screen on the left and a 1:1 2K screen on the right, it works perfectly fine there): one 4K screen on the left, one Full HD screen on the right, mouse can't pass the lower part of the screen from left to right, so the desktop is 3840x2160 on the left, 1920x1080 on the right, and the right screen has the same upper edge so the mouse stays on top when I go to the right screen.
As I am trying to automate this, I would like this to work with xrandr (I've already written an udev script which works), but even with xrandr, I don't seem to be able to set the parameters correctly.
I've tried so many things, but I seem to misunderstand the options or something, because whatever I try, it's not working as I expect it.
The simplest thing I tried was just: xrandr --output HDMI-1-0 --auto --right-of eDP-1 --mode 1920x1080 --pos 3840x0. I've tried to add --fb 1920x1080. It doesn't seem to work.
Can anyone help me understand it, maybe even help me configure it?
Edit: I don't know if it's important, but I use i3.
Edit: If I change the resolution of the 4K screen to 1920x1080, mirroring works perfectly. That's the only usable way, but it's still not what I want.
Edit: this is so strange, I just did xrandr --output eDP-1 --auto --output HDMI-1-0 --auto --right-of eDP-1 and my tv now shows an empty desktop, but my mouse is limited to the 4K screen and when I'm in the upper left area, the mouse shows up on the TV also... so it feels like the TV shows a part right of the 4K screen, but the mouse is on the left top part.
Edit: another thing I tried doesn't work: xrandr --output eDP-1 --mode 3840x2160 --fb 3840x2160 --output HDMI-1-0 --mode 1920x1080 --fb 1920x1080 --right-of eDP-1 xrandr: specified screen 1920x1080 not large enough for output eDP-1 (3840x2160+0+0) xrandr: specified screen 1920x1080 not large enough for output HDMI-1-0 (1920x1080+3840+0)
Have you tried tools like arandr ? They help you configure this kind of things visually, with menus and drop-down lists for various parameters.
Link: https://christian.amsuess.com/tools/arandr/
In arandr you can save layout as a simple sh script (Layout->Save As or blue icon). Make it executable (chmod +x) and run like any other script.
The problem has solved itself with updates, be it in xrandr, Xorg, nvidia, intel, ...
The same xrandr commands that didn't work then, work perfectly fine now.
Related
I'm quite new on the raspberry pi world, but not on linux.
I have bought a screen 7 inch with touchscreen. It works well, but if I attached another screen on HDMI 1 (tv monitor), that screen is on but only black display (not disconnected). If I connect the tv monitor to HDMI0 and 7inch display on HDMI1, the tv monitor works fine but the touchscreen sometime is black or it has the image of raspberry pi (I attached a photo).
The touch work fine, but I can see the mouse move in the other screen. If I replace the HDMI cable, I will see the screen in the touchscreen and tv monitor black.
The touchscreen is this https://www.amazon.co.uk/gp/product/B07 ... UTF8&psc=1
I have changed the /boot/config.txt
hdmi_force_hotplug=1
hdmi_drive=2
the lines below I tried to remove it, but same behaviour, majority of the time screen black on the hdmi1 and work perfectly on hdmi0
hdmi_edid_file:1=1
hdmi_edid_filename:1=edid.dat
hdmi_force_hotplug:1=1
someone has any suggestion?
thank you in advance for the reply.
I solved the issue, just used another SDCARD and flash again. It worked immediately
I'm building an application that will run on both HoloLens and mobile devices (iOS/Android). I'd like to be able to use the same manipulation handlers on all devices with the goals:
Use ARFoundation for mobile device tracking and input
Use touch input with MRTK with ManipulationHandler and otherwise use touch input as normal (UI)
Simulate touch input in the editor (using a touch screen or mouse) but retain the keyboard/mouse controller for camera positioning.
So far I've tried/found:
MixedRealityPlayspace always parents the camera, so I added the ARSessionOrigin to that component, and all the default AR components to the camera (ARCameraManager, TrackedPoseDriver, ARRayCastManager, etc.)
Customizing the MRTK pointer profile to only countain MousePointer and TouchPointer.
Removing superfluous input data providers.
Disabling Hand Simulation in the InputSimulationService
Generally speaking, the method of adding the ARSessionOrigin to the MixedRealityPlayspace works as expected and ARFoundation is trivial to set up. However, I am struggling to understand how to get the ManipulationHandler to respond to touch input.
I've run into the following issues:
Dragging on a touch screen with a finger moves the camera (editor). Disabling the InputSimulationService fixes this, but then I'm unable to move the camera...
Even with the camera disabled, clicking and dragging does not affect the ManipulationHandler.
The debug rays are drawn in the correct direction, but the default touchpointer rays draw in strange positions.
I've attached a .gif explaining this. This is using touch input in the editor. The same effect is observed running on device (Android).
This also applies to Unity UI (world space canvas) whereby clicking on a UI element does not trigger (on device or in editor), which suggests to me that this is a pointer issue not a handler issue.
I would appreciate some advice on how to correctly configure the touch input and mouse input both in editor and on device, with the goal being a raycast from the screen point using the projection matrix to create the pointer, and use two-finger touch in the same way that two hand rays are used.
Interacting with Unity UI in world space on a mobile phone is supposed to work in MRTK, but there are few bugs in the input system preventing it from working. The issue is tracked here: https://github.com/microsoft/MixedRealityToolkit-Unity/issues/5390.
The fix has not been checked in, but you can apply a workaround for now (thanks largely to the work you yourself did, newske!). The workaround is posted in the issue. Please see https://gist.github.com/julenka/ccb662c2cf2655627c95ffc708cf5a69. Just replace each file in MRTK with the version in the gist.
In appxmanifest file, there are five options for Splash screen : 620x300, 775x375, 930x450, 1240x600 and 2480x1200.
I have the following resolutions in my 15" laptop :
1366x768 (Recommended)
1360x768
1280x720
1280x600
1024x768
800x600
My question is, if I provide all five image in appxmanifest for Splash Screen, how those going to effect my laptop's different reolustion. I looked into MSDN blogs mentioning about some scale factor. I also noticed that, for each splash screen image, corresponds to a name like "SplashScreen.scale-100", "SplashScreen.scale-125", "SplashScreen.scale-150", "SplashScreen.scale-200" and "SplashScreen.scale-400"..
Actually it's a very beginning level question, but I am kind of puzzled up with all those
You have one minute video which explain Scaling and effective pixels in UWP apps. In addition, the Store picks the assets to download based in part of the DPI of the device. Only the assets that best match the device are downloaded. By the way, don't hesite to add a maximum of assets.
To help you, the following extension can generate the different size for you.
UWP Tile Generator Extension for Visual Studio
I've been having this problem with the Matlab GUI (linux) that has been annoying me for over a year but I still haven't found a solution.
Basically, the autofix hints are not displayed. When I move the mouse cursor over a potential warning/suggestion, a gray-background pop-up appears but the text inside is missing. The same happens when I hover over those little warning bars on the right hand side of the editor. Does anyone have any clue what might be causing this?
Screenshot: http://i58.tinypic.com/4veu.png
This happens only on my linux machine (Ubuntu 14.04 LTS, NVidia GeForce with nvidia driver).
Thanks!
For those interested, this issue appears to be related to the Unity Desktop. Mathworks does not provide a fix but suggests using a different XServer instead. Here is the answer I received from support:
This issue is known to occur due to a windowing system used with
"linux" on which MATLAB has not been tested. It has been observed that
if you are using "Unity desktop" in "linux", then the tooltips are
displayed as blanks.
To work around this issue, you may try switching off "Unity desktop".
You can refer to the following links for more information on this
issue:
http://www.mathworks.com/matlabcentral/answers/116987-empty-tooltips-in-code-analyzer
matlab code analyzer produces empty tooltips
Indeed, I tried lubuntu and XUbuntu (Xfce) and the tooltips in Matlab were working in both cases. I find Unity very handy because I got used to it, so for now, I will probably simply not use this Matlab feature. Hopefully this will be fixed eventually.
It's an old post but some people may be still looking for a solution or a hack. Well, I also had this issue on R2015a when using two monitors and hiding Ubuntu 14.04 sidebar seems to do the trick. This link explains how to do it: http://www.howtogeek.com/198218/how-to-easily-hide-the-unity-launcher-in-ubuntu-14.04/. Hope it helps!
This is accomplished, in the article, by:-
1) Select “System Settings” from the drop-down menu.
2) The “System Settings” dialog box displays. In the “Personal”
section, click “Appearance.”
3) On the “Appearance” screen, click the “Behavior” tab.
4) On the right side of the “Behavior” tab, there’s an ON/OFF switch.
Click the switch so it reads ON.
5) The ON/OFF switch also turns orange. Additional options for how to
show the hidden Unity Launcher become available in the “Auto-hide the
Launcher” section of the “Behavior” tab. Under “Reveal location,”
select whether you want to move the mouse to any location on the “Left
side” or just to the “Top left corner” of the screen to reveal the
Unity Launcher. Use the “Reveal sensitivity” slider to change the
sensitivity of the reveal location.
6) Once you have chosen your settings, close the “Settings” dialog box
by clicking the “X” button in the upper-left corner of the dialog box.
This happens to me in Ubuntu 15.10 using xfce, with two monitors connected to an Nvidia GeForce 8600 GT, one of which is rotated to portrait orientation. The "workaround", from the page Anton linked, is to resize my matlab desktop such that the red underlined text or red scrollbar annotation is in the bottom third of my left monitor. Unbelievable.
My preferred workaround is to use Python+Scipy+Matplotlib instead of Matlab.
I have created an Android 4.2 AVD. For the need of my current project, the main screen orientation is landscape. The software keys option is selected.
The mode is set to xhdpi, like a Galaxy Nexus phone.
As I start the emulator, I see a black stripe on the right where the buttons should appear, but it remains black. Thus, there is no way to trigger a Back action since the emulated physical keys are disabled.
The problem can be fixed by configuring the emulator in portrait mode then rotate it once started; the buttons appear as expected. This causes problems since the window is automatically scaled down to fit on my monitor; I did not find any shortcut to restore 1:1 scaling at runtime, after the rotation is done. This is important since I would like to see pixel-perfect results.
I am using SDK version 21 and platform-tools version 16.0.2, as updated yesterday.
Found the solution myself... this is a workaround that allows getting 1:1 scale and the software buttons working.
Leave the AVD (Galaxy Nexus or cloned from it) on portrait mode.
Run the emulator from the command line, using the -scale 1 option; this is the magic that forces 1:1 pixel perfect ratio even if the window does not fit in screen at startup. By default, automatic downsizing happens to fit the monitor.
Rotate the display using Ctrl+F11 or Ctrl+F12 to get landscape mode.