WPF application has one more processes - wpf-controls

In my WPF application, used one graphics application as a control. But this graphics control/tool will run in separate process and WPF application is run in separate process. Problem is when i am navigating from one screen to some other screen, my WPF application window is minimized or going behind to other applications some times.
I want this problem to rectify. Please any give me solution for this.

Related

Demystifying the Virtual Keyboard and Touchpad in Windows 10

I'm new to Windows development, and am looking for assistance on where to get started for a particular project.
In short, I want to create a windowed application that allows a user to send keyboard and mouse inputs to another application, by interacting with various UI controls via touch. Essentially a custom on-screen keyboard/touchpad that can be used for sending keyboard-shortcuts to other applications.
There are two applications in Windows 10 that perform exactly the way I would want my new app to - the On-Screen Keyboard and Touchpad:
https://support.microsoft.com/en-us/help/4337906/windows-10-open-the-on-screen-touchpad
https://support.microsoft.com/en-us/help/10762/windows-use-on-screen-keyboard
At the most basic level, I want to define my own interface (or allow the end user to define their own), and use the same code that the onscreen keyboard/touchpad are using for handling touch events and injecting inputs into the system.
I'm uncertain at what level I would need to start to get the functionality I need - UWP? WPF? C++?
If anyone has any insight into how the on-screen utilities were built, I think that would give me an excellent head start.

Make electron window able to receive click event when above keynote app

I am working on electron app where i open a child window which i want to be above all the windows (not above fullscreen windows though), I managed to do it by using
win.setAlwaysOnTop(true, "screen-saver");
It now stays on top of all other open apps and keynote app presentation mode, i want user to be able to click buttons inside my window but now the issue is as soon as user clicks on button or just window in my child window, keynote window minimizes (as focus shifts to my window).
What i tried : I tried almost all available window option given in electron docs with different variations, I also tried playing with modals, but obviously modals stay in window itself, i want to keep main app minimized and keep child window on top of other apps. I also found electron-modal package, but that also behaves same.
working example
I was trying different application to check is any other application is able to do it, and i found that zoom app window (in screen share mode) is able to stay on top of keynote app and you can click buttons inside that app, you can move window, and keynote app keeps running in the background with no issues. I am trying to achieve exactly same behaviour.
This is something that you won't be able to recreate with electron currently, except through a native node module that manipulates your window related OS flags.
You can follow this issue on the Electron repository, since the flags introduced there should resolve your issue, or at least give you a point of entry to make your own PR or node_module.
https://github.com/electron/electron/issues/10078

Linux Window Manager Forces Window Size/Location

We're using Red Hat Linux 6.4, and our application is built using Qt. The application has multiple windows and we support a layout system where our users can save the application layout and restore it later. The application is cross-platform, and on Windows, everything is fine. On Linux, we're having problems restoring windows when a window spans multiple monitors. Our configuration uses a single virtual X display spanning all monitors, and the users can manually position and size windows across the monitors as desired.
What we've found is that the window manager is enforcing a policy on windows that are programmatically set and forcing them not to span across divide between two monitors. When we attempt to restore a saved layout containing a window that spanned monitors, the window manager reduces its size and repositions at as it sees fit. Basically, as long as the user makes the change by dragging and resizing the window, the window manager respects it, but an application that programmatically sets it gets overridden. I'm sure someone somewhere thought this was a reasonable restriction, but our customers disagree.
A developer here has spent days searching and experimented trying to find a way to work around this behavior programmatically, or better yet, tell the window manager to stop doing that. We're using the GNOME desktop and Qt 4.8.x.
Any ideas?
Thank you,
Doug McGrath

Hidden controls appearing in web instance[Labview]

I have created an application in which the vi has some controls and these controls are useful only during the development and on special instance can be unlocked in the application. i basically use app.kind property node to determine what environment the vi is running in and suitably hide/unhide the controls.
I have the application published on the web using the NI Web publishing tool. The computer which hosts the app works fine(and these controls remain invisible) but these controls can be sen on the web page. The vi is in "Embedd" mode. As a workaround i have pushed these controls some distance away and hence avoided the user from knowing about it. but this introduces the problem that i cannot view these controls when i unlock them.
Any help would be greatly appreciated.
You have built a stand-alone application and enabled web server, correct?
Are you sure the web panel is connecting to the stand-alone application (app.kind=2)
and it is not reaching the development LabView (app.kind=1) still listening on that web server port?
I would add an indicator to display the value of app.kind at all times.
What happens if you toggle the hidden fields on and off? I would add a button to do this on the vi.
Do they disappear/reappear reliably in the window where you have control?
Also, you said this was in Embedded mode - but are you also transferring control to the web page?
Those are some approaches I'd try to help pin this down.

Web Browser in a fullscreen Direct3D application

I need to have a working web browser in a fullscreen Direct3D application. For example, Valve's Source-based games (sort of) do it in the MotD window when you join a server. Any tips on where to look?
Second Life uses ubrowser (http://ubrowser.com/) to embed a browser over a 3D world. As the source code is available for the Second Life client (http://wiki.secondlife.com/wiki/Get_source_and_compile) it would be a good place to see how they have done it.
Note however, they are using OpenGL, not Direct3D... but there is nothing specific in ubrowser itself to OpenGL.

Resources