I'm working on a Direct3D application and I sometimes need to terminate the application forcefully via the debugger. After the application has been terminated in this way a couple of times, Direct3D reports an "Out of video memory" error when attempting to create a new device. Presently I am working around this by switching my display resolution to 800x600 and back to previous resolution in the hope of "resetting" video memory. It works 99% of the time. But it would be nice to have a simple utility -- a command line app, perhaps -- to quickly reset my graphics card and Direct3D stack by releasing video memory explicitly.
How can I free video memory and other resources left behind by a forcefully terminated Direct3D application?
This is a driver bug. The driver should release the memory when the process exits. If it isn't doing so then you need to update your driver. If that still doesn't work you need to send a repro case to the IHV that made your card.
Related
I had a completely working, fully-debugged PyQt5 application which 1) accepted streaming real-time data from a USB peripheral, 2) allowed data recording and 3) saved recorded data to disk. I'm running on top of Ubuntu Linux 17.10.
This application is now broken. The Linux operating system is not connecting to the USB peripheral dependably. Kernel updates may be involved. The problem is complicated, and I discuss it here if you are interested in the details. I'm trying to fix this problem. In the mean time, I need to collect data from the device.
My original application allowed full-duplex communications over the USB. So in the original application, when I wanted to open a QFileDialog to save data to disk, I first issued a command to instruct the peripheral to stop transmitting. It has been a long time since I wrote this program and I don't remember whether this was truly necessary to avoid a crash, it just seemed like a smart thing to do.
When the OS fails to cooperate, I can no longer do this. I only have half-duplex communication. The peripheral must start automatically and must keep running.
My program works as before, right up to the point where I save data. I open the QFileDialog, I name the file and save it. The file actually does save to disk, but after that the segmentation fault occurs.
I am trying to devise a work-around for this situation. Any suggestions are appreciated! Thanks.
I am looking for a way to control which applications have access to use the GPU and video memory when using X.Org on Linux.
The setup:
I am developing an embedded Linux system where several 3D games are started on boot. I am starting all the games such that they are present in memory which makes it possible to switch quickly between them.
The idea is that only 1 game is visible at a time. This is done by mapping the windows of the active game and unmapping the windows of the inactive ones. I.e. "minimizing" the windows of the inactive games.
The problem:
The performance of the active game is not optimal, and so it seems that the inactive games are still taking up resources on the graphics card even though their windows are unmapped from X.Org.
What I have tried so far:
My current solution is to suspend the inactive games and only have the active one being resumed. This is however not optimal since I would like to do IPC communication with the inactive games as well.
The question:
How can I disallow a Linux application access to the graphics card, such that their draw calls to the graphics card are simply ignored.
If your games start slow, it's time to optimize that.
If a game is inactive and not using GPU resources the driver will eventually swap all OpenGL resources out of GPU memory, so when the game resumes it will experience some noticable delay, when the OpenGL driver is swapping resources back into GPU meory.
Just unmapping a window is not enough, you also have to stop the main game loop, so that they don't consume CPU memory.
How can I disallow a Linux X11 application access to the graphics card, such that their draw calls to the graphics card are simply ignored.
With current generation X11 and drivers: Switch to another VT than the X11 server it runs on. You can start any number of X11 servers you like, but only one can be active on a display VT at any given time.
Note that given the right driver architecture and drivers also inactive VTs may access the GPU, but right now this is not the case.
Recently I did develop a Symbian application. In Qt simulator it works perfect but in actual device the application terminates unexpectedly. I suspect this is a memory leak issue.
Already i have followed all possible(following) memory cleaning mechanisms:
Creating new objects as pointers.
deleting the objects after use
using 'delete' keyword. using deleteLater() function on ui objects.
But still the application terminates on the device.
please suggest me possible solutions for this.
You can try increasing the Heap and/or Stack sizes using the EPOCHEAPSIZE and EPOCSTACKSIZE statements in your .PRO file
http://qt-project.org/doc/qt-4.8/qmake-platform-notes.html#stack-and-heap-size
Although it may depend on which Qt SDK you are using as the documentation now states that the Qt toolchain already sets these to the maximum possible values.
I have a MFC application that I want to distribute via USB flash drives.
One of the the requirement for the application is that when a user unplugs the USB flash, the MFC app should exit itself.
I added code to detect USB flash removal and exit the app when that occurs.
The app exits itself fine most of the time when USB flash is unplugged.
But sometimes (50%), I would get a "AppName.exe has stopped working." error message in Windows 7. When running under the debugger, I get more details and it's a "First-chance exception: 0xC0000006: In page error." error.
After some googling, that error seems to be caused by the fact that the underlying media is removed and the memory manager is unable to read from the media (USB flash in this case).
Reference 1: In page error 0xc0000006
Reference 2: http://blogs.msdn.com/b/oldnewthing/archive/2008/12/04/9172708.aspx
This MFC app is a small single executable file program. My question is if there's a way to force Windows OS to load an entire .exe file into memory before executing it. My thinking is that if I can get Windows to load the entire program into memory before executing, then the memory manager wouldn't need to access the disk when USB flash is removed and the unwanted error message problem might just go away.
Thanks for you help!
Link the program with the /SWAPRUN option. Also available as an EditBin.exe option to do it later.
Okay, I'm trying to implement fast switching on my app, and if I don't have to execute any background code / save any user data, do I really even need to do anything ? Or do I just upgd to iPhone SDK 4.0, click compile, and deploy?
Is there any way to simulate an out of memory exception, make the OS purge the application to test how it's being relaunched?
Thanks,
Teja.
If all you want is fast-user switching, I found that recompiling with the iOS4 SDK is sufficient in order to make your app 'fast-switch ready.' You just need to make sure your app can deal with memory warnings and being able to restore any data you had to release as a result of a memory warning.
The easiest way to simulate a memory warning is through the iPhone simulator, under Hardware -> Simulate Memory Warning. Personally, I used the Mac OSX Keyboard preferences to set a hot-key for that menu item so I can quickly simulate memory warnings on my app.