I have been at this thing for a couple of days and I just can't see CeRunAppAtTime working. I just want to ask if anyone has ever got his to work?
Could anyone please post a working code sample?
If you'd have a look into the notify.h, where CeRunAppAtTime is defined, you might notice it is obsolete and not supported:
//
//Obsolete; provided to maintain compatibility only
//
HANDLE CeSetUserNotification (HANDLE hNotification,
TCHAR *pwszAppName,
SYSTEMTIME *lpTime,
PCE_USER_NOTIFICATION
lpUserNotification);
BOOL CeRunAppAtTime (TCHAR *pwszAppName, SYSTEMTIME *lpTime);
BOOL CeRunAppAtEvent(TCHAR *pwszAppName, LONG lWhichEvent);
BOOL CeHandleAppNotifications (TCHAR *pwszAppName);
It might work, even most of the time, but you are on your own there.
It is recommended to use CeSetUserNotificationEx instead.
CeRunAppAtTime does work, but unfortunately it seems to be rather platform dependant. On some devices, it's extremely unreliable. You can use CeSetUserNotificationEx to accomplish the same thing. I've found it much more reliable on newer devices, but on some devices it too just doesn't work well. I don't know what's so hard about this particular task, but many OEMs just can't seem to get it right.
It seems that the device must be set to a full power-on state after your app is launched, or it goes back to "sleep". You can accomplish this through a call to SetSystemPowerState, as detailed here
Related
I'm currently stuck with a pesky little issue. I developed an application that zeroes out the DXGI mode desc. structure and calls FindClosestMatchingMode() to, as advertised, "gravitate towards the desktop resolution".
This works fine if the laptop(s) run fully on their own display -- as soon as I plug in another monitor it goes berserk. In the case I extend my desktop it will still correctly get the laptop monitor resolution, yet the attached one (running 1080p) will yield a preference for 800*480 :) (sure, poor man's 16:10, but...)
Doing the same thing with the monitors cloned/combined (results in 1 output device), even if their resolution is equal, gives the same 800*480 crap.
What gives? And has anyone perhaps found a way to properly get a display's current mode through DXGI or a pointer for a wholly different yet functional approach to this here problem?
Life was easier back in the D3D9 days =)
-- Update
As it turns out any FindClosestMatchingMode() call made on the IDXGIOutput instance belonging to the external monitor behaves differently (and in most cases plain wrong) compared to the internal display, even though their native resolution is identical. To top it all off, other systems don't have this issue yet I can't get around supporting this particular laptop including it's drivers.
Time for a good old setup dialog.
Not the best solution but as I was constrained to these exact machines I settled for getting the monitor's current resolution through GetSystemMetrics() (SM_CXSCREEN/SM_CYSCREEN), which admittedly only works for the primary monitor but there's other ways, and feeding this resolution to the ModeToMatch structure fed to FindClosestMatchingMode().
It then settles for the correct (desktop) resolution.
Better answers are very welcome of course ;)
First, I apologize for any sillinesses that might occur, as I'm not quite sure as how to properly put this question.
I recently got curious about how some applications (like Midnight Commander) control the text-mode output, forming so-called "Text-based user interface", idk. Is that some evil magic with standard output operations or something else that I'm unaware of? I did some poking around google, but didn't find anything of particular interest, and I hope someone here can point me on the right way.
Thanks in advance,
~Insomnia Array
What you're looking for is NCurses - a library which uses special terminal characters to set color, position, etc.
http://www.gnu.org/s/ncurses/
In addition to ncurses, take a look at S-lang.
http://www.s-lang.org/
Many sites and articles on getting widescreen monitors to work on notebooks in their native resolution mention something called the "Mode Removal Table" in the Video BIOS which specifically prevents certain video modes:
http://www.avsforum.com/avs-vb/showthread.php?t=947830
http://software.intel.com/en-us/forums/showthread.php?t=61326
http://forum.notebookreview.com/dell-xps-studio-xps/313573-xps-m1330-hdmi-hdmi-tv-issue-2.html
http://forums.entechtaiwan.com/index.php?action=printpage;topic=3363.0
Does such a thing really exist? The fix worked for me but I wanted to find out if I can read, modify, or work around this table. However I can't find any mention of it in the various VESA standards. Perhaps it actually goes by some other more cryptic name?
“Many sites and articles”? The first couple of dozen results are from you, and most of the rest are from that Intel article you mentioned or other people linking to that article.
You could always try asking someone who talks as though they know how to do it. There's another thread that discusses it—though it too has no information on the table, only a quick mention of it.
There does not seem to be any currently known way to read the GMA video BIOS. You would have to dump the BIOS and reverse-engineer it to figure out where the table is and how to interpret it. Unfortunately, even extracting it is difficult since nobody seems to have had enough interest in creating a tool to automate it. Looks, like you’ve got even more reversing to do. (Techincally, because the GMA is an integrated graphics-adapter, you'll need to extract the video BIOS from the system BIOS, then extract the table.)
Good day,
I am working on a Stratix III FPGA which contains M9K block memories, the contents of which are conveniently initialised to zero on power-on. This suits my application very well.
Is there a way to reset the contents back to zero without power-cycling/reflashing/etc the FPGA? There seems to be no such option in the megawizard plugin manager, and I would like to avoid wasting a bunch of logic which just goes and sequentially writes zero to every address...
I have looked around and there is no reference to such a mechanism, but I thought I'd ask just in case someone knew a handy trick :] By the way I'm working in VHDL but I should be able to translate any Verilog.
Datasheet (does not contain the answer!) : http://www.altera.com/literature/hb/stx3/stx3_siii51004.pdf
Thanks in advance,
- Thomas
PS: This be my first post here, so if I've violated any etiquette please let me know :)
Sorry, the conventional ways to do that are:
to re-configure the fpga (you could trigger that from within your hardware if you don;t mind the whole thing "disappearing" while it reconfigures)
explicitly write zeros in (as you already suggested)
At the wackier end of the solution space, I guess you could also wire something up to the JTAG port if you already have a microcontroller either in the FPGA or outside - you might be able to overwrite the RAM contents that way too.
This is a similar problem: Link
Which was solved by calling GetAsyncKeyState(). While all fine and dandy, I need a Linux alternative. I need to know if a button is being held down, not just being pressed (because of the keyboard buffer delay). Does anything like this exist in the OpenGL/Glut libraries, or will I have to look elsewhere?
I have never used Glut, but I know that many people will say SDL is better. I have used SDL and I like it a lot. It does everything Glut does and a lot more. In SDL, you can use SDL_PollEvent() to get key state without the keyboard buffer delay.
Edit: I know almost nothing about Glut, but it looks like you can use glutKeyboardFunc to detect normal keys, and glutSpecialFunc for keys that do not generate ASCII characters (such as shift). I'm not sure if there is a better way, as this doesn't seem very nice.
You can detect when a keypress event occurs, record that state, and then listen for a key release event.
As said, you will have to make your own state machine, which is easy. But you also need to use this callback method I think.
http://pyopengl.sourceforge.net/documentation/manual/glutKeyboardUpFunc.3GLUT.xml