I wanna track if I make a fist or if I have my hand open.
Like this shows:
http://channel9.msdn.com/coding4fun/kinect/Simple-Hand-Tracking-with-MS-Kinect-SDK--WPF
Now do I have the problem that I'm doing it in C++ and that in this example the old version of kinect is in used.
At first I've the problem to display just the hand in the depthView and then im going to have the problem to detect if it is a fist or a hand.
I'm looking forward for your help
Related
I have been trying to build a TV application, using an SD.
I have got features like image gallery, video player running,
However, I also wanted to add a virtual on-screen keyboard that works with up-down left-right arrow keys. Can somebody help me with how to get started?
When I wanted to do this with my Vestel (Polaroid Branded) smart TV, which uses "Opera for TV devices" as it's HbbTV browser, I found that I didn't need to.
I simply just used HTML text fields and input types where needed, and as soon as I clicked into them, the browser/OS kernel popped up an onscreen keyboard that was built in for me.
However, I did do some research to see if I needed it, and on some devices you do, whilst I never actually implemented it (My app was just for my own use) the "BBC Television Application Layer" (TAL for short) : https://github.com/bbc/tal had pretty good keyboard support.
Another one that might be worth looking at is "Mautilus SDK" : https://github.com/mautilus/sdk
Be aware though, both are horribly convoluted and use quite complex code where it's really not needed.
I'm using Unity3D and I'm having issues with the keyboard when I 'build' the project.
When I run the game within the Unity Editor, the input works fine. However, when I build the project and test it, I have no directional input whatsoever. The mouse works fine, the game registers keystrokes (the Esc key works), but the player won't move.
I'm using Input.GetAxisRaw("Vertical") and Input.GetAxisRaw("Horizontal")
I researched the issue on the web, but I'm still stucked. The only solution I found (in various links) mentions a problem with DirectInput and states that you should remove the key "Input" from [HKEY_CURRENT_USER\Software\Unity\Player], in the Windows registry, but that doesn't seem to work for me.
Has anybody else fought this problem? Any working solutions? Am I doing the registry trick wrong?
You should check out Input Manager in Edit->Project Settings-> Input. Have you tried using GetAxis instead of GetAxisRaw? I also suggest using, eg: Input.GetKey(KeyCode.A) and etc.
I've figured out what was troubling me.
Input.GetAxisRaw("Vertical") and Input.GetAxisRaw("Horizontal") were fine the whole time. Somehow, there was some code in the script that in the Editor worked just fine, but when running as standalone gave me a calculation equal to 0 (it depends on the Delta Time, so I assume that the Delta Time when running compiled is much smaller than when running in the Unity Editor) and thus the character didn't move at all.
I have recently got a Mac and I have downloaded Xcode 4.2 from the store. I am trying to get to grips with iPhone development but I am having real troubles. All the tutorials I seem to find online, when they create a project, they had a resources folder, and inside that there is xib file which allows them to use an interface builder.
This does not appear on 4.2, so makes it kinda hard to follow majority of tutorials have the resources folder. How do I get this back? Or how do I access this file on 4.2?
Also, I could someone explain to me where the objects list is? I started following this tutorial
http://maybelost.com/2011/10/tutorial-storyboard-in-xcode-4-2-with-navigation-controller-and-tabbar-controller-part1/
as it seemed to be using Xcode 4.2, but when I get down to the storyboard section, it says
"Of course, we really want another tab on there so we can see the switching between the two – so lets drag in another Navigation Controller from the Utilities (objects) list and plonk it down somewhere. "
Except I cannot find this objects list? How do I open this objects list? What am I missing?
Sorry if these questions seem very basic, I am new to both Macs and iPhones. Android development seems a HELL of a lot easier from what I can see so far.
Thanks in advance for any help.
Also would be grateful if anyone could point in direction of any good up to date tutorials
I have a post on http://www.armandvanderwalt.co.za it will give you a nice understanding of how most stuff fits together, I don't use Interface Builder at all since it only makes the app bigger. Have a look at my blog post, still need to do styling, and add more posts but it is a nice beginner guide.
Most posts you are finding still use XCode 3 that's why you can't find certain things.
Also have a look at http://www.raywenderlich.com
What they are referring to as the object list, in XCode 4 it is found in the bottom right corner of Interface Builder. In XCode 4 Interface Builder is part of XCode and no longer an external application. Therefore when ever you open a XIB file Interface Builder also automatically opens
Recently I have a problem with playing my soundeffects using CocosDenshion. The sound is playing on the iPhone Simulator, but not on my own device. I am not sure what I am doing wrong. Off course I checked if my speakers are still working, but they do while using other apps or the iPod Library.
I just use this simple code:
[[SimpleAudioEngine sharedEngine] playEffect:#"button.wav"];
I double checked the name of the file and it indeed is button.wav.
I hope someone can help me out.
I had the same problem recently and I've figured it out.
I' m using Xcode 4.3.2 and there seems to be a bug.
Not all resources you add (drag) to the project are by default added to the target.
When you add a resource, make sure that (apart from checking the copy items into destination checkbox) you also check the add to target checkbox.
You can check if the resources has been added to the bundle by clicking on the target icon. In the Build Phases tab, check if the sound files are included in the Copy Bundle Resources.
If not, add them manually (+add target)
You might try preloading the sound effects, at some earlier point in the program, as otherwise it has to load the sound effect before it can play it. You could create a splashScreen scene that is the first scene and loads all your assets and then transitions to the first "actual" scene.
SimpleAudioEngine *engine = [SimpleAudioEngine sharedEngine];
[engine preloadEffect:#"Example-Sound.caf"];
Not sure whether it is the case here, but you should keep in mind that if your file name is "Button.wav" and you are asking to playEffect:#"button.wav" it will play on simulator, but will not on the device. This effect takes place because filesystem on your desktop is case-insensative, and on iOS devices it is not.
I have the same problem and find here. But I fix it by myself.
If you reinstall ios for ipad. the default switch(upon the volume) is default as "mute volume" method. Maybe I change it to "rotation" method when it is mute. It lead to all cocos2d can't play sound any more. So I turn the method back and set mute off. And then SimpleAudioEngine works well. It is ios system's bug.
Make sure to check your audio file itself. A Mono 16 Bit, Uncompressed wave with a sample rate of 441000 should 'just work'. I recommend grabbing a sample .wav from an online Cocos2D example, like the Cowbell.wav and try to play that file.
We've got some in-house applications built in MFC, with OpenGL drawing routines. They all use the same code to draw on the screen and either print the screen or save it to a JPEG file. Everything's been working fine in Windows XP, and I need to find a way to make them work on Vista.
In three of our applications, everything works. In the remaining one, I can get the window border, title bar, menus, and task bar, but the interior never shows up. As I said, these applications use the exact same code to write to the screen and capture the window image, and the only difference I see that looks like it might be relevant is that the problem application uses the MFC multiple document interface, while the ones that work use the single document interface.
Either the answer isn't on the net, or I'm worse at Googling than I thought. I asked on the MSDN forums, and the only practical suggestion I got was to use GDI+ rather than GDI, and that did nothing different. I have tried different things with every part of the code that captures and prints or save, given a pointer to the window, so apparently it's a matter of the window itself. I haven't rebuilt the offending application using SDI yet, and I really don't have any other ideas.
Has anybody seen anything like this?
What I've got is four applications. They use a lot of common code, and share the actual .h and .cpp files, so I know the drawing and screen capture code is identical.
There is a WindowtoDIB() routine that takes a *pWnd, and a source rectangle and destination size. It looks like very slightly adapted Microsoft code, and I've found other functions in this file on the Microsoft website. Of my four applications, three handle this just fine, but one doesn't. The most obvious difference is that the problem one is MDI.
It looks to me like the *pWnd is the problem. I'm not a MFC guru by a long shot, and it seems to me that the problem may be that we've got one window setup in the SDIs, and more than one in the MDI. I may be passing the wrong *pWnd to the function.
In the meantime, it has started working properly on the 64-bit Vista test machine, although it still doesn't work on the 32-bit Vista machine. I have no idea why. I haven't changed anything since the last tests, and I didn't think anybody else had. (On the 32-bit version, the Print Screen key works as expected, but it does not save the screen as a JPEG.)
Your question title mentions screen capture but your actual question doesn't. Please elaborate more clearly. Is the problem that you can do screen capture of three of your applications, but not the fourth one? You can use different screen capture software that can capture OpenGL/DirectX windows. Those surfaces are handled directly by the Window Manager and won't show up with a simple 'PrtScn'.
Switching to GDI+ won't solve it, nor will switching to SDI.
If it's the content of the CView that you want, then yes, that should be right one. If it's the content of the whole screen (at least the content, without the toolbar(s) and status bar), then you should pass it the CMainFrame (that's the default name which may have been changed, the one that is derived from CMDIFrameWnd).
Can you post the code of WindowToDIB()? I've just tried it and It Works For Me (TM), but without OpenGL code in the view. Try passing the following windows to your WindowToDIB() function:
CMainFrame* mainfrm = static_cast<CMainFrame*>(::AfxGetMainWnd());
- mainfrm
- mainfrm->MDIGetActive()
- mainfrm->MDIGetActive()->GetActiveView()
and see what you get.
The contents of each window are directX surfaces and are only assembled by the window manager in the graphics card. You'd not be able to capture this unless you switch off the new interface (DWM) or code specifically for screen capture from the DWM.
Wikipedia has a good description of the Desktop Window Manager (DWM)
Sorry, I still don't understand. You're trying to get the Print Screen key to work on all four applications? Or you're trying to get the WindowtoDIB() function to work, which takes a 'screenshot' (from within your own application) of the application itself, so that it can be saved as an image file?
Also, what do you mean with 'he Print Screen key works as expected, but it does not save the screen as a JPEG.'? Print Screen only copies to the clipboard, what happens when you paste in Paint?
If your WindowtoDIB() function only 'captures' the window you pass to it, then yes, your MDI child windows are not going to show up.
We eventually solved this by creating a different OpenGL context, and drawing everything to that. We gave up on the screen capture.