I would like to display an animated mushroom cloud (probably with particle emitters) as the result of an explosion in an iPad game I am developing.
I was wondering if there was anyone out there that has already tried this, and if they would be willing to give me some pointers.
Trainyard by Matt Rix has a little explosion animation that happens when a train crashes. You could ask him
Related
I have a LG G3 phone which after some testing does not have a gyroscope(only accelerometer) and I've been testing the Cardboard with it and faced some issues.
Sometimes the camera can suddently jump up to 90 degrees in any direction from where I was looking, and this can at it's worst happen every 10 seconds or so(usually about every 30sec). I did test the accelerometer output and it didn't seem to be that inaccurate that the camera would jump that much. I've looked around and found a couple of other users reporting the same issue too.
This issue is present not only in the Unity Cardboard SDK Demo but also in some vr apps, and by some I mean there are a couple of apps that I've tried that works perfectly fine(Can't remember which ones right now but one was a roller coaster vr app). Though this issue is really apparent in the Cardboard Labs app.
This jumpyness doesn't only destroy the immersion but also induces a bit of desorientation aswell as nausea when the jumping gets really bad. I had a hard time finishing the Cardboard Labs tests because of this...
Soo last but not least can the headtracking code be optimized for phones without a gyro so that these experiences can be improved? If not on the google side of the SDK, is there anything I can do to the SDK to help minimize this effect?
Ok, so after some testing I seem to have it fixed now.
The reason seems to be that since I have rooted my device and I'm often fiddling with the frequency of the CPU in it the motion tracking somehow gets messed up. This can easily be fixed with a reboot with stock clocks.
I'm not sure if it has to do with the polling of the motion sensor fetching incorrect data when it's reading too fast or if it's the cpu that can't keep up but nevertheless, I seem to be stuck with stock clocks if I'm going to play VR games. I'm leaving this question here for those who might have the same issue.
EDIT: After some more testing the issue reappeared after a while. I'm guessing there is an app or service that might be the problem here because after a restart it's fixed again.. I'll post more when I've tested it further.
I'm looking for an example of multi-threading implementation inside the game toolkit? I have the MultiCube example, but that is for WinForms and I use WPF, and I can't use the game toolkit tools from Direc3D11 because I need an instance of the GraphicsDevice. The MultiCube example is not displaying anything but a black screen, I tried it on several computers. My video card doesn't support command lists, don't know if that has anything to do with it. I was wondering how many models can SharpDX handle, because I have to draw hundreds of small
scaffold couplers, and after adding about a 100 on the default GraphicsDevice, the application slows down and gets locked. Any help would be appreciated.
Regards,
Haris
I was looking for the same thing but I couldn't find any examples. I tried converting the MultiCube example to use the toolkit and got it basically working, still very messy at the moment and needs optimizing, but at least it renders.
https://github.com/PlehXP/SharpDX-Samples/tree/MultiCubeToolkit/Toolkit/WindowsDesktop/MultiCube
I'm looking into making a project with the Kinect to allow my Grandma to control her TV without being daunted by using the remote. So, I've been looking into basic gesture recognition. The aim will be to say turn the volume of the TV up by sending the right IR code to the TV when the program detects that the right hand is being "waved."
The problem is, no matter where I look, I can't seem to find a Linux based tutorial which shows how to do something as a result of a gesture. One other thing to note is that I don't need to have any GUI apart from the debug window as this will slow my program down a fair bit.
Does anybody know of something somewhere which will allow me to in a loop, constantly check for some hand gesture and when it does, I can control something, without the need of any GUI at all, and on Linux? :/
I'm happy to go for any language but my experience revolves around Python and C.
Any help will be accepted with great appreciation.
Thanks in advance
Matt
In principle, this concept is great, but the amount of features a remote offers is going to be hard to replicate using a number of gestures that an older person can memorize. They will probably be even less incentivized to do this (learning new things sucks) if they already have a solution (remote), even though they really love you. I'm just warning you.
I recommend you use OpenNI and NITE. Note that the current version of OpenNI (2) does not have Kinect support. You need to use OpenNI 1.5.4 and look for the SensorKinect093 driver. There should be some gesture code that works for that (googling OpenNI Gesture yields a ton of results). If you're using something that expects OpenNI 2, be warned that you may have to write some glue code.
The basic control set would be Volume +/-, Channel +/-, Power on/off. But that will be frustrating if she wants to go from Channel 03 to 50.
I don't know how low-level you want to go, but a really, REALLY simple gesture recognize could look at horizontal and vertical swipes of the right hand exceeding a velocity threshold (averaged). Be warned: detected skeletons can get really wonky when people are sitting (that's actually a bit of what my PhD is on).
I am animating a small space ship (derived from UIView) and periodically (whilst in animation) send it a PointF to check if this is near the space ship's current position.
However, when reading out the Frame position of the View it keeps returning the starting position before the animation started.
I think this is by design but it is causing me big problems since the space ship(s) should move independently along Paths and it is very tricky for me to do this by hand.
Is there another way - and/or has anyone some sample code?
Not sure of a workaround for your issue, but I have some suggestions on game development for iOS.
Your problem is one of the reasons why using GUI frameworks like UIKit/CoreGraphics for games isn't a good idea. For both performance reasons, as well as the fact as they aren't designed for it.
If you are looking for a simple framework for making games on iOS, have you looked at MonoGame? If you are doing lots of animations, we also use XNA Tweener along with MonoGame to get some lifelike animations.
PS - check out our game here.
I'm making a 3d game in xna.
I want to make a cube of water (part of a mob) that will splash a little,
but I cant think of a way to render this kind of cube or a shader with this effect,
any suggestions?
*BTW sorry for the bad english.
Take a look at this post:
http://forums.create.msdn.com/forums/p/31417/180167.aspx#180167
Hope it helps you!