I am trying to create an App to Stream screen to Chromecast.
I follow the Tutorials and able to create button & connect device.
However, what is next step to cast over ?
Anyone know how using the latest Android Sender 17.1.0 ?
Thanks
You should register your receiver app and a Cast device to test (https://cast.google.com). Anyway, I guess you have your mediaRouteIMenuItem created.
Now you have to use RemoteMediaClient to load some content in your cast device (Video, audio...). What's your problem exactly?
I have an App require to cast the screen (mirror) to the Chromecast or other similar device. I notice in SDK-28 Phone which I use for testing, the Android-OS already has this feature.
Could I could simply invoke(control) this Android feature within my App using API ? I could not find how to do it.
Furthermore reading the material, it indicates if I am using default Cast device protocol, I could just use the "CastMediaControlIntent.DEFAULT_MEDIA_RECEIVER_APPLICATION_ID" = "CC1AD845".
Is this correct ?
I found a project "android_library_screen_mirroring_chromecast-master" in GitHub created 2 years ago to use "CastScreenPresentation" but when I rebuild the project using "com.android.tools.build:gradle:3.5.0" and it would not work.
Related
I am currently developing my first hololens 2 application. In it, I create an interactable with box collider at runtime. In the Unity editor I can grab and move it around, but on Hololens, if I try to grab it, nothing happens. Does anyone have any leads on why this could be?
EDIT: The Example App is installed and Hand Tracking works as intended. The issue seems to not be caused by faulty sensors
Properties (at runtime, after instantiating):
It is recommended to double-check the Unity Project Setting to troubleshoot, please follow this doc to declare the appropriate capabilities. And then you need to configure the XR Plugin. For XR SDK, please follow this link: Getting started with MRTK and XR SDK, for Legacy XR, have a look at For Legacy XR.
After this, please try deploying the scene Assets/MRTK/Examples/Demos/HandTracking/Scenes/HandInteractionExamples.unity to your device. Do you see the same behavior? This will help narrow down if it's an issue with your configuration or an issue with your scene.
Besides, we always recommend the latest Unity LTS (Long Term Support) stream as the best version to develop MR app, and the current recommendation is to use Unity 2019.4: https://unity3d.com/unity/qa/lts-releases
I want to bind the GVRSDK.ios (GOOGLE Cardboard SDK) library into my Xamarin project, but I am running into some issues.
When I try to create the binding using Sharpie I only get "GoogleToolboxForMac" and "GTMSessionFetcher" to bind; 2 sets of attributes and struct files are created.
The library I want is GVRSDK, but this library is not being bound. Can someone guide me in the right direction? this is my first binding for ios.
commands used:
sharpie pod init ios GVRSDK
sharpie pod bind
I read the documentation on the site, but I still can't get my head around it. must of the samples out there use simple library with not dependencies which do not apply to this.
EngagementReachAgent.Initialize(); does not exist in the current context
How can I solve this error? I can't deliver push notifications and I can't find anywhere a fix for this error. I tried everything that is explained here: https://azure.microsoft.com/en-us/documentation/articles/mobile-engagement-unity-android-get-started/
You did not follow the instruction well. You need to download and import the Azure Plugin package into Unity. You can get that here. When this is done, the EngagementReachAgent class and the Initialize() function will be present in your Unity project. They are not there now, so you can't use them.
After getting out of a mode of procastinating, I've finally gotten to the item on the projects todo-list that says "Run on virtual device to see why it crashes".
My project is a libgdx application that I plan on porting to various platforms, the two most important ones being desktop and android.
During development I've exclusively used the desktop launcher, as it's a lot easier to fire up when just checking minor things.
I did build an .apk at one point just to see if it'd run out of the box, but it didn't. Now that I've tried via the virtual device, this is what the log says:
com.badlogic.gdx.utils.GdxRuntimeException: Couldn't load file: rifleman.png
The same goes for any other assets that my game uses. The files are all placed in projectname/android/assets, as I remember learning way back that this was the way to go.
The virtual device I tried with is the stock Nexus 5, but I tried running the .apk by uploading it to my actual phone with a crash right after start, and I assume the problem is the same and is therefore unrelated to the test-platform.
I am currently not using an assetmanager, as implementing this is scheduled for after getting the basic core mechanics up and running. In the mean time I'm loading them as such: new Pixmap(new FileHandle("rifleman.png"));
Using android-studio 2.2 on Linux Mint 17 Cinnamon.
Please comment if more info is needed.
Use Gdx.files.internal() instead of FileHandle().
From filehandle(string) method info
Creates a new absolute FileHandle for the file name. Use this for
tools on the desktop that don't need any of the backends.
Do not use this constructor in case you write something cross-platform.
I just want to know, what can make the IAP prompt this kind of exception "com.nokia.mid.payment.IAPClientPaymentException: Application ID initialization fails" when you test it in the emulator.
Another related question: The documentation says to remove the TEST_MODE.TXT file when pointing it to live server. I removed the text but now I get Security Exception. Whats the workaround for this? I am using java sdk 2.0 and the emulator that came with it.