In my application, I use AVPlayer to play videos. However, I do not use FairPlay.
I tried to use RPScreenRecorder to detect whether screen recording is on:
[RPScreenRecorder sharedRecorder].isRecording
but that does not work.
Does anyone know how to get whether screen recording is on?
Get the window of your app delegate and check isCaptured:
[appDel.window.screen isCaptured]
That will indicate if anything is "using" the screen: RPScreenRecorder, airplay, or mirroring.
Normally RPScreenRecorder is not compatible with AVplayer. That might be the cause of what you are going through
It is said in actual Documentation here: Here
ReplayKit is incompatible with AVPlayer content.
Related
I am trying to make an application that will "trigger" when a certain image is shown on screen. I have the app streaming the camera to the screen inside the app using AVCaptureSession. Now i was wondering how to make some sort of EventHandler for each frame of the video, in which i will check to see if the image contains one of my triggers. Does anyone know the best way to handle this? I could not find any resources on an OnFrame EventHandler for AVCaptureSession.
This is the tutorial i followed for showing the camera on screen
https://github.com/messier16/FullCameraPage/blob/master/FullCameraApp.iOS/CameraPageRenderer.cs
Any advice is awesome. Thank you!
I ended up doing a while loop inside of overriden ViewWillAppear() to grab the frame consistently
I have a Xpage that takes too long to load. So the client ask me to do a loading indicator.
I searched and found XSP.startAjaxLoading(), that I put in onClientLoad event of the Custom Control.
But now I don't know where should I put XSP.endAjaxLoading() to make the loading screen go away.
I'tried to use in afterRenderResponse and beforeRenderResponse: view.postScript("XSP.endAjaxLoading()"), since this comand is CSJS, but it doesn't work.
Thanks in advance.
I think you want to put it in the onComplete event. That can be difficult to find. You typically need to use the outline control to find it.
I have a video demo on NotesIn9 that has an example on this.
http://www.notesin9.com/2016/02/19/notesin9-188-adding-a-please-wait-to-xpages/?unique=http%3A//www.notesin9.com/2016/02/19/notesin9-188-adding-a-please-wait-to-xpages/
Your attempt (view.postscript) works only with full/partial updates and does not work for page loading.
You have used onClientLoad - which is executed when your page is finished with loading. I guess you get ajax animation after a while and it won't stop.
You should make preload screen - very simple XPage which starts animation and does not care to turn it off. In onClientLoad event redirect to your slow XPage. That will discard the animation.
I'd highly recommend using the Standby Dialog XSnippet https://openntf.org/XSnippets.nsf/snippet.xsp?id=standby-dialog-custom-control. I use it as a standard in all XPages applications.
I used this answer as solution: https://stackoverflow.com/a/35481981/5339322
I've saw it a few days ago, what made me think twice is that using this i should know what is doing my XPages to delay. I ran some tests and discovered what, and it was a call to a method in the afterRestoreView event, then I migrated it to onClientLoad event and used the solution in the answer above cited.
But I'm afraid that I have to keep an eye on it, so if someone adds some code that delays in one of the another events of XPages I have to move it again, of course, if it's possible, if it's not, I'll figure it out something diferent.
Thanks for all the answers ans comments.
I'm familiarizing myself with the Google Cast SDK by building a small test application, following the Cast SDK for Android guide. I've created a standard ActionBarActivity-based app as the guide suggests (even though ActionBarActivity is deprecated... shrug) I've added all the library dependencies, added the necessary XML to AndroidManifest.xml and menu_main.xml and am using the MediaRouteActionProvider to handle device detection and to show/hide the Cast button in the action bar. All is working well, and the "Connect to device" box appears as it should when I tap the Cast Button.
However, when I tap the Cast button again to disconnect, the "Stop Casting" box appears but it is unusually small.
The box is supposed to be wide enough to show the volume bar -- in my super-small version, the volume slider does in fact show up but it is unusably small. Compare to what it is supposed to look like, for example as in the YouTube app shown below.
Any idea what could be going on here? I am using basically the exact code that the Cast SDK for Android guide uses (the only addition being some custom Buttons with onClick listeners to control the media playback and to start casting several different test streams (both audio and video.)
This is due to changes that were made into the Media Router Support library and will be fixed in future updates to that library. If absolutely needed, the current workaround is to override that dialog.
I am using this line of code to start the Video Component:
videoComponent = VideoComponent.createVideoPeer("capture://video");
The code I have works perfectly on a Nokia phone but I have another phone by OLG and this line always fails. Both "capture://video" and "capture://image" don't work.
Does anyone know how to find out whats the proper string to put there or all the possibilities?
I would search it but I don't know what to type into Google.
VideoComponent in LWUIT 1.5 had many bugs and since LWUIT is effectively abandoned it will probably never be fixed and never really worked well. With Codename One we migrated to a more traditional Media approach, I would recommend migrating to Codename One.
if you use the Google Plus App on Android and switch to the Stream, you get a view where you can swipe to the left and right between the All circles/Incoming/Nearby-Stream. What view component is used for this? Is this a standard Android component? Or where can I find democode how i can build such a view component?
You should take a look at the ViewPager from Android Compatibility Package for the desired widget/swipe navigation. Find more about it here
http://developer.android.com/sdk/compatibility-library.html
Also, checkout this recently posted tutorial and some sample code on ViewPager by Richard:
http://geekyouup.blogspot.com/2011/07/viewpager-example-from-paug.html
It is a combination of a ViewPager together with an indicator for where you are currently and where you can go swiping left and right.
A sample of how this can be done along with code you can use in your own apps may for example be found here. I've played with this code a little and it works pretty well.
None of the default widgets/views. I guess, it's some kind of a custom view with swipe functionality.
Honeycomb opens up a few new widgets which seem to have these functionality. Have a look here.
http://developer.android.com/sdk/android-3.0.html (New Widgets)
I used APKTool to take a look at what's going on. Hopefully it is okay to post this here. This is from version 1.0.2 of the G+ APK.
removed google+ app code as per CommonsWare's suggestion
So, it looks like they're using standard views, though perhaps with a good deal of gesture detection and smooth animation magic.
EDIT) If you really want to know about the exact inner-workings of what is going on in the Stream activity, I suggest you use APKTool yourself and examine the .smali code