Does ActionBar Sherlock cause problems with MediaRecorder? - actionbarsherlock

I have a problem using MediaRecorder to record video with ActionBar Sherlock. I have a base app that sets up and records video files without any problem, but as soon as I add ABS, it stops working.
Right now I have a non-ABS app that can record video with no problems at all and the same app (in all other respects) with ABS that doesnt work.
Initialising the video recorder using the CAMCORDER settings gets through the MediaRecorder setup OK, but when I stop recording, I get a '-1' error in the logcat and the resulting .MP4 file has no audio or video track (although the file size looks OK).
If I try setting MediaRecorder with maunaul setups (i.e., format, frame rate, size etc), the setup keeps falling over at 'setVideoSource...CAMERA'.
Does anyone know if there's any reason why ABS would upset MediaRecorder this way?
Added 01/01/13:The application I'm working on is being upgraded from a photo-taking app to a video-recording app. The preview for the camera was previously done with a ImageView frame, which works OK for providing an image preview, but not for use as the video preview on MediaRecorder. For MediaRecorder, I needed to use a SurfaceView. Once I switched to a SurfaceView, all was good!

Related

How does one play higher resolution videos (4k and higher) in VLC (desktop app and Unity)?

I'm trying to get a 360 streamed video to play at its full (4k) resolution. I did do previous searching and found the below thread but it didn't answer my question.
How do I adjust the video resolution?
My end goal is to integrate this into vlc unity but I also don't see how to adjust the resolution up to 4k (or higher) for streamed videos even on the desktop app. As a test, I tried a 4k video from youtube in the VLC desktop app and couldn't see how to get it to play in 4k. The end goal is to stream from aws but I want to confirm that my stream can indeed play at 4k before continuing to try to figure out why it's playing at lower resolution within Unity. It resizes down to 1920x1080 instead of the intended 4096x2048 for my aws stream and resizes the youtube video down to 1280x720 from its 3840x2160.
I've tried messing with the advanced settings and looking at input/codecs and the best option for "preferred video resolution" is "best available" (below that is full hd). I've looked through other suggestions online and didn't see a lot to change but did change Settings-->Video-->Output to "DirectX (DirectDraw) video output. This one was just me fumbling through the desktop app, but I tried going through Media-->Open Network Stream, clicking the arrow next to play and selecting stream, choosing RTSP for destination, and then editing the resolution within the profile to be a specified width and height. None of the above changed the resolution of the video.
TL;DR - How can I play a streamed video at higher resolutions (4k and up) in the VLC desktop app and in VLC Unity?
Edit - adding requested code snippets. This just modifies the UseRenderingPlugin file from the VLC unity asset.
Also, similar to what was getting discussed in the comments, when I play the video locally instead of streaming, it plays at the full resolution of the video.
Within Awake() I set the material to the 4096x2048 one mentioned for the skybox
RenderSettings.skybox = material4k;
And then in Update here is the relevant segment
var texptr = _mediaPlayer.GetTexture(out bool updated);
            if (updated && texptr != IntPtr.Zero)
            {
                Debug.Log("Creating texture ");
                tex = Texture2D.CreateExternalTexture((int)width,
                    (int)height,
                    TextureFormat.RGBA32,
                    false,
                    true,
                    texptr);
                RenderSettings.skybox.mainTexture = tex;
                }

medialement.js not working for my hls on iphone

Here is my m3u8 file:
cat 8.m3u8
#EXTM3U
#EXT-X-VERSION:3
#EXT-X-ALLOW-CACHE:YES
#EXT-X-MEDIA-SEQUENCE:1131
#EXT-X-TARGETDURATION:5
#EXTINF:4.950, no desc
1545049888215.ts
#EXTINF:4.950, no desc
1545049893218.ts
I serve it at as static file at http://104.248.205.68:31339/8.m3u8
I use mediaelemnts.js to run this hls video: jsfiddle
html:
<video width="240" height="160"
id="player1" src="http://104.248.205.68:31339/8.m3u8"
controls="controls" autoplay preload="auto" muted ></video>
js:
$('video').mediaelementplayer({});
It works fine on chrome mac os desktop. But not working on iphone 8+ (safari and chrome). No errors in the console. Video just not played, black screen. In the fullscreen mode of video - the same.
At the same time, if I find a random m3u8 on the internet and use mediaelemnts.js to play it, it works well on the iPhone (at least in fullscreen mode) jsfiddle 2.
So I guess something wrong with my m3u8 file since other m3u8s are runnable on the iPhone.
If I open network tab while loading the problem page on Iphone, I see it's downloading the files but not showing video for some reason.
Update
I checked on android: galaxy s5 and galaxy s9+ in chrome: both works.
Update 2
Zip archive with ts files and m3u8: http://104.248.205.68:31339/8.m3u8.zip
When inspecting the media files [1] it can be found that the PMT (Program Map Table) signals that there is audio in the stream, but there are actually no TS packets for audio (i.e. no audio data) present.
It looks like the player waits for the audio TS packets in order to build a common buffer for both, audio and video, and only then start playback. Since the stream lacks audio data that never happens. To back that up, you can use ffmpeg to remove the audio track from the media segments using the below command and find that playback would work once you do this.
ffmpeg -i 1545049893218.ts -an -vcodec copy 1545049893218-v.ts
Further, the reason this problem only manifests in Safari and Chrome on iOS is that in those cases mediaelement.js is using the browser's native capabilities to play HLS, instead of a JavaScript player (like hls.js) which is used on other platforms (e.g. Chrome on desktop) and is more tolerant to such problem cases.
[1] E.g. using http://thumb.co.il/ or ffprobe
EDIT:
While the above may be sufficient to make it work on older Apple mobile devices - I tested on iPhone 6 iOS10 - newer devices seem to be more restrictive. The official HLS Authoring Spec states
8.11. You MUST provide at least 6 segments in a live/linear playlist.
which does not seem to be a hard requirement on some iOS versions. However, to ensure it to be working on all versions these requirements should be met.
I did a quick test on iPhone X with iOS12 and found it would play if at least 3 segments are provided in the playlist by just duplicating the last segment entry.

Adobe edge animate: Audio starts playing even if preloader is there

[when I am on slow internet connectivity or images are of larger size comes from server and taking time to load]
Images taking time to load, so animation starts playing even if images are not loaded yet, to overcome this I have used preloader.
Now problem is audio I have used on this page loads early and starts playing even if page still showing preloader.
So it doesnt seem to work
Que1:
for images I have noticed until all assets are ready, edge min file attaches style
.edgeLoad-EDGE-137360589 { visibility:hidden; }
so for certain time it shows white screen as all other things it making hidden but animation runs in background, which doesnt look good.
Que2:
To resolve above issue, I tried preloader. By using it I can resolve above issue but it creates problem with audio.
Even if preloader is there, my audio starts playing in background.
[audio preload is set to false, but as and when it loads it starts playing not waiting for preloader to complete]

HTML5 video on iOS5 not loading/showing

I have an HTML/JS app running in a webview in an iPad app. The app uses the HTML5 video tag. Videos work fine in iOS4.3 but today I've tested on iOS5 and the videos simply do not show up.
I have verified that it is not a layout related issue by setting background color and borders on the video element.
The same behaviour is evident irrespective of whether the app is run directly in mobile safari /from the home screen or within the webview.
The template for the video is simply:
<video controls src='{url}'></video> //where {url} is substituted at runtime.
The relevant video url plays correctly directly in the mobile Safari on iOS5.
I have tried to proxy the app comms and it seems that it does start loading the video but then stops, no video controls show and only the background color I have set shows thru.
Any ideas would be greatly appreciated. Thanks.
Have you tried create an empty webview, without additional parameters and scripts and make sure that you call it only once? I had the same issue when i call it twice without clearing previously created one - just audio was played.
Try to look at http://blog.millermedeiros.com/2011/03/html5-video-issues-on-the-ipad-and-how-to-solve-them/ and see if the fix works for you...
he he - should have read the fine print - did not notice you had answered yourself - remember to tick the thread off as answered
I've fixed this in code by changing the width and height by a pixel once the video element is created. Must invoke a repaint or something to that effect

Images do not show up when I run my app on iOS device

I have images titled like so in my app:
image~iPhone.png
image#2x~iPhone.png
In interface builder I am loading image.png into my UIImageView. I also programatically load some images into a different view using imageWithContentsOfFile. The images all load fine when I run in the simulator but I get no images when I run on the device. If I use the full name of the image in interface builder it works but I want iOS to distinguish between high res and lower res. I have tried a lot of different things but can't figure this out. I see this error in the debugger as well:
Could not load the "image.png" image referenced from a nib in the bundle with identifier "com.mycompany.myproject"
Xcode 4
Deployment Target 4.1
Base SDK 4.3
Thanks for any help.
Ok...so after much experimenting I got it working.
I had two images named:
image#2x~iPhone.png
image~iPhone.png
and I was trying to load them using IB or imageWithContentsOfFile using
image.png
This worked fine in the simulator but not on my device. I just got a blank white screen where the image should be.
I finally renamed the high resolution image to:
image~iPhone#2x.png
Moving the '#2x' modifier after the device modifier(~iPhone) when referencing my images allowed it to work the way I understood that it should from reading Apple's docs. I was under the impression that you didn't need to include the device modifier when referencing images but I had to.
To sum up, I am now using
- image~iPhone.png
to reference my images in IB and programatically for some images. I now get iOS recognizing that I am on a retina screen and loading the #2x images accordingly. So the #2x modifier had to go at the end and the ~iPhone modifier had to be included in the name of the '.png'.
That is what worked for me. Hope it helps someone else. Note that I am only building my app for iOS4.1 and above so there might be some issues with this if you are supporting previous version.
iOS does not automatically pick the right image for the device like that. You are going to have to write code to check which device it is, and set the image by full name.
e.g. if ([[UIScreen mainScreen] scale] == 2) // set hi res image
Or, you can just use the same image in both, and set the content mode to scale to fill. It will look the same.
EDIT: Try writing either ~iphone (lowercase), or just don't write ~iPhone at all on the file name. If your app is not universal, then writing the ~iphone suffix is completely pointless.
iOS file system is case sensitive and device modifiers should be lowercase, it should be
image~iphone.png
image#2x~iphone.png
The #2x comes before the device modifier.
See the resource programming guide

Resources