how to android camera record provider - android-camerax

I'm making a mobile app that takes a video with a camera and uploads it to a server.
I am using CameraX to record the camera.
But the video is saved in the gallery
val mediaStoreOutput = MediaStoreOutputOptions.Builder(
requireActivity().contentResolver,
MediaStore.Video.Media.EXTERNAL_CONTENT_URI
)
I knew how to set the video storage location here, so I tried to edit this place.
How can I save the video to the fileProvider without saving it to the gallery??

Related

How can I create a MediaStream track from a continuous stream of images in node.js? (for usage with WebRTC)

I want to stream a robot cam from a web media element. I have access to the camera in node.js, which is providing a live stream of images (continually producing a new frame at ~20fps).
In this same situation in the browser, one could write the image to a canvas and capture the stream.
Is there some way to construct a MediaStreamTrack object that can be directly added to the RTCPeerConnection, without having to use browser-only captureStream or getUserMedia APIs?
I've tried the npm module canvas, which is supposed to port canvas to node -- then maybe I could captureStream the canvas after writing the image to it. But that didn't work.
I'm using the wrtc node WebRTC module with the simple-peer wrapper.
Check out the video-compositing example here.
https://github.com/node-webrtc/node-webrtc-examples

Can't show static loaded videos by Express.js

I'm serving some static HTML, CSS and JS files and also a folder called tmp with some images and video files using Express.js for my Node app:
app.use(express.static("build"));
app.use(express.static("../tmp"));
When I go to http://localhost:3003, it loads up my app very nicely and it loads all the images on my webpage(located in the tmp folder) but the problem is every video file looks like this:
If I press fullscreen on the video player or even visit the url directly http://localhost:3003/video_1.mp4, it works.
Is this a problem with Express.js trying to stream the video data from the tmp folder? I really don't know how to solve this issue. I tried to delay the playback and use a 3rd party library to play the video but no luck.
Seems to work when I directly specify the whole path localhost:3003/picture.png in src of the video element

Video quality in Video View

Here is code of web view to load video using URL
mediacontroller = new MediaController(this);
mediacontroller.setAnchorView(vv);
String uriPath = "https://firebasestorage.googleapis.com/v0/b/fire-b6fff.appspot.com/o/Nissan_-_Ignite_the_Excitement(1).mp4?alt=media&token=2f329bc8-7045-4f4e-a683-64169fc4562c"; //update package name
uri = Uri.parse(uriPath);
vv.setOnCompletionListener(new MediaPlayer.OnCompletionListener() {
#Override
public void onCompletion(MediaPlayer mp) {
if(isContinuously){
vv.start();
}
}
});
How to set video qualities like 180p,360p like youtube in video view
You may find it easier to use ExoPlayer to do this:
https://github.com/google/ExoPlayer
It is supported by Google and also used as the basis for many mainstream android video players. Its described like this at the link above (at the time of writing):
ExoPlayer is an application level media player for Android. It provides an alternative to Android’s MediaPlayer API for playing audio and video both locally and over the Internet. ExoPlayer supports features not currently supported by Android’s MediaPlayer API, including DASH and SmoothStreaming adaptive playbacks. Unlike the MediaPlayer API, ExoPlayer is easy to customize and extend, and can be updated through Play Store application updates.
The Exoplayer demo application includes track selection as standard:
https://google.github.io/ExoPlayer/demo-application.html

Capture Thumbnails from a Video on Azure Media Service

I'm trying to use a Image Thumbnail from a Video on Azure Media Service.
I can't understand if a thumbnail is made automatically And if so - then what is the URI for it.
Documentation talks about 'Thumbnail Collections' in AssetFile - but I can't find anything further.
Any ideas?
Thanks
Here is a sample code to add thumbnail task to encoding job
ITask task = job.Tasks.AddNew("My thumbnail task",
processor,
"Thumbnails",
TaskOptions.None);
You can control thumbnail task parameters by using xml preset instead of system named pereset. I pasted it from sdk github repo file Jobtests.cs
string presetXml = #"<?xml version=""1.0"" encoding=""utf-8""?>
<Thumbnail Size=""80,60"" Type=""Jpeg"" Filename=""{OriginalFilename}_{ThumbnailTime}.{DefaultExtension}"">
<Time Value=""0:0:0""/>
<Time Value=""0:0:3"" Step=""0:0:0.25"" Stop=""0:0:10""/>
</Thumbnail>";
IJob job = CreateAndSubmitOneTaskJob(_mediaContext, name, mediaProcessor, presetXml, asset, TaskOptions.None);
var task = job.Tasks.First();
var asset = task.OutputAssets.First();
var files = asset.AssetFiles.ToList();
Run test ShouldFinishJobWithSuccessWhenPresetISUTF8() which is using thumbnail preset and you will find that job generates 1 output asset which will have around 30 files. To download these files you can simply call Download or DownloadAsync.
files[0].Download()
If you need to get url for selected file you can execute following code:
var accessPolicy = mediaContext.AccessPolicies.Create("12HoursRead", TimeSpan.FromHours(12), AccessPermissions.Read);
//Creating read-only access url which will be available for 12 hours
var locator = mediaContext.Locators.CreateSasLocator(asset, accessPolicy);
//Getting url for first file in collection
UriBuilder uriBuilder = new UriBuilder(locator.BaseUri);
uriBuilder.Path += String.Concat("/", files[0].Name);
Please note that all Azure media asset files are stored in Azure Storage.
If you have high volume website, it will be better to download thumbnails from storage and publish them through CDN.
MSDN docs related to thumbnail preset

.wav sound file stored on server does NOT play when Loaded in web application

this is the scenario i'm trying to achieve: a sound stored on the same server as a web application, plays when a condition is met on the client. It works perfectly when I run it in the IDE and change the webconfig to point to the server where the DB is. However when I deploy it and access it via the browser, the sound does not play. The same sound that played when i used my development machine. Code is:
var configsetings = new System.Configuration.AppSettingsReader();
string soundPath= configsetings.GetValue("Notification",typeof(System.String)).ToString();
var sound = new System.Media.SoundPlayer { SoundLocation = Server.MapPath(soundPath) };
sound.Load();
sound.Play();
web config is:
<add key="Notification" value="~/beep-4.wav" />
The sound file is sitting in the root folder of the ASP.NET web application. So what could be wrong? There is no audio output device on the server neither is there a player like media player nevertheless these factors did NOT stop it from working in my dev machine.
Looking at the code you posted I will assume you wrote it in C#.
So, this code will run on the server-side, and the client-side (the web browser) will never know about it or about your audio file. Please read about asp.net code-behind and how it works. If you want to play an audio file in the browser (client-side), you need to use either javascript, or flash, or the < audio > tag from html5.
By installing a sound card on a server you will only achieve (in a best case scenario) to get the file played on that server.
Thanks yms, the tag worked. I put a routine that writes the tag's HTML to a div at run time and put it in a timer.
sounddiv.InnerHtml = "<audio preload=\"auto\" autoplay=\"autoplay\">" +
"<source src=\"" + soundPath + "\" type=\"audio/wav\" />" +
" Your browser does not support the audio tag. </audio>";
This code is called in the code behind in a timer in response to the condition.
So the sound repeats every 30 seconds. Problem solved. Thanks guys for the leads.

Resources