File Sharing is not working in Xamarin iOS - xamarin.ios

I am using the below code to share the file/image to other apps using xamarin ios. But it is not working properly. No exceptions. Code executing properly. But the app list is not launching. What is an issue in below code? Do we need to do any configuration settings changes in the project?
var documentName = shortName + ".pdf";
var ContentPath = Environment.GetFolderPath(Environment.SpecialFolder.Personal);
var fullFilename = Path.Combine(ContentPath, documentName);
NSData dataToShare = NSFileManager.DefaultManager.Contents(fullFilename);
var items = new NSObject[] { dataToShare };
var controller = new UIActivityViewController(items, null);
UIApplication.SharedApplication.KeyWindow.RootViewController.PresentViewController(controller, true, null);

I'm using this code and it is working properly:
var url = NSUrl.FromFilename(this.filePath);
var item = url.Copy();
var activityItems = new[] { item };
var activityController = new UIActivityViewController(activityItems, null);
float width = (float)this.PdfView.Frame.Width;
float height = (float)this.PdfView.Frame.Height;
UIPopoverController popoverController = new UIPopoverController(activityController);
popoverController.SetPopoverContentSize(new CGSize(width, height), true);
popoverController.PresentFromRect(new CGRect(0, 0, width, height), this.MainView, 0, true);

Related

How to get access to sprite 2D context?

Is it possible to get the CanvasRenderingContext2D at a sprite level instead of the whole canvas?
Has anyone tried to have access for it? any idea?
I tried:
child.addEventListener(RenderEvent.RENDER_CANVAS, function(event)
{
var renderer:openfl.display.DisplayObjectRenderer = event.renderer;
var ctx:CanvasRenderingContext2D = renderer.context;
ctx.shadowBlur = 20;
ctx.shadowColor = "black";
});
However, I am getting error:
openfl.display.DisplayObjectRenderer has no field context
How would I access the sprite’s CanvasRenderingContext2D ?
Try this casting the renderer to a CanvasRenderer:
child.addEventListener(RenderEvent.RENDER_CANVAS, function(event)
{
var renderer:openfl.display.CanvasRenderer = cast(event.renderer);
var ctx:CanvasRenderingContext2D = renderer.context;
ctx.shadowBlur = 20;
ctx.shadowColor = "black";
});

Cadlib draw model with images to bitmap

I am using Cadlib to read DWG, DXF and PDF architecture files, some files have images inside it, I create a DXFModel which I want to add those DXFImages to it then export to jpg, I always get a white image.
DxfModel dxfModel = new DxfModel();
foreach(DxfImage image in cadMimages)
{
dxfModel.Images.Add(image.ImageDef);
image.SetDefaultBoundaryVertices();
dxfModel.Entities.Add(image);
}
GraphicsConfig graphicsConfig = new GraphicsConfig(false, ArgbColors.White)
{
FixedForegroundColor = ArgbColors.Black,
CorrectColorForBackgroundColor = false,
ApplyLineType = true,
DisplayLineWeight = true,
DrawImages =true,
DrawImageFrame=true
};
var boundsCalculator = new BoundsCalculator();
boundsCalculator.GetBounds(dxfModel);
var bounds_ = boundsCalculator.Bounds;
Size maxSize = new Size(4096, 4096);
var width = maxSize.Width;
var height = maxSize.Height;
var point1 = new WW.Math.Point3D(0, height - 1, 0);
var point2 = new WW.Math.Point3D(width - 1, 0, 0);
var to2DTransform = DxfUtil.GetScaleTransform(bounds_.Corner1, bounds_.Corner2, point1, point2);
var bitmap = ImageExporter.CreateAutoSizedBitmap(dxfModel, new GDIGraphics3D(graphicsConfig), SmoothingMode.AntiAlias, to2DTransform, maxSize);
using (Stream stream = File.Create("d:/test.png"))
{
ImageExporter.EncodeImageToPng(bitmap, stream);
}

Azure server not letting me use a NuGet package

I have a website hosted by Azure that includes a Web API which I'm using to develop an android app. I'm trying to upload a media file to the server where it's encoded by a media encoder and saved to a path. The encoder library is called "Media Toolkit" which I found here : https://www.nuget.org/packages/MediaToolkit/1.0.0.3
My server side code looks like this:
[HttpPost]
[Route("upload")]
public async Task<HttpResponseMessage> Upload(uploadFileModel model)
{
var result = new HttpResponseMessage(HttpStatusCode.OK);
if (ModelState.IsValid)
{
string thumbname = "";
string resizedthumbname = Guid.NewGuid() + "_yt.jpg";
string FfmpegPath = Encoding_Settings.FFMPEGPATH;
string tempFilePath = Path.Combine(HttpContext.Current.Server.MapPath("~/video"), model.fileName);
string pathToFiles = HttpContext.Current.Server.MapPath("~/video");
string pathToThumbs = HttpContext.Current.Server.MapPath("~/contents/member/" + model.username + "/thumbs");
string finalPath = HttpContext.Current.Server.MapPath("~/contents/member/" + model.username + "/flv");
string resizedthumb = Path.Combine(pathToThumbs, resizedthumbname);
var outputPathVid = new MediaFile { Filename = Path.Combine(finalPath, model.fileName) };
var inputPathVid = new MediaFile { Filename = Path.Combine(pathToFiles, model.fileName) };
int maxWidth = 380;
int maxHeight = 360;
var namewithoutext = Path.GetFileNameWithoutExtension(Path.Combine(pathToFiles, model.fileName));
thumbname = model.VideoThumbName;
string oldthumbpath = Path.Combine(pathToThumbs, thumbname);
var fileName = model.fileName;
try
{
File.WriteAllBytes(tempFilePath, model.data);
}
catch (Exception e)
{
Console.WriteLine(e.Message);
}
using (var engine = new Engine())
{
engine.GetMetadata(inputPathVid);
// Saves the frame located on the 15th second of the video.
var outputPathThumb = new MediaFile { Filename = Path.Combine(pathToThumbs, thumbname + ".jpg") };
var options = new ConversionOptions { Seek = TimeSpan.FromSeconds(0), CustomHeight = 360, CustomWidth = 380 };
engine.GetThumbnail(inputPathVid, outputPathThumb, options);
}
Image image = Image.FromFile(Path.Combine(pathToThumbs, thumbname + ".jpg"));
//var ratioX = (double)maxWidth / image.Width;
//var ratioY = (double)maxHeight / image.Height;
//var ratio = Math.Min(ratioX, ratioY);
var newWidth = (int)(maxWidth);
var newHeight = (int)(maxHeight);
var newImage = new Bitmap(newWidth, newHeight);
Graphics.FromImage(newImage).DrawImage(image, 0, 0, newWidth, newHeight);
Bitmap bmp = new Bitmap(newImage);
bmp.Save(Path.Combine(pathToThumbs, thumbname + "_resized.jpg"));
//File.Delete(Path.Combine(pathToThumbs, thumbname));
using (var engine = new Engine())
{
var conversionOptions = new ConversionOptions
{
VideoSize = VideoSize.Hd720,
AudioSampleRate = AudioSampleRate.Hz44100,
VideoAspectRatio = VideoAspectRatio.Default
};
engine.GetMetadata(inputPathVid);
engine.Convert(inputPathVid, outputPathVid, conversionOptions);
}
File.Delete(tempFilePath);
Video_Struct vd = new Video_Struct();
vd.CategoryID = 0; // store categoryname or term instead of category id
vd.Categories = "";
vd.UserName = model.username;
vd.Title = "";
vd.Description = "";
vd.Tags = "";
vd.Duration = inputPathVid.Metadata.Duration.ToString();
vd.Duration_Sec = Convert.ToInt32(inputPathVid.Metadata.Duration.Seconds.ToString());
vd.OriginalVideoFileName = model.fileName;
vd.VideoFileName = model.fileName;
vd.ThumbFileName = thumbname + "_resized.jpg";
vd.isPrivate = 0;
vd.AuthKey = "";
vd.isEnabled = 1;
vd.Response_VideoID = 0; // video responses
vd.isResponse = 0;
vd.isPublished = 1;
vd.isReviewed = 1;
vd.Thumb_Url = "none";
//vd.FLV_Url = flv_url;
vd.Embed_Script = "";
vd.isExternal = 0; // website own video, 1: embed video
vd.Type = 0;
vd.YoutubeID = "";
vd.isTagsreViewed = 1;
vd.Mode = 0; // filter videos based on website sections
//vd.ContentLength = f_contentlength;
vd.GalleryID = 0;
long videoid = VideoBLL.Process_Info(vd, false);
return result;
}
else
{
throw new HttpResponseException(Request.CreateResponse(HttpStatusCode.NotAcceptable, "This request is not properly formatted"));
}
}
When the debugger hits the line using (var engine = new Engine()) I get 500 internal server error thrown. I don't get this error testing it on the iis server. Since it works fine on my local server and not on the azure hosted server, I figured it had to do with the Azure service rather than an error in my code. If so is the case then how would I be able to get around this issue? I don't want to use azure blob storage as it would require a lot of changes to my code. Does anyone have any idea what might be the issue.
Any helpful suggestions are appreciated.
Server.MapPath works differently on Azure WebApps - change to:
string pathToFiles = HttpContext.Current.Server.MapPath("~//video");
Also, see this SO post for another approach.

How do I change the pitch of a wave with sharpDX/Xaudio2 ?

All I want to do is play a WAV at random pitches in my Universal App. If there is a more straight forward way to do this than needed SharpDX tell me!
private void PlayTone(float randomPitch)
{
var xAudio = new XAudio2();
var masteringVoice = new MasteringVoice(xAudio);
var nativeFileStream = new NativeFileStream("Assets/440tone2.wav", NativeFileMode.Open, NativeFileAccess.Read, NativeFileShare.Read);
var stream = new SoundStream(nativeFileStream);
var waveFormat = stream.Format;
var buffer = new AudioBuffer
{
Stream = stream.ToDataStream(),
AudioBytes = (int)stream.Length,
Flags = BufferFlags.EndOfStream
};
var sourceVoice = new SourceVoice(xAudio, waveFormat, true);
sourceVoice.SubmitSourceBuffer(buffer, stream.DecodedPacketsInfo);
// sourceVoice.SetFrequencyRatio(200.0f);
sourceVoice.Start();
}
I had to switch these around
sourceVoice.SubmitSourceBuffer(buffer, stream.DecodedPacketsInfo);
sourceVoice.SetFrequencyRatio(200.0f);

how to compress/resize image in windows phone 8

I am creating a project in which I want to compress image so that it can be uploaded easily on windows azure and later can be retrieved easily from windows azure to my application.So can you please help me with how can I do that. I am using BitmapImage right now . Follwoing is the code which I am using to upload image to azure
void photoChooserTask_Completed(object sender, PhotoResult e)
{
if (e.TaskResult == TaskResult.OK)
{
BitmapImage bitmap = new BitmapImage();
bitmap.SetSource(e.ChosenPhoto);
WriteableBitmap wb = new WriteableBitmap(bitmap);
using (MemoryStream stream = new MemoryStream())
{
wb.SaveJpeg(stream, wb.PixelWidth, wb.PixelHeight, 0, 0);
AzureStorage storage = new AzureStorage();
storage.Account = **azure account**;
storage.BlobEndPoint = **azure end point**;
storage.Key = **azure key**;
string fileName = uid;
bool error = false;
if (!error)
{
storage.PutBlob("workerimages", fileName, imageBytes, error);
}
else
{
MessageBox.Show("Error uploading the new image.");
}
}
}
}
Be care using the WriteableBitmap as you may run out of memory if resizing a lot of images. If you only have a few, then pass the size you want saved to the SaveJpeg method. Also make sure you use a value higher than 0 for the quality (last param of SaveJpeg)
var width = wb.PixelWidth/4;
var height = wb.PixelHeight/4;
using (MemoryStream stream = new MemoryStream())
{
wb.SaveJpeg(stream, width, height, 0, 100);
...
...
}
You can also use the JpegRenderer from the Nokia Imaging SDK to resize an image.
var width = wb.PixelWidth/4;
var height = wb.PixelHeight/4;
using (var imageProvider = new StreamImageSource(e.ChosenPhoto))
{
IFilterEffect effect = new FilterEffect(imageProvider);
// Get the resize dimensions
Windows.Foundation.Size desiredSize = new Windows.Foundation.Size(width, height);
using (var renderer = new JpegRenderer(effect))
{
renderer.OutputOption = OutputOption.PreserveAspectRatio;
// set the new size of the image
renderer.Size = desiredSize;
IBuffer buffer = await renderer.RenderAsync();
return buffer;
}
}

Resources