App crashes when it tries to show sharing dialog - android-studio

I'm trying to implement a sharing function into my app, but it crashes every time it tries to bring up the sharing dialog.
The errors occur in these lines of code
val builder = AlertDialog.Builder(mContext)
builder.setTitle(mContext.getString(R.string.dialog_title_options))
builder.setItems(items) { dialog, item ->
if (item == 0) {
shareFileDialog(holder.adapterPosition)
}
if (item == 1) {
renameFileDialog(holder.adapterPosition)
} else if (item == 2) {
deleteFileDialog(holder.adapterPosition)
}
}
**2nd portion**
fun shareFileDialog(position: Int) {
val shareIntent = Intent()
shareIntent.action = Intent.ACTION_SEND
shareIntent.putExtra(Intent.EXTRA_STREAM, Uri.fromFile(File(getItem(position)?.filePath)))
shareIntent.type = "audio/mp4"
mContext.startActivity(
Intent.createChooser(
shareIntent,
mContext.getText(R.string.send_to)
)
)
}
Crash log
android.os.FileUriExposedException: file:///storage/emulated/0/SoundRecorder/3f4f4c.mp4 exposed beyond app through ClipData.Item.getUri()
at android.os.StrictMode.onFileUriExposed(StrictMode.java:2083)
at android.net.Uri.checkFileUriExposed(Uri.java:2395)
at android.content.ClipData.prepareToLeaveProcess(ClipData.java:980)
at android.content.Intent.prepareToLeaveProcess(Intent.java:11455)
at android.content.Intent.prepareToLeaveProcess(Intent.java:11461)
at android.content.Intent.prepareToLeaveProcess(Intent.java:11440)
at android.app.Instrumentation.execStartActivity(Instrumentation.java:1714)
at android.app.Activity.startActivityForResult(Activity.java:5258)
at androidx.fragment.app.FragmentActivity.startActivityForResult(FragmentActivity.java:675)
at android.app.Activity.startActivityForResult(Activity.java:5203)
at androidx.fragment.app.FragmentActivity.startActivityForResult(FragmentActivity.java:662)
at android.app.Activity.startActivity(Activity.java:5587)
at android.app.Activity.startActivity(Activity.java:5555)
at com.khumomashapa.notes.adapter.FileViewerAdapter.shareFileDialog(FileViewerAdapter.kt:175)
at com.khumomashapa.notes.adapter.FileViewerAdapter$onBindViewHolder$2$1.onClick(FileViewerAdapter.kt:73)
This is what the log shows me when it crashes. I'm trying to get the app to share or send a file to another device.

If you are targeting Android N (API level 24) or above, you have to either use FileProvider or ask VM to ignore file URI exposure by adding
StrictMode.VmPolicy.Builder builder = new StrictMode.VmPolicy.Builder();
StrictMode.setVmPolicy(builder.build());
in Application.onCreate() method.
How to use FileProvider is explained here.
My personal recommendation is to use FileProvider.

Related

create custom module for pdf manipulation

I want to create a custom Kofax module. When it comes to the batch processing the scanned documents get converted to PDF files. I want to fetch these PDF files, manipulate them (add a custom footer to the PDF document) and hand them back to Kofax.
So what I know so far:
create Kofax export scripts
add a custom module to Kofax
I have the APIRef.chm (Kofax.Capture.SDK.CustomModule) and the CMSplit as an example project. Unfortunately I struggle getting into it. Are there any resources out there showing step by step how to get into custom module development?
So I know that the IBatch interface represents one selected batch and the IBatchCollection represents the collection of all batches.
I would just like to know how to setup a "Hello World" example and could add my code to it and I think I don't even need a WinForms application because I only need to manipulate the PDF files and that's it...
Since I realized that your question was rather about how to create a custom module in general, allow me to add another answer. Start with a C# Console Application.
Add Required Assemblies
Below assemblies are required by a custom module. All of them reside in the KC's binaries folder (by default C:\Program Files (x86)\Kofax\CaptureSS\ServLib\Bin on a server).
Setup Part
Add a new User Control and Windows Form for setup. This is purely optional - a CM might not even have a setup form, but I'd recommend adding it regardless. The user control is the most important part, here - it will add the menu entry in KC Administration, and initialize the form itself:
[InterfaceType(ComInterfaceType.InterfaceIsIDispatch)]
public interface ISetupForm
{
[DispId(1)]
AdminApplication Application { set; }
[DispId(2)]
void ActionEvent(int EventNumber, object Argument, out int Cancel);
}
[ClassInterface(ClassInterfaceType.None)]
[ProgId("Quipu.KC.CM.Setup")]
public class SetupUserControl : UserControl, ISetupForm
{
private AdminApplication adminApplication;
public AdminApplication Application
{
set
{
value.AddMenu("Quipu.KC.CM.Setup", "Quipu.KC.CM - Setup", "BatchClass");
adminApplication = value;
}
}
public void ActionEvent(int EventNumber, object Argument, out int Cancel)
{
Cancel = 0;
if ((KfxOcxEvent)EventNumber == KfxOcxEvent.KfxOcxEventMenuClicked && (string)Argument == "Quipu.KC.CM.Setup")
{
SetupForm form = new SetupForm();
form.ShowDialog(adminApplication.ActiveBatchClass);
}
}
}
Runtime Part
Since I started with a console application, I could go ahead and put all the logic into Program.cs. Note that is for demo-purposes only, and I would recommend adding specific classes and forms later on. The example below logs into Kofax Capture, grabs the next available batch, and just outputs its name.
class Program
{
static void Main(string[] args)
{
AppDomain.CurrentDomain.AssemblyResolve += (sender, eventArgs) => KcAssemblyResolver.Resolve(eventArgs);
Run(args);
return;
}
static void Run(string[] args)
{
// start processing here
// todo encapsulate this to a separate class!
// login to KC
var login = new Login();
login.EnableSecurityBoost = true;
login.Login();
login.ApplicationName = "Quipu.KC.CM";
login.Version = "1.0";
login.ValidateUser("Quipu.KC.CM.exe", false, "", "");
var session = login.RuntimeSession;
// todo add timer-based polling here (note: mutex!)
var activeBatch = session.NextBatchGet(login.ProcessID);
Console.WriteLine(activeBatch.Name);
activeBatch.BatchClose(
KfxDbState.KfxDbBatchReady,
KfxDbQueue.KfxDbQueueNext,
0,
"");
session.Dispose();
login.Logout();
}
}
Registering, COM-Visibility, and more
Registering a Custom Module is done via RegAsm.exe and ideally with the help of an AEX file. Here's an example - please refer to the documentation for more details and all available settings.
[Modules]
Minimal CM
[Minimal CM]
RuntimeProgram=Quipu/CM/Quipu.KC.CM/Quipu.KC.CM.exe
ModuleID=Quipu.KC.CM.exe
Description=Minimal Template for a Custom Module in C#
Version=1.0
SupportsTableFields=True
SupportsNonImageFiles=True
SetupProgram=Minimal CM Setup
[Setup Programs]
Minimal CM Setup
[Minimal CM Setup]
Visible=0
OCXFile=Quipu/CM/Quipu.KC.CM/Quipu.KC.CM.exe
ProgID=Quipu.KC.CM.Setup
Last but not least, make sure your assemblies are COM-visible:
I put up the entire code on GitHub, feel free to fork it. Hope it helps.
Kofax exposes a batch as an XML, and DBLite is basically a wrapper for said XML. The structure is explained in AcBatch.htm and AcDocs.htm (to be found under the CaptureSV directory). Here's the basic idea (just documents are shown):
AscentCaptureRuntime
Batch
Documents
Document
A single document has child elements itself such as pages, and multiple properties such as Confidence, FormTypeName, and PDFGenerationFileName. This is what you want. Here's how you would navigate down the document collection, storing the filename in a variable named pdfFileName:
IACDataElement runtime = activeBatch.ExtractRuntimeACDataElement(0);
IACDataElement batch = runtime.FindChildElementByName("Batch");
var documents = batch.FindChildElementByName("Documents").FindChildElementsByName("Document");
for (int i = 0; i < documents.Count; i++)
{
// 1-based index in kofax
var pdfFileName = documents[i + 1]["PDFGenerationFileName"];
}
Personally, I don't like this structure, so I created my own wrapper for their wrapper, but that's up to you.
With regard to the custom module itself, the sample shipped is already a decent start. Basically, you would have a basic form that shows up if the user launches the module manually - which is entirely optional if work happens in the back, preferably as Windows Service. I like to start with a console application, adding forms only when needed. Here, I would launch the form as follows, or start the service. Note that I have different branches in case the user wants to install my Custom Module as service:
else if (Environment.UserInteractive)
{
// run as module
Application.EnableVisualStyles();
Application.SetCompatibleTextRenderingDefault(false);
Application.Run(new RuntimeForm(args));
}
else
{
// run as service
ServiceBase.Run(new CustomModuleService());
}
}
The runtime for itself just logs you into Kofax Capture, registers event handlers, and processes batch by batch:
// login to KC
cm = new CustomModule();
cm.Login("", "");
// add progress event handlers
cm.BatchOpened += Cm_BatchOpened;
cm.BatchClosed += Cm_BatchClosed;
cm.DocumentOpened += Cm_DocumentOpened;
cm.DocumentClosed += Cm_DocumentClosed;
cm.ErrorOccured += Cm_ErrorOccured;
// process in background thread so that the form does not freeze
worker = new BackgroundWorker();
worker.DoWork += (s, a) => Process();
worker.RunWorkerAsync();
Then, your CM fetches the next batch. This can either make use of Kofax' Batch Notification Service, or be based on a timer. For the former, just handle the BatchAvailable event of the session object:
session.BatchAvailable += Session_BatchAvailable;
For the latter, define a timer - preferrably with a configurable polling interval:
pollTimer.Interval = pollIntervalSeconds * 1000;
pollTimer.Elapsed += PollTimer_Elapsed;
pollTimer.Enabled = true;
When the timer elapses, you could do the following:
private void PollTimer_Elapsed(object sender, System.Timers.ElapsedEventArgs e)
{
mutex.WaitOne();
ProcessBatches();
mutex.ReleaseMutex();
}

Transition is lost when using PRISM in UWP

I am making my first steps in Universal Windows Platform (Windows 10 store apps technology). I have an application with 2 pages and use the navigation service to navigate between them. I am used to having a transition applied when navigating between pages, but now I don't see any transition.
I tried to set the content transition manually but that does not work as well:
protected override Task OnLaunchApplicationAsync(LaunchActivatedEventArgs args)
{
var shell = Shell as Frame;
shell.ContentTransitions = new TransitionCollection();
shell.ContentTransitions.Clear();
shell.ContentTransitions.Add(new ContentThemeTransition { HorizontalOffset = 300 });
this.NavigationService.Navigate(Experiences.Main, null);
return Task.FromResult<object>(null);
}
Is there any other way to configure the navigation service to apply a transition?
I found the solution for the problem. I ended up using the **EntranceThemeTransition** and it seemed to work:
shell.ContentTransitions.Add(new EntranceThemeTransition {
FromHorizontalOffset = 400,
FromVerticalOffset = 0,
IsStaggeringEnabled = true});

NServiceBus Event Subscriptions Not Working With Azure Service Bus

I'm attempting to modify the Azure-based Video Store sample app so that the front-end Ecommerce site can scale out.
Specifically, I want all instances of the web site to be notified of events like OrderPlaced so that no matter which web server the client web app happens to be connected to via SignalR, it will correctly receive the notification and update the UI.
Below is my current configuration in the Global.asax:
Feature.Disable<TimeoutManager>();
Configure.ScaleOut(s => s.UseUniqueBrokerQueuePerMachine());
startableBus = Configure.With()
.DefaultBuilder()
.TraceLogger()
.UseTransport<AzureServiceBus>()
.PurgeOnStartup(true)
.UnicastBus()
.RunHandlersUnderIncomingPrincipal(false)
.RijndaelEncryptionService()
.CreateBus();
Configure.Instance.ForInstallationOn<Windows>().Install();
bus = startableBus.Start();
And I've also configured the Azure Service Bus queues using:
class AzureServiceBusConfiguration : IProvideConfiguration<NServiceBus.Config.AzureServiceBusQueueConfig>
{
public AzureServiceBusQueueConfig GetConfiguration()
{
return new AzureServiceBusQueueConfig()
{
QueuePerInstance = true
};
}
}
I've set the web role to scale to two instances, and as expected, two queues (ecommerce and ecommerce-1) are created. I do not, however, see additional topic subscriptions being created under the videostore.sales.events topic. Instead, I see:
I would think that you would see VideoStore.ECommerce-1.OrderCancelled and VideoStore.ECommerce-1.OrderPlaced subscriptions under the Videostore.Sales.Events topic. Or is that not how subscriptions are stored when using Azure Service Bus?
What am I missing here? I get the event on one of the ecommerce instances, but never on both. Even if this isn't the correct way to scale out SignalR, my use case extends to stuff like cache invalidation.
I also find it strange that two error and audit queues are being created. Why would that happen?
UPDATE
Yves is correct. The AzureServiceBusSubscriptionNamingConvention was not applying the correct individualized name. I was able to fix this by implementing the following EndpointConfig:
namespace VideoStore.ECommerce
{
public class EndpointConfig : IConfigureThisEndpoint, IWantCustomInitialization
{
public void Init()
{
AzureServiceBusSubscriptionNamingConvention.Apply = BuildSubscriptionName;
AzureServiceBusSubscriptionNamingConvention.ApplyFullNameConvention = BuildSubscriptionName;
}
private static string BuildSubscriptionName(Type eventType)
{
var subscriptionName = eventType != null ? Configure.EndpointName + "." + eventType.Name : Configure.EndpointName;
if (subscriptionName.Length >= 50)
subscriptionName = new DeterministicGuidBuilder().Build(subscriptionName).ToString();
if (!SettingsHolder.GetOrDefault<bool>("ScaleOut.UseSingleBrokerQueue"))
subscriptionName = Individualize(subscriptionName);
return subscriptionName;
}
public static string Individualize(string queueName)
{
var parser = new ConnectionStringParser();
var individualQueueName = queueName;
if (SafeRoleEnvironment.IsAvailable)
{
var index = parser.ParseIndexFrom(SafeRoleEnvironment.CurrentRoleInstanceId);
var currentQueue = parser.ParseQueueNameFrom(queueName);
if (!currentQueue.EndsWith("-" + index.ToString(CultureInfo.InvariantCulture))) //individualize can be applied multiple times
{
individualQueueName = currentQueue
+ (index > 0 ? "-" : "")
+ (index > 0 ? index.ToString(CultureInfo.InvariantCulture) : "");
}
if (queueName.Contains("#"))
individualQueueName += "#" + parser.ParseNamespaceFrom(queueName);
}
return individualQueueName;
}
}
}
I could not, however, get NServiceBus to recognize my EndpointConfig class. Instead, I had to call it manually before starting the bus. From my Global.asax.cs:
new EndpointConfig().Init();
bus = startableBus.Start();
Once I did this, the subscription names appeared as expected:
Not sure why it's ignoring my IConfigureThisEndpoint, but this works.
This sounds like a bug, can you raise a github issue on this at https://github.com/Particular/NServiceBus.Azure
That said, I think it's better to use signalr's scaleout feature instead of using QueuePerInstance as signalr needs to replicate other information like (connection/group mappings) internally as well when running in scaleout mode.
Update:
I think I see the issue, the subscriptions should be individualised as well, which isn't the case in current naming conventions
https://github.com/Particular/NServiceBus.Azure/blob/master/src/NServiceBus.Azure.Transports.WindowsAzureServiceBus/NamingConventions/AzureServiceBusSubscriptionNamingConvention.cs
while it is in the queuenamingconventions
https://github.com/Particular/NServiceBus.Azure/blob/master/src/NServiceBus.Azure.Transports.WindowsAzureServiceBus/NamingConventions/AzureServiceBusQueueNamingConvention.cs#L27
As these conventions are public you can override them to work around the problem by changing the func in IWantCustomInitialization until I can get a fix in, just copy the current method and add the individualizer logic. The queue individualizer is internal though, so you'll have to copy that class from
https://github.com/Particular/NServiceBus.Azure/blob/master/src/NServiceBus.Azure.Transports.WindowsAzureServiceBus/Config/QueueIndividualizer.cs

Processing an emaillist async in MVC4

I'm trying to make my MVC4-website check to see if people should be alerted with an email because they haven't done something.
I'm having a hard time figuring out how to approach this. I checked if the shared hosting platform would allow me to activate some sort of cronjob, but this is not available.
So now my idea is to perform this check on each page-request, which already seems suboptimal (because of the overhead). But I thought that with using an async it would not be in the way of people just visiting the site.
I first tried to do this in the Application_BeginRequest method in Global.asax, but then it gets called multiple times per page-request, so that didn't work.
Next I found that I can make a Global Filter which executes on OnResultExecuted, which would seemed promising, but still it's no go.
The problem I get there is that I'm using MVCMailer to send the mails, and when I execute it I get the error: {"Value cannot be null.\r\nParameter name: httpContext"}
This probably means that mailer needs the context.
The code I now have in my global filter is the following:
public override void OnResultExecuted(ResultExecutedContext filterContext)
{
base.OnResultExecuted(filterContext);
HandleEmptyProfileAlerts();
}
private void HandleEmptyProfileAlerts()
{
new Thread(() =>
{
bool active = false;
new UserMailer().AlertFirst("bla#bla.com").Send();
DB db = new DB();
DateTime CutoffDate = DateTime.Now.AddDays(-5);
var ProfilesToAlert = db.UserProfiles.Where(x => x.CreatedOn < CutoffDate && !x.ProfileActive && x.AlertsSent.Where(y => y.AlertType == "First").Count() == 0).ToList();
foreach (UserProfile up in ProfilesToAlert)
{
if (active)
{
new UserMailer().AlertFirst(up.UserName).Send();
up.AlertsSent.Add(new UserAlert { AlertType = "First", DateSent = DateTime.Now, UserProfileID = up.UserId });
}
else
System.Diagnostics.Debug.WriteLine(up.UserName);
}
db.SaveChanges();
}).Start();
}
So my question is, am I going about this the right way, and if so, how can I make sure that MVCMailer gets the right context?
The usual way to do this kind of thing is to have a single background thread that periodically does the checks you're interested in.
You would start the thread from Application_Start(). It's common to use a database to queue and store work items, although it can also be done in memory if it's better for your app.

why my sharepoint workflow always stop when i use this code?

I need to find an user in a list to set the assignedto task property, these informations are in a list. So i use this method :
public static SPUser GetSPUser(SPListItem item, string key){
SPFieldUser field = item.Fields[key] as SPFieldUser;
if (field != null)
{
SPFieldUserValue fieldValue = field.GetFieldValue(item[key].ToString()) as SPFieldUserValue;
if (fieldValue != null)
{
return fieldValue.User;
}
}
return null;
}
The problem is that when i use this method or this part of code, my workflow stop without saying anything. Here an exemple of code when i use it :
using (SPSite site = new SPSite(adress_of_my_site))
{
using (SPWeb web = site.OpenWeb())
{
SPList list = web.Lists["Acteurs du projet"];
SPView view = cobj_ListeDesActeursDuProjet.DefaultView;
SPListItemCollection itemcollection = list.GetItems(view);
foreach (SPListItem item in itemcollection)
{
SPUser lobj_acteur = Utilities.GetSPUser(item,"acteur");
// Dictionary<string,class>
ActeursDuProjet[item["Rôle"].ToString()] =
new ActeursDuProjet()
{
Login = lobj_acteur.LoginName,
Email = lobj_acteur.Email
};
}
}
}
If i comment the content of my foreach my workflow continue as well...
If anybody have an idea it will be cool.
Regards,
Loïc
edit: problem in the code
Here are some debugging tips that might help:
ULS logs
Any exceptions should be reported here in some detail.
Enable debugging for all .NET code
This will cause the debugger to break whenever an exception occurs in SharePoint as well as your code. The downside is that the debugger will break on 'normal' exceptions that cause no side effects. So don't be misled!
To enable: Go to Debug, Exceptions and tick Common Language Runtime Exceptions. Also go to Tools, Options, Debugging and untick Enable Just My Code. Then attach to w3wp.exe.
Commenting code
You could also comment out all of your code. If the workflow step fails, you know there is a problem elsewhere. If the workflow step passes, then start uncommenting code until it fails - then you know where to look.
I tried commenting this above but it didn't format nicely so here it is.
It probably is fine, but this looks fishy to me:
// Dictionary<string,class>
ActeursDuProjet[item["Rôle"].ToString()] =
new ActeursDuProjet()
{
Login = lobj_acteur.LoginName,
Email = lobj_acteur.Email
};
I would think it would be something like:
// dictionary declared somewhere earlier
Dictionary<string,ActeursDuProjet> roles = new Dictionary<string,ActeursDuProjet>();
// inside the foreach
string role = item["Rôle"].ToString();
if (!roles.ContainsKey(role))
{
ActeursDuProjet foo = new ActeursDuProjet();
foo.Login = lobj_acteur.LoginName;
foo.Email = lobj_acteur.Email;
roles.Add(role, foo);
}

Resources