How to create an image import job using KTA SDK? - kofax

I am trying to create a job using SDK. Simple job with send email activity work like a charm!
But when I try to create a job with variables input folder to import few images it doesn't work at all. Am I missing very trivial settings ?
My process has classification activity & extraction activities
Variables : DefaultImportFolder
FYI : My process works fine if I set import settings -> import sources. That tells me there is no issue with my process process Smile. But when I try to run through console app with dynamic variables, it doesn't work.
Following is my sample code. Any help?
ProcessIdentity processIdentity = new ProcessIdentity
{
Name = "SDK TestProcess"
};
var jobService = new TotalAgility.Sdk.JobService();
JobInitialization jobInitialization = new JobInitialization();
InputVariableCollection variablesCollections = new InputVariableCollection();
InputVariable inputVariable = new InputVariable
{
Id = "DefaultImportFolder",
Value = #"\\FolderPath",
};
variablesCollections.Add(inputVariable);
inputVariable = new InputVariable
{
Id = "ExportSuccess",
Value = "true"
};
variablesCollections.Add(inputVariable);
var createJobAndProgress = jobService.CreateJob(sessionId, processIdentity, jobInitialization);
Console.WriteLine($"Job ID {createJobAndProgress.Id}");
As Suggested by Steve, tried with WithDocuments method Still no luck .....
JobWithDocumentsInitialization jobWithDocsInitialization = new JobWithDocumentsInitialization();
Agility.Sdk.Model.Capture.RuntimeDocumentCollection documentsCollection = new Agility.Sdk.Model.Capture.RuntimeDocumentCollection();
Agility.Sdk.Model.Capture.RuntimeDocument runtimeDoc = new Agility.Sdk.Model.Capture.RuntimeDocument
{
FilePath = #"FolderPath\abc.tif",
};
documentsCollection.Add(runtimeDoc);
jobWithDocsInitialization.Documents = documentsCollection;
var jobIdentity = jobService.CreateJobWithDocuments(sessionId, processIdentity, jobWithDocsInitialization);
Console.WriteLine($"Job ID {jobIdentity.Id}");

A folder variable represents a reference to a folder that already exists in the KTA database, so you can't just set a file path to the variable. When you create a job via an import source, it is creating the folder and documents as part of creating the job.
To do the same in your code, you would use one of the "WithDocuments" APIs such as CreateJobWithDocuments which has parammeters specific to importing documents into the process, including by file path.
As discussed in this other answer (Kofax TotalAgility Send a PDF Document to Jobs Queue (KTA)), you may want to look at the sample code that is included with the product (that most people don't realize is available), and also look at other API functions for more context on the parameters needed for the "WithDocuments" APIs mentioned above.

Related

Scheduled CSV Import's Deployement Error In Netsuite

I tried to update the employees record in an Netsuite system using the Scheduled CSV Import.
while deploying the Scheduled Script, I put the status has "Testing" and click "Save & Execute" Button for testing purpose.
And then I checked the Execution Log in the Log it shows the following error message:
" com.netledger.app.common.scripting.version1.nlobjCSVImportImplV1"
Scheduled Script Code:
function scheduled(type) {
var fileId = nlapiGetContext().getSetting('SCRIPT', 'custscript_sfg_customer_rec_cus_param');
nlapiLogExecution('DEBUG','fileId:',fileId);
var import1 = nlapiCreateCSVImport();
nlapiLogExecution('DEBUG','import1:',import1);
var mapping = import1.setMapping('CUSTIMPORT_emp_rec_imp');
// Internal id for mapping
nlapiLogExecution('DEBUG','mapping:',mapping);
var setPrimary = import1.setPrimaryFile(nlapiLoadFile(fileId));
// id is Internal id for Employee file
nlapiLogExecution('DEBUG','setPrimary:',setPrimary);
var submitImport = nlapiSubmitCSVImport(import1);
// Importing is Done
nlapiLogExecution('DEBUG','submitImport:',submitImport);
}
Help me to find the solution for creating the scheduled CSV Import.
Thanks in Advance.
There are a couple of possible issues with your code:
When I have done this I've used all lower case for the import id. e.g. use 'customimport_emp_rec_imp' not 'CUSTIMPORT_emp_rec_imp'
Make sure the saved csv import definition is public
Set a job name with import1.setOption("jobName", "Employee Import");

Making jslink target specific list

Background
I got a page where I’m showing two list views from two separate lists which both have Custom List as their ListTemplate. They got their separate jslink file cause I don’t want them to look alike.
Problem
The js link file targets both listviews since they use the same Template.
Code
(function () {
var listContext = {};
listContext.Templates = {};
listContext.ListTemplateType = 100;
listContext.Templates.Header = "<div><ul>";
listContext.Templates.Footer = "</ul></div>";
listContext.Templates.Item = LinkTemplate;
SPClientTemplates.TemplateManager.RegisterTemplateOverrides(listContext);
})();
Question
Is there any way to make the js only target a specific list?
Ended up going with Paul Hunts solution that he writes about on myfatblog.co.uk. http://www.myfatblog.co.uk/index.php/2013/09/listview-web-part-issues-with-jslink-and-display-templates-a-solution/
The script ended up looking like this and I pasted it into the jslink function where I define what listContext to override.
// Override the RenderListView once the ClientTemplates.JS has been called
ExecuteOrDelayUntilScriptLoaded(function(){
// Copy and override the existing RenderListView
var oldRenderListView = RenderListView;
RenderListView = function(ctx,webPartID)
{
// Check the title and set the BaseViewId
if (ctx.ListTitle == "List")
ctx.BaseViewID = "list";
//now call the original RenderListView
oldRenderListView(ctx,webPartID);
}
},"ClientTemplates.js");

How to make the cache to refresh when the XML is changed?

I am using MvcSiteMapProvider 4.6.3, MVC 4. Using DI to config the Sitemap.
this.For<System.Runtime.Caching.ObjectCache>()
.Use(s => System.Runtime.Caching.MemoryCache.Default);
this.For(typeof (ICacheProvider<>)).Use(typeof (RuntimeCacheProvider<>));
var rootCacheDependency = this.For<ICacheDependency>().Use<RuntimeFileCacheDependency>()
.Ctor<string>("fileName").Is(rootFileName);
var rootCacheDetails = this.For<ICacheDetails>().Use<CacheDetails>()
.Ctor<TimeSpan>("absoluteCacheExpiration").Is(absoluteCacheExpiration)
.Ctor<TimeSpan>("slidingCacheExpiration").Is(TimeSpan.MinValue)
.Ctor<ICacheDependency>().Is(rootCacheDependency);
var cacheDetails = new List<SmartInstance<CacheDetails>>();
var xmlSources = new List<SmartInstance<FileXmlSource>>();
How to make it automatically update the cache when the Sitemap xml is updated?
I am upgrading MvcSitemapProvider from v3 to v4.
In version 3, it seems the sitemap is automatically refreshed.
I did set the cache expiration time to be 5 min, is this causing problem?
TimeSpan absoluteCacheExpiration = TimeSpan.FromMinutes(5);
var rootCacheDetails = this.For<ICacheDetails>().Use<CacheDetails>()
.Ctor<TimeSpan>("absoluteCacheExpiration").Is(absoluteCacheExpiration)
.Ctor<TimeSpan>("slidingCacheExpiration").Is(TimeSpan.MinValue)
.Ctor<ICacheDependency>().Is(rootCacheDependency);
UPDATE
When I change the sitemap xml file the cache is not updated till 5 min the cache expire.
I am using multiple sitemap xml files.
var sitmapPath = HostingEnvironment.MapPath("~/Sitemaps");
var sitemaps = new List<string>();
if (sitmapPath != null)
{
sitemaps.AddRange(Directory.GetFiles(sitmapPath, "*.sitemap"));
}
foreach (var sitemapFileName in sitemaps)
{
var cacheDependencie =
this.For<ICacheDependency>()
.Use<RuntimeFileCacheDependency>()
.Ctor<string>("fileName")
.Is(sitemapFileName);
cacheDetails.Add(this.For<ICacheDetails>().Use<CacheDetails>()
.Ctor<TimeSpan>("absoluteCacheExpiration").Is(absoluteCacheExpiration)
.Ctor<TimeSpan>("slidingCacheExpiration").Is(TimeSpan.MinValue)
.Ctor<ICacheDependency>().Is(cacheDependencie));
xmlSources.Add(this.For<IXmlSource>().Use<FileXmlSource>()
.Ctor<string>("fileName").Is(sitemapFileName));
}
Will this be the reason it's not working?
I don't see a problem with the code you posted. However, it is the RuntimeFileCacheDependency that will make it reload when the XML is changed.
The RuntimeFileCacheDependency expects the fileName argument to be an absolute path. So you must convert it using HostingEnvironment.MapPath before providing it to the RuntimeFileCacheDependency constructor.
var rootFileName = HostingEnvironment.MapPath("~/root.sitemap");
Response to Your Update
The purpose of the cacheDetails object is to specify the caching policy for a single SiteMapBuilderSet instance. If you look further down in the (original) DI module, notice that the variable is passed to the constructor of this class.
// Configure the builder sets
this.For<ISiteMapBuilderSetStrategy>().Use<SiteMapBuilderSetStrategy>()
.EnumerableOf<ISiteMapBuilderSet>().Contains(x =>
{
x.Type<SiteMapBuilderSet>()
.Ctor<string>("instanceName").Is("default")
.Ctor<bool>("securityTrimmingEnabled").Is(securityTrimmingEnabled)
.Ctor<bool>("enableLocalization").Is(enableLocalization)
.Ctor<bool>("visibilityAffectsDescendants").Is(visibilityAffectsDescendants)
.Ctor<bool>("useTitleIfDescriptionNotProvided").Is(useTitleIfDescriptionNotProvided)
.Ctor<ISiteMapBuilder>().Is(builder)
.Ctor<ICacheDetails>().Is(cacheDetails); // <- caching specified here explicitly.
});
This is what is used to expire the cache, but it is a completely separate mechanism from the part that specifies to use multiple files to build a SiteMap:
// Register the sitemap node providers
var siteMapNodeProvider = this.For<ISiteMapNodeProvider>().Use<CompositeSiteMapNodeProvider>()
.EnumerableOf<ISiteMapNodeProvider>().Contains(x =>
{
x.Type<XmlSiteMapNodeProvider>()
.Ctor<bool>("includeRootNode").Is(true)
.Ctor<bool>("useNestedDynamicNodeRecursion").Is(false)
.Ctor<IXmlSource>().Is(rootXmlSource);
// NOTE: Each additional XmlSiteMapNodeProvider instance for the same SiteMap instance must
// specify includeRootNode as "false"
x.Type<XmlSiteMapNodeProvider>()
.Ctor<bool>("includeRootNode").Is(false)
.Ctor<bool>("useNestedDynamicNodeRecursion").Is(false)
.Ctor<IXmlSource>().Is(childXmlSource1);
x.Type<XmlSiteMapNodeProvider>()
.Ctor<bool>("includeRootNode").Is(false)
.Ctor<bool>("useNestedDynamicNodeRecursion").Is(false)
.Ctor<IXmlSource>().Is(childXmlSource2);
// Add additional XmlSiteMapNodeProviders here (with includeRootNode as "false")...
// You only need this if you intend to use MvcSiteMapNodeAttribute in your application
x.Type<ReflectionSiteMapNodeProvider>()
.Ctor<IEnumerable<string>>("includeAssemblies").Is(includeAssembliesForScan)
.Ctor<IEnumerable<string>>("excludeAssemblies").Is(new string[0]);
});
// Register the sitemap builders
var builder = this.For<ISiteMapBuilder>().Use<SiteMapBuilder>()
.Ctor<ISiteMapNodeProvider>().Is(siteMapNodeProvider);
This is how to specify multiple XML files for a single SiteMap, but it is also possible to make each XML file into its own SiteMap instance by passing each instance of XmlSiteMapNodeProvider to a separate SiteMapBuilder and a separate SiteMapBuilderSet as described in Multiple SiteMaps in One Application.
IMPORTANT: For multiple XML files to work on a single SiteMap instance, you must specify the same key for the root node of each SiteMap as shown at the bottom of this answer. But you cannot specify a node representing the same controller action in more than one XML file (other than the root node).
If you need more flexibility than this, I would suggest implementing your own XmlSiteMapNodeProvider or abandoning the idea of using XML altogether, since using ISiteMapNodeProvider or IDynamicNodeProvider is much more flexible.
Now, back to the caching. If you are indeed using multiple XML files in the same SiteMap instance, you need to use a RuntimeCompositeCacheDependency so each of the files will be considered a dependency for the same cache, but you must use a single instance of CacheDetails.
var rootCacheDependency =
this.For<ICacheDependency>().Use<RuntimeFileCacheDependency>()
.Ctor<string>("fileName").Is(rootAbsoluteFileName);
var childCacheDependency1 =
this.For<ICacheDependency>().Use<RuntimeFileCacheDependency>()
.Ctor<string>("fileName").Is(childAbsoluteFileName1);
var childCacheDependency2 =
this.For<ICacheDependency>().Use<RuntimeFileCacheDependency>()
.Ctor<string>("fileName").Is(childAbsoluteFileName2);
var cacheDependency =
this.For<ICacheDependency>().Use<RuntimeCompositeCacheDependency>()
.Ctor<ICacheDependency[]>().Is(new ICacheDependency[]
{
(ICacheDependency)rootCacheDependency,
(ICacheDependency)childCacheDependency1,
(ICacheDependency)childCacheDependency2
});
var cacheDetails =
this.For<ICacheDetails>().Use<CacheDetails>()
.Ctor<TimeSpan>("absoluteCacheExpiration").Is(absoluteCacheExpiration)
.Ctor<TimeSpan>("slidingCacheExpiration").Is(TimeSpan.MinValue)
.Ctor<ICacheDependency>().Is(cacheDependency);

unable to change the account reference inside the contact using sdk in crm2011

I am unable to change the client by updating the contact using crm 2011 sdk.Here is the code i am using to do that :
Entity contact = new Entity();
contact.LogicalName = "contact";
contact.Attributes = new AttributeCollection();
EntityReference clientLookup = new EntityReference();
clientLookup.Id = NewClientBId;
clientLookup.LogicalName = "account";
contact.Attributes.Add("parentcustomerid", clientLookup);
contact.Attributes.Add("contactid", workItem.Id);
SynchronousUtility.UpdateDynamicEntity(CrmConnector.Service, contact);
The code runs fine without any error but when i go to web portal and check the record ,it still points to the old account though updated the modofication time stamp.I also checked the sql profiler query which shows up as below :
exec sp_executesql N'update [ContactBase] set
[ModifiedOn]=#ModifiedOn0, [ModifiedBy]=#ModifiedBy0,
[ModifiedOnBehalfBy]=NULL where ([ContactId] =
#ContactId0)',N'#ModifiedOn0 datetime,#ModifiedBy0
uniqueidentifier,#ContactId0
uniqueidentifier',#ModifiedOn0='2013-07-04
09:21:02',#ModifiedBy0='2F8D969F-34AB-E111-9598-005056947387',#ContactId0='D80ACC4E-A185-E211-AB64-002324040068'
as can be seen above the column i have updated is not even there in the set clause of the update query.Can anyone help me with this ?
I tested your code and it works:
Entity contact = new Entity();
contact.LogicalName = "contact";
contact.Attributes = new AttributeCollection();
EntityReference clientLookup = new EntityReference();
clientLookup.Id = new Guid("3522bae7-5ae5-e211-9d27-b4b52f566dbc");
clientLookup.LogicalName = "account";
contact.Attributes.Add("parentcustomerid", clientLookup);
contact.Attributes.Add("contactid", new Guid("16dc4143-5ae5-e211-9d27-b4b52f566dbc"));
As you can see I used existing Id in my environment, and to perform the update I used
service.Update(contact);
Reasons why your code is not working:
NewClientBId is not the right account Guid
workItem.Id is not the right contact Guid
the function SynchronousUtility.UpdateDynamicEntity has errors

Saving per-user or per-document preferences in a Photoshop script

I'm working on a Photoshop script in JavaScript using ExtendScript. My script allows some user input, and I'd like to save it between uses. That is, I'm looking for a way to save a simple string or numeric value under a particular key so that I'll be able to access it on subsequent uses of the script. Simply put, I want to save a preference for my script. How do I do that?
Even better would be to be able to save at least some preferences on a per-document basis. Is that possible? That is, can I store an arbitrary bit of data with a document?
You can use put/get custom options to save preference parameters that persist across Photoshop launches:
const kMyFlag = app.stringIDToTypeID( "myFlag" );
const kMyNumber = app.stringIDToTypeID( "myNumber" );
const kMySettings = "mySettings";
function saveSettings()
{
var desc = new ActionDescriptor();
desc.putBoolean(kMyFlag, true);
desc.putInteger(kMyNumber, 42);
// "true" means setting persists across Photoshop launches.
app.putCustomOptions( kMySettings, desc, true );
}
function getSettings()
{
var desc = app.getCustomOptions( kMySettings );
return [desc.getBoolean( kMyFlag ), desc.getInteger( kMyNumber )];
}
You have some options. You can create a text file and write to it using the File object:
var prefs = new File("~/desktop/prefs.txt");
prefs.open("w"); // or "a" to append
prefs.writeln("user:lenny;favorite_color:ff6600;likes:sunsets;");
...if you wanted your preferences tied to the script itself.
If you want per-document preferences you could write a string to one of the metadata fields of the file your working on using Document.info like this (using the 'instructions' field but you could use any writable field):
var doc = app.activeDocument;
doc.info.instructions = "user:lenny;favorite_color:ff6600;likes:sunsets;";
//alert(doc.info.instructions); // see, it works!
As for how to actually format the string you could just do it like a simple config file or, if you have a complex user preferences object you could use the XML object to construct and serialize it. JSON would be great for this but there is no JSON object in Extendscript, unfortunately.
For per-document prefs I suggest the use of the XMP Metadata. You can find example snippet here: http://forums.adobe.com/thread/790973. You can leverage AdobeXMPScript library to create your own namespace like it is suggested in the link by Paul Riggott.

Resources