I have developed an application on Datacap 9.0.0.2.
I need to make a certain field in my application to be a read only field. I used FastDoc to run my application. I tried everything and I succeeded in making this field a read-only field on (Datacap desktop) and (web-admin client) but then I failed to make this field read-only on FastDoc.
How can I make this field read-only with FastDoc?
The read-only mechanism you reference (Field variable 'ReadOnly' = 1) is only available, and only operates in the VeriFine web client & Datacap Desktop thick client.
You will likely need to raise an enhancement request to have this compatible with FastDoc.
It's possible to 'hack' around the problem by using Validation rules that will always populate the initial 'read-only' value, however i suppose this largely depends on your use-case.
IBM Content Navigator for Datacap (Datacap - Navigator) supports read-only mechanism via the use of EDS (External Data Services).
Related
We are upgrading from Maximo 7.5 to 7.6.1. Our web service that uses MXINVISSUEInterface is throwing an exception when we try to issue a part that is marked as a spare part and the work order has an asset. The exception says "BMXAA4195 - A value is required for the Organization field on the SPAREOBJECT object." The part is not in the SPAREPART table for the asset so it is trying to add it, but for some reason the ORGID is not populated from the MXINVISSUE_MATUSETRANSType object.
I re-generated the WSDL on the new server and rebuilt the solution, but after populate a new required field, I still get the same error.
Is there a system property that must be set. It works in 7.5 writing the record to MATUSETRANS and SPAREPART.
This sounds like a bug, so you might raise a Support Case with IBM about it. For a workaround until IBM releases a fix and you install said fix, consider the following options.
Can you set the Default Insert Site for the user using the web service?
Is it practical to put a Default Value on SPAREPART.ORGID?
Create an automation script called SPAREPART.NEW that will somehow figure out an ORGID to use. To "figure out", my first would be to check if mbo has an owner that has an ORGID and, assuming it does, use that.
I hope I use the propper terminology. Please correct me if not.
How to access an .nsf file with PHP ? I need to get some field values of tables for clients login purposes around the globe. Let`s say I need to validate if a client is registered and that value is stored in an xyz.nsf file. How to connect and how access it ?
Access your data with http/https request per URL.
You have several choices:
use out-of-the-box Domino URL commands like URL commands for opening documents by key
use IBM Domino Access Services (DAS)
create your own REST service based on ExtLib REST Service
create an XAgent which delivers the data as JSON (see example)
create your own Domino REST service using DAS
Choice #1 returns HTML. All others return JSON data which is probably best to handle with PHP. I'd go for choice #3 ExtLib REST Service.
Following is my requirement :
Whenever site is created, with help of GroupListener we are adding some custom attributes to created site.
So assume that you are crating "Liferay Test" site then it will have some fix custom attribute "XYZ" with value set in GroupListeners onAfterCreate method.
I can see this value in custom fields under site settings.
Now based on this values, we are creating groups in another system(out of liferay, using webservices).
So far so good.
Whenever we are deleting the site we need to remove the equivalent groups from other system via web service.
But while deleting site, in GroupListener we are not able to retrieve the custom attributes.
On further debug by adding expando listener, I observed that Expando listeners are getting called first and then delete method of GroupLocalService/GroupListener.
And hence we are not able to delete the groups present in another system.
So I was wondering if we can have ordering defined for listeneres.
Note: Since we were not getting custom attributes in listeners we implemented GroupLocalServiceImpl and with this we are getting custom attributes in delete method on local environment but not on our stage environment which has clustering.
You shouldn't use the ModelListeners for this kind of change, rather create ServiceWrappers, e.g. wrap the interesting methods in GroupLocalService (for creation as well as deletion).
This will also enable you to react to failures to create records in your external system etc.
We have a working website using ServiceStack as the back end that amounts to a complex data-entry form.
My users have requested an "offline editor" for the forms. To use the offline program, the user will have to connect to the ServiceStack service, create empty instances of the forms, and then I will save the POCOs from the service to disk using ServiceStack's JSON serializer. From there the user can log off the service and edit the POCOs. When they're done, they reconnect to the service, and post/put the edited POCO object.
This all works great. My question involves validation. The validation logic is built into my Service.Interface library, which isn't available offline. The winforms program references only the POCO library and the ServiceStack "common" libraries, which do not look like they include the ServiceStack.Validation namespace.
Is there a way I can rearrange my project so that both the service and the Winforms client can run Validation against the POCOs, so that they can have data validation while offline?
UPDATE:
getting closer, I think - I moved all of the Validation classes into their own project. From my Winforms project, I can now manually set up a validator for a POCO class like this:
ServiceStack.FluentValidation.IValidator<SomePOCO> IValidator;
IValidator = new Tonto.Svc.Validation.SomePOCOValidator();
ServiceStack.FluentValidation.Results.ValidationResult vr =
IValidator.Validate(_rpt);
I can see the validator constructor being set up and the rules being initialized, but the .Validate method doesn't seem to do anything. (object comes back as valid, and breakpoints into custom validator code never get there).
UPDATE #2
I discovered my validator code wasn't running from Winforms because my validators all specify a servicestack ApplyTo Put/Post only (see sample code below). When I remove the entire Ruleset clause, though, then validation happens in my service on GETs - something I never want.
Can anyone think of a way to configure the validator rules to run for POST/PUT only when called from ServiceStack, but to also always run when NOT in servicestack? So close!
public class SomePOCOValidator : AbstractValidator<SomePOCO>
{
public SomePOCO()
{
RuleSet(ApplyTo.Put | ApplyTo.Post, () =>
{
(rules)
});
}
}
If your validation is doing anything interesting, then it probably HAS to be done "online".
Maybe just allow your client to save the POCOs locally until they go back online, at which point you send them up to your server. Any transactions that are okay, get processed normally, and any that fail, get returned for the user to edit (so your client will need some smarts to have a working set of POCOs for editing)...
If you don't want ANY extra stuff on the client, just have the transactions that fail to validate get stuffed into a "needs_corrections" table on the server, and then code up a supervisor-sort of screen to manage that table.
The validation framework that ServiceStack uses is named FluentValidation. There is no WinForms support in it. Jeremy Skinner the creator of FluentValidation answerd a question about this back in 2010 on his forum here.
Personally I don't use FV with WinForms - the vast majority of my projects are web-based with the occasional WPF project.
However, if I was going to do this then I probably wouldn't validate the controls directly, but instead use a ViewModel which is bound to the controls. I'd use a fairly strict convention where the names of the controls would match the names of the properties that they're bound to. Then, after validation completes I'd walk the control hierarchy to find the control with the name that matches the property that failed validation (I'm not sure how you'd do this in WinForms, but in WPF I'd use LogicalTreeHelper.FindLogicalNode) and then use the ErrorProvider to set the appropriate error.
Jeremy
I was able to work out a solution that allowed me to use ServiceStack validation libraries on both a ServiceStack client and an offline client. Here are the details.
Move all AbstractValidators to their own project: Proj.Svc.Validation.
get rid of all RuleSets in your AbstractValidators.
Reference Proj.Svc.Validation from Proj.Svc.Interface and Proj.OfflineWinformsClient projects.
Turn OFF the ValidationFeature() plugin in your service. All validation will have to be done manually. This means no iOC injected validators in your service classes.
When it's time to validate, either from your service or the offline client, manually declare the validator and use it like this.
IValidator validator = new
Tonto.Svc.Validation.SomePOCOValidator();
ServiceStack.FluentValidation.Results.ValidationResult vr =
validator.Validate(poco);
if (!vr.IsValid)
(throw exception or notify user somehow);
When I launch a RemoteApp via Remote Desktop Web Access, is there a way to send data to the remote app?
Desired senario:
A user logs into a website with their credentials. They also provide demographic information such as first name, last name, address, etc.
The website connects to the RemoteApp via SSO and makes the demographic information available to the RemoteApp.
For example, if the RemoteApp is a Windows Forms app, can I get this information and display it in a message box?
Edit1: TomTom's response in this question mentions using named pipes to send data. Is that applicable to this problem?
It turns out you can pass command line parameters to the RemoteApp using the remoteapplicationcmdline property like such:
remoteapplicationcmdline:s:/Parameter1: 5234 /Parameter2: true
(The names "/Parameter1" and "/Parameter2" are just examples. Your remote app will have to define and handle these as appropriate.)
This setting is part of the RdpFileContents property of the MsRdpClientShell object.
Here is a resource for other RdpFileContents properties.
Your code might end up looking something like this:
MsRdpClientShell.PublicMode = true;
MsRdpClientShell.RdpFileContents = 'redirectclipboard:i:1 redirectposdevices:i:0 remoteapplicationcmdline:s:/Parameter1: 5234 /Parameter2: true [Other properties here...]';
MsRdpClientShell.Launch();
For larger amounts of information, we might send preliminary data to a web service, retrieve an identifier back, pass this identifier to the RemoteApp via the command line, then have the RemoteApp query the web service to get all the information.
Of course, for the parameters to be of use the program must be looking for them. Setting up a database to query has a little security issue if it is sensitive data.
If the program (RemoteApp) is looking for data in the form of a CSV or table or something, then you might be able to send a lot of data to be processed. It just depends upon what parameters (and form) the program is going to use.