Remote creation of Custom actions in Thunar - linux

I'm using Thunar as file browser for a linux network composed of 100 CentOS 7.2 machines. We are managing the installation of those workstations with a PXE server and SaltStack installation.
I need to create those custom actions during the installation.
Currently, Thunar is installed on every workstations, the script is available on a share but I need to create the custom action on each machine. Open Thunar, Edit > Create custom actions and it launches my script in xterm for the selected folder:
xterm -e "/path/to/my/script.sh %f"
Is there a way to create Thunar's custom actions from command lines or by editing a file so that I will be able to launch them through Salt cmd.run?
Thanks for your help.

I've found out that those Custom actions are store in this file:
cat ~/.config/Thunar/uca.xml
Here is an example of the syntaxe:
<?xml encoding="UTF-8" version="1.0"?>
<actions>
<action>
<icon>script.png</icon>
<name>My custom action</name>
<unique-id>1479309009025049-2</unique-id>
<command>xterm -e "/path/to/my/script.sh %f"</command>
<patterns>*</patterns>
<startup-notify/>
<directories/>
<audio-files/>
<image-files/>
<other-files/>
<text-files/>
<video-files/>
</action>
</actions>
That way, I can create this template file and copy it in the user folder.

Related

An option is expected here instead of "T" returned upon validating or deploying SuiteApp project using sdfcli command

Upon validating/deploying the SuiteApp project it returned An option is expected here instead of "T".
For validating the project used this command sdfcli validate -authid [AUTH_ID] -p [PATH_TO_SUITE_APP_FOLDER] -applycontentprotection T
For validating the project used this command sdfcli deploy -authid [AUTH_ID] -p [PATH_TO_SUITE_APP_FOLDER] -applycontentprotection T
hiding.xml
<preference type="HIDING" defaultAction="UNHIDE">
<apply action="HIDE">
<path>~/FileCabinet/SuiteApps/xxx.xxx.xxx/script.js</path>
</apply>
</preference>
locking.xml
<preference type="LOCKING" defaultAction="UNLOCK">
<apply action="LOCK">
<object>custcontenttype_myobject</object>
</apply>
</preference>
NetSuite Account Release: 2020.2
SDFCLI: 2020.2
JDK: 11
Additional Note: When I ran the command without -applycontentprotection T it deployed to the target account but the file content visible in the target account.
No T is necessary; the switch is just --applycontentprotection (note the two dashes as well). If the switch is present, Content Protection is applied; if not, it is not applied.
Reference Help - project validate
In suitecloud cli for nodeJS you have to provide the flag --applycontentprotection only, the old behaviour has been changed where a user had to provide T or F value.
you can always check the help for suitecloud {command} --help
For applycontentprotection you can check the help: project:deploy -h

Trying to create a handler for .maff files in Linux

MAFF files are simply zip files. I'm trying to create a handler for .maff in linux so that when I click on them or type xdg-open x.maff it will call my handler instead of the default which is to open the directory in nautilus. I created an application-x-maff.xml file that contains:
<?xml version="1.0" encoding="UTF-8"?>
<mime-info xmlns="http://www.freedesktop.org/standards/shared-mime-info">
<mime-type type="application/x-maff">
<comment>maff type</comment>
<magic priority="100">
<match offset="0" type="string" value="PK\x03\x04" />
</magic>
<glob pattern="*.maff"/>
</mime-type>
</mime-info>
and saved in ~/.local/share/mime/packages. Created also a ~/.local/share/applications/maffapplication.desktop that contains
[Desktop Entry]
Type=Application
MimeType=application/x-maff
Name=Maff Handler
Exec=<my home path>/bin/linux/maffHandler
and executed
% update-mime-database ~/.local/share/mime/packages/
% update-desktop-database ~/.local/share/applications
If I do
% gio info x.maff (filtered)
standard::content-type: application/x-maff
standard::fast-content-type: application/x-maff
and if I do
% gio mime application/x-maff
Registered applications:
maffapplication.desktop
Recommended applications:
maffapplication.desktop
everything seems to be right ... but then xdg-open x.maff does not work, still calls nautilus ... worse yet, if I do
% xdg-mime query filetype x.maff
application/zip
I'm sure I'm missing something ... somehow I need to override this association between the .maff file that starts with the same magic as a zip file to no avail ... I tried all kinds of modifications on the xml file, with and without the magic, nothing works
By the way, if I do
% maffHandler x.maff
it works perfectly and opens the maff file in firefox, I'm willing to share the C++ code of that if anyone is interested
Seems that TDE (Trinity Desktop) does not properly set two important environment variables
setenv XDG_CURRENT_DESKTOP KDE
setenv KDE_SESSION_VERSION 5
Once they are set at .login (unfortunately had to log out and login again) xdg- scripts started working properly and recognizing the MIME types. The other problem is that TDE requires that you manually add the association on Control Center -> TDE Components -> File Associations.
After environment variables properly set for my environment and File Associations set, then it all works perfectly. Thanks

Cruisecontrol.net with UCM Clearcase - How to?

I am trying to configure Cruisecontrol.net for UCM Clearcase for the first time. Following is the sourceControl tag in the ccnet.config file:
<sourcecontrol type="clearCase">
<branch>123_India_Release</branch>
<autoGetSource>true</autoGetSource>
<viewName>admin_123_CRUISE</viewName>
<viewPath>$(ViewDirectory)</viewPath>
<useLabel>false</useLabel>
<useBaseline>false</useBaseline>
<executable>cleartool.exe</executable>
</sourcecontrol>
I constantly receive the following error:
ThoughtWorks.CruiseControl.Core.CruiseControlException: Source control
operation failed: cleartool: Error: Not an object in a vob: "PATH TO
THE VIEW"
When I run cleartool from an arbitrary directory with the following parameters:
cleartool.exe lshist -r -nco -branch "123_India_Release" -since
05-Dec-2012.14:38:18 -fmt
I get the same error. But if I change the working directory to $(ViewDirectory) before running cleartool, it runs fine.
How should I make Cruisecontrol.net run cleartool.exe from the $(ViewDirectory)?
I have already tried adding <workingDirectory>$(ViewDirectory)</workingDirectory> tag before <executable>cleartool.exe</executable> but it did not work.
Any help would be appreciated.
EDIT 1:
As a workaround I have done the following:
<exec>
<executable>cleartool.exe</executable>
<baseDirectory>d:\Workspace\123_India_Release\VOB</baseDirectory>
<buildArgs>update -force</buildArgs>
<buildTimeoutSeconds>6000</buildTimeoutSeconds>
</exec>
I have added this to the tasks tag. I have configured an hourly trigger which does the following:
1) Update snapshot view
2) Build the VS 2010 solutions mentioned in the tasks tag.
The limitations are:
1) The trigger is hourly. I want it to be a commit based trigger.
2) This is a workaround
EDIT 2:
Further experimentation revealed that the ccnet.exe works fine. It does all that is needed. The issue is caused by the service ccservice.
I have stopped ccservice for now and started ccnet.exe. I plan to leave it running.
The View directory isn't enough: you must specify a vob.
See for instance:
"clearfsimport: Error: Not an object in a vob: "\"." (as an illustratio of that error message)
this thread (or this one): "You have to specify explicitly the VOB(s) to check for modification set"
The path should looks like:
<viewPath>Drive:\path\to\view\vobname</viewPath>
If your $(ViewDirectory) already references Drive:\path\to\view, then you could use:
<viewPath>$(ViewDirectory)\vobname</viewPath>

How to remove multiple virtual directories?

I need to remove a big amount of virtual directories, some of them don't have associated physical directories.
Ideas?
As you need to remove a large amount, I'm guessing you'll want to use some form of script.
IIS 6.0, using IISvdir.vbs( article # MSDN):
At the command prompt, use the cd command to change to the directory where the Iisvdir.vbs script is installed. The default location for this file is systemroot/system32/iisvdir.vbs.
At the command prompt, type:
cscript iisvdir.vbs /delete "Sample Web Site" VirtualDirectoryName.
Substitute your Web site name and virtual directory name as appropriate. If there are spaces in the Web site name, use quotation marks around the Web site name, as shown in the preceding example.
IIS 7 using AppCmd.exe (article # TechNet):
To remove a virtual directory, use the following syntax:
appcmd delete vdir /vdir.name: string
The variable vdir.namestring is the virtual path of the virtual directory.
For example, to remove a virtual directory named photos from the root application of a site named contoso, type the following at the command prompt, and then press ENTER:
appcmd delete vdir /vdir.name: contoso / photos
To remove a virtual directory named photos from an application named marketing in a site named contoso, type the following at the command prompt, and then press ENTER:
appcmd delete vdir /vdir.name: contoso / marketing / photos
HTH
You could also write an msbuild script to do this and use the msbuild extension pack which is available here. I have used this successfully to do exactly what you are saying for 100s of vdirs in iis 6 AND in iis 7.5.
Its pretty simple and took me longer to write the .proj file than it did to figure out how to do it.
have fun :)
the resultant msbuild target would look like as follows
<Target Name="IIS7VirtualDirectories:Delete">
<MSBuild.ExtensionPack.Web.Iis7Application
TaskAction="Delete"
Website="%(Application.WebsiteName)"
Applications="#(Application)"
MachineName="$(MachineName)"
ContinueOnError="false"/>
<MSBuild.ExtensionPack.Web.Iis7Website
TaskAction="DeleteVirtualDirectory"
Name="%(VirtualDirectory.WebsiteName)"
VirtualDirectories="#(VirtualDirectory)"
ContinueOnError="false"
MachineName="$(MachineName)"/>
</Target>
Where Application and VirtualDirectory are defined in an external proj file :)

cruise control .net : xcopy not working for copying files to remote server

I am trying to copy a folder to a remote machine using xcopy. This command is executed via cruise control task. I can run xcopy source destOnRemoteMachine successfully if i am running it from command prompt. But if i am executing it from cruise control .net, it is always complaining about "Invalid drive specification" error.
I tried :
> <exec
> executable="c:\Windows\System32\xcopy.exe">
> <baseDirectory>$(BASE)\Project</baseDirectory>
> <buildArgs>.\*.* RemoteMachine\Project /Y</buildArgs>
> <buildTimeoutSeconds>10</buildTimeoutSeconds>
> <successExitCodes>-1,0,1</successExitCodes>
> </exec>
and :
<exec executable="c:\Windows\System32\cmd.exe">
<baseDirectory>$(BASE)\Project</baseDirectory>
<buildArgs>/C xcopy $(BASE)\Project\*.* RemoteMachine /y</buildArgs>
<buildTimeoutSeconds>30</buildTimeoutSeconds>
<successExitCodes>-1,0,1</successExitCodes>
</exec>
Any Suggestions?
thanks for your answer. Putting xcopy in batch was also not helpful, the problem was something else. here is problem:
It was problem with the account on which Cruise Control .net service was running. It was running as local service. Thus was not able to see the network path. I changed CCNet Service to run on a domain account which have permission to read/ write on remote machine. This solved my problem.
Thanks.
Create a batch file using xcopy source destOnRemoteMachine command and execute that bat file using <exec> command in cruise control .net
In my case I was running CC as a user account, after having used NET USE to persistently map a drive letter to a network share on another domain. Although it worked in a desktop session, CC could still not get to the share as part of the build. Executing NET USE as part of the build (without a drive mapping) allowed the copy to succeed.
<exec program="net" verbose="true">
<arg value="use" />
<arg value="\\server\share" />
<arg value="password" />
<arg value="/user:domain\username" />
</exec>

Resources