uCommerce Secure Ticket Transaction with Loggin - security

I'm working on a webshop in which it should be possible to subscribe to a service for a monthly 20$ fee in return.
I've made a script in a c# usercontrol which will be called monthly by umbraco scheduled task command:
<task log="true" alias="test60" interval="60" url="http://mysite/umbraco/subscriptionPayment.aspx"/>
For that I have two questions.
1) Where should I put the usercontrol in order to make it unaccessible for the public but accessible for the umbraco task command? It is very important that the script can only be accessed by local server commands.
2) I would like the script to log a file each time a transaction is made. I'm using the following script:
File.AppendAllText("paymentlog.txt",
"Transaction "+transactionNumber.ToString() +" sucessfully executed at "+DateTime.Now.ToString() + Environment.NewLine);
I just don't know which path I should give to the paymentlog.txt file since handling real paths in umbraco seems kind of obscure to me. I would like the paymentlog.txt file to be placed in the root umbraco folder. How do I do that?
Thanks in advance (I'm running umbraco 4.8 and uCommerce 2.6.1).
Best regards,
Brinck10

The task scheduler in umbraco isn't very reliable and this sounds like a crucial to your businness so i would probably try to use the build in scheduler in Windows on your server to do the processing.
That being said. In regards to marking the tasks scheduler approach a little more secure you could make the URL you are calling only accessible on localhost/127.0.0.1 and use localhost address in your task configuration.

You can make a call to the user service to verify that the caller is logged in. Following code will achieve that:
using UCommerce.Infrastructure;
using UCommerce.Security;
var authService = ObjectFactory.Instance.Resolve<IAuthenticationService>();
if (authService.IsAuthenticated())
{
// do secure code here
}

Related

UseLegacyUI Setting Not Working

I have ran a SQL script to update the UserPreferences.UseLegacyUI to 0 for all of the users in our 2018 R1 system but, the users are still getting the classic UI when they log into the system. Any thoughts on why this might be? We're trying to avoid every user having to change their setting in their profile.
EDIT: Users are having to switch to the modern UI every time they log in. Is there a cookie involved?
Try using the UserPreference graph to change that value, that's how it's used in 'Main.aspx.cs'. Sometimes there's code in event handlers that needs to be executed too:
PX.SM.SMAccessPersonalMaint prefGraph = PX.Data.PXGraph.CreateInstance<PX.SM.SMAccessPersonalMaint>();
PX.SM.UserPreferences prefs = prefGraph.UserPrefs.SelectSingle() ?? prefGraph.UserPrefs.Insert();
prefs.UseLegacyUI = false;
prefGraph.UserPrefs.Update(prefs);
prefGraph.Persist();
There was an HTTP redirect set in IIS on both the web site and the web application that was causing this. I suspect this was a hold over from the 5.3 installation but, not sure. Removing the redirect from the web application and removing the "/main.aspx" from the web site redirect cured this issue.

Coldfusion 2016 - 403 forbidden error for schedule tasks

I have two schedule task which is running in coldfusion administrator. They are giving 403 forbidden error when run through coldfusion administrator. Here is the log which i get.
"Information","DefaultQuartzScheduler_Worker-8","02/22/17","10:11:00","","Task default.example - Get detail Dev triggered."
"Information","DefaultQuartzScheduler_Worker-4","02/22/17","10:11:00","","Task default.example - Get detail Live triggered."
"Error","DefaultQuartzScheduler_Worker-8","02/22/17","10:11:00","","403 Forbidden "
"Error","DefaultQuartzScheduler_Worker-4","02/22/17","10:11:00","","403 Forbidden "
The task url is running good through browser. It seems that it is something related to permission problem. I have checked the permission of coldfusion Application 'log on as' user on the CFIDE directory and task url directory. It has full control.
Can anyone guide me to solve this problem.
This post is a little old but I've happened across the same problem and I thought I'd share our solution here.
We're running ColdFusion 2016 on a dedicated Windows 2012R2 box. We have several client sites on our box and we're completely locked down using Peter Freitag's lockdown guide.
This is was a new server migration from a CF10 server on another box. Once I set up the scheduled task exactly as we had done before, I received several "403 Forbidden" responses.
The only real way to troubleshoot this is to activate the "Save output to a file" option on the scheduled task itself and save the file to a directory where your CFUser has write access to. "CFUser" of course is the Windows user your CF service uses.
My first test of the URL was through Chrome on the server and it worked just fine so my URL was valid and of public access.
When I fired the scheduled task, it said "The scheduled task ran successfully" however the results of the output file showed it didn't. In our case, an outside service called "Cloudflare" was blocking our request. The error from Cloudflare asked us to enable cookies which we can't do in a scheduled task request. In our case, our hosting provider must provide an exception for requests made from our server's dedicated IP.
Most of the time, these error are generated because of file permission issues on Windows. If you're sure that your CFUser has read & execute permission to the requested template, then you need to output the scheduled task result to fully understand the error.

How to determine site web root on Shared Hosting and set up a Scheduled Task?

I am still new in doing a scheduled task. My problem is where should I put the PHP script I will make?
In creating a Schedule Task I need to fill up this:
Specify the full path to the script. Example: /tmp/script.php
How can I get the full path? I already created a web user in my domain.
Example in my domain I will put my script inside my sample_website So my full path will be like this?
/usr/bin/php -q /home/my_domain.ph/public_html/sample_website/cron_script.php
Please help me guys. I am still new in doing this. Thanks
Can you provide me a step by step process with this?
To determine your script server full path you always can create PHP script with content:
<?php
echo(__FILE__);
and place it in your web root, like /httpdocs/script.php
Than you access it via browser like http://you-domain.name/script.php
It will show your web root path on server it should be something like:
/var/www/vhosts/you-domain.name/httpdocs/script.php
Now you know that your files are placed at /var/www/vhosts/you-domain.name/httpdocs/ and can use it to call scripts from scheduled tasks.

standard way of setting a webserver deploy using webhooks

I am working on code for a webserver.
I am trying to use webhooks to do the following tasks, after each push to the repository:
update the code on the webserver.
restart the server to make my changes take effect.
I know how to make the revision control run the webhook.
Regardless of the specifics of which revision control etc. I am using, I would like to know what is the standard way to create a listener to the POST call from the webhook in LINUX.
I am not completely clueless - I know how to make a HTTP server in python and I can make it run the appropriate bash commands, but that seems so cumbersome. Is there a more straightforward way?
Setup a script to receive the POST request ( a PHP script would be enough )
Save the request into database and mark the request as "not yet finished"
Run a crontab and check the database for "not yet finished" tasks, and do whatever you want with the information you saved into database.
This is definately not the best solution but it works.
You could use IronWorker, http://www.iron.io, to ssh in and perform your tasks on every commit. And to kick off the IronWorker task you can use it's webhook support. Here's a blog post that shows you how to use IronWorker's webhooks functionality and the post already has half of what you want (it starts a task based on a github commit): http://blog.iron.io/2012/04/one-webhook-to-rule-them-all-one-url.html

Updating a Web Application in IIS - Best practices

What are the best practices for updating a web application in IIS?
The first page you see when you visit our application is a login page.
What I want to achieve is that visitors be redirected to a page stating that the application is being updated. And for users with an admin role being able to login successfully (to check whether everything is working properly)
In web.config we keep track of wheter the application is being updated (updating = [true|false] and then on the authentication_event:
if (updating)
{
if (User.IsInRole("admin"))
{
redirect to main web app...
}
else
{
redirect to web being updated page....
}
}
else
{
redirect to main web app..
}
Any advise will be appreciated immensely..
I test everything locally, then I use the built in app_offline.html file on production server.
When this file is present it will be served to clients, in meantime I am uploading new content. When done, renaming app_offline.html to something different and the new app starts.
instead of doing it via web.config, you should be saving the updating flag in some other xml file or database, as each time you update your web.config file, your application restarts, thus invalidating current application variables, caches, etc etc
apart from that, you got the other logic right. but avoid touching the web.config as far as possible - i personally only save the connection string in the web.config as that hardly changes
but rest of the key value pairs which change often, i have them saved in sql database table. so that way i never ever have to do an application restart unless my connection string changes :-)
Looks like you pretty much have it figured out.
if (WebConfigurationManager.AppSettings["updating"]=="true")
if (User.IsInRole("admin"))
Response.Redirect("~/Main.aspx");
else
Response.Redirect("~/Updating.aspx");
else
Response.Redirect("~/Main.aspx");

Resources