Management Api stops reporting data after almost 4 hours of execution - azure

I was working to generate DLP logs with the help of the script present here
body = #{grant_type="client_credentials";resource=$APIResource;client_id=$AppClientID;client_secret=$ClientSecretValue}
Write-Host -ForegroundColor Blue -BackgroundColor white "Obtaining authentication token..." -NoNewline
try{
$oauth = Invoke-RestMethod -Method Post -Uri "$loginURL/$tenantdomain/oauth2/token?api-version=1.0" -Body $body -ErrorAction Stop
$OfficeToken = #{'Authorization'="$($oauth.token_type) $($oauth.access_token)"}
Write-Host -ForegroundColor Green "Authentication token obtained"
} catch {
write-host -ForegroundColor Red "FAILED"
write-host -ForegroundColor Red "Invoke-RestMethod failed."
Write-host -ForegroundColor Red $error[0]
exit
}
This part of the code helps to generate the access token.
The script seems to work fine, until about 700mb of data is exported. For me, it takes around 3-4 hours to run and get this 700mb worth of data.
Now, the issue is, I believe that the only reason this is happening is because the Access token is defaulting. And since I am new to PowerShell, I haven't been able to write the part of the code that could help me generate a new token every 3-4 hours.
Any documentation that might help me would be of great help since I have not been able to find anything of substance on the internet regarding my request.
If it's not about the access token, what else could it be about?
Please help!
thanks in advance!
$StopWatch = [System.Diagnostics.Stopwatch]::StartNew() if($StopWatch.Elapsed.TotalSeconds -ge 3599){$OfficeToken = Get-Token; $StopWatch.Restart()}
I tried using this particular piece of code to generate it every hour, but that doesn't seem to help us either.I was expecting it to work flawlessly after this, but alas, that isn't the case.

The Script in question does not produce information for large sources of data, as confirmed from the author of the script..
I ran the script for other days and it worked fine.

Related

Am I safe (XSS injection, etc) with my website contact form with Google Captcha?

I have suffered injection in my website (from a search box in a KB system). I removed that KB system but have a Contact Form (with Google Captcha) where the user enters his name, email and message and I use PHP mail() to send me the message.
Is it possible that an attacker can get access to my website from a possible attack to that form? Or the worst scenario could just be that he uses it to send Spam?
This is my PHP code before calling "main()":
<?php
$fname = $_POST['contact-f-name'];
$lname = $_POST['contact-l-name'];
$email = $_POST['contact-email'];
$text = $_POST['contact-message'];
$companyname = $_POST['company-name'];
$subject = $_POST['subject'];
$address = "myemail#myemail.com";
$headers = "From: " . strip_tags($email) . "\r\n";
$headers .= "Reply-To: ". strip_tags($email) . "\r\n";
$headers .= "MIME-Version: 1.0\r\n";
$headers .= "Content-type:text/plain; Charset=UTF-8 \r\n";
$message = ."Name: ".strip_tags($fname)." ".strip_tags($lname)."\r\n"
."Email: ".strip_tags($email)."\r\n"
."Company Name: ".strip_tags($companyname)."\r\n"
."Subject: ".strip_tags($subject)."\r\n"
."Message: ".strip_tags($text)."\r\n";
if(#mail($address, $subject, $message, $headers)) { echo "true"; }
else { echo "false"; }
exit;
?>
TL;DR: Short answer: Maybe:
While I do not have the time right now to do a complete and exacting answer to this post; I will point you to some best practises, and lots of links to other more verbose answers to similar questions regarding making user inputted data safe.
How to make the inputs safer?
Disable certain dangerous PHP functions. Read the second answeer rathr than the "ticked" answer.
Use PHPs filter_var() to force input the their correct types, especially for emails:
$email = filter_var($_POST['contact-email'], FILTER_SANITIZE_EMAIL);
use preg_replace() (or str_replace() ) to remove unwanted characters from your values. This can most typically be backticks, quotes of any kind, forward slashes or backslashes. Example.
I recommend replacing mail() in your code with PHPMailer.
strip_tags is ok, but just ok. It has flaws (such as dealing with unclosed tags). be aware of that.
Your PHP should be suitably jailed so if someone can run exec(...) commands (Ohsh1tOhsh1tOhsh1t) you have not (literally) lost your server.
What else can I read?
This huge topic on how to deal with forms on PHP
This question about how to "sanitise" user input.
OWASP PHP filters for cleaning inputs.
Disable dangerous functions
PHP fitler_var sanitisation filter list
Securing user variables (database related mostly)
Further wise words on data sanitisation.

PHPMailer - Does body have to come from a file now?

I've been using PHPMailer successfully for a couple of years. I just refreshed my PHPMailer class from their github site, and now my server throws 500 errors. I tracked down the problem to this line (simplified for this post):
$mail->Body = "<p>Hello World</p>";
All of the example that I see on the worxware website these days show the body of the email being read from a file like this:
$body = file_get_contents('contents.html');
$body = eregi_replace("[\]",'',$body);
$mail->MsgHTML($body);
I also tried modifying my code to use the MsgHTML syntax, but I still have the same result:
$body = "<p>Hello World</p>";
$mail->MsgHTML($body);
I can't imagine that it matters whether this body gets populated from a file or from a local variable, but nothing that I try works. What am I missing? Thanks!
$output = str_replace(array("\n","\r"),"",$output);
try this

Powershell: Recursive Functions and Multiple Parameters in Start-Job

I have been banging my head against a wall for a couple of days now trying to get Start-Job and background jobs working to no avail.
I have tried ScriptBlock and ScriptFile but neither seem to do what I want, or I can't seem to get the syntax right.
I have a number of recursive functions and need to split up the script to work in parallel accross many chunks of a larger data set.
No matter how I arrange the Start-Job call, nothing seems to work, and the recursive functions seem to be making everything twice as hard.
Can anyone give me a working example of Start-Job calling a recursive function and having multiple parameters, or point me somewhere where one exists?
Any help appreciated
This works for me:
$sb = {param($path, $currentDepth, $maxDepth) function EnumFiles($dir,$currentDepth,$maxDepth) { if ($currentDepth -gt $maxDepth) { return }; Get-ChildItem $dir -File; Get-ChildItem $dir -Dir | Foreach {EnumFiles $_.FullName ($currentDepth+1) $maxDepth}}; EnumFiles $path $currentDepth $maxDepth }
$job = Start-Job -ScriptBlock $sb -ArgumentList $pwd,0,2
Wait-Job $job | Out-Null
Receive-Job $job
Keep in mind your functions have to be defined in the scriptblock because the script runs in a completely separate PowerShell process. Same goes for any externally defined variables - they have to be passed into Start-Job via the -ArgumentList parameter. The values will be serialized, passed to the PowerShell process executing the job, where they will then be provided to the scriptblock.

Is there a way to edit the HTML of multiple SharePoint 2013 pages at once?

You guys gotta save me.
I'm supposed to go into 70+ SharePoint 2013 pages and manually replace each of the 3+ links in the html of each page with the correct copied version. As in, all the links currently point toward /place1/place2/link and now they need to be /place3/place4/link
So, there has got to be a way to mass edit all of the pages, like a find and replace, because otherwise I'm just going to go sit in a corner and cry for hours. I can't edit the folder structure because I'm not the project leader.
I will do whatever it takes to keep my sanity.
You can use powershell.
From this question:
function ProcessWeb($currentWeb)
{
if([Microsoft.SharePoint.Publishing.PublishingWeb]::IsPublishingWeb($currentWeb))
{
$publishingWeb = [Microsoft.SharePoint.Publishing.PublishingWeb]::GetPublishingWeb($currentWeb)
$publishingPages = $publishingWeb.GetPublishingPages()
foreach ($publishingPage in $publishingPages)
{
if($publishingPage.ListItem.File.CheckOutStatus -eq "None")
{
UpdatePage -page $publishingPage
}
}
Write-Host -ForegroundColor red "FINISHED"
}
else
{
Write-Host -Foregroundcolor Red "^ not a publishing site"
}
}
function UpdatePage($page)
{
$page.CheckOut();
Write-Host -Foregroundcolor green $page.Url;
$NewPageContent = $page["PublishingPageContent"].Replace("/place1/place2/link","/place3/place4/link");
$page["PublishingPageContent"] = $NewPageContent
$page.ListItem.Update();
$page.CheckIn("nothing");
$page.ListItem.File.Approve("Updated PublishingPageContent to replate place1/place2 with place3/place4");
}
ProcessWeb(Get-SPWeb -identity http://myintranet.com)
Note that you will need to work out how a good replace statement will work.
Also this automates changes that could go wrong, so make sure you do it first on a dev/uat environment before backing up the content database on production and then finally giving it a go.

How does threading in powershell work?

I want to parallelize some file-parsing actions with network activity in powershell. Quick google for it,
start-thread looked like a solution, but:
The term 'start-thread' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
The same thing happened when I tried start-job.
I also tried fiddling around with System.Threading.Thread
[System.Reflection.Assembly]::LoadWithPartialName("System.Threading")
#This next errors, something about the arguments I can't figure out from the documentation of .NET
$tstart = new-object System.Threading.ThreadStart({DoSomething})
$thread = new-object System.Threading.Thread($tstart)
$thread.Start()
So, I think the best would be to know what I do wrong when I use start-thread, because it seems to work for other people. I use v2.0 and I don't need downward compatibility.
Powershell does not have a built-in command named Start-Thread.
V2.0 does, however, have PowerShell jobs, which can run in the background, and can be considered the equivalent of a thread. You have the following commands at your disposal for working with jobs:
Name Category Synopsis
---- -------- --------
Start-Job Cmdlet Starts a Windows PowerShell background job.
Get-Job Cmdlet Gets Windows PowerShell background jobs that are running in the current ...
Receive-Job Cmdlet Gets the results of the Windows PowerShell background jobs in the curren...
Stop-Job Cmdlet Stops a Windows PowerShell background job.
Wait-Job Cmdlet Suppresses the command prompt until one or all of the Windows PowerShell...
Remove-Job Cmdlet Deletes a Windows PowerShell background job.
Here is an example on how to work with it. To start a job, use start-job and pass a script block which contains the code you want run asynchronously:
$job = start-job { get-childitem . -recurse }
This command will start a job, that gets all children under the current directory recursively, and you will be returned to the command line immediately.
You can examine the $job variable to see if the job has finished, etc. If you want to wait for a job to finish, use:
wait-job $job
Finally, to receive the results from a job, use:
receive-job $job
You can't use threads directly like this, but you can't be blamed for trying since once the whole BCL is lying in front of you it's not entirely silly to expect most of it to work :)
PowerShell runs scriptblocks in pipelines which in turn require runspaces to execute them. I blogged about how to roll your own MT scripts some time ago for v2 ctp3, but the technique (and API) is still the same. The main tools are the [runspacefactory] and [powershell] types. Take a look here:
http://www.nivot.org/2009/01/22/CTP3TheRunspaceFactoryAndPowerShellAccelerators.aspx
The above is the most lightweight way to approach MT scripting. There is background job support in v2 by way of start-job, get-job but I figured you already spotted that and saw that they are fairly heavyweight.
The thing that comes closest to threads and is way more performant than jobs is PowerShell runspaces.
Here is a very basic example:
# the number of threads
$count = 10
# the pool will manage the parallel execution
$pool = [RunspaceFactory]::CreateRunspacePool(1, $count)
$pool.Open()
try {
# create and run the jobs to be run in parallel
$jobs = New-Object object[] $count
for ($i = 0; $i -lt $count; $i++) {
$ps = [PowerShell]::Create()
$ps.RunspacePool = $pool
# add the script block to run
[void]$ps.AddScript({
param($Index)
Write-Output "Index: $index"
})
# optional: add parameters
[void]$ps.AddParameter("Index", $i)
# start async execution
$jobs[$i] = [PSCustomObject]#{
PowerShell = $ps
AsyncResult = $ps.BeginInvoke()
}
}
foreach ($job in $jobs) {
try {
# wait for completion
[void]$job.AsyncResult.AsyncWaitHandle.WaitOne()
# get results
$job.PowerShell.EndInvoke($job.AsyncResult)
}
finally {
$job.PowerShell.Dispose()
}
}
}
finally {
$pool.Dispose()
}
It also allows you to do more advanced things like
Throttle the number of parallel runspaces on the pool
Import functions and variables from the current session
etc.
The answer, now, is quite simple with the ThreadJob module according to Microsoft Docs.
Install-Module -Name ThreadJob -Confirm:$true
$Job1 = Start-ThreadJob `
-FilePath $YourThreadJob `
-ArgumentList #("A", "B")
$Job1 | Get-Job
$Job1 | Receive-Job

Resources