No log folder is created under each task directory in cuckoo - sandbox

I am running cuckoo 0.6 and everything is fine. My virtual machine is accessed by cuckoo and performs analysis. but after each analysis only tcpdupm.pcap and report folder are created in the /storage/analysis/taskiD/ directory and I can not see any behaviour analysis result in the html report generated at report folder.
It seems no dynamic analysis has been done and only static analysis result are shown! Every thing is enabled in configuration files.
I receive this error that shows there is no log folder:
[modules.processing.behavior] ERROR: Analysis results folder does not exist at path "/home/sam/cuckoo/cuckoo/storage/analyses/7/logs"

If you see only static analysis, then perhaps something went wrong with the execution of the binary. My best guest would be that the EXE (if it is), got corrupted.

Actually the main reason was inappropriate network setting between host and Virtual box analyser. The vm machine could not return the result back to the host!

Related

ADF Copy Activity FTP Source strange behavior

I have created a ADF pipeline to copy around 18 files from FTP location into Azure Blob container. Initially, I have used Get Metadata Activity to get all the files from the FTP location. Then, I have ForEach activity to loop through all the files. Inside ForEach Activity, I have Copy Data Activity which copies from FTP location to Blob location.
While running the pipeline, some of the files are getting copied however, some of them are getting failed saying below error message -
"ErrorCode=UserErrorFileNotFound,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=The remote server returned an error: (550) File unavailable (e.g., file not found, no access).,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Net.WebException,Message=The remote server returned an error: (550) File unavailable (e.g., file not found, no access).,Source=System,'"
I am not sure what is wrong here, because other files get copied successfully, however, few of them are not. I had to try it multiple times, still no guarantee that all files would get copied.
When I try to see if the connection to FTP Linked service is working or not, it says it connects successfully. FTP linked service is SSL enabled and configured to get password from Azure Key Vault.
Refer below output when I ran pipeline -
Any thoughts as to what is going wrong in here? Is there any limit on number of files being copied at one time?
Thank you in advance.
As #Joel Cochran said the issue may be a concurrency limit issue.
When we select Sequential, Copy activity will be single-threaded. Uncheck it, Copy activity will be multi-threaded, efficiency is greatly improved.
So our solution is:
Uncheck Sequential
Increase the maximum number of parallel operations of internal activities.

NT Authority/System can't see protected OS files

The Question:
Why can't the LocalSystem account (NT Authority\System) see files in the Recycle Bins or the Temporary Internet Files directory?
Background:
I created a scheduled task to run using the System account. The purpose of the task is to execute the Disk Cleanup Utility with predefined setting (for example: cleanmgr.exe sagerun:1). When it executes, it seems to run with no errors. But when I check the resources it's supposed to clean (Temporary Internet Files, Recycle Bin etc.), they're still there.
So I thought maybe cleaning up the two resources manually might work. I developed a console application in C# that clears the Recycle Bin and the Temporary Internet Files. I test it and it works just fine. But again, when I attempt to run it as a scheduled task with the System account, I run into the same issue again.
Following the log, it looks like when running the application with System account, it sees no files are in the Recycle Bin or the Temporary Internet Files directory.
Upon checking the Security tab for the Temporary Internet Files directory, it shows System as a full access account to that directory.
I'm so puzzled by this issue. I may be missing something but I assumed the LocalSystem account has the highest privilege on a machine. Is that not the case?

Writing a flat file in SSIS succeeds if run from VisualStudio, fails if run from the Integration Services Catalog

I have an SSIS package that does a Bulk Insert, then executes a SQL Task, and then finally writes some database data to a flat file on our network. The package runs fine if I run it from Visual Studio 2012. However, if I deploy the package to the Integration Services Catalog on a 2012 SQL Server and run it from there, the Bulk Insert and SQL Task run fine, but when the package tries to output to the flat file, I get these error messages:
Cannot open the datafile "\\nyfil006\Projects\Accounting\CostRecovery\Cafe de Novo\HospitalityCharges.csv".
HospitalityCharges Flat File failed the pre-execute phase and returned error code 0xC020200E.
I'm able to output the System::UserName to an errorlog, and it's what I think it should be: an account that has full permissions to the folder in the flat file destination (and its parent folders). I've tried creating a blank version of HospitalityCharges.csv, and DelayValidation is set to True for the Data Flow Task that outputs the flat file. My system admin has granted Network Service permissions to the folder as per this link and this link, but that doesn't help. I've also made the connection string an expression as described here. We've also created a mapped drive and used that for the Destination Connection String instead of a UNC path. No joy. Does anyone know why this is happening?
Another note: if I change the flat file destination to point to the C: drive, the package runs fine, whether I run it from Visual Studio or from the Integration Services Catalog.

source code location for debugging multiple instance of an application

Hi have an application running separateley (1 instance for customer) in different folders, 1 per each customer.
Each customer is a separate user on my machine.
At the moment I have the source code in each of these folders where I rebuild the code per each instance. Would it be better if I do something like the following?
create a shared folder where I build the code
deploy the binary in each user folder.
allow permission for each user to access the source code in READ ONLY mode.
when it is time to debug, by using gdb in each user folder will allow to read the source code and debug will happen.
Do you think that this could be a better approach or there are better practice?
My only concern is that each user has the chance to read the source code, but since the user will not access directly his folder (it is in my control) this should not trouble me.
I am using CENTOS 6.4, SVN and G++/GDB.
in different folders
There are no "folders" on UNIX, they are called directories.
I rebuild the code per each instance
Why would you do that?
Is the code identical (it sounds like it is)? If so, build the application once. There is no reason at all to have multiple copies of the resulting binary, or the sources.
If you make the directory with sources and binaries world-readable, then every user will be able to debug it independently.

Exporting IIS configuration

Windows 2003/IIS 6...
I have a virtual directory on a web site that closely mirrors the configuration another virtual directory on the same site will need. Since we have multiple dev/staging/test/prod environments, I'd like to be able to export the values of one virtual directory and quickly fire one up on either the same machine (with a different name/source directory) or on another machine (with perhaps the same name/source directory).
Can that be done? I see you can export the configuration through the IIS manager, but it seems to have a lot of keys embedded in it and I'm not sure if that can be directly imported into a separate entity on the same/different machine, or if it's only used for backups in case the original gets corrupted and needs to be restored.
You may want to take a look at the sample VBScript files installed with IIS 6. On my system they are in C:\Windows\System32
Two in particular seem relevant to your question:
iisvdir.vbs - allows listing, creating, and deleting virtual directories locally or remotely.
iiscnfg.vbs - allows exporting configuration for copying to another machine.
Neither one of these does exactly what you want, but it looks to me like they could be used as sample code to help you get to where you want.
Have you taken a look at the IIS6 Migration Tool yet? It may address your needs.

Resources