I am creating a portscanner (via Python) and whenever I receive something, it should be written in an .XLSV file.
What it should do:
Webscanner finds port 21 open
Receives data
Writes it in a .XLSV file in row 2
Webscanner finds port 80 open
Receives data
Writes it in a .XLSV file in row 3
My code:
wb = load_workbook('scanreport.xlsx')
hitdetails = (str(hostname), str(host), str(port), str(keyword), str(banner))
wb = Workbook()
ws = wb.active
start_row = 1
start_column = 1
for searchresult in hitdetails:
ws.cell(row=start_row, column=start_column).value = searchresult
start_column += 1
start_row += 1
wb.save("scanreport.xlsx")
Result:
How can I fix this?
#skjoshi, you sir just fixed my problem with overwriting:
write into excel file without overwriting old content with openpyxl (this page)
Because I am already loading an existing file with an existing sheet, I was also creating a new worksheet, which will overwrite the old one in which I lose my data everytime.
In this case I removed the ws = wb.active and it worked!
Related
i'm not a expert in python and it's my fist post in stack overflow.
Maybe you can help me
I try to use python instead of vba to create document for a specific application.
This application is used to configure Programmable RTUs
It is possible to automate the creation of documents via excel by activating in excel a reference to this specific application.
This is a exemple to add a specific tag to a exsisting timer in a document with VBA in Excel
Sub timer()
Dim TWsdoc As TWinSoft.Document
Dim MyTag As TWinSoft.Tagname
Dim file_path As String
file_path = ThisWorkbook.Path
Set TWsdoc = GetObject("/RTU:MS32S2", "TWinSoft.Document")
Set TWsdoc = GetObject(file_path & "\test.tws")
Dim MyTimer As TWinSoft.OSCTimer
Set MyTimer = TWsdoc.OSCTimers("Timer_1")
'add Status Tag in Timer_1
Set MyTimer.Status = TWsdoc.Tagnames("Timer_1_State")
TWsdoc.Save 'save the document
Set TWsdoc = Nothing
End Sub
This is the python code i use to:
Creat a tag (status) with soms parameter
creat a timer (Timer 1)
try to associate timer status to a timer
and save the document
I use Visual Studio Code
THE problem, i can not associate the tag Timer_1_Status to the Timer_1 like in vba
I always have this error
Une exception s'est produite : AttributeError Property
'.Status' can not be set.
from multiprocessing import context
import win32com.client #need pywin32 (pip install pywin32)
import os, os.path
file = "\\test.tws"
file_path = os.path.dirname(os.path.abspath(__file__))
file_path = file_path + file
twdoc = win32com.client.Dispatch('Twinsoft.Document')
#-----------Creat Timer Status Tag-----------------
tagTimer1Sta=twdoc.AddTag('Timer_1_Status','DIV')
tagTimer1Sta.comment = 'Timer 1 Status'
tagTimer1Sta.ModbusAddress = 5000
#-----------Creat Timer-----------------
twdoc.AddOSCTimer("Timer_1")
#----------Associate Timer Status Tag to Timer
mytimer = twdoc.OSCTimers("Timer_1")
mytimer.Status = twdoc.Tagnames(tagTimer1Sta)
#-----------Save the created tws Document-----------------------
twdoc.SaveAs(file_path)
I thank you in advance for your help and I apologize for the bad English that I write, I am French.....
Yvan
I have written the following VBA code to automate SAS processes.
Rem Start the SAS server
Dim SASws As SAS.Workspace
Dim SASwsm As New SASWorkspaceManager.WorkspaceManager
Dim strError As String
Set SASws = SASwsm.Workspaces.CreateWorkspaceByServer _
("MySAS", VisibilityProcess, Nothing, "", "", strError)
Dim code_location As String, code_name As String, param_str As String
Dim param_flag As Boolean
code_location = "file:" & ThisWorkbook.Sheets("Control").Range("B2").Value
code_name = ThisWorkbook.Sheets("Control").Range("C2").Value
param_flag = ThisWorkbook.Sheets("Control").Range("C4").Value
If param_flag = True Then
param_str = ThisWorkbook.Sheets("Control").Range("D5").Value
Else: param_str = "ds=Sasuser.Export_output"
End If
Rem Run the stored process
Dim SASproc As SAS.StoredProcessService
Set SASproc = SASws.LanguageService.StoredProcessService
SASproc.Repository = code_location
SASproc.Execute code_name, "ds=Sasuser.Export_output"
'SASproc.Repository = "file:C:\Duopa_Repository\SAS_Codes\Weekly"
'SASproc.Execute "weekly_refresh_run.sas", "ds=Sasuser.Export_output"
Rem Shut down the SAS server
SASwsm.Workspaces.RemoveWorkspaceByUUID SASws.UniqueIdentifier
SASws.Close
Set SASws = Nothing
All the SAS codes seem to work through this except the ones that import csv files using proc import method. Can someone please explain me why
Thanks in advance
I'm in need of optimizing import of .xls files to matlab due to xlsread being very time consuming with large amount of files. Current xlsread script as follows:
scriptName = mfilename('fullpath');
[currentpath, filename, fileextension]= fileparts(scriptName);
xlsnames = dir(fullfile(currentpath,'*.xls'));
xlscount = length(xlsnames);
xlsimportdata = zeros(7,6,xlscount);
for k = 1:xlscount
xlsimport = xlsread(xlsnames(k).name,'D31:I37');
xlsimportdata(:,1:size(xlsimport,2),k) = xlsimport;
end
I have close to 10k files per week that needs processing and with approx. 2sec per file processed on my current workstation, it comes in at about 5½ hours.
I have read that ActiveX can be used for this purpose however that is far beyond my current programming skills and have not been able to find a solution elsewhere. Any help on how to make this would be appreciated.
If it is simple to perform with ActiveX (or other proposed method), I would also be interested in data on cells D5 and G3, which I am currently grabbing from 'xlsnames(k,1).name' and 'xlsnames(k,1).date'
EDIT: updated to reflect the solution
% Get path to .m script
scriptName = mfilename('fullpath');
[currentpath, filename, fileextension]= fileparts(scriptName);
% Generate list of .xls file data
xlsnames = dir(fullfile(currentpath,'*.xls'));
xlscount = length(xlsnames);
SampleInfo = cell(xlscount,2);
xlsimportdata = cell(7,6,xlscount);
% Define xls data ranges to import
SampleID = 'G3';
SampleRuntime = 'D5';
data_range = 'D31:I37';
% Initiate progression bar
h = waitbar(0,'Initiating import...');
% Start actxserver
exl = actxserver('excel.application');
exlWkbk = exl.Workbooks;
for k = 1:xlscount
% Restart actxserver every 100 loops due limited system memory
if mod (k,100) == 0
exl.Quit
exl = actxserver('excel.application');
exlWkbk = exl.Workbooks;
end
exlFile = exlWkbk.Open([dname filesep xlsnames(k).name]);
exlSheet1 = exlFile.Sheets.Item('Page 0');
rngObj1 = exlSheet1.Range(SampleID);
xlsimport_ID = rngObj1.Value;
rngObj2 = exlSheet1.Range(SampleRuntime);
xlsimport_Runtime = rngObj2.Value;
rngObj3 = exlSheet1.Range(data_range);
xlsimport_data = rngObj3.Value;
SampleInfo(k,1) = {xlsimport_ID};
SampleInfo(k,2) = {xlsimport_Runtime};
xlsimportdata(:,:,k) = xlsimport_data;
% Progression bar updater
progress = round((k / xlscount) * 100);
importtext = sprintf('Importing %d of %d', k, xlscount);
waitbar(progress/100,h,sprintf(importtext));
disp(['Import progress: ' num2str(k) '/' num2str(xlscount)]);
end
%close actxserver
exl.Quit
% Close progression bar
close(h)
Give this a try. I am not an ActiveX Excel guru by any means. However, this works for me for my small amount of test XLS files (3). I never close the exlWkbk so I don't know if memory usage is building or if it automatically cleaned up when descoped after the next is opened in its place ... so use at your own risk. I am seeing an almost 2.5x speed increase which seems promising.
>> timeit(#getSomeXLS)
ans =
1.8641
>> timeit(#getSomeXLS_old)
ans =
4.6192
Please leave some feedback if this work on large number of Excel sheets because I am curious how it goes.
function xlsimportdata = getSomeXLS()
scriptName = mfilename('fullpath');
[currentpath, filename, fileextension]= fileparts(scriptName);
xlsnames = dir(fullfile(currentpath,'*.xls'));
xlscount = length(xlsnames);
xlsimportdata = zeros(7,6,xlscount);
exl = actxserver('excel.application');
exlWkbk = exl.Workbooks;
dat_range = 'D31:I37';
for k = 1:xlscount
exlFile = exlWkbk.Open([currentpath filesep xlsnames(k).name]);
exlSheet1 = exlFile.Sheets.Item('Sheet1'); %Whatever your sheet is called.
rngObj = exlSheet1.Range(dat_range);
xlsimport = cell2mat(rngObj.Value);
xlsimportdata(:,:,k) = xlsimport;
end
exl.Quit
I am trying to read .csv file contents, then find the records which contain letter N, and export the rest of the row which contains that N to external .txt file.
At this point, there are total of 12 customers and 7 of them contain my needed letter N. I want them all to be exported. Only the first one which is found gets exported.
CSV file{A column and B column}
John 345N
Andy 346K
Andrew 564B
Richard 645N
John 563N
Andy 345N
Andrew 346K
Richard 564B
John 645N
Andy 563N
Andrew 345N
Richard 346K
Code:
Dim currentRow As String()
Dim customerName(11) As String
Dim idAndProduct(11) As String
Dim FileWriter As StreamWriter
Using FileReader As New Microsoft.VisualBasic.FileIO.TextFieldParser("C:\Files\customers.csv")
FileReader.TextFieldType = FileIO.FieldType.Delimited
FileReader.SetDelimiters(",")
For index = 0 To 11
currentRow = FileReader.ReadFields
customerName(index) = currentRow(0)
idAndProduct(index) = currentRow(1)
Next index
For counter = 0 To 11
If Mid$(idAndProduct(counter), 4, 1) = "N" Then
FileWriter = New StreamWriter("C:\Files\export.txt")
FileWriter.WriteLine(customerName(counter) & "," & idAndProduct(counter))
FileWriter.Close()
End If
Next counter
End Using
Output(only one line gets exported instead of all 7):
Andrew,345N
How can I make all 7 get exported ?
I would be thankful for any answer !
EDIT To solve this problem, code needs to be changed from FileWriter = New StreamWriter("C:\Files\export.txt") to FileWriter = New StreamWriter("C:\Files\export.txt", True) to prevent the file from overwriting.
I have a program that writes files to a network share at a high rate, from a few (3) threads at once.
After running for a while (usually a short while) some of these threads get stuck. Using Process Monitor, I can see that there are calls to WriteFile and CloseFile that simply have no answer.
At this point, I can't shut down the process at all, even killing it from the task manager does nothing.
The interesting thing is that this happens when the computer hosting the shares is running Windows Server 2008 (R2). If I move the shares to a Windows 2003 computer, I don't see these problems. Also, I only see this problem if the program is run on a computer that is running Windows Server 2008 (different computer than the share host).
Here is a short program that quickly reproduces the problem. The files in the source directory range in size from 1 to 20 MB:
Imports System.IO
Imports System.Threading
Module Module1
Private m_sourceFiles As FileInfo()
Private m_targetDir As String
Sub Main(ByVal args As String())
Dim sourceDir As New DirectoryInfo(args(0))
m_sourceFiles = sourceDir.GetFiles()
m_targetDir = args(1)
For i As Integer = 0 To 2
ThreadPool.QueueUserWorkItem(AddressOf DoWork)
Next
Console.ReadLine()
End Sub
Private Const BUFFER_SIZE As Integer = (128 * 1024)
Private Sub DoWork(ByVal o As Object)
Console.WriteLine(Thread.CurrentThread.ManagedThreadId)
Dim random As New Random(Thread.CurrentThread.ManagedThreadId)
While True
Dim fileIndex As Integer = random.Next(m_sourceFiles.Count)
Dim sourceFile As FileInfo = m_sourceFiles(fileIndex)
Dim input As FileStream = sourceFile.OpenRead
Dim targetName As String = sourceFile.Name.Replace(sourceFile.Extension, random.Next(Integer.MaxValue) & sourceFile.Extension)
Dim targetPath As String = m_targetDir & "\" & targetName
Dim output As FileStream = File.Create(targetPath)
Dim bytes() As Byte = New Byte((BUFFER_SIZE) - 1) {}
Dim read As Integer = input.Read(bytes, 0, bytes.Length)
While read <> 0
output.Write(bytes, 0, read)
read = input.Read(bytes, 0, bytes.Length)
End While
output.Flush()
output.Close()
Console.WriteLine(Thread.CurrentThread.ManagedThreadId & " - " & targetName)
End While
End Sub
End Module
The problem was caused by Symantec Antivirus.
Apparently they don't support 2008 R1 yet.
I was able to workaround the issue by disabling SMB 2.0 on the client computer, as described here:
sc config lanmanworkstation depend= bowser/mrxsmb10/nsi
sc config mrxsmb20 start= disabled