I have qvw file with sql query
Data:
LOAD source, color, date;
select source, color, date
as Mytable;
STORE Data into [..\QV_Data\Data.qvd] (qvd);
Then I export data to excel and save.
I need something to do that automatically instead of me
I need to run query every day and automatically send data to excel but keep old data in excel and append new value.
Can qlikview to do that?
For that you need to create some crazy macro that runs after a reload task in on open-trigger. If you schedule a windows task that execute a bat file with path to qlikview.exe with the filepath as parameters and -r flag for reload(?) you can probably accomplish this... there are a lot of code of similar projects to be found on google.
I suggest adding this to the loadscript instead.
STORE Table into [..\QV_Data\Data.csv] (txt);
and then open that file in excel.
If you need to append data you could concatenate new data onto the previous data.. something like:
Data:
load * from Data.csv;
//add latest data
concatenate(Data)
LOAD source, color, date from ...
STORE Data into [..\QV_Data\Data.csv] (txt);
I assume you have the desktop version so you don't have access to the Qlikview Management Console (if you do, this is obviously the best way).
So, without the Console, you should create a txt file with this command: "C:\Program Files\QlikView\Qv.exe" /r "\\thePathToYourFile\test.qvw". Save this file with .cmd file extension. After that you can schedule this command file with the windows task scheduler.
Related
I have an excel workbook that uses a hotkey that launches a batch file, which launches a Node script, which updates a CSV file. Technical details on that are further below.
The workbook uses the CSV file as a data source. I can manually update the Workbook with the data from the CSV file by going to Data > Refresh All > Refresh All.
Is there any way to trigger an update in the workbook once there is new data in the CSV file, or when the batch file finishes? Conceptually, I'm asking how an external event can trigger something in Excel.
Here are fine details on the process:
When a hotkey is pressed in the Excel workbook, it launches MS console ("cmd.exe") and passes the location of a batch file to be ran and the value of the selected cell. The reason the batch file is run this way is probably not relevant to this question but I know it will be asked, so I'll explain: The batch file is to be located in the same directory as the workbook, which is not to be a hard-coded location. The problem is that launching a batch-file/cmd.exe directly will default to a working directory of C:\users\name\documents. So to launch the batch file in the same directory as the workbook, the path of the workbook is passed along to cmd.exe like so: CD [path] which is then concatenated inline with another command to launch the batch file with the value of the selected cell as an argument like so: CD [path] & batch.bat cellValue
Still with me?
The batch file then launches a Node script, again with the selected cell value as an argument.
The Node script pulls data from the web and dumps it in to a CSV file.
At this point, the workbook still has outdated data, and needs to be Refreshed. How can this be automatic?
I could just start a static timer in VBA after the batch file is launched, which then runs ActiveWorkbook.RefreshAll, but if the batch file takes too long, there will be issues.
I found a solution, although it may not be the most efficient way.
Right now, after Excel launches the batch file, I have it set to loop and repeatedly check the date modified of the CSV file via FileDateTime("filename.csv")
At first, this looping was an issue because I was worried about Excel excessively checking the date modified of the CSV. I thought it may cause issues with resources while it checks however many hundred or thousands of times a second. I could add a 1 second delay with the sleep or wait functions, but those cause Excel to hang. It would be frozen until the CSV files were updated, if at all. The user would have to use CTRL+BREAK in an emergency.
I was able to use a solution that just loops and performs DoEvents while checking until a certain amount of time has passed. This way, Excel is still functional during the wait. More info on that here: https://www.myonlinetraininghub.com/pausing-or-delaying-vba-using-wait-sleep-or-a-loop
Using an existing SSIS package, I was trying to import .xlsx files we received from a client. I received the error message:
External table is not in the expected format
These files will open in XL
When I use XL (currently XL2010) to Save As... the file without making any changes:
The new file imports just fine
The new file is 330% the size of the original file
When changing .xlsx to .zip and investigating the contents with WinZip:
The original file only has 4 .xml files and a _rels folder (with 2 .rels files):
The new file has the expected .xlsx contents:
Does anyone know what kind of file this could be?
It would be nice to develop my SSIS package to work with these original files, without having to open and re-save each file. There are only 12 files, so if there are no other options, opening/saving each file is not that big of deal...and I could automate it with VBA going forward.
Thanks for any help anyone can provide,
CTB
There are many Excel file formats.
The file you are trying to import may have another excel format but the extension is changed to .xlsx (it could be edited by someone else) , or it could be created with a different Excel version.
There is a Third-Part application called TridNet File Identifier which is an utility designed to identify file types from their binary signatures. you can use it to specify the real extension of the specified file.
Also after a simple search on External table is not in the expected format this error is thrown when the definition (or version) of the excel files supported in the connection string is different from the file selected. Check the connection string used in the excel connection manager. It might help to identify the version of the file.
I have two problems using openpyxl
The number of rows in the spreadsheet are 1048498. The iteration hogs memory so I put a logic to check for first five empty columns and break from it
Logic 1 works for me and code does not indefinitely iterate over the spreadsheet blank cells. I am using P4Python to delete this read only file after I am done reading it. However, openpyxl is still using that file and there is no method except save to close the archive used internally. Since my file is in read only mode, I cannot save the file. When P4 is trying to delete this file, I get this error - "The process cannot access the file because it is being used by another process."
Help is appreciated :)
If you open the file in read-only mode then it will not hog memory. Cells are created only when read. Memory use has been tested with huge files but if you think this is a bug then please submit a bug report with a sample file.
This looks like an existing issue or intended beahvior with openpyxl. If you have a read only file (P4Python sync operation - p4.run_sync(file_path_to_sync)) and if you are reading it using openpyxl, you will not be able to delete the file (P4Python p4.run_sync(file_path_to_sync + '#0') - Remove from workspace) until you save the file which is not possible (or intended in my case) since it is a read only file.
So I have a folder in which excel files get loaded daily. I need to create a package that extract those files to a sql table. However, the file changes name daily as the name comes with the current date,so the filename would be like filename+currentdate(name20140801.xlsx).
I tried using a variable that has filename?.xlsx/filename?*xlsx as a excelpath property for the excel connection manager but the task gives me an error DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER.
Does anyone know how i can do this using the Data Flow task( I am trying to avoid Script task)
Thank you!
I need to write a Stored Proc/ Function which reads data from a worksheet of Excel workbook. How do I do it in DB2 ? I am using AIX os.
Tried Read Excel from DB2 but wont work on my OS.
Also tried
Import from FileName.csv of DEL COMMITCOUNT 1000 insert into TableName
but invain.
You have several options, the cleanest is probably to write a Java Stored Procedure, utilising the Apache POI library, if you intend to read Excel workbooks (.xls or .xlsx) rather than plain CSV formatted text files.
Not as clean but just as effective you can write a Perl / Python / PHP script to read the file and return a line at a time, and invoke the script from a stored procedure, see: Making Operating System Calls from SQL
Its be better to convert your excel file to flat file like csv if possible. Because DB2 not natively know excel file. Its csv file that can processed natively using IMPORT, LOAD or INGEST tools from DB2