I am generating pdf file using dompdf whose filename keeps changing as I have assigned a variable in dompdf stream output and this filename always remains as unique filename.
I want to attach a file in my mail where I am using phpmailer.
Problem:
Able to attach file properly if I assign a particular file name in phpmailer as static file(with particular name assigned), but not getting successful for dynamic files (using variable as below). Here, $attach is my php variable for file name.
Trying out below code:
$dompdf->stream($attach);//Code for dompdf
$mail->AddAttachment('C:\Downloads\$attach.pdf'); //code for phpmailer to attach file
What i did is like this...........
I got all my html code in the following variable called `$html`.Then i followed below procedure.
$dompdf->load_html($html);
$dompdf->render();
$pdf = $dompdf->output();
$file_location will have dynamic file name.
file_put_contents($file_location,$pdf);
Now use this $file_location variable for attaching the file name to php mailer. This you need to write in the same php file where you are generating pdf.
Related
This question already has answers here:
SSIS - How to loop through files in folder and get path+file names and finally execute stored Procedure with parameter as Path + Filename
(2 answers)
Closed 3 years ago.
I have a xlsx file that will be dropped into a folder on a monthly basis. The filename will change every month (filename_8292019) based on the date, to which I cannot change.
I want to build a foreach loop to pick up the xlsx file and manipulate it (load into SQL server table, the move the file to an archive folder). I cannot figure out how to do this with a dynamic filename (where the date changes.
I was able to successfully run the package when converting the xlsx to CSV, and also when pointing directly to the xlsx filename.
[Flat File Destination [219]] Error: Cannot open the datafile "filename"
OR errors relating to file not found
The Files: entry on the Collection tab of the Foreach Loop container will accept wildcard characters.
The general pattern here is to create a variable, say, FileName. Set your Files: to something like:
Files:
BaseFileName*
or, if you want to be sure to only pick up spreadsheets, maybe:
Files:
BaseFileName*.xlsx
Select either Name and extension or Fully qualified, which will include the full file path. I usually just use Name and extension and put the file path into another variable so when Ops tells me they're moving my drop location, I can change a parameter instead of editing the package. This step tells the container to remember the name of the file it just found so you can use it later for a variable mapping.
On the Variable Mappings tab, select your variable name and assign it to Index 0.
Then, for each spreadsheet, the container will loop, pick up the name of the first file it finds that matches your pattern, and assign the full name, with the date extension (and path, if you go that way), to your variable. Pass the variable as in input parameter to the tasks inside the loop and use that to process the file, including moving it to the archive, or you'll get yourself into an infinite loop, processing the same file(s) over and over. <--Does that sound like the voice of experience? Yeah. Been there, done that.
Edit:
Here, the FullFilePath variable is just the folder name, without a file reference. (Red variable to red entry in the Folder box).
The FileBaseName variable drives what shows up in the Files box. (Blue to blue).
Another variable picks up the actual file name, with the date extension. Later, say in a File System Task, if I need the folder & file name together, I concatenate the variables.
As far as the Excel Connection Manager error you're getting, unfortunately I'm no help. I don't use it. We have SentryOne's Task Factory for SSIS which includes a much more resilient Excel connector.
I have set up a logic app with the ftp trigger [When a file is added or modified (properties only)]. This works just fine when I upload a 50+MB file to that ftp server. I have a [Get File Content] action set up right after the trigger. For the File input of the [Get File Content] action, I used the [List of Files Name] dynamic content from the trigger AND I have just filled in the path by using the available 'File Picker' (which connects to the FTP just fine). When I test this out, it fails on the [Get File Content] action stating BadRequest and this Body.
{
"status": 400,
"message": "An invalid request was made. Inspect the passed parameters and actions.\r\nclientRequestId: 7d9f2ff3-62d0-4f69-8cc5-f41c35297882",
"source": "ftp-eus.azconn-eus.p.azurewebsites.net"
}
The inputs that come into the action show the correct file name and path. So I am confused on what it means by "Inspect the passed parameters and actions". Can someone point me in the right direction on how to solve this?
EDIT
Here are some screenshots to show. I don't get [File Name] as a dynamic option from my trigger. It doesn't even matter though, I can pick the exact file I want downloaded from the FTP Picker and it still fails. See screenshots:
Dynamically select file:
Statically select file:
Same result from both of them:
If you use Get file content to pick file, you could find the input of File is the Path of the file you want, so you could not get the file content just with the File content. You could use File path or File name, if you want to use the File name you should also know the path.
If you want to use File name, the input would be like this, this is a little inconvenient.
Or just with the File path. Actually the inputs in these two ways are same, so they all could get the file.
Hi I'm new to puppet and trying to work on a sample to copy files from one location to another location. Any sample script to do that?
Ex: I've my file at d:\temp\test.txt and I want to copy this file to E:\mycopy\ folder.
You can "ensure" that the file at target location exists and provide the file to be copied as source in file type. A partial code snippet only showing relevant parts:
file { 'E:\mycopy\folder\filename':
ensure => present,
source => "d:\temp\test.txt",
}
Check the documentation of file type here and how source attribute behaves here. Now this will work with a few caveats :
If you are using absolute file path as source - then the file should be present on agent machine
If you are serving file from Puppet's file server then the source file should be in appropriate location in puppet's file server.
But what is your exact purpose? Similar thing can be achieved with content attribute of file type or other attributes
I want to identify the file-format of the input file given to my shell script - whether a .pst or a .dbx file. I checked How to check the extension of a filename in a bash script?. That one deals with txt files and two methods are given there -
check if the extension is txt
check if the mime type is application/text etc.
I tried file -ib <filename> on a .pst and a .dbx file and it showed application/octet-stream for both. However, if I just do file <filename>, then I get
this for the dbx file -
file1.dbx: Microsoft Outlook Express DBX File Message database
and this for the pst file -
file2.pst: Microsoft Outlook binary email folder (Outlook >=2003)
So, my questions are -
is it better to use mime type detection everytime when the output can be anything and we need a proper check?
How to apply mime type check in this case - both returning "application/octet-stream"?
Update
I didn't want to do an extension based detection because it seems we just can't be sure on a Unix system, that a .dbx file truly is a dbx file. Since file <filename> returns a line which contains the correct information of the file (e.g. "Microsoft Outlook Express DBX File Message database"). That means the file command is able to identify the file type properly. Then why does it not get the correct information in file -ib <filename> command?
Will parsing the string output of file <filename> be fine? Is it advisable assuming I only need to identify a narrow set of data storage files of outlook family (MS Outlook Express, MS Office Outlook 2003,2007,2010 etc.). A small text identifier like application/dbx which could be compared would be all I need.
The file command relies on having a file type detection database which includes rules for the file types that you expect to encounter. It may not be possible to recognize these file types if the file content doesn't have a unique code near the beginning of the file.
Note that the -i option to emit mime types actually uses a separate "magic" numbers file to recognize file types rather than translating long descriptions to file types. It is quite possible for these two databases to be out of sync. If your application really needs to recognize these two file types I suggest that you look at the Linux source code for "file" to see how they recognize them and then code this recognition algorithm right into your app.
If you want to do the equivalent of DOS file type detection, then strip the extension off the filename (everything after the last period) and look up that string in your own table where you define the types that you need.
I want to know how to dynamically assign file name using Log4net .My application is such that 10 different files should be dynamically created based on user input ,and later based on the name the corresponding file name needs to be picked up and information written to it
For example in my application based on my buisness requirement for every xml file a corresponding log file with the same name as xml file should be created .Later whenever I do any modification to the xml file an entry needs to be in the corresponding log file
Please help . I having trouble to get control of the appropriate log to write it
Have not done this, but there are probably a number of ways of doing this, so this may not be the best way, but it should work
public OpenLogFile(string fileName)
{
log4net.Layout.ILayout layout = new log4net.Layout.PatternLayout("%d [%t]%-5p : - %m%n");;
log4net.Appender.FileAppender appender = new log4net.Appender.FileAppender(layout , filename);
appender.Threshold = log4net.Core.Level.Info;
log4net.Config.BasicConfigurator.Configure(appender);
}
Then just call OpenLogfile when you need to switch files.
You might need to tweak the layout or appender type.
A big disadvantage of this method is you losing the xml configuration and the ability to change settings at runtime. So a better way might be to configure your appender in the xml file to use a property
eg
file type="log4net.Util.PatternString" value="Logfiles\Log_For_%property{MyLogFileName}"
Then in your code you could change the property
log4net.GlobalContext.Properties["MyLogFileName"] = ...;
The tricky bit is to get log4net to reload itself. I haven't read the documentation of this, so I don't know if there is a way of forcing a reload. It might work if you just call log4net.Config.XmlConfigurator.ConfigureAndWatch again. Otherwise it should work if you opened the xml file and saved it again (without needing to change anything)
Hope this helps.