Command Line Arg in VC++ 2010 - visual-c++

command line arguments having null after each char suppose i call the program from the command prompt like "abc.exe test data" then in memory there is an space after each char and the data is "t.e.s.t..d.a.t.a" What is the issue.
It is printing t only the first char not the complete string "test"
What is the issue.
int _tmain(int argc, _TCHAR* argv[])
{
printf("The number of Argc %d %s",argc,argv[1]);
return 0;
}

You are using UNICODE encoding (see the _t prefix in _tmain and _tchar).
This encoding stores characters on 2 bytes.
Hence you should use _tprintf instead of printf.

Related

How do I call use global variables in command line using (int argc, char *argv[])?

I have a global variables
that want to call such that when I write ./file 5 on command line of linux terminal => will result in m =5
#define m
...
int main(int argc, char* argv[]){
m = atoi(argv[1]);
for(int i =0; i< m ; i++){
cout << "hello" << endl;
}
You need to replace #define m with int m; for it to work. #define m is a a preprocessor macro, not a global variable.

Execve problems when reading input from pipe

I wrote a simple C program to execute another program using execve.
exec.c:
#include <unistd.h>
#include <stdio.h>
int main(int argc, char** argv) {
char path[128];
scanf("%s", path);
char* args[] = {path, NULL};
char* env[] = {NULL};
execve(path, args, env);
printf("error\n");
return 0;
}
I compiled it:
gcc exec.c -o exec
and after running it and writing "/bin/sh", it succesfully ran the shell and displayed the $ sign like a normal shell as can be seen in the picture.
Then I did the following: I created a server using nc -l 12345 and ran nc localhost 12345 | ./exec. It worked, but for some reason I can't understand, the $ sign was not displayed this time. I couldn't figure out the reason to this. (demonstrating images attached)
Now, here is the weirdest thing.
When I try to pass the program path AND more input via the pipe at once it seems like the executed process just ignores the input and closes.
For example:
But, if I run the following it works exactly the same way it worked when I piped nc output:
So, to conclude my questions:
I don't understand why the executed shell doesn't print the $ prompt sign when reads input from a pipe instead of stdin.
Why won't the executed program read input from the pipe when the input is already there and not waiting? It seems like it works only in the cases where the pipe remains open after the command execution.
Like AlexP already mentioned, the prompt sign is only displayed, when input comes from a terminal.
The second question was trickier: When you call libc-function scanf, its implementation will not only consume /bin/sh from the pipe, but also store the next input ls its internal buffers. Those internal buffers, will be overwritten by execve, so the shell gets nothing.
Here is your script without scanf to verify this:
#include <unistd.h>
#include <stdio.h>
int main(int argc, char** argv) {
char path[128];
read(0, path, 8); // consume `/bin/sh`
path[7] = '\0';
char* args[] = {path, NULL};
char* env[] = {NULL};
execve(path, args, env);
printf("error\n");
return 0;
}
Why did the example with cat work in the first place?
That's (probably) because of buffering also. Try:
(echo /bin/sh; echo ls) | stdbuf -i0 ./exec
I recommend this nice Article about buffering for further reading.

How to extract numbers correctly from an input file in command line argument

I am attempting use ifstream to extract two numbers from a file in argv[1], named "inputFile", and the extracting operator seems to be extracting the bits of code rather than the numbers needed.
inputFile.txt was put into the command line operator by right clicking the project, going to properties -> debugging -> command arguments -> typing inputFile.txt into command arguments in visual studio 2017.
The file inputFile.txt is as below:
1 2
#include <iostream>
#include <fstream>
#include <string>
#include <iomanip>
using namespace std;
int main(int argc, char *argv[])
{
//Test opening file
cout << "Input file: " << argv[1] << endl;
ifstream in(argv[1]);
if (!in)
{
cerr << "Unable to open " << argv[1] << " for input";
return 1;
}
//extract numbers
int num1;
int num2;
in >> num1 >> num2;
cout << num1 << endl << num2 << endl;
in.close();
return 0;
}
I expect the int num1 to hold 1, and the int num2 to hold 2, but instead each variable holds the number -858993460.
Are you sure the file you are reading has got the data you expect? This code works fine for me, compiling with Visual Studio 2005. But, If I change the contents of the file, writing not numbers (for example if I write: a b), num1 and num2 ends with -858993460.

more than one command for system call in linux

I am trying to execute a program (say target.c) that has the following
void foo(char * arg)
{
char cmd[16];
char par[16];
char * p;
strcpy(cmd, "ls --color -l ");
strcpy(par, arg);
printf("You can use \"%s %s\" to list the files in dir \"%s\"!\n",cmd, par, par);
p = (char*)malloc(strlen(cmd) + strlen(par) + 2);
strcpy(p, cmd);
strcat(p, " ");
strcat(p, par);
system(p);
}
int main(int argc, char ** argv)
{
int i;
char test[256];
if (argc > 1)
foo(argv[1]);
else
printf("usage: %s dir\n", argv[0]);
return 0;
foo(test);
};
Now i am trying to get shell by invoking it from another program (it is important to invoke from another program shown below:
int main(int argc, char **argv)
{
char * arrv[] = {NULL};
char *payload;
int i; int j;
char * argo[] = {"../targets/target1","sdknsd",NULL};
strcpy(payload,"sd;/bin/sh");
argo[1] = payload;
i=fork();
if(i == 0)
{
execve("../targets/target1" ,argo, arrv );
exit(1);
}
else if(i == -1)
{
perror("fork()");
}
}
My question is when I try to execute the target and provide command line arguments something ; /bin/sh then I get the shell but not in case of invoking from execve.
Any help would be really appreciated
Alright here is the output:
[hvalayap#localhost targets]$ ./target1 ds;/bin/sh
ls: ds: No such file or directory
sh-2.05$
The above program appends the user input string onto ls and passes it to system hence system(ls ds;/bin/sh " gives me shell
But when I try to do the same with execve from another program(the second program) it doesn't work
says "ds" directory not found
Look at your code very carefully. The char *payload is on stack, and then you strcpy at this address, hence you overwrite local variables on stack. You didn't allocate memory for this pointer (e.g. malloc or use local static buffer). If user input string will be more longer (say 255 symbols) you cat get Segmentation fault error.
BTW: Why wouldn't you use snprintf instead strcpy? More security carfully I suppose.

Can not print out the argv[] values using std::cout in VC++

This is my first question on the site even though i have been coming here for reference for quite some time now. I understand that argv[0] stores the name of the program and the rest of the commandline arguements are stored in teh remaining argv[k] slots. I also understand that std::cout treats a character pointer like a null terminated string and prints the string out. Below is my program.
#include "stdafx.h"
#include <fstream>
#include <iostream>
using namespace std;
int _tmain(int argc, _TCHAR* argv[])
{
cout << argv[0] << " ";
cout << argv[1] ;
return 0;
}
According to all the other programs I have seen over my internet search in the issue, this program should printout two strings viz. name of the program and the commandline arguement. The console window shows
0010418c 001048d6
I believe these are the pointers to argv[0] and argv[1] resp.
The only commandline arguement I have is "nanddumpgood.bin" which goes in argv[1] and shows the strings correctly if I mouseover the argv[] arrays while debugging.
Whis is this happening? What am I doing wrong? I understand, arrays decay to pointers in special cases? Is this a case where it doesnt?
I also understand that std::cout treats a character pointer like a null terminated string and prints the string out.
That's mostly correct. It works for char*, but not other types of characters. Which is exactly the problem. You have a _TCHAR*, which IS char* on an ANSI build but not on a Unicode build, so instead of getting the special string behavior, you get the default pointer behavior.
I understand, arrays decay to pointers in special cases? Is this a case where it doesnt?
argv is an array, but neither argv[0] nor argv[1] are arrays, they are both pointers. Decay is not a factor here.
The simplest fix is to use int main(int argc, char* argv[]) so that you get non-Unicode strings for the command-line arguments. I'm recommending this, rather than switching to wcout, because it's much more compatible with other code you find on the internet.
Use wcout for Unicode strings.
I guess you are compiling your application with the unicode compiler switch which treats all TCHAR as wchar_t. Therefore cout treats argv as an int.
Write instead
wcout << argv[0] << L" ";
wcout << argv[1] ;
or change to Use Multi-byte character set in the Project settings/General.

Resources