I can't write into the Control register for parallel port on window 7 64 bit platform, why? - 64-bit

I am using inpout32.dll to deal with parallel port on my pc. I find I can change value of control register (0x37a) on windows 7 32bit, but I can't on 64bit.
Anyone knows the reason?
The home page of the dll is http://www.highrez.co.uk/
I paste the source code for inpoutx64.sys as follows, it's pretty simple, just call WRITE_PORT_UCHAR system api, any differences for 64bit version and 32bit version of this function?
NTSTATUS hwinterfaceDeviceControl(IN PDEVICE_OBJECT DeviceObject, IN PIRP pIrp)
{
PIO_STACK_LOCATION stkloc;
NTSTATUS ntStatus = STATUS_SUCCESS;
struct tagPhys32Struct Phys32Struct;
PUCHAR cData;
PUSHORT sData;
PULONG lData;
PUSHORT address;
ULONG inBuffersize;
ULONG outBuffersize;
ULONG inBuf;
PVOID CtrlBuff;
stkloc = IoGetCurrentIrpStackLocation( pIrp );
inBuffersize = stkloc->Parameters.DeviceIoControl.InputBufferLength;
outBuffersize = stkloc->Parameters.DeviceIoControl.OutputBufferLength;
CtrlBuff = pIrp->AssociatedIrp.SystemBuffer;
cData = (PUCHAR) CtrlBuff;
sData = (PUSHORT) CtrlBuff;
lData = (PULONG) CtrlBuff;
address = (PUSHORT) CtrlBuff;
switch ( stkloc->Parameters.DeviceIoControl.IoControlCode )
{
case IOCTL_READ_PORT_UCHAR:
if ((inBuffersize >= 2) && (outBuffersize >= 1))
{
UCHAR value;
value = READ_PORT_UCHAR((PUCHAR)address[0]);
cData[0] = value;
}
else
{
ntStatus = STATUS_BUFFER_TOO_SMALL;
}
pIrp->IoStatus.Information = sizeof(UCHAR);
ntStatus = STATUS_SUCCESS;
break;
case IOCTL_WRITE_PORT_UCHAR:
if (inBuffersize >= 3)
{
WRITE_PORT_UCHAR((PUCHAR)address[0], cData[2]); //Byte 0,1=Address Byte 2=Value
pIrp->IoStatus.Information = 10;
}
else
{
ntStatus = STATUS_BUFFER_TOO_SMALL;
pIrp->IoStatus.Information = 0;
ntStatus = STATUS_SUCCESS;
}
break;

I think you have some knowledge about hardware architecture, well, 64bit architecture change completely, and change register values can be protected for the hardware level or software level.
try to share what you want to do, Your ask don't left clear what you main deal.

Related

Using "Microsoft Windows Security Auditing" provider in real-time consumer with ETW (Event Tracing for Windows)

My task is to make an ETW real-time consumer with events provided by 'Microsoft Windows Security Auditing'.
I made a simple controller and consumer application, basing on this example http://msdn.microsoft.com/en-us/library/windows/desktop/ee441325%28v=vs.85%29.aspx
and changing flags to work in real-time mode.
The main function looks this way:
LPTSTR SessionName = L"hahahaaa";
ULONG status = ERROR_SUCCESS;
PEVENT_TRACE_PROPERTIES pSessionProperties = NULL;
EVENT_TRACE_LOGFILE trace;
TRACEHANDLE hTrace = 0;
TRACEHANDLE hSession = 0;
const GUID providerId = { 0x54849625, 0x5478, 0x4994, { 0xA5, 0xBA, 0x3E, 0x3B, 0x03, 0x28, 0xC3, 0x0D } };
//const GUID providerId = { 0xA68CA8B7, 0x004F, 0xD7B6, { 0xA6, 0x98, 0x07, 0xE2, 0xDE, 0x0F, 0x1F, 0x5D } };
HANDLE hToken = NULL;
HANDLE hProcess = NULL;
hProcess = GetCurrentProcess();
if (OpenProcessToken(hProcess, TOKEN_ADJUST_PRIVILEGES, &hToken) == FALSE) {
printf("Error: Couldn't open the process token\n");
goto cleanup;
}
if(!SetPrivilege(hToken, SE_SECURITY_NAME, TRUE)) goto cleanup;
if (!pSessionProperties) {
const size_t buffSize = sizeof(EVENT_TRACE_PROPERTIES)+(_tcslen(SessionName) + 1) * sizeof(TCHAR);
pSessionProperties = reinterpret_cast<EVENT_TRACE_PROPERTIES *>(malloc(buffSize));
ZeroMemory(pSessionProperties, buffSize);
pSessionProperties->Wnode.BufferSize = buffSize;
pSessionProperties->Wnode.ClientContext = 1;
pSessionProperties->Wnode.Flags = WNODE_FLAG_TRACED_GUID;
pSessionProperties->LogFileMode = EVENT_TRACE_REAL_TIME_MODE;
pSessionProperties->LoggerNameOffset = sizeof(EVENT_TRACE_PROPERTIES);
}
// Create the trace session.
status = StartTrace(&hSession, SessionName, pSessionProperties);
if (ERROR_SUCCESS != status) {
wprintf(L"StartTrace() failed with %lu\n", status);
goto cleanup;
}
status = EnableTraceEx2(hSession, &providerId, EVENT_CONTROL_CODE_ENABLE_PROVIDER, TRACE_LEVEL_VERBOSE, 0, 0, 0, NULL);
if (ERROR_SUCCESS != status) {
wprintf(L"EnableTrace() failed with %lu\n", status);
goto cleanup;
}
ZeroMemory(&trace, sizeof(EVENT_TRACE_LOGFILE));
trace.LogFileName = NULL;
trace.LoggerName = SessionName;
trace.CurrentTime = 0;
trace.BuffersRead = 0;
trace.BufferSize = 0;
trace.Filled = 0;
trace.EventsLost = 0;
trace.Context = NULL;
trace.ProcessTraceMode = PROCESS_TRACE_MODE_REAL_TIME | PROCESS_TRACE_MODE_EVENT_RECORD;
trace.EventRecordCallback = (PEVENT_RECORD_CALLBACK)(ProcessEvent);
hTrace = OpenTrace(&trace);
if (INVALID_PROCESSTRACE_HANDLE == hTrace)
{
wprintf(L"OpenTrace failed with %lu\n", GetLastError());
goto cleanup;
}
status = ProcessTrace(&hTrace, 1, 0, 0);
if (status != ERROR_SUCCESS && status != ERROR_CANCELLED)
{
wprintf(L"ProcessTrace failed with %lu\n", status);
goto cleanup;
}
The application at 'ProcessTrace()' point should wait for incoming events and write to stdout theirs meta-data. But it simply does not. All generated events by me (i.e. I turn on Detailed Tracking - Process Creation and run an application) are shown in EventViewer, but my program shows nothing.
I thought that it might be some privileges problem, and using this example http://msdn.microsoft.com/en-us/library/windows/desktop/aa446619%28v=vs.85%29.aspx I set SE_SECURITY_NAME privilege and of course run application in Administrator's mode. But nothing changed.
Another try was a session name. Maybe it is the same problem as with 'Windows Kernel Trace' which can log only to system session 'NT Kernel Logger'. Only thing which I found was that 'Microsoft Windows Security Auditing' is correlated with 'Eventlog-Security' session, but when I set session name by this I received 'Access Denied' error. I don't know which extra privilege I should set to handle this.
Last try was to use 'logman' and collect events into a file, but everything was the same. When I set session name by 'Eventlog-Security' I received 'Access Denied'. On the other hand, when I set it by something else I only received one event, provided by 'MSNT_SystemTrace', which is the abstract class for other events.
If I change provider to i.e. 'Microsoft Windows Kernel General' (commented GUID) and generate an event (update system clock) everything works as it should (in my application and using 'logman').
I works in Windows 7 Professional x64 and Visual Studio Ultimate 2013.
My question is, what can I do to receive events from 'Microsoft Windows Security Auditing' provider?
Thanks for any help!
EDIT
As I wrote in comment, if we set SessionName on Eventlog-Security, the application is shortened to OpenTrace() and ProcessTrace().
EDIT 2
As Luke suggested in comment, I ran my application with LocalSystem privileges, and everything started working.

wxDirDialog Returns the Wrong Directory on Vista

I recently ported the following code to wx3.0 under visual studio 2013:
void PanelFileControl::on_retrieve_clicked(wxCommandEvent &event)
{
if(!chosen_files.empty())
{
Csi::ModalCounter counter;
wxDirDialog query(
this,
make_wxString(my_strings[strid_choose_retrieve_dir]),
make_wxString(wxGetApp().get_config()->get_last_prog_dir()));
int rcd;
query.CentreOnParent();
rcd = query.ShowModal();
if(rcd == wxID_OK)
{
// we need to generate an operation for every selected file.
StrAsc path(make_StrAsc(query.GetPath()));
DlgFileControl::operations_type operations;
if(path.last() != Csi::FileSystemObject::dir_separator())
path.append(Csi::FileSystemObject::dir_separator());
for(files_type::iterator fi = chosen_files.begin(); fi != chosen_files.end(); ++fi)
{
file_type const &file(*fi);
StrAsc file_path(path + file.get_file_name());
bool use_file(true);
if(Csi::file_exists(file_path.c_str()))
{
OwxStringStream message;
message << boost::format(my_strings[strid_overwrite_file_confirm].c_str()) %
file_path;
wxMessageDialog overwrite_query(
this,
message.str(),
wxT(""),
wxYES_NO | wxNO_DEFAULT | wxICON_QUESTION);
int rcd;
overwrite_query.CentreOnParent();
rcd = overwrite_query.ShowModal();
if(rcd != wxID_YES)
use_file = false;
}
if(use_file)
operations.push_back(new ReceiveFileOperation(file, file_path));
}
// we can now display the operation dialogue
if(!operations.empty())
{
DlgFileControl dialogue(this, device_name, operations);
dialogue.show_modal();
}
}
}
} // on_retrieve_clicked
Following this change (which didn't require any changes to the code above), I have received complaints that, if the user selects the desktop and then double clicks on a directory on the desktop, that the file save operation fails. This appears to be a result of the path produced by the wxDirDialog::GetPath() and has only been seen under windows vista. I have followed up some testing and I find that, under Vista, the last path component is mentioned twice in the string returned by GetPath().
Has anyone seen this issue? Are there any work arounds?
I found that I can address the issue by preventing the wxDirDialog from using the IFILEDIALOG interface from being used. My ShowModal() method now looks like this:
int wxDirDialog::ShowModal()
{
WX_HOOK_MODAL_DIALOG();
wxWindow* const parent = GetParent();
WXHWND hWndParent = parent ? GetHwndOf(parent) : NULL;
// Use IFileDialog under new enough Windows, it's more user-friendly.
int rc;
#if wxUSE_IFILEDIALOG
if ( wxGetWinVersion() > wxWinVersion_Vista )
{
rc = ShowIFileDialog(hWndParent);
}
else
{
rc = wxID_NONE;
}
if ( rc == wxID_NONE )
#endif // wxUSE_IFILEDIALOG
{
rc = ShowSHBrowseForFolder(hWndParent);
}
// change current working directory if asked so
if ( rc == wxID_OK && HasFlag(wxDD_CHANGE_DIR) )
wxSetWorkingDirectory(m_path);
return rc;
}

Windows + SetCommState how to set RTS?

I have an application that has been ported up from VC++ 6 where it worked fine. The code in question uses the WinAPI for a serial device driver. When ported to VS2012, the same code behaves rather differently.
Create a DCB, set SetCommState and go. CTS was set high, RTS was set high and you were on your way.
Since ported up to VS2012 Pro MFC, I am finding that it sets up SetCommState the same with or without hardware flow control:
memset(&dcb, 0x00, sizeof(dcb));
dcb.DCBlength = sizeof(DCB);
// Must be TRUE, only binary mode in Windows
dcb.fBinary = TRUE;
dcb.fParity = FALSE;
// XOn/XOff disabled
dcb.fTXContinueOnXoff = TRUE;
dcb.fOutX = FALSE;
dcb.fInX = FALSE;
dcb.XonLim = 0;
dcb.XoffLim = 0;
dcb.XonChar = 0;
dcb.XoffChar = 0;
// Misc Stuff
dcb.EofChar = 0;
dcb.EvtChar = 0;
dcb.ErrorChar = 0;
dcb.fErrorChar = FALSE;
dcb.fNull = FALSE;
dcb.fAbortOnError = FALSE;
// 8N1 Setup
dcb.ByteSize = 8;
dcb.Parity = NOPARITY;
dcb.StopBits = ONESTOPBIT;
// Baud Rate
if (dwBaudRate == BAUD_115200)
{
dcb.BaudRate = CBR_115200;
}
else
{
dcb.BaudRate = CBR_38400;
}
// setup hardware flow control
if (bHardware == eYesHardwareFlowControl)
{
// ================ FLOW CONTROL ON ================
switch (bIgnoreCTS)
{
case eIgnoreCTS:
dcb.fOutxCtsFlow = FALSE;
break;
case eEnableCTS:
dcb.fOutxCtsFlow = TRUE;
break;
default:
case eCTSDecideLater:
dcb.fOutxCtsFlow = TRUE;
break;
}
// DSR Flow Control
dcb.fDsrSensitivity = FALSE;
dcb.fOutxDsrFlow = FALSE;
// <<Hardware flow control On(TRUE) Off(FALSE)>>
dcb.fDtrControl = DTR_CONTROL_ENABLE;
// <<Hardware flow control On(_HANDSHAKE) Off(_ENBLE)>>
dcb.fRtsControl = RTS_CONTROL_HANDSHAKE;
}
else
{
// ================ FLOW CONTROL OFF ================
switch (bIgnoreCTS)
{
case eIgnoreCTS:
dcb.fOutxCtsFlow = FALSE;
break;
case eEnableCTS:
dcb.fOutxCtsFlow = TRUE;
break;
default:
case eCTSDecideLater:
dcb.fOutxCtsFlow = FALSE;
break;
}
// DSR Flow Control
dcb.fDsrSensitivity = FALSE;
dcb.fOutxDsrFlow = FALSE;
dcb.fDtrControl = DTR_CONTROL_ENABLE;
dcb.fRtsControl = RTS_CONTROL_ENABLE;
}
if (SetCommState(m_hIdComDev, &dcb) == WINDOWS_API_ZERO_IS_BAD)
{
dwLastError = GetLastError();
}
At this point, I have set up the DCB, cleared it. I don't read in the previous state is I don't want to trust any garbage of anyone that previously used the port. Then I set up every field with Baud, Flow Control and CTS Ignore being the only optional items.
So, what I've noticed is that I can create situation where the Device and the PC don't communicate. Now, mind you, they always did before and they always work with Hyperterminal, ProComm, TeraTerm and so on. What I can see is that when these comm programs start (and the old VC++ 6 App), when the device is created and set up, RTS is immediately set high.
Now, my App, once the DCB is set up, SetCommState called; RTS is always LOW. And when this happens, communications is toast.
I want to FORCE RTS to be high and thought I could do it like this:
if (EscapeCommFunction(m_hIdComDev, CLRRTS) == WINDOWS_API_ZERO_IS_BAD)
{
dwLastError = GetLastError();
}
if (EscapeCommFunction(m_hIdComDev, SETRTS) == WINDOWS_API_ZERO_IS_BAD)
{
dwLastError = GetLastError();
}
But this fails, it gives an error 87 (parameter error). And I cannot quite figure it out. Even if I just SETRTS high only, it fails.
Any ideas on how I can force Windows to set RTS high after I setup the comm parameters in the DCB?
Well it turned out to be a device problem and not a windows problem at all.

How to send data from barcode scanner MT2070

i've got problems with the barcode scanner MT2070 from Motorola. I use the EMDK 2.6 for .NET(Update 2) to create strings from the scanned barcode, then transmit them to the host pc. But the transmit failed.
The MT2070 run with Windows CE5.0 and is connected over bluetooth to the cradle STB2078. But everytime i get "send failed" and the ResultCode is "E_INCORRECT_MODE".
The problem is that dont understand what they mean with "INCORRECT_MODE" i set it to DECODE and by RawData what is mean with source?
ScannerServicesClient scannerServices;
scannerServices = new ScannerServicesClient();
SCANNERSVC_MODE mode;
if(scannerServices.Connect(true))
{
Logger("start service with decode rights"); // primitiv method to see what happen
scannerServices.GetMode(out mode);
if (mode != SCANNERSVC_MODE.SVC_MODE_DECODE)
{
mode = SCANNERSVC_MODE.SVC_MODE_DECODE;
if (scannerServices.SetMode(mode) != RESULTCODE.E_OK)
{
Logger("cant set mode: " + mode.ToString());
}
}
// wanna know which connection is use
string connection = "";
switch (scannerServices.HostParameters.CurrentConnection)
{
case SCANNERSVC_DATA_CONNECTION.NO_CONNECTION:
connection = "Not connected";
break;
case SCANNERSVC_DATA_CONNECTION.BLUETOOTH:
connection = scannerServices.HostParameters.BluetoothConnection.ToString();
break;
case SCANNERSVC_DATA_CONNECTION.RS232:
connection = scannerServices.HostParameters.RS232Connection.ToString();
break;
case SCANNERSVC_DATA_CONNECTION.USB_CABLE:
connection = scannerServices.HostParameters.USBConnection.ToString();
break;
}
Logger(connection);
ScannerHostParameters scnHost = new ScannerHostParameters(scannerServices);
//example hello
string input = "hello"; //what should send
byte[] output = new byte[input.Length]; //field with converted data
byte source = 0; //<-- what mean source? i sum all byte-value but this cant be correct
for (int i = 0; i < input.Length; ++i)
{
output[i] = Convert.ToByte(input[i]);
source += output[i];
}
RawData rawData = new RawData(output, input.Length, source);
//RawParameters rawParam = new RawParameters();
//rawParam.BaudRate = RawParameters.RawBaudRates.RAWSERIAL_9600;
//rawParam.Type = RawParameters.RawHostType.Auto;
RESULTCODE result = scannerServices.SendRawData(rawData, 2000);
if(result == RESULTCODE.E_OK)
{
Logger("successful send");
}
else
{
Logger("Send failed: " + result.ToString());
}
Logger("ScannerService kill");
scannerServices.Disconnect();
}
Logger("\n");
scannerServices.Dispose();
scannerServices = null;
Thanks for your help! (and sorry for my english)
At some point (somewhere where you're setting the mode - I do it right after setting the mode) you'll want to do this:
//set raw mode
if (RESULTCODE.E_OK != scannerServices.SetAttributeByte((ushort)ATTRIBUTE_NUMBER.ATT_MIA_HOSTNUM, (byte)ENUM_HOSTS.HOST_RAW))
{
throw new Exception("Can't set RAW mode");
scannerServices.Disconnect();
scannerServices.Dispose();
return;
}
Where you have:
RawData rawData = new RawData(output, input.Length, source);
you can leave source as 0:
RawData rawData = new RawData(output, input.Length, 0);
Unfortunately I'm not the greatest when it comes to programming so I've only managed to stumble my way through getting my scanner to work. The documentation isn't great, in fact I find it severly lacking. Even the people at Motorola don't seem to know much about it or how to program it. I've been given misinformation by them on on at least one point.
I use the CDC COM Port Emulation mode for the scanner so that it shows up under Ports in Device Manager (I need the scanner to work with an old program we have which uses COM ports). A driver is also needed for this.
Depending on how you're using the scanner, the above may or may not work.

HEAP CORRUPTION DETECTED after upgrade the project from vc6 to vc9

The function which were written in vc6.
bool CProductionTestDlg::GetVariables(CString strFilename, CMapStringToOb *cVariableMap)
{
int iMaxEntryLen = 1000;
//char rgbEntryNames[1000]; //previous
char *rgbEntryNames = (char*)malloc(iMaxEntryLen * sizeof(int)); //Now
CString strEntryName = "";
CString strEntryValue = "";
UINT uiSeperator = 0;
ULONG dwRetCode, dwSizeOfReturn;
dwSizeOfReturn = GetPrivateProfileString(cszVariables,
NULL,
"",
rgbEntryNames,
iMaxEntryLen,
strFilename);
while ( uiSeperator < dwSizeOfReturn )
{
strEntryName.Format("%s", &rgbEntryNames[uiSeperator]);
uiSeperator += strEntryName.GetLength() + 1;
CString *strValue = new CString();
dwRetCode = GetPrivateProfileString(cszVariables,
strEntryName,
"",
strEntryValue.GetBufferSetLength(strEntryValue.GetLength()),
iMaxEntryLen,
strFilename);
strValue->Format("%s", strEntryValue);
cVariableMap->SetAt(strEntryName, (CObject*)strValue);
}
return true;
}
Now I upgrade it on vs08.The project build correctly but when I open exe it throw an exception
*HEAP CORRUPTION DETECTED * CRT Detected that the application wrote to memory after end of heap buffer.
When I debug the my application the the control goes to dbgheap.c at line no 2103 after return true.
The problem is here:
dwRetCode = GetPrivateProfileString(cszVariables,
strEntryName,
"",
strEntryValue.GetBufferSetLength(strEntryValue.GetLength()),
iMaxEntryLen,
strFilename);
You pass a buffer of size 0 (strEntryValue is initialized to ""), but say its size is iMaxEntryLen. So GetPrivateProfileString thinks it has a much larger buffer than it actually got, and write beyond its bounds.
The reason you get this error after upgrading is, is guess, the improvement of the bounds validation. The bug was there in VC6 as well, it just wasn't detected.

Resources