How do i handle this error - HEAP CORRUPTION DETECTED: after Normal block (#136) - visual-c++

void main(){
char *p;
p = new char[3];
strcpy_s(p, sizeof(p), "AB");
cout << p << endl << "input : ";
cin.get(p, 3);
delete []p;
}
Why does an error occurs in the 'delete'??
The errors do not occur elsewhere..
enter image description here

Based on my Igor's and Han's comments, it appears that using sizeof(p) in the strcpy_s function is actually your issue. p is a char *, doing sizeof(p) returns the size of the pointer (usually 4 or 8 depending on the architecture), not the size of the allocated array. In debug builds, the documentation for strcpy_s states:
The debug versions of these functions first fill the buffer with 0xFE. To disable this behavior, use _CrtSetDebugFillThreshold.
So, the runtime will fill 4 (or 8) bytes, which is larger than your allocated array into p, thus corrupting the heap.

Related

Memory not be freed on Mac when vector push_back string

Code as below, found that when vector push_back string on a Mac demo app, memory not be freed. I thought the stack variable will be freed when out of function scope, am I wrong? Thanks for any tips.
in model.h:
#pragma once
namespace NS {
const uint8_t kModel[8779041] = {4,0,188,250,....};
}
in ViewController.mm:
- (void)start {
std::vector<std::string> params = {};
std::string strModel(reinterpret_cast<const char *>(NS::kModel), sizeof(NS:kModel));
params.push_back(strModel);
}
The answer to your question depends on your understanding of the the "free" memory. The behaviour you are observing can be reproduced as simple as with a couple lines of code:
void myFunc() {
const auto *ptr = new uint8_t[8779041]{};
delete[] ptr;
}
Let's run this function and see how the memory consumption graph changes:
int main() {
myFunc(); // 1 MB
std::cout << "Check point" << std::endl; // 9.4 MB
return 0;
}
If you put one breakpoint right at the line with myFunc() invocation and another one at the line with "Check point" console output, you will witness how memory consumption for the process jumps by about 8 MB (for my system and machine configuration Xcode shows sudden jump from 1 MB to 9.4 MB). But wait, isn't it supposed to be 1 MB again after the function, as the allocated memory is freed at the end of the function? Well, not exactly.. The system doesn't regain this memory right away, because it's not that cheap operation to begin with, and if your process requests the same amount memory 1 CPU cycle later, it would be quite a redundant work. Thus, the system usually doesn't bother shrinking memory dedicated to a process either until it's needed for another process, and until it runs out of available resources (it also can be some kind of fixed timer, but overall I would say this is implementation-defined). Another common reason the memory is not freed, is because you often observe it through debug mode, where the memory remains dedicated to the process to track some tricky scenarios (like NSZombie objects, which address needs to remain accessible to the process in order to report the use-after-free occasions).
The most important here is that internally, the process can differentiate between "deleted" and "occupied" memory pages, thus it can re-occupy memory which is already deleted. As a result, no matter how many times you call the same function, the memory dedicated to the process remains the same:
int main() {
myFunc(); // 1 MB
std::cout << "Check point" << std::endl; // 9.4 MB
for (int i = 0; i < 10000; ++i) {
myFunc();
}
std::cout << "Another point" << std::endl; // 9.4 MB
return 0;
}

How to make an array of pointers and make the user enter the size of it?

I want to make an array, and inside this array there are pointers, like this:
int *arrp[size]; and I want the user to enter the size of it.
I tried to do this:
#include <iostream>
using namespace std;
int main ()
{
int size;
cout << "Enter the size of the array of pointers" << endl;
cin >> size;
int *arrp[size];
return 0;
}
but this doesn't work.
I also tried to do this:
#include <iostream>
using namespace std;
int main ()
{
int size;
cout << "Enter the size of the array of pointers" << endl;
cin >> size;
int* arrp[] = new int[size];
return 0;
}
also doesn't work, can someone help?
The error of the first code is that the size must be constant, I tried to fix that by writing the 2nd code but it gives an error for the word "new" in line 9:
E0520 initialization with '{...}' expected for aggregate object
and another error for the size in the same line:
C2440 'initializing': cannot convert from 'int *' to 'int *[]'
To make an array of pointers you should type: int** arr = new int*[size]
we type 2 stars '*', the first mean a pointer to an integer, the second means a pointer to the pointer to the integer, and then we make a place in the memory for those pointers by typing = new int*[size], you can use this as a 2D array that stored in the heap (not the stack) go to this website to know the difference: https://www.geeksforgeeks.org/stack-vs-heap-memory-allocation/.
to know more about how to use an array of pointers to a pointer to an integers you can see this video: https://www.youtube.com/watch?v=gNgUMA_Ur0U&ab_channel=TheCherno.

Why protocol buffer c++ library not reading binary objects properly

I created a binary file using a c++ program using protocol buffers. I had issues reading the binary file in my C# program, so I decided to write a small c++ program to test the reading.
My proto file is as follows
message TradeMessage {
required double timestamp = 1;
required string ric_code = 2;
required double price = 3;
required int64 size = 4;
required int64 AccumulatedVolume = 5;
}
When writing to protocol buffer, I first write the object type, then object length and the object itself.
coded_output->WriteLittleEndian32((int) ObjectType_Trade);
coded_output->WriteLittleEndian32(trade.ByteSize());
trade.SerializeToCodedStream(coded_output);
Now, when I am trying to read the same file in my c++ program i see strange behavior.
My reading code is as follows:
coded_input->ReadLittleEndian32(&objtype);
coded_input->ReadLittleEndian32(&objlen);
tMsg.ParseFromCodedStream(coded_input);
cout << "Expected Size = " << objlen << endl;
cout<<" Trade message received for: "<< tMsg.ric_code() << endl;
cout << "TradeMessage Size = " << tMsg.ByteSize() << endl;
In this case, i get following output
Expected Size = 33
Trade message received for: .CSAP0104
TradeMessage Size = 42
When I write to file, I write trade.ByteSize() as 33 bytes, but when I read the same object, the object ByteSize() is 42 bytes, which affects the rest of the data. I am not sure what is wrong in this. Please advice.
Regards,
Alok
This is guesswork, based on the above: when you use ParseFromCodedStream, you aren't actually limiting that to the objlen that you previously found; thus, if the stream contains any more data than this (i.e. that isn't the end of the file), the engine will try to keep reading to the EOF. You must cap the length to your expectation. I am not a C++ expert, so I can't offer direct guidance, but if this was C# (using protobuf-net):
objType = ProtoReader.DirectReadLittleEndianInt32(file);
len = ProtoReader.DirectReadLittleEndianInt32(file);
// assume GetObjectType returns typeof(TradeMessage) for our objType
Type type = GetObjectType(objType);
msg = RuntimeTypeModel.Default.Deserialize(file, null, type, len, null);
So apparently, i was doing a very silly mistake while creating the binary files. I did not open the file in binary mode when i wrote protobuf data to it causing it to add weird ascii characters in the middle. This caused an issue while reading the data using protobuf-net library. The issue is resolved here. Shouldn't have taken so long to resolve this.

Debug and release modes giving different outputs

I have a function in my program that outputs a data structure that consists of three doubles in two formats, one text and one binary.
When I run the program in debug and release modes, I end up with different binary outputs but identical text outputs. What is going on?
Here is the binary output code:
void outputPoints(xyz* points, string description, int length, param parameters)
{
stringstream index;
index.str("");
index << setw( 3 ) << setfill( '0' ) << parameters.stage;
string outputName = parameters.baseFileName + " " + index.str() + " " + description + ".bin"; // create file name
ofstream output; // create output object
cout << "Output " << outputName.c_str() << "...";
output.open(outputName.c_str(), std::ios::binary | std::ios::out); // open or create file for output
output.write(reinterpret_cast<char*>(points), (sizeof(xyz) * length));
output.close(); // close output object
cout << "done" << endl;
}
The debug build usually initializes variables with some patterns. Usually data allocated has the content CDCD, deleted objects are overwritten with FEEE. The CDCD pattern is overwritten when you initialize your variables. The release build doesn't initiliaze with these patterns.
It's worth to check your program for uninitialized variables. You can define a Dump function that just prints the (fist few bytes of) the suspected variables.
I don't know whether you got a solution for your issue and I did not look at your code.
I had the same issue because I was adding unsigned char and unsigned short and saving into unsigned short. I changed all variables to unsigned short and the issue was solved.

VC++ read variable length char*

I'm trying to read a variable length char* from the user input. I want to be able to specify the length of the string to read when the function is called;
char *get_char(char *message, unsigned int size) {
bool correct = false;
char *value = (char*)calloc(size+1, sizeof(char));
cout << message;
while(!correct) {
int control = scanf_s("%s", value);
if (control == 1)
correct = true;
else
cout << "Enter a correct value!" <<endl
<< message;
while(cin.get() != '\n');
}
return value;
}
So, upon running the program and trying to enter a string, I get a memory access violation, so I figured something has gone wrong when accessing the allocated space. My first idea was it went wrong because the size of the scanned char * is not specified within scanf(), but it doesn't work with correct length strings either. Even if I give the calloc a size of 1000 and try to enter one character, the program crashes.
What did I do wrong?
You have to specify the size of value to scanf_s:
int control = scanf_s("%s", value, size);
does the trick.
See the documentation of scanf_s for an example of how to use the function:
Unlike scanf and wscanf, scanf_s and wscanf_s require the buffer size to be specified for all input parameters of type c, C, s, S, or [. The buffer size is passed as an additional parameter immediately following the pointer to the buffer or variable.
I omit the rest of the MSDN description here because in the example they're providing, they use scanf instead of scanf_s what is quite irritating...

Resources