I'm integrating plCrashReporter into one of my apps to add crash reporting functionality. Essentially, if I detect a crash I gather the crash report as NSData...
NSData *crashData;
NSError *error;
crashData = [crashReporter loadPendingCrashReportDataAndReturnError: &error];
crashData now contains the entire report. I can push this crashData into a PLCrashReport struct and read out parameters of it, but I'd rather just send the whole blob to my servers and look at it there. When the data reaches me, it looks like a lot of this:
706c6372 61736801 0a110801 1205342e 322e3118 02209184 82e80412
1b0a1263 6f6d2e73 6d756c65 2e545061 696e4465 76120531 2e362e32
1adb0208 00120618 d4a5f59d 03120618 bda5f59d 03120418 b5b96c12
0618df95 b09d0312 0618938b 9f9a0312 0618f9bb f68d0312 0618cdbc
f68d0312
I haven't had any luck getting anything meaningful out of this. I've tried using the plcrashutil, but haven't had any luck...
./plcrashutil convert --format=iphone example.plcrash
Could not decode crash log: Could not decode invalid crash log header
I also tried using Google's protobuf but was unable to get it running.
I do have a dSYM file but am not even at the point of trying to symbolicate this yet.
I'm running Mac OS X 10.6.5.
Any advice would be greatly, greatly appreciated. Thanks!
Got this sorted out! The report gets sent through as hex, but converting it to binary allows you then to nicely run it through plcrashutil. Here is my HexToBinary.cpp implementation.
Related
I am currently trying to process data for a simple network. This is the code I entered:
Screenshot here
I keep getting this error message but can't find any syntax problems or anyone else with this issue, I'm guessing it's something to do with my vienv because I've seen tutorials of people with no issues and that exact code. It's possible I haven't imported a package into my IDE and I am using anaconda and PyCharm if that helps.
Anyway, this is the error message I keep getting.
Error Message
You need to use transforms.ToTensor() instead of transforms.ToTensor when passing to transforms.Compose.
I'm new to Intel RealSense. I want to learn how to save the color and depth streams to bitmap. I'm using C++ as my language. I have learned that there is a function ToBitmap(), but it can be used for C#.
So I wanted to know is there any method or any function that will help me in saving the streams.
Thanks in advance.
I'm also working my way through this, It seems that the only option is to do it manually. We need to get ImageData from PXCImage. The actual data is stored in ImageData.planes but I still don't understand how it's organized.
https://software.intel.com/en-us/articles/dipping-into-the-intel-realsense-raw-data-stream?language=en Here you can find example of getting depth data.
But I still have no idea what is pitches and how data inside planes is organized.
Here: https://software.intel.com/en-us/forums/intel-perceptual-computing-sdk/topic/332718 kind of backwards process is described.
I would be glad if you will be able to get some insight from this information.
And I obviously would be glad if you've discovered some insight you can share :).
UPD: Here is something that looks like what we need, I haven't worked with it yet, but it sheds some light on internal organization of planes[0] https://software.intel.com/en-us/forums/intel-perceptual-computing-sdk/topic/514663
UPD2: To add some completeness to the answer:
You then can create GDI+ image from data in ImageData:
auto colorData = PXCImage::ImageData();
if (image->AcquireAccess(PXCImage::ACCESS_READ, PXCImage::PIXEL_FORMAT_RGB24, &colorData) >= PXC_STATUS_NO_ERROR) {
auto colorInfo = image->QueryInfo();
auto colorPitch = colorData.pitches[0] / sizeof(pxcBYTE);
Gdiplus::Bitmap tBitMap(colorInfo.width, colorInfo.height, colorPitch, PixelFormat24bppRGB, baseColorAddress);
}
And Bitmap is subclass of Image (https://msdn.microsoft.com/en-us/library/windows/desktop/ms534462(v=vs.85).aspx). You can save Image to file in different formats.
Greetings people,
My Windows Store game has been released for more than three weeks now, and I started getting crash reports. I could download the TriageDump.dmp file and have it opened in Visual Studio 2012, but it did not help much, I am constantly getting "No Debugging Information" error message when I click on "Debug with Native Only":
Also, the tool-tip on my Dashboard shows no information of the crashing function (could "Unknown" here mean inlined function or lambda expressions in concurrent::task?):
I would like to believe that I have done everything the way it should be done, of course I may be wrong. Here are some additional information that might be helpful in finding the issue:
It uses DirectX and written purely in C++ (without C# or XAML)
Project setting: C++\General\Debug Information Format = Program Database (/Zi)
Project setting: Linker\Debugging\Generate Debug Info = Yes (/DEBUG)
The game is made up of two native modules: Labyrinth.App.exe and Labyrinth.Core.dll
The generated APPXUPLOAD contains both APPX and APPXSYM files
The APPXSYM file contains both Labyrinth.App.pdb and Labyrinth.Core.pdb
I'm on x64 development machine, and the triagedump.dmp is for x86
I did click on "Include public symbol files, if any, to enable crash analysis for the app" when generating APPXUPLOAD file:
Please let me know if you spotted the issue or suspected something that's wrong above. Thanks in advance for your help, guys! :)
The very same problem here. MS had that working in the past though.
Actually if you still have the .appxsym around you can easily extract the .pdb out of that. The appxsym is just a .zip file it seems.
You can load these pdbs then as symbols after loading the triagedump.dmp.
I'm working on a Windows Store app and I'm getting a WinRT error that doesn't really give me any information so I would like to know how to understand these sorts of errors.
Basically I get the error on the following line which is called inside OnPointerPressed:
m_gestureRecognizer->ProcessDownEvent(args->GetCurrentPoint(nullptr));
The error is:
First-chance exception at 0x76F54B32 (KernelBase.dll) in DXAML2.exe: 0x40080201: WinRT originate error (parameters: 0x80070057, 0x00000044, 0x03CEE72C).
This error didn't used to appear, the only thing I've changed is that this line is now wrapped in an if clause which tests if the current pointer's PointerId is the same as one I've stored just using == such as:
if(args->GetCurrentPoint(nullptr)->PointerId == m_UIPointerID)
I have no idea why this has started happening.
So my question is in two parts:
More generally, how do I understand what an error such as the above means?
And does anyone know this error has suddenly started happening now that I check the pointerId?
Thanks for your time.
P.S. I guess another thing that has changed is that there will already be 2 pointers on the screen (the one that gets pushed into this GestureRecognizer) as well as another one, hence the PointerId check.
"How to Decipher such an error"...
For any WinRT originate error, you can take the third address in the parameters list (in your example, 0x03CEE72C), and find a description of your error in the memory window.
While debugging, break when your error is thrown and open the memory window via Debug -> Windows -> Memory -> Memory 1
Copy and paste the address to get your "easy-to-find" error message.
As Raman said - it's good to look up the hex values shown. The first one is the memory location which won't tell you much without the symbols/source, which in this case is reported directly by Windows. Perhaps the public symbols can shed some more light on where the error came from, but the error code lookups are more helpful.
If you Bing for 0x80070057 you will find an MSDN article on Common HRESULT Values which lists
E_INVALIDARG : One or more arguments are not valid : 0x80070057
It doesn't give you all the details of course, so you're off to theorize. Perhaps you can only call args->GetCurrentPoint(nullptr) once and you should store/reuse the value? Maybe gesture recognizer is not configured correctly? Maybe the app window is not visible at the time the exception is thrown or the thread is wrong. Maybe some expected calls to gesture recognizer were missed due to filtering those out with these "if" statements.
I am curling a website and writing it to .json file; this file is input to my java code which parses it using json library and the necessary data is written back in a CSV file which i later use to store it in a database.
As you know data coming from a website can be in different formats so i make sure that i read and write in UTF-8 format, still i get wrong output.
For example, Østerriksk becomes �sterriksk.
I am doing all this in Linux. I think there is some encoding problem because this same code runs fine in Windows but not in Unix/Linux.
I am quite sure my java code is proper but i am not able to find out what I'm doing wrong.
You're reading the data as ISO 8859-1 but the file is actually UTF-8. I think there's an argument (or setting) to the file reader that should solve that.
Also: curl isn't going to care about the encodings. It's really something in your Java code that's wrong.
What kind of IDE are you using, for example this can happen if you are using Eclipse IDE, and not set your default encoding to utf-8 in properties.