I use process explorer (which is a microsoft tool) on windows XP, and often the "physical memory" is being filled at max (3GB) while I use visual C++. At a point, all my programs are slow and are unresponsive, and when it returns to normal, available memory comes back by nearly half ! What is wrong ?
I'm programming some project with Ogre3D, maybe I can deactivate some options in visual, what exactly is it caching that eats that much memory ?
Apparently MSVC is designed to work on big machines, there are many settings in text editor -> C++ to remove some weight, but my guess is that windows xp + recent microsoft apps don't play nice.
Related
I'm running AS 1.2.2. on OSX 10.10.3 The CPU usage swings wildly up and down. Trying to edit anything is a real pain - deleting characters, typing, type-checking - all are slow because Studio is consuming a huge amount of resources. I can press a key and must wait sometimes 5 seconds before it updates on the screen
Anyone else has this problem and figure out how to make Android Studio usable again.
I am running Android Studio under Windows 7 64 bit, with a DPI scaling setted up at 150%. The text is of course blurred, and the only thing I could do is to "Disable display scaling on high DPI settings" from the program's properties/compatibility tab: the problem is that the settings are grayed out.
Anybody knows what the problem could be?
Thank you in advance
(I would have added the screenshot but my rank is too low, sorry)
I've managed to solve this problem in the following way.
I am not sure about whether it has caveats, but it's really simple. The settings are greyed out because it's a) a 64-bit application or b) a part of 64-bit Windows distribution
Solution is to use a 32-bit shell with high-DPI disabled to start a 64-bit program. I'm not sure whether it has memory addressing implications, but it should work for any application.
Good candidates are 32-bit versions of bash, or anything else you can find.
I have written a C++ code for a vehicle routing project. On my dell laptop I have both Ubuntu and Windows 7 installed. When i run my code in a gcc compiler on UNIX platform it runs at least 10x faster than the exact same code on Visual C++ 2010 on the windows OS (both of them on the same machine). This is not just for one particular code, turns out this happens for almost every C++ code i have been using.
I am assuming there is an explanation to such a large differences in runtimes and why gcc out performs visual C++ run time wise. Could anyone enlighten me on this?
Thanks.
In my experience, both compilers are fairly equal, but you have to watch out for a few things:
1. Visual Studio defaults to stack-checking on, which means that every function starts with a small amount of "memset" and ends with a small amount of "memcmp". Turn that off if you want performance - it's great for catching when you write to the 11th element of a ten element array.
2. Visual studio does buffer overflow checking. Again, this can add a significant amount of time to the execution.
See: Visual Studio Runtime Checks
I believe these are normally enabled in debug mode, but not in release builds, so you should get similar results from release builds and -O2 or -O3 optimized builds on gcc.
If this doesn't help, then perhaps you can give us a small (compilable) example, and the respective timings.
Created application is working toooo slow, looks like there are a lot of memory leaks, there are a lot of pointers. So, please, can you advice some effective tool for run-time errors and memory leaks detection in Visual Studio C++?
You can use deleaker. It must help you.
I know 2 good tools for windows: Purify and Insure++.
For linux: Valgrind.
If you use the debug version of the CRT library , you can use find all memory leaks very easily.
Basically after including appropriate headers you call
_CrtSetDbgFlag ( _CRTDBG_ALLOC_MEM_DF | _CRTDBG_LEAK_CHECK_DF );
somewhere in the beginning of you program.
Before the program exits you should call
_CrtSetReportMode( _CRT_ERROR, _CRTDBG_MODE_DEBUG );
Which dumps all the memory leaks to Debug Output Window.
But the application being slow might be unrelated to memory leaks, For performance profiling you can follow directions as per Find Application Bottlenecks with Visual Studio Profiler
For catching bad C++ constructs at compile time, you can use the static code analysis feature of Visual Studio 2010 or later.
i have an Vc++ application developed in VC6 . currently it supports 32 bit Operation systems.
My requirement to covert this application to support 64 bit Operating systems (like windows7 , Windoes 2008 server and etc..).
what are easiest way / steps / procedure to migrate such of application?
In practice, if you use every data type as it should, there shouldn't be a problem.
Typical errors that are made, are:
using [unsigned] long instead of size_t when referring to sizes
subtracting pointers and assigning the result to a long (should be ptrdiff_t or something like this)
converting pointers to long or long to pointers
The page http://msdn.microsoft.com/en-us/library/aa384198%28v=VS.85%29.aspx on Microsoft's MSDN site gives a list of important things to think about when going to 64-bit.
Hope this helps.