Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
I have a server with an AMD Opteron(tm) Processor 246 and a customised Linux kernel (2.6.9-100.ELhugemem) in it. When I check the processor using dmidecode, it displays a speed of 2000 MHz, whereas /proc/cpuinfo shows a speed of 1000MHz.
Can anybody explain this and also give me a method to check the current CPU speed?
What you are seeing is probably due to frequency scaling. You can see the minimum, maximum and current cpu frequency by:
cat /sys/devices/system/cpu/cpu0/cpufreq/cpuinfo_min_freq
cat /sys/devices/system/cpu/cpu0/cpufreq/cpuinfo_max_freq
cat /sys/devices/system/cpu/cpu0/cpufreq/cpuinfo_cur_freq
(Replace cpu0 as appropiate).
see cpufreq-info and cpufreq-set in http://www.thinkwiki.org/wiki/How_to_use_cpufrequtils
(may have to possibly install depending on distro)
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
I would like to plot the CPU and memory usage of an application on linux vs time. What is the best way to do this?
Would greping these values out from top every 0.1s and writing them into some file work - or is there a better and easier way?
There is an easier way. All of the information displayed in top can be found in /proc/<pid>/, most of it in /proc/<pid>/stat. man proc describes the content of these files.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I am looking for GNU/Linux distribution which works in real mode. I want to install it in virtual machine so I can study assembly. Is there any one who can help me with it ?
There's ELKS, a subset of Linux suitable for the 8086 processor (ie, no memory manager unit needed, real mode only, etc). But I don't think that such a beast will be your best vehicle to study assembly...
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
i want to determine maximum speed of cpu in MHz with linux operating system and MIPS hardware. With command cat /proc/cpuinfo it results in BogoMIPS=1000.00. And
`cat /sys/devices/system/cpu/cpu0/cpufreq/scaling_cur_freq`
is not working. How can i convert it into MHz.Or what unit is used for BogoMIPS??
Thanks in advance
Try this:
cat /sys/devices/system/cpu/cpu0/cpufreq/scaling_max_freq
or this:
cat /sys/devices/system/cpu/cpu0/cpufreq/cpuinfo_max_freq
There should be similar files for the minimum frequency as well.
The information should be under cpuinfo. What does it give you if you enter:
cat /proc/cpuinfo | grep Hz
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I'm trying to find out how to do an IISRESET if the CPU usage gets above 90%. I was thinking about using a batch file or something like that, but really I have no clue.
The problem is the CPU usage gets to 99%, the machine slows down and the only way to get the website working again is to log onto the machine and manually perform an IISRESET, I'm trying to find out if there is a way to automate that so that when the CPU usage get to around 95% the IISRESET will happen.
Hope someone can help.
Thanks
Alex
This PowerShell script will do that:
if ((Get-Counter '\Processor(_Total)\% Processor Time').CounterSamples[0].CookedValue -gt 90) { &iisreset }
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 12 years ago.
Improve this question
Conceptual question, just out of curiosity:
What is less taxing on the graphics processor: Anti-aliasing (2x? 4x? Higher?) on a typical desktop machine (around 120-150dpi) or to drive a hi-density (>300dpi) screen without anti-aliasing? This question could pertain to both desktop systems and embedded (smartphones). I'm interested to see the responses!
Neither usually, since font rendering and AA is done by the CPU (though you can use GPU features to blur). And then it depends on the font rasterizer and how good or bad it was implemented. It also depends on how AA was done, whether a matrix blur was applied, an FFT, or a simple render-bigger-and-bicubic-downsampling was used. Only runtime tests can show.