Some x86 ASM Reference/Tutorials? [closed] - reference

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
I'm trying to find some references in regards to x86 Assembly languages. Tutorials/Examples to help my understanding.
-Thanks

Programming from the Ground Up (free book, highly recommended)
x86 Assembly (wikibooks.org)
Essential Resources for x86 Programmers

I recommend Roby's PC Assembly Tutorial Lesson. It's also available for download. It contains diagrams and examples.
"This assembly lesson is for x86 specific, i.e. for Intel 8088, 80286, 80386, etc. Yes, it is compatible with your Pentium or Pentium III. AMD users could also use this tutorial as well because I cover only the basics. I assume that you have some grasp on some programming language like Pascal, C or C++. I don't want to go over the basic concepts of programming all over again."
Preliminary lesson -- Low Level Basic Concepts
Talks about registers, flags, memory, stacks, and interrupts. Don't worry about that too much. You might be confused with so many concepts. However, as you follow the lesson, I think every concepts should be clear enough.
Chapter 1 -- COM program structure
Begin your journey in assembly by observing the simplest program structure.
Chapter 2 -- Variables in Assembly
Discover the unique concept of variables in assembly language. The notion is far different than that of the normal high level programming language. I also explain how mov instruction works.
Chapter 3 -- Arithmetic Instructions
How can we perform some arithmetic in assembly?
Chapter 4 -- Bitwise Logic, part 1
Using and, or, and xor to perform various logical tasks, including bit masking and flipping.
Chapter 5 -- Bitwise Logic, part 2
Bit shifting and rolling can be useful in aiding bit masking set forth in the next chapter.
Chapter 6 -- Branching
Branch is essential for all programs. Let's try some assembly branching instructions to improve the logic of our programs.
Chapter 7 -- Loop
The loop instruction in assembly can be useful to resemble higher level programming language construct.
Chapter 8 -- Interrupt Essentials
Using common system interrupt services to do screen output and takes input.
Chapter 9 -- Stacks
Using push and pop and knowing how the stack behaves. Some details about tiny memory mode is explained here.
Chapter 10 -- Making Subroutines
Using subroutines to mimic structured programming approach.
Chapter 11 -- Macros
Using macros.
Chapter 12 -- Array Access
See how assembly provides a very crude array access instructions.
Chapter 13 -- Basic String Instructions
Using various string instructions: movs, lods, cmps, scas, and stos.
Chapter 14 -- Structures
Using structures like the ones in high-level languages. The structure in assembly is equivalent to struct in C/C++ and record in Pascal.
Chapter 15 -- A Bit of Theory
Addressing modes and memory modes explained plus Computer Architecture 101.

The best, most comprehensive source of Assembly I've seen is the "Art of Assembly Language" book. It's a free download, in several versions.

For reference, the Intel IA-32 reference manuals are the definitive source.

I find that Professional Assembly Language to be a good start.

Related

Is there any merit about assembly language? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
I heard some of developers use assembly language in embedded system.
I wonder what merit they have from learning assembly language and what field use assembly language.
Do you have any experience?
The need for assembly code is proportional to the lack of specific compiler support.
Embedded systems are tailored to specific needs, for example Texas Instruments DSP have some "exotic" address modes, like circular and bit reversing addressing modes that are absent into other architectures.
The C language cannot address all these differences in a standard way.
However the C standard doesn't forbid vendor extensions and compilers targeting specific environments come with built-in functions with the purpose of exposing some low-level functionality. These functions are called intrinsics and being non standard reserved keywords, they start with an underscore.
For example, the TMSxC6000 Optimization Manual lists the intrinsics at 7.5.4.
One very common operation done in DSP is saturated addition where in a n bits word (2n - 1) + 1 = 2n - 1 as opposed to (2n - 1) + 1 = 0 for the usual modular addition.
In TI C dialect this translates to
int x1, x2, y;
y = _sadd(x1, x2); //_sadd mimics the name of sadd assembly instruction
With the opportune intrinsics you can avoid assembly language at all.
There are however at least three situations where assembly language is still needed:
No adequate intrinsics are present.
The programmer is forced to fallback to assembly.
The compiler is known particularly bad at optimizing code and you want to write critical part by your self.
Think at least thrice before taking this path.
You need to use the same C code base for different platforms.
This is the case of the Linux kernel for example, where they use a small portion of assembly to "abstract" the execution environment enough to be handled with mostly pure C code.
Often the diversities are so accentuated that simply calling intrinsics is not enough, a different management is needed instead, an abstraction.

Is it worthwile to learn assembly language? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
Is it still worthwhile to learn ASM?
I know a little of it, but I haven't really used it or learned it properly because everything I learn to do in assembler I can do in 1/10th the time with some language like C or C++. So, should I really learn and use ASM? Will it do me any good professionally? Will it increase my resourcefulness? In short, would it make me a better programmer?
Note: I am talking about low-level assembly like FASM or NASM and not something like HLA (High-Level Assembler).
I learned from Kip Irvine's book. If you ignore the (fair) criticisms of his (irrelevant) libraries, I can recommend it as a good introduction to the language itself -- although for the really interesting stuff you have to hunt out obsessives on the net.
I think it's useful to understand what happens at the lower levels. As you research assembler you will learn about cpu pipelining, branch prediction, cache alignment, SIMD, instruction reordering and so on. Knowledge of these will help you write better high-level code.
Furthermore, the conventional wisdom is to not try to hand-optimise assembly most of the time but let the compiler worry about it. When you see some examples of the twisted things that compilers generate, you will better understand why the conventional wisdom holds.
Example: LFSRs run fast with the rotate-with-carry instruction, for specific cases like this it's just as easy to write the assembler version as it is to discover whether or not the compiler is smart enough to figure it out. Sometimes you just know something that the compiler doesn't.
It also increases you understanding of security issues -- write-or-execute, stack overruns, etc.
Some concurrency issues only become apparent when you are aware of what is happening at the per-instruction level.
It can be useful sometimes when debugging if you don't have the complete source code.
There's the curiousity value. How are virtual functions implemented anyway? Ever try to write DirectX or COM programs in assembler? How do large structures get returned, does the calling function offer a space for them or vice-versa?
Then there are special assembly languages for graphics hardware, although shader languages went high-level a few years ago, anything which lets you think about a problem a different way is good.
I find it interesting that so many people jump to say that yes, you need/should learn assembly. To me the question is how much assembly do you need to know? I don't think you have to know assembly like a programming language, that is I don't believe that everyone should be able to write a program in assembly, but on the other hand, being able to read it and understand what it actually means (which might require more knowledge of the architecture than the assembler) is enough.
I for sure cannot write assembly (i.e. write any non trivial piece of code in assembly), but I can read it and that together with knowledge of the actual hardware architecture, and the calling conventions that are being used is enough to analyze performance, and identify what piece of C++ code was the source of that assembly.
Yes - the primary reason to learn assembly for C and C++ developers is it helps understanding what's going on under the hood of C and C++ code. It's not that you will actually write code in assembly, but you will be able to look at code disassembly to assess its efficiency and you will understand how different C and C++ features work much better.
It's worthwhile to learn lots of different languages, from lots of different paradigms. Learning Java, C++, C#, and Python doesn't count, since they are all instances of the same paradigm.
As assembly is at the root (well, close to the root) of all languages, I for one say that it is worthwhile to learn assembly.
Then again, it's worthwhile to learn a functional programming language, logic programming, scripting languages, math-based languages. You only have so much time, so you do have to pick and choose.
Knowing ASM is also useful when debugging, as sometimes all you have is "ASM dump of the error".
Depend of which programming level you wish to reach.
If you need to work with debuggers then YES.
If you need to know how compilers works then YES.
Any assembler/debugger is CPU dependent, so there is a lot of work, just check x86 family how big and old is it.
Do you have any use for it in what you plan to do? is it going to aid you in any way in what you currently do or plan to do? those are the two questions you should ask yourself, the answer to those is the answer to your question.
In a more general sense, yes, I'd say in my opinion is well worth learning asm (something like x86 or arm), how well it serves you depends on what you programming and how your debugging it.

Why is Verilog not considered a programming language? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 2 years ago.
Improve this question
In class the professor said that students shouldn't say that they learned to program in Verilog. He said something like Verilog isn't used to program it's used to design. So how is Verilog different from other programming languages?
Verilog, just like VHDL, is meant to describe hardware. Instead, programming languages such as C or C++ provide a high level description of software programs, that is, a series of instructions that a microprocessor executes.
In practice, Verilog and VHDL do not offer the same features as programming languages, even though they look very much alike. For instance, a for loop in C/C++ describes the sequential execution of a given snippet of code; instead, a for ... generate loop in Verilog/VHDL describes multiple parallel instances of a same hardware building block (say, a AND logic gate). To be precise, there also exists a plain for loop in Verilog, but again, it has to be "synthesizable", that is, the compiler must be able to generate logic that fits the description.
Typically, a beginner in Verilog/VHDL will be tempted to "translate" a given function/algorithm from a C/C++ type of pseudocode directly to Verilog/VHDL: surprisingly, it might sometimes work, but it always lead to dramatically poor design. One must really be aware of these differences in order to become a good Verilog/VHDL programmer.
Verilog is a hardware definition language. Programming languages are generally understood to be languages for telling existing hardware what to do, not for reconfiguring said hardware.
I don't know anything about Verilog but just did a quick googling and the wiki pages seem to do a pretty good job of explaining the differences in concept that your teacher seemed to be eluding to. As some of the other posters here wrote I don't know that I would dismiss this as not a programming language, I think there's a high tendency for programmers to believe if it isn't somehow application programming or assembly programming then it's not really programming, but in short that's BS. Everything above machine code is basically the same to me, if it's a file I give to a computer and it tells the computer how to do something it's programming the computer (I guess the problem is drawing a line between users and developers, we like to feel special). Unless we plan to roll back to punch-cards sometime soon, I think anything that has a C like syntax or allows you to describe in a syntactically strict (well defined) way and modifies the behavior of the computer (what it outputs for a given input) then you've done some programming in one sense or another.
http://dictionary.reference.com/browse/programming
From the wiki page:
http://en.wikipedia.org/wiki/Dataflow_language
Dataflow programming focuses on how things connect, unlike imperative programming, which focuses on how things happen. In imperative programming a program is modeled as a series of operations (thing that "happen"), the flow of data between these operations is of secondary concern to the behavior of the operations themselves. However, dataflow programming models programs as a series of (sometimes interdependent) connections, with the operations between these connections being of secondary importance.
(I think the key here is the qualifiers of the type of programming not that one is a "programming language" and the other is a "design language", from what I understand they're both programming languages they just have distinct purposes and implementations). When I think of design I basically think of this:
http://dictionary.reference.com/browse/design
and that is not a program although a program may utilize designs (and probably should, generally referred to as design patterns, but not what you're doing)
Linked in from: http://en.wikipedia.org/wiki/Verilog
To your teachers point this language would likely be used to solve different problems from your every day Java/C program, and via a different means, however to say it is not a program seems wrong.
Because it is an HDL, so it is to define hardware, and anything done in verilog (not really anything, but synthesizable things) will be synthesized into actual hardware. So you can't just use programming features like class and OOPS concept because it can't create any hardware.
But in C, everything will be converted into executable hex file, which will be loaded in your ram while executing the program.
Another basic difference is everything in hardware is concurrent, so if you have written a=b+1 and c=d+1 in verilog, then in the synthesized hardware, both modules will work simaltaneously. But in C everything is sequential, so in same C program actually both instruction will be loaded one by one in your processor.
It is a programming language, not to program software, but to describe hardware design - but the output is not necessarily an "application" as we understand it.
The language has a formal syntax.
Verilog contains features to describe logical netlists(RTL) and features to facilitate simulation of them. Describing an RTL description as a program may convey that one who describes it as such does not throughly understand logic design or synthesis. Describing a testbench stimulus as a program would be appropriate.
verilog/vhdl is used to create and design specific application system on the chip which embedded into electronic devices.
c/c++ used design softwares on the computer
I am going to tackle this question in a different way. What is a purpose of a programming language? Can the output of a program affect real world and your goals and expectation? If yes then ofcourse verilog is a programming language. Console.log has as much meaning as what it translates to in real world eg. console.log("you have a million unit") has no fiat without authority. So verilog is a programming language in certain sense.

Which programming languages aren't considered high-level? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
In informatics theory I hear and read about high-level and low-level languages all time.
Yet I don't understand why this is still relevant as there aren't any (relevant) low-level languages except assembler in use today.
So you get:
Low-level
Assembler
Definitely not low-level
C
BASIC
FORTRAN
COBOL
...
High-level
C++
Ruby
Python
PHP
...
And if assembler is low-level, how could you put for example C into the same list. I mean: C is extremely high-level compared to assembler. Same even for COBOL, Fortran, etc.
So why does everybody keep mentioning high and low-level languages if assembler is really the only low-level language?
You will find that
many of the truths we cling to depend upon our own point of view.
For a C programmer, Assembler is a low-level language.
For a Java programmer, C is a low-level language and so on.
I suspect the folks programming the first stored-program computer with 1s and 0s would have thought Assembler a high-level language. It's all relative.
(Quote from Return of the Jedi)
According to Wikipedia, the low level languages are machine code and assembly.
From the source:
In computer science, a low-level
programming language is a programming
language that provides little or no
abstraction from a computer's
instruction set architecture. The word
"low" refers to the small or
nonexistent amount of abstraction
between the language and machine
language; because of this, low-level
languages are sometimes described as
being "close to the hardware."
Then, to answer:
So why does everybody keep mentioning high and low-level languages if assembler is really the only low-level language.
I don't know who "everyone" is, but I would venture a guess that back when high-level languages were not as commonplace as they are today, it was more relevant to talk about low-level vs. high-level (because there was a relatively significant amount of programmers writing assembly code). In modern times it is a less important distinction. Personally, I rarely hear people using these terms except to differentiate between assembly or not (except for those times when you might hear someone raised on Python referring to C or C++ as low-level, but this is not in the spirit of the original definition).
You're asking a relatively subjective question; it's a question about terminology, that vernacular, and perspective.
For example, is Lisp a high-level or a low-level language? What if the implementation is running on a Lisp Machine?
Often, when people attempt to build a spectrum from low-level to high-level, what they are trying to quantify is a degree of "closeness to the hardware" as opposed to the degree of "abstraction."
Qualities which count toward an implementation's closeness to the hardware:
The programmer directly controls the memory layout of data and has access at run-time to memory addresses of data.
Mathematical operations are defined in terms of the hardware or loosely defined in order to conform to different types of hardware.
There may be a library providing dynamic memory allocation, but use of dynamic memory is manual.
Management of memory during string manipulation is manual.
Converse qualities which count toward an implementation's abstraction from the hardware:
The programmer does not have run-time access to address of data (references instead of pointers).
Mathematical operations are defined in specific terms not tied to specific hardware. (e.g., ActionScript 3 supports the Number type which self-converts from integer to floating-point rather than experience overflow.)
Management of dynamic memory is handled by the environment, possibly through reference counting, garbage collection, or another automated memory management scheme.
Management of memory during string manipulation is always hidden from the programmer and handled by the environment.
Other qualities might render a language very abstract compared to the hardware on which it runs:
Declarative, search-based syntax. (e.g. Prolog)
With factors like these in mind, I would revise the spectrum you have written as follows:
Lowest level:
Assembly language of the platform in question.
Low-level languages with higher-level flow control than assembly:
C, C++
Pascal
High-level languages:
FORTRAN
COBOL
Python
Perl
Highest-level languages:
PROLOG
Python
Scheme
Python appears twice by intent -- it spans a portion of the spectrum depending on how the code is written.
As low-level, I would add:
.NET IL
Java JVM
Other P-Code used in environments like VB6
The "level" of a language is a moving target. In 1973, PL/I was considered a high-level language. Today, C is considered (at least by language professionals) as a low-level language [see footnote]. Some of the reasons:
Exposes machine-level representations of numbers
"Integer" arithmetic can overflow
No real support for strings, or at the very least, strings are not first-class
Manual memory management
Address arithmetic
Unsafe
A high-level language might include
Support for integer types independent of the target machine
Default integer arithmetic never overflows unless the machine runs out of memory
Strings as first-class values with, e.g., concatenation built in
Automatic memory management with no address arithmetic
Safe
Some candidates as "high-level languages" by this definition might include Icon, Scheme, Smalltalk, and some of your favorite scripting languages.
Back in the day when I was a young scholar and dinosaurs roamed the earth, people referred to Icon as a "very high-level language". As recently as 15 years ago you could even attend a learned symposium on Very High Level Languages. But that term isn't used much any more.
Why does everybody keep mentioning high and low-level languages?
Even though the difference between "high" and "low" keeps changing, distinctions like the ones listed above are still important. And there are so many distinction that the words "high" and "low" can be a useful shorthand. But not that useful—to a cynic, a high-level language is one that looks at least as powerful as whatever my favorite language is, and a low-level language is everything else. In other words, "level" can easily degenerate into mere name-calling.
Footnote: It's hard to find citations for terminology used at professional meetings, especially when professionals don't use the terms "low-level" and "high-level" because they're not so technical. But danben asked about citations, and I found a couple:
"To provide the required precision, experimental programs are usually written in a low-level language (eg C or Pascal)," in a refereed article on computer vision.
"The C programming language is well-known for its flexibility in dealing with low-level constructs," in an important paper by Necula et al.
P.S. Don't count too heavily on Wikipedia for good information on programming languages, especially if the Wikipedia reference cites no references or sources
Purely guessing here, but this may be a case of language-shift, whereby the distinction between low- and high-level langauges is slowly evolving in peoples' minds into the difference between managed- and unmanaged-languages, typed-and untyped-languages etc.etc. (at least in the way people are using the terminology).
To a large extent, "low-level" and "high-level" not binary categories but are a continuum. There are some languages that are clearly low-level (assembly, machine code), but beyond that there is really only "higher-level" and "lower-level".
As I see it, "lower-level" languages require code that looks more like the architecture of the computer, and "higher-level" languages accept code that looks more like the structure of the problem. But with that, languages can be high-level for one problem and low-level for another.
Low-level
Binary
Assembler
ET IL
Java JVM
Other P-Code used in environments like VB6
Definitely not low-level
C
BASIC
FORTRAN
COBOL
Python
Perl
Pascal
High-level
C++
Ruby
Python
PHP
PROLOG
Scheme

What is the best way to learn x86 assembly on a Linux platform? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I have no prior knowledge of assembly programming, and would like to learn how to code x86 assembly on a Linux platform. However, I'm having a hard time finding a good resource to teach myself with.
The Art of Assembly book looks good, but it teaches HLA. I'm not interested in having to learn one way, then relearning it all over again. It also seems like RISC architectures have better resources for assembly, but unfortunately I do not have a RISC processor to learn with. Does anyone have any suggestions?
http://asm.sf.net has some material on architectures besides x86.
If you are interested in RISC architectures, you could run Linux on Qemu. Qemu emulates several RISC architectures like PowerPC, ARM and MIPS. You might be able to find a ready to use Qemu hard disk image here.
Another way to experiment with RISC architectures would be to use gdb's built-in simulator.
I found Assembly language step-by-step to be a very good resource. It has a section in the back thats aimed at Linux assembly too.
Probably nothing much better than The Art of Assembly Language Programming and the other resources at that web site.
There are really two parts to learning assembly-level programming: the basic concepts, and then specific architectures. If you haven't had any exposure to asm programming, I strongly suggest you get the basics down first with a simple, small architecture, even tho' it likely is not directly applicable to any real hardware. If many folks are pointing to a particular resource like "The Art of...", take another look at it, use it to learn what an architecture is, how to use the basic tools (asm, debugger, disasm, etc).
Once those are out of the way, then you can start looking into more advanced instruction sets. The x86 architecture and instruction set are pretty convoluted and there are many obscure ways to twist your brain - learn something simple before you tackle that.
Even though many people I know at school hated this book, I will link it anyway:
http://www.amazon.com/Professional-Assembly-Language-Programmer/dp/0764579010
The main reason I used this book is because it uses x86 on Linux with the GNU assembler. That last point helped since I had to use that assembler in our school's lab, and if you aren't aware - the syntax is different from Intel syntax.
Also, I would just add that learning how high level languages are compiled into assembly language really helped me move along.
x86 assembly is really an intel language, best learnt with an intel chip and a windows platform which does DOS
If you have something like WinXP there used to be a DOS interpreter which showed a user the basics of asm and allowed a user to reverse a command and tweak the code in real time, then assemble the code into a block which could be run on the interpreter
It was called the "Ketman Interpreter"
It was for DOS asm only but it was pretty unique because it let you see what happens with all the registers and flags and allows a totally clueless individual to get a handle on the logic
Try http://www.emu8086.com which is a windows-hosted 8086 emulator with an assembler and debugger. It comes with a tutorial.
I learned x86 assembler from a book about the 8086 (which I can't remember the name of at present... it was obviously quite old, and purple. if you're really interested I can dig it up when I get home). That will only teach you 16 bit stuff, for the more advanced 32 bit stuff I read some tutorials online. I've never done 64 bit. At least at first, the OS you're targeting probably won't matter, as you're too low-level... the BIOS is all you really care about. If you don't have access to a test system, an emulator is probably a good choice, as others have mentioned, but you can also build yourself an 8088 or 8086 without too much trouble from discrete parts. You can find tutorials and circuit diagrams online easily. It should cost less than $50 and it's a great learning experience -- you're essentially building a motherboard from scratch.
If you're not too attached to x86 assembly and want to learn RISC, I recommend the Microchip PIC microcontrollers. You can pick up a starter kit for less than $50 (the PICKit 1, which I have, even works under Linux). They have extensive documentation and plenty of third-party tutorials aimed at hobbyists.
don't forget to grab a copy of Guide-Assembly-Language-Programming-in-Linux book.
The Art of Assembly Language Programming

Resources