I'm trying to develop a kind of security program on Linux. So, My plan is currently to use AES128bit-CBC. I heard AES 128 is basically supported on OSX. Is there any libraries on Linux like that?
The libcrypto library in the OpenSSL package supports AES128 encryption. Most Linux distributions like RHEL, SuSE and ubuntu come with OpenSSL.
The AES_set_encrypt_key() and AES_cbc_encrypt() functions from <openssl/aes.h> implement the function you're after.
Related
How do I get the benefit of sendfile() (on Linux) and TransmitFile() (on Windows) if I also want to use encryption? Are there any Linux kernel modules or Windows drivers that provide this functionality? The only thing I've found this far is an implementation on FreeBSD by Netflix, but unfortunately that is not my two target platforms.
Google for "Linux KTLS". It's pretty new/experimental though.
I need to secure my UDP traffic. As far as I understand DTLS protocol is the best way to do it. There is another one - IPsec - but it looks not applicable for me because it's not easy to use and there are possible hardware problems.
I've found that there are some libraries which have DTLS implemented. So now I'm trying to choose - OpenSSL or GnuTls? Could you please advise me what is better to use? What are drawbacks or advantages? Or may be there is another library with DTLS support implemented?
I've found the following facts about the libraries and DTLS.
There is another lib with DTLS support - CyaSSL, but it supports DTLS only in test mode for now.
Although RFC 4347 dates from Apr, 2006, the OpenSSL supports DTLS since 2005 (v0.9.8). Many Linux distribs include this version. OpenSSL API looks ugly a little, but it seems like DTLS implementation is stable.
GnuTls supports DTLS since 2011 (v3.0.0). Looks like no Linux includes this version yet. (For example, Ubuntu 11.04 uses v2.8.6, Ubuntu 11.10 is going to use v2.10.5, not v3.0.0.) There is no information about when v3.0 will be used. It can be built manually, however it depends on too many additional libraries which may have no native support in some distribs.
It looks like all of these libraries can be used on other platforms (e.g. Windows).
Known OpenSSL issue: OpenSSL has compression enabled by default for DTLS, but it shouldn't be. OpenSSL v0.9.8 API doesn't provide any method to disable compression. The method should be implemented manually.
SUMMARY:
Speaking about usability, personally I would prefer GnuTls API, but at the time OpenSSL looks more preferable to use.
IPsec is the oldest and hence most compatible and stable, but requires tasks from the sysadmin and can be quite challenging for novices. DTLS is tackling the problem from the application side which the programmer can significantly simplify and integrate with existing environments with less change.
The choice between OpenSSL and GnuTLS is almost always due to license.
OpenSSL license includes an advertising clause:
3. All advertising materials mentioning features or use of this *
software must display the following acknowledgment: * "This
product includes software developed by the OpenSSL Project * for
use in the OpenSSL Toolkit. (http://www.openssl.org/)"
GnuTLS from Wikipedia:
GnuTLS was initially created to allow applications of the GNU project
to use secure protocols such as TLS. Although OpenSSL already existed,
OpenSSL's license is not compatible with the GPL;[4] thus software
under the GPL, such as GNU software, could not use OpenSSL without
making a GPL linking exception.
http://en.wikipedia.org/wiki/GnuTLS
in a short I'm gonna release an application written in OCaml and I was planning to distribute it by source code.
The problem is that the OCaml development system is not something light neither so common to have installed so I would like to release it also in a binary way for various operating systems.
Windows makes no problem since I can compile it through cygwin and distribute it with the required dlls
OS X is not a problem too since I can compile it and distribute it easily (no external dependencies from what I've tried)
When arriving to Linux problems arrive since I don't really know which is the best way to compile and distribute it. The program itself doesn't depend on anything (everything is statically linked) but how to cover many distributions?
I have an ubuntu server 10 virtualized with an amd64 architecture, I used this machine to test the program under Linux and everything works fine. Of course if I try to move the binary to a 32bit ubuntu it stops working and I haven't been able to try different distributions... are there tricks to manage this kind of issue? (that seems recurring)
for example:
can I compile both 32 bit and 64 from the same machine?
will a binary compiled under ubuntu run also on other distributions?
which "branches" should I consider when wanting to cover as many distros as possible?
You can generally produce 64 and 32 bit binaries on a 64 bit machine with relative ease - i.e. the distribution usually has the proper support in its compiler packages and you can actually test your build. Note that the OS needs to be 64 bit too, not just the CPU.
Static binaries generally run everywhere, provided adequate support from the kernel and CPU - keep an eye on your compiler options to ensure this. They are your best bet for compatibility. Shared libraries can be an issue. To deal with this, binaries linked with shared libraries can be bundled with those libraries and run with a loader script if necessary.
You should at least target Debian/Ubuntu with dpkg packages, Redhad/Fedora/Mandriva with RPM and SUSE/OpenSUSE with RPM again (I mention these two RPM cases separately because you might need to produce separate packages for these "families" of distributions). You should also provide a .tar.bz2 or a .run installer for the rest.
You can have a look at the options provided by e.g. Oracle for Java and VirtualBox to see how they provide their software.
You could look at building it in the openSUSE Build Service. Although run by openSUSE, this will build packages for:
openSUSE SUSE
Enterprise variants
Mandiva
Fedora
Red Hat Enterprise/CentOS
Debian
Ubuntu
The best solution is to release the source code under a free license. You can package it for a couple distributions yourself (e.g. Debian, Fedora), then cooperate with other people porting it to others. The maintainers will often do most of this work with only a few required upstream changes.
Yes you can compile for both 32 and 64 bits from the same machine :
http://gcc.gnu.org/onlinedocs/gcc/i386-and-x86_002d64-Options.html
Most likely a binary running on Ubuntu will run on other distributions, the only thing you need to worry about if it you are using shared libraries (especially if you use some GUI framework or things like that).
Not sure what you mean by branch, but if you are talking about distribution, I would use the most vanilla Ubuntu distribution...
I'd recommend you just package a 32 and 64-bit binary for .deb and RPM, that way you can hit most of the major distros (Debian, Fedora, openSUSE, Ubuntu).
Just give clear installation instructions regarding dependencies, command-line fu for other distros, etc. and you shouldn't have much a problem just distributing a source tarball.
I want to change linux distro my Development(Host) Machine which I use for embedded development.
I cross-compile applications for many different processors. It is required for me to download different different libraries to evaluate their functionality/Performance/Stability on different devices , as well as on PC.
So Is ubuntu 9.04 a good choice for me?
Thanks,
Sunny.
If you are using gcc or other source based compiler that runs on linux then I would say yes, you want a linux distro, and ubuntu is currently the most popular/best. I would try to avoid distro specific things, drive down the middle of the road and you should be able to use any distro equally well.
That will largely depend on your needs. For an embedded system, I'd go with any distribution that sports a very small footprint and supports the necessary hardware.
Depending on your hardware, Debian might work fine. You could create your image with debootstrap which allows for fairly small customized installs. It still includes apt and other things which might not be desirable, although that could be to your benefit if you need to push out updates.
If you did go with Debian, you could most likely do all your development on Ubuntu and then push to your embedded system.
i use ubuntu for my host system and a chrooted gentoo install for building apps for an embedded target. I found gentoo was a good choice as it is source distributed and easy to select what version of a particular library is installed.
One thing that is good to know is that ubuntu and derivatives uses dash and not bash as /bin/sh. This confuses crosstools and can give you severe headaches.
ubuntu 9.04, fedora 11, redhat...
what are the differences from a web server/development standpoint?
None. They differ only in how they package things, but they're all essentially the same - same operating system, same software. Some people get quite emotional about this choice, but I've used several, and there's nothing to pick between them these days.
I like to choose linux distros based on whichever ones have the most help available online. I'd probably go with CentOS or Ubuntu for that reason.
Use whatever your hardware vendor is happy to support. If you're serious about running a production system, you will use a supported OS.
Having said that, most vendors don't officially support Centos, however it is sufficiently similar (i.e. almost identical) to Redhat Enterprise that they ignore the difference.
Your code might run anywhere, but your hardware vendor's tools probably won't. You'll want to use those.
For tomcat there is not any difference but as a server: Ubuntu is more cutting edge in terms of kernel and packages. Ubuntu package management is superior and easier. If you will prefer Ubuntu then use server edition it is optimized as a server. CentOS is said to be solid but I haven't got much experience with it. If you are considering a virtual server different distros have different level of support for different virtualization technologies just keep it in your mind.
If you are new to linux, then I defiantly recommend Ubuntu. You can be up and running in now time with apt-get.