Is there a cat utility for Apache's avro? - cat

I'd like to know if there is a 'cat' utility for Apache's binary avro file format?
Something along the lines of zcat, or bzcat for gzip and bzip2 respectively.
Thanks for your help!

Yes, avrocat was included in the 1.6.0 release. Look in the scripts/ directory of the Python implementation of Avro.

fastavro
pip install fastavro
fastavro filename.avro --pretty

Related

Unzip single file from zip archive using node.js zlib module

Let say I have a zip archive test.zip which contained two files:
test1.txt and text2.txt
I want to extract only test1.txt using the node inbuilt zlib module.
How to do that?
I don't want to install any package.
You could run a shell command to unzip, assuming that unzip is installed on your system. (It very likely is.)
As far as I can tell, there is no zip functionality within node.js without installing a package.
You can use zlib to help you with the decompression part, but you will have to write your own code to interpret the zip format. You can use zlib.inflateRaw to decompress the raw deflate compressed data of a zip entry. You have to first find where that compressed data starts by reading and interpreting the zip file headers.
The zip format is documented here.

How to get rpm file path in spec file

I want to copy the installed rpm file to some directory during %post. Like
%post
cp rpm_path %_prefix/rpm/
Is there any way to get rpm file path in spec file?
It is not possible.
However, you can achieve it on the DNF level, by using local plugin. Or writing a similar plugin.
https://github.com/rpm-software-management/dnf-plugins-core/blob/master/plugins/local.py

Single node standalone Spark?

Is there any way I can install and use Spark without Hadoop/HDFS on a single node? I just want to try some simple examples and I would like to keep it as light as possible.
Yes it is possible.
However it needs hadoop libraries underneath, so download any version of prebuilt spark like,
wget http://www.apache.org/dyn/closer.lua/spark/spark-1.6.0/spark-1.6.0-bin-hadoop2.4.tgz
tar -xvzf spark-1.6.0-bin-hadoop2.4.tgz
cd spark-1.6.0-bin-hadoop2.4/bin
./spark-shell

Tar --acl option is missing from 1.25 version

I am trying to upgrade the tar version on my system.
Below is the current version.
# tar --version
tar (GNU tar) 1.17
If we execute tar --help
Handling of file attributes:
--acls Save the ACLs to the archive
--atime-preserve don't change access times on dumped files
We can see --acls option available.
I downloaded 1.25 tar version, compiled and now I see --acl option is not available in the latest tar version.
Am I missing something ? or That option is replaced with some other option ?
The solution to your problem is that you are using -- with your acl(option). Don't do that, you have to use a single dash(-).
So, the option would be tar -acl archive.tar!
This will help you for sure.

pack the nodejs and javascript code to one execute file

I hope to pack the nodejs(includes its installed module via npm) and javascript code to one execute file for different plateform(windows, osx, linux).
Is it possible or any solution?
Your comment welcome
From my understanding, you can't really create an executable file for multiplatforms. Each platform has it's own packaging format to make it binary executable. What you can do is to create a x.tar.gz file and expand it to your target platform. I myself haven't done it but theoretically it's possible. Here is an example (assuming you're using GNU tar for all your platforms):
To pack it, do:
tar cvzf nodeproject.tar.gz nodeproject
To expand, do
tar xvzf nodeproject.tar.gz

Resources