Capture still image from video stream on a Linux Access Control Server - linux

I am currently adding features to a custom access control server at work and what I would like to do is access the camera at the door and store at least 1 still image after the door is unlocked (each door has an RFID reader that interfaces with this server).
For example one of the Cameras is a Vivotek FD8136, which has multiple output steams that i can access as long as I have a username and password......
With this camera I can access a stream by going to the local address http://xxx.xxx.xxx.xxx:8002/video2.mjpg
I am just unsure how to go about saving just a simple image to the server from that stream.....
any and all help will be appreciated. I know I am probably trying to over complicate it in my head....

I found one possible way to do it using avconv... But if anyone knows a better way please feel free to leave a message so that if anyone else needs this it can help them as well.
With avconv it is the command as follows (i believe it will work with ffmpeg as well)
avconv -i http://xxx.xxx.xxx.x:8002/video2.mjpg -vframes 1 output%.jpg
That will output a single file for testing but a bash script with date can make the file name with a date and time in it easily. Then when needed just run the desired script and there is an image to access later.

Related

Dash deployment

I just created this dashboard using dash and it works fine on my local machine but now when I try to input larger data the graphs lagging a little, so I have already a good server and I wish to host my dash application there, I don't want the app to run all the time but I want to be able each time I run it anyone who have access can see the dash and it won't go down unless I want to.
So my question here is :
should I just upload my code into that server machine and run the application from there ?
would the lagging be fixed if I am using better machine ?
Should I use Flask ?
Thank you.

How to live track of progress in terminal over portal?

So, I have this question that I didnt try to make beforeand was not sure how to search and what to do.
To give little background : Currently me and couple of my friend trying to building drone hacking platform which have rasperry pi on it. Our plan is to established wifi connection with base station and have a portal running on pi that will trigger some scripts including wpa bruteforcing. But for brute force the captured handshake will forward to base station and bruteforcing process will start on that machine.
My question is we need to be able to live track of the bruteforcing process over the local built portal so when we go to that portal we can see the percentage of progress and etc.
However, I didnt do such a think like this before and have no idea and/or dont know that and how to research for that. Therefore if anyone can give a lead or some idea I would appriciate it.
Thank you,
Write to a file and then track each change using tail -f <filename>. You can do this remotely over ssh

Using a CLI to recover a disk image saved with clonezilla

I have setup a live CentOS 7 that is booted via PXE if the client is connected to a specified network port.
Once the Linux is booted up, I have scripted a small logic that compares if there is a newer image version available on a central host than it is already deployed on the client. This is done with comparing the contents of a versions file. If there is a newer version, the image should be deployed on the client. Else only parts of the Image (qcow2-Files) should be replaced to safe time.
Since the Image is up to 1TB I do not want to apply the image at any case. It would also take too long.
On the client, there is a volume group that consists of lvms in different sizes and also "normal" partitions (like /dev/sda1).
Is there a way to deploy a whole partition structure using a cli?
I already figured this to recover one disk out of the whole system.
But this would make a lot of effort to script around that to get the destination structure I want.
I found out that there is no way to "run" clonezilla as a cli (which I actually cannot understand why this does not exist). I was trying to use parts of the clonezilla live iso with the command "ocs-sr", but I stuck somewhere and it always gives me a "unknown commands"-Error.
For my case the best would be a thing like:
. clonezilla --restore /path/to/images/folder --dest /dev
Which applies all Images in the imagefolder that is generated by clonezilla to the client.
Any help highly appreciated.
I've found that using Clonezilla's preparation script does the thing for me. You can use ocs_prerun parameter that will run a script before clonezilla will do anything.
If you are stuck into a company hardened image, you can try this to setup a (ubuntu) Linux with the needed programs on it.

A way to convert bitrate/format of audio files (between upload & storage to S3)

Currently using PHP 5.3.x & Fedora
Ok. I'll try to keep this simple. I'm working on a tool that allows the upload & storing of audio files on S3 for playback. Essentially, the user uploads a file (currently only allowing mp3 & m4a) to the server, and the file is then pushed to S3 for storage via the PHP SDK for amazon aws.
The missing link is that I would like to perform a simple bitrate & format conversion of the file prior to uploading the file. (ensuring that all files are 160kbs and .mp3).
I've looked into ffmpeg, although it seems that the PHP library only allows for reading bitrates and other meta, not for actual conversion.
Does anyone have any thoughts on the best way to approach this? Would running a shell_exec() command that performs the conversion be sufficient to do this, or is there a more efficient/better way of doing this?
Thanks in advance! Any help or advice is much appreciated.
You need to perform the conversion and upload to S3 'outside' of the PHP application as it'll take to long for the user to hang around on the page. This could be a simple app that uses ffmpeg from the command line.
I'm not familar with linux, so perhaps someone else can provide a more specific answer, but here is the basic premise:
User uploads file to server.
You set some kind of flag (eg in a database) for the user to see that the file is being processed.
You 'tell' your external encoder that a file needs to be processed and uploaded - you could use an entry in a database or some kind of message queue for this.
The encoder (possibly a command line app that invokes ffmpeg) picks up the next file in the queue and encodes it.
When complete, it uploads it to S3.
The flag is then updated to show that processing is complete and that the file is available.

Linux: What should I use to run terminal programs based on a calendar system?

Sorry about the really ambiguous question, I really have no idea how to word it though hopefully I can give you more detail here.
I am developing a project where a user can log into a website and book a server to run a game for a specific amount of time. When the time is up the server stops running and the players on the server are kicked off. The website part is not a problem, I am doing this in PHP and everything works. It has a calendar system to book a server and can generate config files based on what the user wants.
My question is what should I use to run the specific game server on the linux box with those config files at the correct time? I have got this working with bash scripts and cron, but it seems very un-elegant. It literally uses FTP to connect to the website so it can download all the necessary config files and put them in a folder for that game and time. I was wondering if there was a better way of doing this. Perhaps writing a program in C, but I am not sure how to go about doing this.
(I am not asking for someone to hold my hand and tell me "write this code here", just some ideas of a better way of approaching this problem)
Thanks so much guys!
Edit: The webserver is a totaly different machine. I would theoreticaly like to have more than one game server where each of them "connects" (at the moment FTP) to the webserver, gets a file saying what it has to do at a specific time and downloads any associated files then disconnects.
I think at is better suited for running one time jobs than cron.
For a better approach for the downloading files etc, you should give more details on your setup (like, the website and the game server, are they on the same machine? Or the same network? etc etc.
You need a distributed task scheduler. With that, you can:
Schedule command "X" to be run at a certain time.
Specify the machine (or ask it to pick a machine from a pool of available machines)
Webserver would send request to this scheduler via command line or via web service when user selects a game server and a time.
You can have a look at : http://www.acelet.com/super/SuperWatchdog/index.html
EDIT :
One more option :http://jobscheduler.sourceforge.net/

Resources