I have a perforce Stream
I would like to archive this stream and give it to some one else (a client) at the end of a project so that they can import the stream into their own perforce server.
Can this be done? Can streams be exported from one serverA/depot1 and then imported into say SomeOtherServer/SomeOtherDepot.
I bet you could use remote depots.
First, set up Server1's Depot1 as RemoteDepot1 in Server2, so that Server2 can pull files from Server1.
Then, in Server2, create stream Depot2, and mainline Stream2. Then, while still in Server2, run:
p4 populate //RemoteDepot1/Stream1/... //Depot2/Stream2/...
This seeds Server2's //Depot2/Stream2 with the files from Server1's //Depot1/Stream1.
Related
It looks like simulcasting a single users RTMP stream is as simple as creating an RTMP server configuration in my nginx.conf file and using the push directive to push the stream to the different Social Media RTMP url, but what would be the best way to do this if I have multiple users needing to stream their data to their own social media live accounts?
Here are the possible solution that I can think of:
Use docker to create multiple containers with Nginx RTMP installed for each individual who signs up. I could then edit & manage separate RTMP server configurations and reload the configuration so they can each begin streaming. This sounds like a lot of overhead though.
If it's possible, I could setup multiple RTMP server configs for each user in a single environment (sites-enabled) and reload the config without NGINX going down, but using different ports doesn't seem ideal and I feel like if something happens while the server is reloading the config there is a possibility of every individual who is streaming dropping their connection. Edit: Sites enabled seems out of the question since it needs to be within root context (nginx.conf only) as per https://github.com/arut/nginx-rtmp-module/issues/1492
Map to each users RTMP push directives using their stream key and then forward to that users social media?
Any thoughts?
Here's my example single server configuration:
rtmp {
server {
listen 1935;
chunk_size 4096;
application live {
live on;
record off;
push rtmp://facebook/STREAM KEY HERE;
push rtmp://youtube/STREAM KEY HERE;
}
}
}
I'm new to RTMP if you haven't picked that up yet :P
Check out the following project on GitHub, it does exactly what you need:
https://github.com/jprjr/multistreamer
Note: the project is now archived.
I'm making a node.js application with a MySQL database, and there are string fields in some tables that represent paths in the filesystem for example: "C:\Users\steve\Desktop\my-nodejs-app\files\Apple.jpg" and when a client connects to the server for the first time automatically download all the files in those paths in the database. Every client can add registers and send files in the server.
My question is:
Let's suppose that two clients use the app regularly and client1 adds a new file in the files folder. In the client1's folder it will be all the files updated, but then client2 connects in the app and it has all the files but doesn't have the new file added.
How can I fix this? Make an app where a client just download those files that don't have and don't download all the files every time it connects to the server?
I'd suggest having the client request a list of all files on "connect", then loop through them to see if they exist locally. If they don't, request/download them from the server.
I'm testing out postgres binary abilities by storing some mp3 data in a table. I've read you're supposed to store them in an external filesystem like S3, but for various reasons I don't want to do that right now.
So, for now I'd like to test storing files in the db. The mp3 files are TTS mp3 files from a third-party and I've stored them in a postgres table. This is working ok. But how do I serve them to the client? In other words:
client http requests the files.
node requests (pg-promise) the records (one or many).
the data arrives from db to node in binary format.
??? Do I have to convert it to a mp3 file before sending? Can I send the binary file directly? Which would be better?
client receives file(s)
client queues files in order for playing audio.
My main question is whether I need to convert the binary record I received from postgres before sending, and how to do that?
I'm working on REST server script written in NodeJS. This script lets user do POST request for uploading files. Since my server is not responsible to store files and it is taken care of by another remote server which is also REST server, I like to forward/redirect file upload stream to that remote server. What is the best approach to do that?
Should I buffer the stream and wait until I receive file in its entirety before streaming it to the remote server? Or
Should I just pipe the chunks to remote server as I receive them? I tried piping in my upload route with this statement -
req.pipe(request.post(connection).pipe(res)
but I received an error from the remote server - "Connection has been cancelled by the client". Or
Is it possible to redirect the incoming upload stream to that remote server so that my script wouldn't be a middleman?
Thanks
I have implemented a read/write stream to read a buffer, manipulate the data(like adding headers and footers during output file creation) and write it to a file. What should I do to instead of writing to a file locally, to write to a file in a remote location, but I have only FTP access to the remote server.
I wrote a client using POCO to transfer the file to the ftp server, but it is a two step process. How can I implement a solution which directly writes to the FTP server? I am not able to get how to connect a source stream(which is actually a ReadFile call) to the FTP network stream?
Thanks.
You need an FTP client library that you can call directly from your app, to avoid the need to write the file to disk and then send it via a separate process.
This earlier question has some useful info.