How to stream video capture between devices like video chat in iphone sdk? - ios4

Hi i want to make an app which does video calling between iOS devices .I have studied about opentok and idoubs but i want to do it myself from starting. I searched a lot but could not find any solution . I tried to achieve that in a way i think how video chat works . Untill now i have done following things (by using a streaming bonjour tutorial):
Create avcapture session and got cmsamplebufferref data in
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection{
if( captureOutput == _captureOutput ){
NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
//Lock the image buffer//
CVPixelBufferLockBaseAddress(imageBuffer,0);
//Get information about the image//
uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
//Create a CGImageRef from the CVImageBufferRef//
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
CGImageRef newImage = CGBitmapContextCreateImage(newContext);
//We release some components
CGContextRelease(newContext);
CGColorSpaceRelease(colorSpace);
previewImage= [UIImage imageWithCGImage:newImage scale:1.0 orientation:UIImageOrientationRight];
CGImageRelease(newImage);
[uploadImageView performSelectorOnMainThread:#selector(setImage:) withObject:previewImage waitUntilDone:YES];
//We unlock the image buffer//
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
[pool drain];
[self sendMIxedData:#"video1"];
}
else if( captureOutput == _audioOutput){
dataA= [[NSMutableData alloc] init];
CMBlockBufferRef blockBuffer;
CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(sampleBuffer, NULL, &currentInputAudioBufferList, sizeof(currentInputAudioBufferList), NULL, NULL, 0, &blockBuffer);
//CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(sampleBuffer, NULL, &bufferList, sizeof(bufferList), NULL, NULL, 0, &blockBuffer);
for (int y = 0; y < currentInputAudioBufferList.mNumberBuffers; y++) {
AudioBuffer audioBuffer = currentInputAudioBufferList.mBuffers[y];
Float32 *frame = (Float32*)audioBuffer.mData;
[dataA appendBytes:frame length:audioBuffer.mDataByteSize];
}
[self sendMIxedData:#"audio"];
}
Now sendMixeddata method is writing these video/audio bytes to NSStream.
NSData *data = UIImageJPEGRepresentation([self scaleAndRotateImage:previewImage], 1.0);
const uint8_t *message1 = (const uint8_t *)[#"video1" UTF8String];
[_outStream write:message1 maxLength:strlen((char *)message1)];
[_outStream write:(const uint8_t *)[data bytes] maxLength:[data length]];
const uint8_t *message1 = (const uint8_t *)[#"audio" UTF8String];
[_outStream write:message1 maxLength:strlen((char *)message1)];
[_outStream write:(const uint8_t *)[dataA bytes] maxLength:[dataA length]];
Now the bytes are recived in nsstream delegate method on recieving device
NOw the problem is i dont know if thais is the way video chat works
or not
Also i have no success how to use the receiving bytes to be displayed
as video .
I tried by sending "audio" and "video1" string with the bytes to know
if its video or audio. I also tried without using additional string.
The images are received and displayed correctly but the audio is so
distorted .
Please tell me if this is the correct way to make a video chat app or
not . If yes than what should i do to to make it usable .For example
: Should i send audio/video data together rather than separately like
my example .Here iam using simple bonjour tutorial but how will i
achieve the same with a real server
Please guide me proper direction as i am stuck here .
Thanks
(Sorry for the formatting . I tried but was unable to format correctly)

Video streaming apps use video codecs like vp8 or h.264, which will beat your JPEG encoded frames.
You should be able to display you received NSData by doing...
UIImage *image = [UIImage imageWithData:data];

Related

Image getting zoomed and cut off when converting from binary data

I have some background images stored as binary data. I need to draw something on it based on data and then show it as a single image in the browser. The issue is that some of the images are getting zoomed in and some get zoomed out when I try to do this with the following code. Can anyone please tell where I am going wrong?
int imageWidth = 0, imageHeight=0;
Image bmpImg;
if (datatable.Rows.Count > 0)
{
bmpImg = Bitmap.FromStream(new MemoryStream((byte[])datatable.Rows[0]["data"]));
imageWidth = bmpImg.Width;
imageHeight= bmpImg.Height;
}
else{
bmpImg = null;
}
bitmap = new Bitmap(imageWidth, imageHeight);
//bitmap = new Bitmap(1000, 800);
renderer = SvgRenderer.FromImage(bitmap);
graphics = Graphics.FromImage(bitmap);
if(bmpImg != null)
{
graphics.DrawImage(bmpImg, 0, 0);
}
//perform other drawings using graphics
MemoryStream ms = new MemoryStream();
bitmap.Save(ms, ImageFormat.Png);
bitmap.Dispose();
renderer.Dispose();
After some searching, I got the answer from another question : Graphics.DrawImage unexpectedly resizing image
Basically you have to change this:
graphics.DrawImage(bmpImg, 0, 0);
to
graphics.DrawImage(bmpImg,new Rectangle(0,0,imageWidth,imageHeight));

UIGraphicsBeginImageContext created image stays in memory forever

I'm creating many colour variations of an image using Core Graphics. I need to create about 320 in total but gradually, the memory usage of the app keeps increasing until it crashes. From Instruments, I see that more CGImage objects are being created and stay alive. I want to release them because I store the images to the cache directory in PNG format.
I have searched through all other solutions I could find without success. Any help is appreciated. Thanks.
Here's the main part:
+(UIImage *)tintedImageFromImage:(UIImage *)sourceImage colour:(UIColor *)color intensity:(float)intensity {
if (UIGraphicsBeginImageContextWithOptions != NULL) {
UIGraphicsBeginImageContextWithOptions(sourceImage.size, NO, 0.0);
} else {
UIGraphicsBeginImageContext(sourceImage.size);
}
CGContextRef context = UIGraphicsGetCurrentContext();
CGRect rect = CGRectMake(0, 0, sourceImage.size.width, sourceImage.size.height);
// draw alpha-mask
CGContextSetBlendMode(context, kCGBlendModeNormal);
CGContextDrawImage(context, rect, sourceImage.CGImage);
// draw tint color, preserving alpha values of original image
CGContextSetBlendMode(context, kCGBlendModeSourceIn);
[color setFill];
CGContextFillRect(context, rect);
//Set the original greyscale template as the overlay of the new image
sourceImage = [self verticallyFlipImage:sourceImage];
[sourceImage drawInRect:CGRectMake(0,0, sourceImage.size.width,sourceImage.size.height) blendMode:kCGBlendModeMultiply alpha:intensity];
UIImage *colouredImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
colouredImage = [self verticallyFlipImage:colouredImage];
return colouredImage;
}
this is used to flip the image:
+(UIImage *)verticallyFlipImage:(UIImage *)originalImage {
UIImageView *tempImageView = [[UIImageView alloc] initWithImage:originalImage];
UIGraphicsBeginImageContext(tempImageView.frame.size);
CGContextRef context = UIGraphicsGetCurrentContext();
CGAffineTransform flipVertical = CGAffineTransformMake(1, 0, 0, -1, 0, tempImageView.frame.size.height);
CGContextConcatCTM(context, flipVertical);
[tempImageView.layer renderInContext:context];
UIImage *flippedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return flippedImage;
}
Add to verticallyFlipImage the line tempImageView.image = nil; in order to make the image view release the image. This solves the problem.

Multiple UI threads in Cocoa

I have a Mac OS X server application that renders NSViews and returns them over an HTTP interface as images for use elsewhere. There's no visible UI, and the application creates detached NSViews without an NSWindow.
The application can receive many requests at once, but the layout and rendering process is synchronized around the main thread (using dispatch_sync in GCD) as Cocoa UI isn't thread safe, reducing the throughput to a single request at a time in that portion of the code.
Given that each request is entirely separate, with nothing shared between them, is there a way for a Cocoa application to effectively run multiple, entirely separate UI threads? Perhaps using multiple run loops?
I'd like to avoid having to run multiple processes, if possible.
It's hard to say with certainty that this will work for your specific needs (since your specific needs may have main-thread dependencies not called out in your question) but I don't see anything particularly controversial here. For instance, the following code works just fine without incident:
CGImageRef CreateImageFromView(NSView* view)
{
const CGSize contextSize = CGSizeMake(ceil(view.frame.size.width), ceil(view.frame.size.height));
const size_t width = contextSize.width;
const size_t height = contextSize.height;
const size_t bytesPerPixel = 32;
const size_t bitmapBytesPerRow = 64 * ((width * bytesPerPixel + 63) / 64 ); // Alignment
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(NULL, width, height, 8, bitmapBytesPerRow, colorSpace, kCGBitmapByteOrder32Host | kCGImageAlphaPremultipliedLast);
CGColorSpaceRelease(colorSpace);
[view displayRectIgnoringOpacity: view.bounds inContext: [NSGraphicsContext graphicsContextWithGraphicsPort: context flipped: YES]];
CGImageRef image = CGBitmapContextCreateImage(context);
CGContextRelease(context);
return image;
}
- (IBAction)doStuff:(id)sender
{
static NSUInteger count = 0;
for (NSUInteger i =0; i < 100; ++i)
{
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
NSButton* button = [[[NSButton alloc] initWithFrame: NSMakeRect(0, 0, 200, 100)] autorelease];
button.title = [NSString stringWithFormat: #"Done Stuff %lu Times", (unsigned long)count++];
CGImageRef image = CreateImageFromView(button);
NSImage* nsImage = [[[NSImage alloc] initWithCGImage:image size: NSMakeSize(CGImageGetWidth(image), CGImageGetHeight(image))] autorelease];
CGImageRelease(image);
dispatch_async(dispatch_get_main_queue(), ^{
self.imageView.image = nsImage;
});
});
}
}
The key here is that everything be "private" to the background rendering task. It gets its own view, its own graphics context, etc. If you aren't sharing anything, this should be OK. Since you explicitly said, "Given that each request is entirely separate, with nothing shared between them", I suspect you've already satisfied this condition.
Try it out. Leave a comment if you run into trouble.

Cannot set thumbnail image for UIWebView

I want to set thumbnail image in the uiwebview frame on my view.
However when I use the following code, there is no image displayed, although the image value is NOT NULL. Can someone help and and let me knwo where am I going wrong?
Please note that in the code, the VideoView class is derived from UIWebView and works just like it.
video = [[VideoView alloc]
initWithStringAsURL:#"http://www.youtube.com/watch?v=T6X3j7zxUHA"
frame:CGRectMake(11, 7, 298, 311)];
[self addSubview:video];
UIGraphicsBeginImageContext(video.bounds.size);
[video.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *resultImageView = UIGraphicsGetImageFromCurrentImageContext();
thumbnailButton = [UIButton buttonWithType:UIButtonTypeCustom];
[thumbnailButton setFrame:CGRectMake(0,0, 298, 311)];
[thumbnailButton setImage:resultImageView forState:UIControlStateNormal];
[video addSubview:thumbnailButton];
UIGraphicsEndImageContext();
I would not recommend using a UIWebView to just get a thumbnail of a YouTube video, but here is some code that I use to embed youtube videos in my app.
static NSString* EmbedHTML = "<html>\
<head>\
<meta name=\"viewport\" content=\"initial-scale=1.0, user-scalable=no, width=%0.0f\"/>\
</head>\
<iframe width=\"%0.0f\" height=\"%0.0f\" src=\"%#\" frameborder=\"0\" allowfullscreen> </iframe></html>";
Use that to embed the video and change your youtube URL from http://www.youtube.com/watch?v=T6X3j7zxUHA to http://www.youtube.com/embed/T6X3j7zxUHA.
Another way to get this that would not require the webview, would be to just build the thumbnail by hand. (example below)
///////////////////////////////////////////////////////////////////////////////////////////////////
-(NSString*)youtubeThumb:(NSString*)url
{
NSRange end = [url rangeOfString:#"?"];
if(end.location != NSNotFound)
{
NSRange start = [url rangeOfString:#"/" options:NSBackwardsSearch range:NSMakeRange(0, end.location)];
if(start.location != NSNotFound)
{
NSString* video_id = [url substringWithRange:NSMakeRange(start.location+1, (end.location-1)-start.location)];
return [NSString stringWithFormat:#"http://i.ytimg.com/vi/%#/2.jpg",video_id];
}
}
return nil;
}
Now you have the image thumbnail URL, just fetch it (via a simple GET request).

Is there a way to record an audio stream using Matt Gallagher's audio streamer?

I use Matt Gallagher's audio streamer for streaming radio stations. But how to record the audio? Is there a way to get the downloaded packets into NSData and save it in an audio file in the documents folder on the iPhone?
Thanks
Yes, there is and I have done it. My problem is being able to play it back IN the same streamer (asked elsewhere). It will play back with the standard AVAudioPlayer in iOS. However, this will save the data to a file by writing it out in the streamer code.
This example is missing some error checks, but will give you the main idea.
First, a call from the main thread to start and stop recording. This is in my viewController when someone presses record:
//---------------------------------------------------------
// Record button was pressed (toggle on/off)
// writes a file to the documents directory using date and time for the name
//---------------------------------------------------------
-(IBAction)recordButton:(id)sender {
// only start if the streamer is playing (self.streamer is my streamer instance)
if ([self.streamer isPlaying]) {
NSDate *currentDateTime = [NSDate date]; // get current date and time
NSDateFormatter *dateFormatter = [[[NSDateFormatter alloc] init] autorelease];
[dateFormatter setDateFormat:#"EEEE MMMM d YYYY 'at' HH:mm:ss"];
NSString *dateString = [dateFormatter stringFromDate:currentDateTime];
self.isRecording = !self.isRecording; // toggle recording state BOOL
if (self.isRecording)
{
// start recording here
// change the record button to show it is recording - this is an IBOutlet
[self.recordButtonImage setImage:[UIImage imageNamed:#"Record2.png"] forState:0];
// call AudioStreamer to start recording. It returns the file pointer back
//
self.recordFilePath = [self.streamer recordStream:TRUE fileName:dateString]; // start file stream and get file pointer
} else
{
//stop recording here
// change the button back
[self.recordButtonImage setImage:[UIImage imageNamed:#"Record.png"] forState:0];
// call streamer code, stop the recording. Also returns the file path again.
self.recordFilePath = [self.streamer recordStream:FALSE fileName:nil]; // stop stream and get file pointer
// add to "recorded files" for selecting a recorderd file later.
// first, add channel, date, time
dateString = [NSString stringWithFormat:#"%# Recorded on %#",self.model.stationName, dateString]; // used to identify the item in a list laster
// the dictionary will be used to hold the data on this recording for display elsewhere
NSDictionary *row1 = [[[NSDictionary alloc] initWithObjectsAndKeys: self.recordFilePath, #"path", dateString, #"dateTime", nil] autorelease];
// save the stream info in an array of recorded Streams
if (self.model.recordedStreamsArray == nil) {
self.model.recordedStreamsArray = [[NSMutableArray alloc] init]// init the array
}
[self.model.recordedStreamsArray addObject:row1]; // dict for this recording
}
}
}
NOW, in AudioStreamer.m I need to handle the record setup call above
- (NSString*)recordStream:(BOOL)record fileName:(NSString *)fileName
{
// this will start/stop recording, and return the file pointer
if (record) {
if (state == AS_PLAYING)
{
// now open a file to save the data into
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
// will call this an mp3 file for now (this may need to change)
NSMutableString *temp = [NSMutableString stringWithString:[documentsDirectory stringByAppendingFormat:#"/%#.mp3",fileName]];
// remove the ':' in the time string, and create a file name w/ time & date
[temp replaceOccurrencesOfString:#":" withString:#"" options:NSLiteralSearch range:NSMakeRange(0, [temp length])];
self.filePath = temp; // file name is date time generated.
NSLog(#"Stream Save File Open = %#", self.filePath);
// open the recording file stream output
self.fileStream = [NSOutputStream outputStreamToFileAtPath:self.filePath append:NO];
[self.fileStream open];
NSLog(#"recording to %#", self.fileStream);
self.isRecording = TRUE;
return (self.filePath); // if started, send back the file path
}
return (nil); // if not started, return nil for error checking
} else {
// save the stream here to a file.
// we are done, close the stream.
if (self.fileStream != nil) {
[self.fileStream close];
self.fileStream = nil;
}
NSLog(#"stop recording");
self.isRecording = FALSE;
return (self.filePath); // when stopping, return nil
}
}
LASTLY, we need to modify the data portion of the streamer to actually save the bytes. You need to modify the stream code in the method: -(void)handleReadFromStream:(CFReadStreamRef)aStreameventType:(CFStreamEventType)eventType
Scroll down in that method until you find:
#synchronized(self)
{
if ([self isFinishing] || !CFReadStreamHasBytesAvailable(stream))
{
return;
}
//
// Read the bytes from the stream
//
length = CFReadStreamRead(stream, bytes, kAQDefaultBufSize);
if (length == -1)
{
[self failWithErrorCode:AS_AUDIO_DATA_NOT_FOUND];
return;
}
RIGHT after the length = line, add the following code:
//
// if recording, save the raw data to a file
//
if(self.isRecording && length != 0){
//
// write the data to a file
//
NSInteger bytesWritten;
NSInteger bytesWrittenSoFar;
bytesWrittenSoFar = 0;
do {
bytesWritten = [self.fileStream write:&bytes[bytesWrittenSoFar] maxLength:length - bytesWrittenSoFar];
NSLog(#"bytesWritten = %i",bytesWritten);
if (bytesWritten == -1) {
[self.fileStream close];
self.fileStream = nil;
NSLog(#"File write error");
break;
} else {
bytesWrittenSoFar += bytesWritten;
}
} while (bytesWrittenSoFar != length);
}
Here are the .h declarations:
Added to the interface for AudioStreamer.h
// for recording and saving a stream
NSString* filePath;
NSOutputStream* fileStream;
BOOL isRecording;
BOOL isPlayingFile;
In your view controller you will need:
#property(nonatomic, assign) IBOutlet UIButton* recordButtonImage;
#property(nonatomic, assign) BOOL isRecording;
#property (nonatomic, copy) NSString* recordFilePath;
Hope this helps someone. Let me know if questions, and always happy to hear someone who can improve this.
Also, someone asked about self.model.xxx Model is a Data Object I created to allow me to easily pass around data that is used by more than one object, and is also modified by more than one object. I know, global data is bad form, but there are times that just make it easier to access. I pass the data model to each new object when called. I save an array of channels, song name, artist name, and other stream related data inside the model. I also put any data I want to persist through launches here, like settings, and write this data model to a file each time a persistent data is changed. IN this example, you can keep the data locally. If you need help on the model passing, let me know.
OK, here is how I play back the recorded file. When playing a file, the station URL contains the path to the file. self.model.playRecordedSong contains a time value for how many seconds into the stream I want to play. I keep a dictionary of song name and time index, so I can jump into the recorded stream at the start of any song. Use 0 to start form the beginning.
NSError *error;
NSURL *url = [NSURL fileURLWithPath:[NSString stringWithFormat:self.model.stationURL, [[NSBundle mainBundle] resourcePath]]];
// get the file URL and then create an audio player if we don't already have one.
if (audioPlayer == nil) {
// set the seconds count to the proper start point (0, or some time into the stream)
// this will be 0 for start of stream, or some value passed back if they picked a song.
self.recordPlaySecondsCount = self.model.playRecordedSong;
//create a new player
audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:&error];
// set self so we can catch the end of file.
[audioPlayer setDelegate: self];
// audio player needs an NSTimeInterval. Get it from the seconds start point.
NSTimeInterval interval = self.model.playRecordedSong;
// seek to the proper place in file.
audioPlayer.currentTime = interval;
}
audioPlayer.numberOfLoops = 0; // do not repeat
if (audioPlayer == nil)
NSLog(#"AVAudiolayer error: %#", error);
// I need to do more on the error of no player
else {
[audioPlayer play];
}
I hope this helps you play back the recorded file.
Try This Class This Have Full Solution OF All Radio streaming recording Playing All..
In Git Hub You Can Find This Use This Class Very Easy To Use

Resources