my app is a AR app to play h.264 video.
We created SKVideoNode by AVPlayer and set the SKVideoNode to SKScene, and then create SCNNode to involve this SKScene.
- Sample Code
AVPlayer *avPlayer = [self getMoviePlayer:path];
SKVideoNode *videoNode = [[SKVideoNode alloc]initWithAVPlayer: avPlayer];
CGSize videoSize = CGSizeMake(100, 50);
videoNode.size = videoSize;
videoNode.position = CGPointMake(50, 50);
videoNode.yScale = -1.0;
SKScene *skScene = [[SKScene alloc] initWithSize:videoSize];
skScene.scaleMode = SKSceneScaleModeAspectFit;
[skScene addChild:videoNode];
SCNNode *planeNode = [[SCNNode alloc] init];
planeNode.geometry = [[SCNPlane alloc] init];
SCNMaterial *material;
material = [[SCNMaterial alloc] init];
material.diffuse.contents = skScene;
planeNode.geometry.firstMaterial = material;
========================
we use this SCNNode to play videos.
it works fine until iOS12.4 but when I test it on iOS13-beta device, it shows black screen while sound is heard.
I had a very similar issue. I also thought it might be an iOS13 bug. But I couldn't reproduce the issue in a test project that was isolating the particular SKVideNode code.
It turned out the root cause of my issue with this problem was in my info.plist file I had PrefersOpenGL set to YES. Changing it to NO (or deleting it) fixed my problem with it not displaying the video in my SKVideoNode iOS 13. It also looks like that change works fine back to iOS 12.4 and 11.4 as well.
I had the same problem. I solved it with replace:
[myViewController.layer addSublayer:playerViewController.view.layer];
by:
[myViewController addSubview:playerViewController.view];
If it can help you.
Related
I'm doing IOS app that recognize 2D image. After recognition it cover the area with read background using this code:
SCNPlane *plane = [SCNPlane planeWithWidth:width height:height];
plane.firstMaterial.diffuse.contents = [UIColor redColor];
Now I want it to play video inside this detected area. I created AVPlayer object using:
NSURL *url = [[NSBundle mainBundle] URLForResource:#"IMG_2099" withExtension:#"MOV"];
AVPlayer *player = [AVPlayer playerWithURL:url];
and added player to material, replacing redColor with:
plane.firstMaterial.diffuse.contents = player;
I guess I need to do something else in order to make AVPlayer play. What should I add to the code (objective c)?
You need to create a SpriteKit scene, add a video node to it, and make it the material of your SceneKit geometry.
I am new to this. And there comes my first problem.
I build an object loader with SceneKit. I got the path, the Object is available, but I have no clue how I can color the shown object.
ScnScene *testScene = [SCNScene sceneWithURL:url options:nil error:NULL];
testScene.background.contents = [UIImage imageWithName:#"color.png"];
[self.mainView.scene.rootnode addChildNode:testScene.rootNode];
This didn't work. I also tried with:
SCNMaterialProperty *testColor = [SCNMaterialProperty materialPropertyWithContents [UIImage imageNamed:#"color.png"]];
testScene.rootnode.geometry.materials = testcolor;
Or:
SCNMaterial *testColor = [SCNMaterial material];
testColor.diffuse.contesnts = [UIColor redColor];
testScene.rootnode.geometry.firstMaterial = testColor;
Nothing works. When I start the app, every object is displayed. So far the OBJ-Loader works out very well. But everything is still gray. I totally have no clue how to color the displayed objects. Has anyone a hint/idea/solution for me?
Btw. I want to avoid that I have to build a geometry out of the OBJ-Informations manually. So I try to solve this by the SCNScene.
The major problem was that the import by a SCNScenedoes not work that way. So the right solution is to import the obj.file into a SCNNode, add a SCNMaterial with the chosen color (or an image) to the SCNNode and give the SCNNodeto the SCNScene. To load the obj.file, you need to import that by Model IO Framework.
I will give some code how I made it colorful.
#import <SceneKit/SceneKit.h>
#import <ModelIO/ModelIO.h>
#import <SceneKit/ModelIO.h>
...
#property (nonatomic) SCNView* mainView;
....
MDLAsset *asset = [[MDLAsset alloc] initWithURL:url];
SCNScene *scene = [SCNScene scene];
SCNNode *node = [SCNNode nodeWithMDLObject:[asset objectAtIndex:0]];
SCNMaterial *material = [SCNMaterial material];
material.diffuse.contents = [UIColor colorWithHue:0 saturation:0.1 brightness:0.5 alpha:1];
node.geometry.firstMaterial = material;
[scene.rootNode addChildNode:node];
[self.mainView.scene.rootNode addChildNode:scene.rootNode];
Alternately you can add a color by this:
material.diffuse.contents = [UIImage imageNamed:#"farbe.png"];
Now you can import any obj.file externally (from any chosen folder you like) and color it.
Thanks to SGlindemann, cashmash and Hal Mueller, who helped out to find that solution.
UPDATE (29.1.2017)
Somehow the way above does not work anymore. I did not figure out yet what changed. But I made another code that makes loading for 3D files possible (from mainBundle, not externally). Here I start from a SCNNode class which is called by the ViewController.m. The SCNScene is setup in the ViewController. Following there is the code which I wrote for the SCNNode class.
Before you start, put the .obj and .mtl file (both with same name) into your Xcode project. You don't need to convert it to a scene.
#import <ModelIO/ModelIO.h>
#import <SceneKit/ModelIO.h>
...
#property (nonatomic) SCNNode *objectNode;
...
NSString* path = [[NSBundle mainBundle]
pathForResource:[NSString stringWithFormat:#"name of the obj.file"]
ofType:#"obj"];
NSURL *url = [NSURL fileURLWithPath:path];
MDLAsset *asset = [[MDLAsset alloc]initWithURL:url];
// Create the Block
self.objectNode = [SCNNode nodeWithMDLObject:[asset objectAtIndex:0]];
[self addChildNode: self.objectNode];
return self;
This returned self has to be added into your view.
[self.view.scene.rootNode addChildNode:returnedObj];
The MDLAssetloads the .obj-file with the corresponding .mtl-file and png-File. I used this code to load objects from MagicaVoxel (this exports obj+mtl+png at once). I did not get deep into it yet.
I did NOT try this code with external loadings or typing in colors manually via SCNMaterial. So there is no statement whether this works or not. I did not try.
Your first example will set the background of the scene but do nothing with your object.
Your second example should be giving you some compiler warnings. You're assigning a single SCNMaterialProperty to testScene.rootnode.geometry.materials, which expects an array of SCNMaterial (not SCNMaterialProperty). Is that your real code?
The final example shouldn't compile at all: you've misspelled contents as contesnts. Other than that, it ought to work.
Note that MDLAsset can import an OBJ file and return an SCNNode. See How do you convert Wavefront OBJ file to an SCNNode with Model I/O. If the object is an asset you'll be shipping with your project, save it as an SCNScene (which is compact and optimized), and ship that, not the original OBJ.
I'm new in developing iOS Apps.
I would like to use the built in core image filters in my app. But I don't know how I can use them. I have already created the "CISephiaTon" filter. Now I would like to create the "CIPhotoEffectTransfer" filter. But I don't know how I can do this =(
Does anybody knows a good tutorial or can give me one for applying core image filters in xcode 5?
This is the filter which I want to add. Could someone give me the code of this filter?
Hope someone can help me.
Observe the following code...
//First create a CIImage
UIImageOrientation originalOrientation = origim.imageOrientation;
data3 = UIImagePNGRepresentation(origim);
CIImage *result = [CIImage imageWithData:data3];
//Create the context
CIContext *context = [CIContext contextWithOptions:nil];
Apply the filter
CIFilter *colorControlsFilter = [CIFilter filterWithName:#"CIColorControls"];
[colorControlsFilter setDefaults];
[colorControlsFilter setValue:result forKey:#"inputImage"];
[colorControlsFilter setValue:[NSNumber numberWithFloat:slySlider1.value] forKey:#"inputSaturation"];
result = [colorControlsFilter valueForKey: #"outputImage"];
//Create CGImage with the original orientation, CIImage, and context.
CGImageRef imgRef = [context createCGImage:result fromRect:result.extent];
//Create the new UIImage from CGImage
theimage.image = [[UIImage alloc] initWithCGImage:imgRef scale:1.0 orientation:originalOrientation];
//Release CGImageRef
CGImageRelease(imgRef);
Good tutorial Beginning Core Image
I have a memory leak situation about UITabBarItem. My app has a TabBarController with customized pictures. so here is the code I put on my graph files.
UIImage *selectedImage0 = [UIImage imageNamed:#"A-click.png"];
UIImage *unselectedImage0 = [UIImage imageNamed:#"A.png"];
UIImage *selectedImage1 = [UIImage imageNamed:#"B-click.png"];
UIImage *unselectedImage1 = [UIImage imageNamed:#"B.png"];
UITabBar *tabBar = self.mytabbarcontroller.tabBar;
UITabBarItem *item0 = [tabBar.items objectAtIndex:0];
UITabBarItem *item1 = [tabBar.items objectAtIndex:1];
[item0 setFinishedSelectedImage:selectedImage0 withFinishedUnselectedImage:unselectedImage0];
[item1 setFinishedSelectedImage:selectedImage1 withFinishedUnselectedImage:unselectedImage1];
And it works totally well !!!
However, when I use Instruments tool to check memory leak, it shows there is leak from
[item0 setFinishedSelectedImage:selectedImage0 withFinishedUnselectedImage:unselectedImage0];
[item1 setFinishedSelectedImage:selectedImage1 withFinishedUnselectedImage:unselectedImage1];
If I comment these two lines without using setFinishedSelectedImage, there won't be leak. I feel it is weird and doesn't make sense for me. Already searched the document and reference and couldn't find relative information. I am using iOS 6 and Xcode 4.5 . Do anyone knows about this? thanks in advance.
I am trying to add menu to a layer in cocos2d but it just does not appear. Here's the code which is written in init method of a layer
CCMenuItem *aButton = [CCMenuItemImage itemFromNormalImage:#"btnImg.png" selectedImage:#"btnImgSel.png" target:self selector:#selector(buttonPressed:)];
aButton.position = ccp(60.0,30.0);
CCMenu *aMenu = [CCMenu menuWithItems:aButton, nil];
aMenu.position = ccp(500.0,20);
[self addChild:aMenu];
Nothing is overlapping the position i specified for menu. Is anything wrong in the code?
Try like these:-
CCLayer *menuLayer1 = [[[CCLayer alloc] init]autorelease];
[self addChild:menuLayer1];
CCMenuItemImage *startButton1 = [CCMenuItemImage
itemFromNormalImage:#"Play.png"
selectedImage:#"Play.png"
target:self
selector:#selector(Play:)];
CCMenu *menu1 = [CCMenu menuWithItems: startButton1,nil];
menu1.position = ccp(157,157 );
[menu1 alignItemsVertically ];
[menuLayer1 addChild: menu1];
For those who are facing an irritating situation where the code is right but menu items are not showing then check the image file. I was using .png images and they were refusing to be displayed. There was something internally wrong with the file, so I replaced that file and it solved the problem :)
Is the iPad your target platform? If so the "menu" should appear at the bottom of the screen. To display the Menu on iPhone adjust the "a.Menu.position" to anything lower than 480 in the first attribute of ccp