Android Studio inline compiler is not showing red lamp errors - android-studio

My Android studio is not showing the red lamp when I have an error but gradle does it when I compile the project. The strangest thing is that only happens in .java files, for xml files is working. I have tried to clean project and rebuild, reinitialize Android Studio and I have checked that power save mode is unselected.
My gradle file:
sourceSets {
main {
manifest.srcFile 'AndroidManifest.xml'
resources.srcDirs = ['src']
res.srcDirs = ['res']
assets.srcDirs = ['assets']
}
debug {
java.srcDirs = ['src', 'build/generated-sources', 'src-debug']
}
release {
java.srcDirs = ['src', 'build/generated-sources', 'src-release']
}
androidTest.setRoot('project-test')
androidTest {
java.srcDirs = ['project-test/src']
resources.srcDirs = ['project-test/src']
aidl.srcDirs = ['project-test/src']
renderscript.srcDirs = ['project-test/src']
res.srcDirs = ['project-test/res']
assets.srcDirs = ['project-test/assets']
}
}
Also my source folder in project is marked as a "J" inside a red circle.
Does someone knows where is the problem?? Thank you!!

Finally I found the problem. The problem was that somehow in gradle file my src foulder was like a resource foulder:
sourceSets {
main {
manifest.srcFile 'AndroidManifest.xml'
--> resources.srcDirs = ['src']
res.srcDirs = ['res']
assets.srcDirs = ['assets']
}
...
}
I changed this line like this:
sourceSets {
main {
manifest.srcFile 'AndroidManifest.xml'
java.srcDirs = ['src']
res.srcDirs = ['res']
assets.srcDirs = ['assets']
}
....
}
I hope this can be useful for other people!

Related

After updating firebase last version, MultiDexApplication error started

Hi I am codding with Kotlin. Everything was perfect. After adding
implementation 'com.google.firebase:firebase-auth-ktx:21.0.1'
lines in build.gradle It started that getting
Didn't find class "com.google.firebase.provider.FirebaseInitProvider" on path: DexPathList[[zip file.................
error.
What is the problem. I am giving some build.gradle files codes.Maybe it is necessary to say somethig. Thanks
apply plugin: 'com.google.gms.google-services'
android {
buildToolsVersion "30.0.3"
compileSdkVersion 30
sourceSets {
main {
manifest.srcFile 'AndroidManifest.xml'
java.srcDirs = ['src']
aidl.srcDirs = ['src']
renderscript.srcDirs = ['src']
res.srcDirs = ['res']
assets.srcDirs = ['assets']
jniLibs.srcDirs = ['libs']
packagingOptions {
exclude 'META-INF/robovm/ios/robovm.xml'
}
defaultConfig {
applicationId "com.mygdx.game"
minSdkVersion 26
targetSdkVersion 30
versionCode 1
versionName "1.0"
multiDexEnabled true
dependencies
implementation 'com.android.support:multidex:1.0.3'
implementation 'org.jetbrains.kotlinx:kotlinx-coroutines-core:1.3.9'
implementation 'org.jetbrains.kotlinx:kotlinx-coroutines-android:1.3.9'
implementation 'org.jetbrains.kotlinx:kotlinx-coroutines-play-services:1.1.1'
implementation 'com.google.firebase:firebase-database:19.7.0'
implementation 'com.google.firebase:firebase-database-ktx:19.7.0'
implementation 'com.google.firebase:firebase-auth-ktx:21.0.1'
// implementation 'com.google.firebase:firebase-auth:21.0.1'
eclipse.project.name = appName + "-android"
What is the problem?
Try updating multidex version..

How can I play a custom sound in WatchOS 3 that will playback on the watch speakers

I've read that we can now play custom sounds on the apple watch in watchos 3.
According to the announcement from Apple so apparently there is but I don't have an example to test it out: 3D spatial audio implemented using SCNAudioSource or SCNAudioPlayer. Instead, use playAudioSource:waitForCompletion: or the WatchKit sound or haptic APIs. Found here: https://developer.apple.com/library/prerelease/content/releasenotes/General/WhatsNewInwatchOS/Articles/watchOS3.html
Can someone place a simple example of this. I'm not using SceneKit in my app as I don't need it but if that's the only way to play a custom sound then I'd like to know the minimum code required to accomplish this. Preferably in Objective c but I'll take it in whatever shape. I'm ok using SpriteKit if that's easier also.
Here's what I have so far but it doesn't work:
SCNNode * audioNode = [[SCNNode alloc] init];
SCNAudioSource * audioSource = [SCNAudioSource audioSourceNamed:#"mysound.mp3"];
SCNAudioPlayer * audioPlayer = [SCNAudioPlayer audioPlayerWithSource:audioSource];
[audioNode addAudioPlayer:audioPlayer];
SCNAction * play = [SCNAction playAudioSource:audioSource waitForCompletion:YES];
[audioNode runAction:play];
I can confirm, that #ApperleyA solution really works!
Here is the swift version:
var _audioPlayer : AVAudioPlayerNode!
var _audioEngine : AVAudioEngine!
func playAudio()
{
if (_audioPlayer==nil) {
_audioPlayer = AVAudioPlayerNode()
_audioEngine = AVAudioEngine()
_audioEngine.attach(_audioPlayer)
let stereoFormat = AVAudioFormat(standardFormatWithSampleRate: 44100, channels: 2)
_audioEngine.connect(_audioPlayer, to: _audioEngine.mainMixerNode, format: stereoFormat)
do {
if !_audioEngine.isRunning {
try _audioEngine.start()
}
} catch {}
}
if let path = Bundle.main.path(forResource: "test", ofType: "mp3") {
let fileUrl = URL(fileURLWithPath: path)
do {
let asset = try AVAudioFile(forReading: fileUrl)
_audioPlayer.scheduleFile(asset, at: nil, completionHandler: nil)
_audioPlayer.play()
} catch {
print ("asset error")
}
}
}
This is Objective-c but can be translated into Swift
I ended up using AVAudioEngine and AVAudioPlayerNode to play audio on the Apple watch.
The gist of how to do this is as follows:
I call the following inside the init method of my AudioPlayer (it's an NSObject subclass to encapsulate the functionality)
_audioPlayer = [[AVAudioPlayerNode alloc] init];
_audioEngine = [[AVAudioEngine alloc] init];
[_audioEngine attachNode:_audioPlayer];
AVAudioFormat *stereoFormat = [[AVAudioFormat alloc] initStandardFormatWithSampleRate:44100 channels:2];
[_audioEngine connect:_audioPlayer to:_audioEngine.mainMixerNode format:stereoFormat];
if (!_audioEngine.isRunning) {
NSError* error;
[_audioEngine startAndReturnError:&error];
}
I have a cache setup so I don't recreate the AVAudioFile assets every time I want to play a sound but you don't need to.
So next create an AVAudioFile object:
NSError *error;
NSBundle* appBundle = [NSBundle mainBundle];
NSURL *url = [NSURL fileURLWithPath:[appBundle pathForResource:key ofType:#"aifc"]];
AVAudioFile *asset = [[AVAudioFile alloc] initForReading:url &error];
Then play that file:
[_audioPlayer scheduleFile:asset atTime:nil completionHandler:nil];
[_audioPlayer play];
UPDATE: If the app goes to sleep or is put to the background there is a chance the audio will stop playing/fade out. By activating an Audio Session this will be prevented.
NSError *error;
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:&error];
if (error) {
NSLog(#"AVAudioSession setCategory ERROR: %#", error.localizedDescription);
}
[[AVAudioSession sharedInstance] setActive:YES error:&error];
if (error) {
NSLog(#"AVAudioSession setActive ERROR: %#", error.localizedDescription);
}
I didn't go over handling any errors but this should work. Don't forget to #import <AVFoundation/AVFoundation.h> at the top of your implementation file.
This worked for me in the simulator
let soundPath = Bundle.main.path(forResource: "cheerMP3", ofType: "mp3")
let soundPathURL = URL(fileURLWithPath: soundPath!)
let audioFile = WKAudioFileAsset(url: soundPathURL)
let audioItem = WKAudioFilePlayerItem(asset: audioFile)
let audioPlayer = WKAudioFilePlayer.init(playerItem: audioItem)
if audioPlayer.status == .readyToPlay
{
audioPlayer.play()
}
else
{
print("Not ready!!")
}
but only if I had a breakpoint at both audioPlayer.play() and after the last }.
dunqan, what did you put at the top of the file, the import statements? I wasn't able to include
import AVFoundation
without an error using Xcode 8.2.1

How we remove sounds from Video through SCPlayer?

I am trying to remove Audio from Video and i am using SCRecorder Class.
but still there is Audio play. So Is there a way to remove Audio from Video using SCRecorder Class.I try following Code in my Project.
SCRecorder *recorder = [SCRecorder recorder]; // You can also use +[SCRecorder sharedRecorder]
SCAudioConfiguration *audio = recorder.audioConfiguration;
// Whether the audio should be enabled or not
audio.enabled = NO;
[_player play];
IN SCRecoder Class you need to stop or comment
this bunch of code
// if (self.audioConfiguration.enabled) {
// if (_audioOutput == nil) {
// _audioOutput = [[AVCaptureAudioDataOutput alloc] init];
// [_audioOutput setSampleBufferDelegate:self queue:_audioQueue];
// }
//
// if ([session canAddOutput:_audioOutput]) {
// [session addOutput:_audioOutput];
// _audioOutputAdded = YES;
// } else {
// audioError = [SCRecorder createError:#"Cannot add audioOutput inside the sesssion"];
// }
// }
ann you find this code in below method
- (void)openSession:(void(^)(NSError *sessionError, NSError *audioError, NSError *videoError, NSError *photoError))completionHandler {

GroovyFX application run with gradle

i have the next sample (from groovyfx site), it's simple window.
import static groovyx.javafx.GroovyFX.start
start {
stage(title: 'GroovyFX Hello World', visible: true) {
scene(fill: BLACK, width: 700, height: 250) {
hbox(padding: 60) {
text(text: 'Groovy', font: '80pt sanserif') {
fill linearGradient(endX: 0, stops: [PALEGREEN, SEAGREEN])
}
text(text: 'FX', font: '80pt sanserif') {
fill linearGradient(endX: 0, stops: [CYAN, DODGERBLUE])
effect dropShadow(color: DODGERBLUE, radius: 25, spread: 0.25)
}
}
}
}
}
How to run it with gradle run ?
my build.gradle:
apply plugin: 'groovy'
sourceCompatibility = 1.8
targetCompatibility = 1.8
project.ext.set('javafxHome', System.env['JAVAFX_HOME'])
repositories {
mavenCentral()
}
configurations {
ivy
}
dependencies {
ivy "org.apache.ivy:ivy:2.3.0"
compile 'org.codehaus.groovy:groovy-all:2.3.6'
compile 'org.codehaus.groovyfx:groovyfx:0.4.0'
compile files("${javafxHome}/rt/lib/jfxrt.jar")
}
tasks.withType(GroovyCompile) {
groovyClasspath += configurations.ivy
}
I might run it from IDE, but how to run it with cli and then build jar with path to Main class?
It works in the following configuration.
Structure:
build.gradle
src/
main
groovy
Main.groovy
build.gradle
apply plugin: 'groovy'
sourceCompatibility = 1.8
targetCompatibility = 1.8
repositories {
mavenCentral()
}
configurations {
ivy
}
dependencies {
ivy "org.apache.ivy:ivy:2.3.0"
compile 'org.codehaus.groovy:groovy-all:2.3.6'
compile 'org.codehaus.groovyfx:groovyfx:0.4.0'
compile files("${System.getenv('JAVA_HOME')}/jre/lib/ext/jfxrt.jar")
}
tasks.withType(GroovyCompile) {
groovyClasspath += configurations.ivy
}
task run(type: JavaExec) {
main = 'Main'
classpath sourceSets.main.runtimeClasspath
}
Main.groovy
import static groovyx.javafx.GroovyFX.start
start {
stage(title: 'GroovyFX Hello World', visible: true) {
scene(fill: BLACK, width: 700, height: 250) {
hbox(padding: 60) {
text(text: 'Groovy', font: '80pt sanserif') {
fill linearGradient(endX: 0, stops: [PALEGREEN, SEAGREEN])
}
text(text: 'FX', font: '80pt sanserif') {
fill linearGradient(endX: 0, stops: [CYAN, DODGERBLUE])
effect dropShadow(color: DODGERBLUE, radius: 25, spread: 0.25)
}
}
}
}
}
Have You already seen this site?

Wikitude app crash (poi adding issue) in iPhone 4.0

I am currently working in iPhone augmentation reality application in which I have added poi over camera view, but my application is crash and it's throwing an exception (CALayer NAN 15).
Following is the code that I am using.
wikitudeAR = [[WikitudeARViewController alloc] initWithDelegate:self applicationPackage:nil applicationKey:nil; applicationName:nil developerName:nil];
- (void) verificationDidSucceed {
id appDelegate=[[UIApplication sharedApplication]delegate];
UIWindow *window = [appDelegate window];
[window addSubview:[wikitudeAR start];
}
- (void) verificationDidFail {
}
- (void) didUpdateToLocation: (CLLocation*) newLocation
fromLocation: (CLLocation*) oldLocation {
}
-(void) APIFinishedLoading
{ //arr is current location data
NSMutableArray *addPOIData=[[NSMutableArray alloc]init];
for(int i=0;i<[arr count];i++)
{
NSDictionary *dict= [arr objectAtIndex:i];
WTPoi* poi = [[WTPoi alloc] initWithName:currentMapLocation.locationTitle AndLatitude:[[dict objectforKey:#"lat"]doubleValue]AndLongitude:[[dict objectforKey:#"long"]doubleValue]];
poi.icon = #"http://img560.imageshack.us/img560/9931/parking.png";
poi.shortDescription = #"Open Monday to Friday 6:30 to 7pm. Tariff plan range from £5";
poi.thumbnail = #"http://img560.imageshack.us/img560/9931/parking.png";
[addPOIData addObject: poi];
[poi release];
}
[[WikitudeARViewController sharedInstance] addPOIs: addPOIData];
[addPOIData release];
}
Please try the new version of the Wikitude iPhone API which should fix the described issue. You can download it from http://www.wikitude.org/developers
Cheers, Nicolas

Resources