I need to test two apps in the same test case in Appium ( android )
for example, write test case to publish ads on app (A) and see the ad in the second app (B). Also, Can I run on emulator ? or should apply on real device?
After many research, I found the solution. the simple way to do that by define new driver with the selected app and this cause to close the first app and open new one.
Example Code
return driver
.elementById('username')
.click()
.init(Common.SelectApp(apps.AppName)) //Open new app ( the code below)
.setImplicitWaitTimeout(10000)
.elementById('username')
.click()
// Select App function:
var desired = process.env.npm_package_config_sauce ?
_.clone(require("./helpers/caps").android18) :
_.clone(require("./helpers/caps").android19);
return desired.app = App Name;
This code from appium examples (Node)
Related
I'm trying to make my own 3d viewer app.
I just want it works like,
Open an app (tap on my smartphone screen),
Show a 3d model,
Close the app when the back button is tapped.
Actually, I made an app (with Kotlin on android studio) that is working like my idea.
The app is working well with a file that is located on a server.
But I want to make an app that works without an internet connection.
How to import a file(glTF format) to the app.?
In the part,
.appendQueryParameter("file", "PATH/XXXX.gltf")
If the PATH is with "https//", it works with an internet connection.
but, How can I set the PATH local storage in order for the apps to work without internet connections?
Here is the kotlin code snippet that I'm using.
val sceneViewerIntent: Intent = Intent(Intent.ACTION_VIEW)
val intentUri: Uri = Uri.parse("https://arvr.google.com/scene-viewer/1.0").buildUpon()
.appendQueryParameter(
"file","ludovic_v2_gltf.gltf" //<<<<---- how to import this file?
//"https://raw.githubusercontent.com/KhronosGroup/glTF-Sample-Models/master/2.0/Avocado/glTF/Avocado.gltf"
//<<<--- with this line, works very well.
)
.appendQueryParameter("mode", "3d_only")
.build()
sceneViewerIntent.data = intentUri
sceneViewerIntent.setPackage("com.google.ar.core")
startActivity(sceneViewerIntent)
finish()
Thanks a lot!!!
This is my environment.
windows10, android-studio 20212.1Patch1(build on May 18,2022)
My project was started with "empty activity".
I had build an electron app with the tutorial from here.
The problem is now, that the "minimize to tray" function and the 'autostart' function doesnt work anymore. When starting my app via npm start it works but not with the .exe
The code of the tray function is from this answer: Electron.js How to minimize/close window to system tray and restore window back from tray?
The code of the autostart function is from here: How to use auto-launch to start app on system startup?
Does anybody know why those functions doesnt work anymore after building a .exe? (Start as admin doesnt help)
The reason that this doesn't work for was that the path of the icon of the tray menu was defined as ./icon.png but after building the application, the file isn't at the same place anymore. All the app files are moved to ./resources/app/.
So this was the fix for me:
let trayIcon = null
if(!app.isPackaged) {
trayIcon = './icon.png'; // when in dev mode
} else {
trayIcon = './resources/app/icon.png';
}
I have a small sample application to test speech recog. It works in some machines but not in other machines. In my dev environment where I first installed the necessary packages, it all worked 100% with no issues. But, my team mates are unable to get it working with the installation of our software that has this code in it. We have mixed environments where in some cases we are using Remote Desktop with the application running on the remote machine (so with the device integration via RDP). And also locally without RDP. It does not detect the mic in both cases. Windows detects the mic. The recorder app works and testing all works so we know the mic is being recognized by windows.
However, the speech SDK does not recognize it.
I have tried 2 ways. First ,with using the FromDefaultMicrophoneInput But with that not working, i changed it to FromMicrophoneInput instead and specifed the microphone ID.
Using NAudio to enumerate the microphones, the mic is detected and listed:
var enumerator = new MMDeviceEnumerator();
string specifiedMicID = string.Empty;
foreach (var endpoint in
enumerator.EnumerateAudioEndPoints(DataFlow.Capture, DeviceState.Active))
{
if (endpoint.FriendlyName != this.MicName)
continue;
else
{
specifiedMicID = endpoint.ID;
break;
}
}
audioConfig = AudioConfig.FromMicrophoneInput(specifiedMicID);
But, when trying to instantiate the SpeechRecognizer with that audio config:
using (var recognizer = new SpeechRecognizer(config, audioConfig))
{
...
}
We get the SPXERR_MIC_NOT_FOUND. Even thought it is clearly there and working in all other cases in windows and with Naudio detecting it fine.
Any ideas what is going on here?
Thank youj.
Are you creating a UWP application? If so, you'll need to retrieve the audio device IDs differently:
var devices = await DeviceInformation.FindAllAsync(DeviceClass.AudioCapture);
foreach (var device in devices)
{
Console.WriteLine($"{device.Name}, {device.Id}\n");
}
Please refer to the documentation here for more information:
https://learn.microsoft.com/en-us/azure/cognitive-services/speech-service/how-to-select-audio-input-devices#audio-device-ids-on-uwp
If you're still having issues, we'd need to get the SDK logs to debug further. Instructions on how to turn on logging can be found here:
https://learn.microsoft.com/en-us/azure/cognitive-services/speech-service/how-to-use-logging
I have a written a Python code that takes user id and movie name from the user and recommends a list or 10 movie to the user now i have also created an android app using the react native app UI and i want to take input from the user through the app process the input(user id and Movie name ) and give recommendation (a list of movies) in the app , what should I do to achieve this objective
One way would be to run your recommender under flask.
Then have your mobile app query your recommender service
and display the results.
I have a location mocking method in my main activity. Unfortunately, I cant put this method into another class (yet!). So, I need a service, to call this method from my main activity every 5 seconds. So i created a countdown in within a service that, while the app is in the background, should run the method in my MainActivity. But it doesnt.
public void OnTimedEvent(object sender, System.Timers.ElapsedEventArgs e)
{
Log.Info("2", "CountDown ausgeführt!");
var test = new MainActivity();
test.getMockLocation();
}
This is my code. As you can see, I'm installing a new object of my Main Activity and then ask for the method in within this activity. This does work. Well at least Visual Studio does not complain. If I now debug my app on my phone, nothing happens. I dont get no errors or anything.
Now, when I run this app Step by Step and it reaches this point
"var test = new MainActivity();"
I get "Frame not in Module".
So, it basically crashes as soon as I ask it to install a new object of my Main Activity.
May anybody tell me why this is?
THANKS :)
Unfortunately in Android you cannot create Activities like this, they need to be instantiated by the OS. Also, instantiate a whole Activity only for a method is not ideal, I suggest you to find the way to get that method/function out of that Activity so you can use it anywhere in your program.
Did you create your app via Xamarin Forms? If you did, you can utilize the Xamarin Forms MessagingCenter for background services and then you can call your mock location tasks.
This is the link for a very helpful walk-through and example of MessagingCenter.