I am new to the android development platform and working on a project to write some test comparing some of the different gesture keyboard implantations (android 4.2 built in, swipe and swift key).
I am using eclipe for my IDE and relying primarily on the emulator for testing. I am using the 4.2 gesture keyboard from here (http://forum.xda-developers.com/showthread.php?t=1964663).
What I would like to be able to do (starting with android 4.2 built keyboard).
Pass a string in, for example “hello” and then have that string entered using the gesture keyboard.
Get the words shown in the Candidates View box.
Compare the words to an expected word and return pass/fail depending on if the words returned from Candidates View were what I expected.
Store passed in word and returned suggestions in a log file for analysis.
Ideally I would like able to use API’s only for this and not reply on UI but I am not sure if this is possible.
Some possible solution I am looking at.
Use TouchUtils class (http://developer.android.com/reference/android/test/TouchUtils.html) to draw shapes between keys. However, I am not sure how I can specific x and y coordinates on device to map to specific keys. Also I don’t see any methods in this class that would let me draw a custom based shape.
Also I see the sendKeys(String keysSequence) however this looks like it just presses key and would not exercise to gesture keyboard stuff.
So I am curious how all you advance android developers would approach/solve this problem and if there is already any resources/API’s out there that can help.
Thanks
Pete
You can try MonkeyRunner tool
http://developer.android.com/tools/help/monkeyrunner_concepts.html
Or you can try "Programmatically Injecting Events on Android" in the below two blog posts (part 1 and part 2)
http://www.pocketmagic.net/2012/04/injecting-events-programatically-on-android/#.UckpTZwlubq
http://www.pocketmagic.net/2013/01/programmatically-injecting-events-on-android-part-2/#.UckpSZwlubp
Related
Introduction
I am developing a script in Python 3.7 using Appium. It will automate some task on a Real Android smartphone. My script need to type some text in textfield.
I don't want to use send_keys method or ActionChains.
I would prefer to type the text, character by character, using the keyboard of the smartphone.
Problem
I investigate and read various docs:
press-keycode
https://appium.readthedocs.io/en/latest/en/commands/device/keys/press-keycode/
which brings me to Keyevent
https://developer.android.com/reference/android/view/KeyEvent.html
Which brings me to KeyCharacterMap
https://developer.android.com/reference/android/view/KeyCharacterMap.html
To be honest with you, after I read it all, it is super difficult for me to undestand all that.
So I made some experimentations by trying different lines of code to see what happens:
driver.press_keycode(0)
driver.press_keycode(1)
driver.press_keycode(2)
etc...
It seems than nothing happen.
Is there anyone who knows a good tutorial or article which can explain me how to type a text in a textfield of an Android smartphone Application using the keyboard instead of send_keys method and ActionChains?
Could you help me please to find the way?
I have been trying to build a TV application, using an SD.
I have got features like image gallery, video player running,
However, I also wanted to add a virtual on-screen keyboard that works with up-down left-right arrow keys. Can somebody help me with how to get started?
When I wanted to do this with my Vestel (Polaroid Branded) smart TV, which uses "Opera for TV devices" as it's HbbTV browser, I found that I didn't need to.
I simply just used HTML text fields and input types where needed, and as soon as I clicked into them, the browser/OS kernel popped up an onscreen keyboard that was built in for me.
However, I did do some research to see if I needed it, and on some devices you do, whilst I never actually implemented it (My app was just for my own use) the "BBC Television Application Layer" (TAL for short) : https://github.com/bbc/tal had pretty good keyboard support.
Another one that might be worth looking at is "Mautilus SDK" : https://github.com/mautilus/sdk
Be aware though, both are horribly convoluted and use quite complex code where it's really not needed.
I'm writing a WinRT game for Windows 8, in C#, using the excellent MonoGame. I've reached the part where the user has achieved a high score and needs to enter their name. This is causing me more pain than I'd anticipated so I thought I'd ask for help.
First of all, is there a simple "enter some text" function that I can call, similar to Guide.BeginShowKeyboardInput in Windows Phone 7, or the ancient InputBox command in VB? I'm using Windows.UI.Popups.MessageDialog for displaying simple dialog messages, but can't find any similar thing for requesting text from the user.
Failing that, is there a way I can easily use a little piece of XAML to present a textbox for the user to use?
If neither of these are possible, I guess I'll have to wire this all up myself... I then would plan to intercept keystrokes and display the required text on screen myself. As I don't have a physical tablet (just the simulator) I'm struggling to start with this. How can I:
Detect whether the device has a physical keyboard, so I know whether or not to display the on screen keyboard?
If there is no physical keyboard, how can I show and hide the on screen keyboard?
Some of these sound like they should be easy to answer, but I've yet to track down answers to any of them.
Many thanks!
Adam.
Hey there is such a way to do this in monogame. There is a new template that allows you to create a XAML + Game game which allows you to use the game class you a used to with the xaml bits as well. These links should get you started. The monogame team rocks.
There are the three game types listed there. You want XAML + Game there is a template for it now if you get the proper version of monogame.
https://github.com/mono/MonoGame/wiki/Windows-8-Project-Types
let me know if you need more help
This is not a cross platform solution but you could use a FlyOut and place the controls for data entry on the window. FlyOut guidelines are here and UI Controls for text input guidelines are here. I have also used MessageDialog in a MonoGame for asking the user simple questions (up to 3 options) or to get a Yes|No response. You can get details of that class here.
How (i.e. using which API) is the virtual keyboard opened on Symbian S60 5th edition? The documentation seems to lack information about this.
You are right, this should obviously be a published API and it should be highlighted in the documentation. No such luck.
If you are using one of the platform native controls, the virtual keyboard will automatically popup when the user accesses a text-editing control.
If you are making a custom control, you need to deal with its selection by adding your own version of the virtual keyboard: make a new text-editing, window-owning virtual keyboard look-alike custom control with the right buttons. Reuse it accross all your applications. One day, Nokia will realize they have made an obvious mistake and make the API publicly available.
If you are using direct screen access, well, you wouldn't exactly expect the very s60-looking virtual keyboard to popup out of nowwhere. Again, draw a nice image on the screen to let the user know where the virtual keys are and react to pointer events. This is going to be less reusable unless you build a good amount of customization (background, button edges...) into it.
EDIT: Nokia may be relying on Qt to fix this issue. I would expect the control to be part of the current 4.7 version of Qt.
Tinkering with focus on a QLineEdit inside custom coded kinetic scroll area, I've had a simmilar problem (how to open virtual keyboard manually). Then, I found it, this obviously works in Qt 4.6.3 on a C7 Symbian^3 phone:
// lineEdit is an instance of QLineEdit
QApplication::postEvent(lineEdit, new QEvent(QEvent::RequestSoftwareInputPanel));
Before that, I also had to post a QEvent::FocusIn event to that same lineedit, otherwise the QLineEdit did not update the content from virtual keyboard.
Hope this is helpful. I lost hours.
Thank you tihi, very useful tip! There's also the "close virtual keyboard" event that can be triggered:
QApplication::postEvent(lineEdit, new QEvent(QEvent::CloseSoftwareInputPanel));
We have a midlet that needs to allow the user to switch input languages on the fly (its a dictionary type app) between several languages (say English to Arabic etc). All was charming in the "old days" with the numeric keypad, we handled the input ourselves matching 2 clicks on the 5 to feed the correct char to our program. Then came the E71 out, it has a qwerty keyboard and in our canvas KeyPressed we get the character the user pressed, say "a" on the keyboard,
now the task of matching this to the correct language (say the user is now searching for the Arabic to English side of the dictionary) involves the task of matching "a" (on the qwerty layout I guess) to the arabic letter that would come out if the layout was arabic.
There is a special keyboard shortcut in these S60 devices (varies between devices) that allows the user to pop the input language selector (function + space in the case of E71) but these does not seem to work when our midlet is running.
another suggested solution was to somehow use an editable textfield for the input in which case a standard support for changing the input language is offered by the jvm, however we render a canvas (a nice looking one) and replacing this with a textfield is a last resort for us.
So, the question we have is what other solution can someone think of to tackle this issue?
or if anyone found a way around this annoyance?
best regards,
--tzurs
I think you can do the mapping using the Nokia specific system properties for keypad settings. Using com.nokia.keyboard.type, com.nokia.key.scancode and com.nokia.key.modifier you should be able to create a generic enough solution for Nokia devices.
More info on these system properties available on Nokia docs