I'm recently playing around with the Flurry AppCircle Clips SDK, but during my test the SDK ran out of videos to play... According to their FAQ the cap of the video frequency is predetermined by the Advertisers based on their daily budget. I can only view new clips after a 24hrs roll.
This is painful and slows down my development, does anyone know if there is a sandbox mode for the Clips SDK that plays unlimited test videos? If not then how can I get around this problem?
Clips supports a test mode just like the regular AppCircle ads. In test mode you won't be generating revenue but there should be more videos available, however it's not unlimited. At the moment you can't set this yourself and will need to contact either your account mgr or support#flurry.com.
(disclaimer: I work on the Android SDK at Flurry)
Related
I am currently implementing a ChatBot using Google Cloud Speech.
I am using socket.io to record a microphone stream and then sending that through node to Google Cloud Speech.
Everything is working fine on my laptop and my android mobile phone (Nexus 5x, Chrome 68)
I record the audio, and having set single_utterance to true, get a result with "isFinal" as soon as I pause speaking.
But if I set the language code to 'da-DK', I never get a "isFinal" result (unless I end the stream myself) on mobile. Works fine on my laptop, but not mobile.
Have anyone experienced anything similar?
As a bonus info:
If I set interimResults to true, I do get multiple results, but they are just never isFinal.
So just to be clear: everything is working perfectly apart from the one case: da-DK on mobile.
As this behavior is occurring only when using the da-DK supported language on a mobile device, it might be related to an internal service issue; therefore, I think that you should take a look the Issue Tracker tool that you can use to raise a Speech-to-Text API in order to verify this scenario with the Google Technical Support Team. In this way, you will be able to share your code, audio files and internal project information if required by the troubleshooting process.
Additionally, I suggest you to take a look on this link that contains some useful documentation and example to use Google Cloud Speech API on an Android environment that you may use as a reference for your project.
I am making a game where I want to command the AI using word i speak.
Say for example I can say go and AI bot goes to certain distance.
Question is I am finding asset and no provider is giving me grantee that it is possible ?
What are the difficulties for doing it?
I am programmer so if some one suggest the way to handle it I can do it.
Should I make mic listener on all the time and read audio and then pass audio to some external sdk which can convert my voice to text ?
these are the asset provider i have contacted.
https://www.assetstore.unity3d.com/en/#!/content/73036
https://www.assetstore.unity3d.com/en/#!/content/45168
https://www.assetstore.unity3d.com/en/#!/content/47520
and few more !
If someone just explains the steps I need to follow then I can try it for sure.
I am currently using this external api for pretty much the same thing: https://api.ai/
It comes with a unity SDK that works quite well:
https://github.com/api-ai/api-ai-unity-sample#apiai-unity-plugin
You have to connect a audio source to the sdk, and tell it to start listening. It will then convert your voice audio to text, and even detect pre-selected intentions from your voice audio / text.
You can find all steps on how to integrate the unity plugin in the api.ai Unity SDK documentation on github.
EDIT: It's free too btw :)
If you want to recognize offline without sending data to the server, you need to try this plugin:
https://github.com/dimixar/unity3DPocketSphinx-android-lib
It uses open source speech recognition engine CMUSphinx
Is there a way to fix a single dxp file to different screen sizes ,for example i want to use same dxp to access from iphone,ipad and Moniter so it has to set automatically.
If you are going to develop mobile analytics I strongly suggest you download the Spotfire App at the Google Play and Apple Stores. This will ensure your visualization is rendered properly and provides you with mobile touching features that you may find cumbersome attempting through a mobile web browser.
Before you start building your analytic, go to Edit > Document Properties > Visualization area size and select one of the iPad options, or set a custom size. You will see the window resize in the professional client, emulating what the mobile application would look like. Personally, I'd steer free of creating one for a phone. It's just unrealistic, so much so that Spotfire discontinued it's mobile analytics which was used for phones, because, in my opinion, it didn't have a ton of value. It's hard to gain a lot from a visualization so small.
I am using android studio.I follwed these steps https://developers.google.com/admob/android/quick-start
After added admobs my app size increased almost 5mb.it was 13mb after became 18mb.
Using
compile 'com.google.android.gms:play-services:7.5.0'
Implies you are using every feature of Google Play Services, including location services. If you only need a particular API, you should be using the selective APIs.
In the case of ads, you can use solely:
compile 'com.google.android.gms:play-services-ads:7.5.0'
I'd recommend looking at how to reduce app size increased after admob ads?
Simply put: The library has a lot of extra stuff. You need to use ProGuard to strip that out.
What kind of list view is in this navigation drawer? It has two sections. One with Inbox, starred, sent mail, drafts and the other with all mail, trash, spam & follow up.
So is it a sectional ListView without section headers or are these two ListViews?
Found the image here: http://www.google.com/design/spec/style/typography.html#typography-standard-styles
Google offers I/O 2014 app source code as Material Design sample code
The Google I/O 2014 app has successfully lived up to its initial purpose of providing scheduling for Google I/O attendees and allowing us at home to check in and watch the keynote presentation, as well as other live-streamed sessions. But what happens to the app now?
Instead of leaving the app to be forgotten in the Google Play Store, Google has decided to use the app as a shining light for developers. Google updated the I/O 2014 app with Material Design and the Android L developer preview before making the full source code free and available for developers to download and utilize as a template for their own apps.
Although the majority of actual benefits of the Google I/O 2014 app were only good during the two days of conference back in June, the app now offers developers examples of a number of features and techniques. Developers can look forward to sample code for:
Google Drive API
Google Cloud Messaging
Android L developer preview
Android Wear
Video streaming
Reminders and alarms
NFC scanning and beaming
Feedback mechanisms
In addition to simply just dropping the code on developers, there is also some reference material available on the project git page, expect also to see video tutorials coming soon through the developer channel.
Once you’ve got your Android L device or emulator up and rolling, grab the source code for the Google I/O 2014 app from the GitHub page and get on building your own Material Design apps.
GITHUB CODE
maybe this is what you are looking for
https://github.com/google/iosched/tree/master/android/src/lpreview/res/layout-v21