I'm trying to understand what the introduction of the Web Audio API has meant for the development of web based games.
Flash games can of course do some quite advanced audio processing, and for simpler games the audio element was maybe enough. But how has Web Audio API changed the game dev scene? In terms of what can be done, supported platforms and so on.
Supported platforms are Chrome, Safari (with some prefixing caveats) and Firefox across all supported hardware/OS platforms; IE is working on development, though the longer tail of versions will take a while to deploy.
Web Audio enables very complex processing, but also very precise timing and multiple sounds; sound management is far, far easier than previously possible in HTML5. In short, Web Audio dramatically improves the story for game audio development on the Web - which, of course, was one of its goals.
I know I could resolve the problem easily with Red5 Media Server. I'm just curious, since I need a node server anyway and wondered if I can bypass Red5. Using the HTML5 camera api is not an option since I target up to four cameras at the same time.
I didn't see anything specific for Node.js to work with webcams. However, I saw this solution online which uses JavaScript, NodeJs, HTML5 (canvas) and WebSockets mainly to achieve frames out of video streaming then it uses the websockets to transfer the image into HTML5 canvas using NodeJs as an intermediary server.
hope it helps.
I'm the author of https://www.npmjs.com/package/camera-capture which supports portable easy to use API for node.js (server side) to capture camera video, audio and even desktop. Doesn't require any native libraries / dependencies and is easy to install - based on puppeteer. I'm already using it on desktop apps based on gtk, cairo, qt, and others and it's behaving acceptably fast although I keep looking for optimizations since the project it's pretty new. Feedback is most welcome.
I'm working on a C++ application which takes microphone input, processes it, and plays back some audio. The processing will incorporate a database located on a server. For ease of creating UI and for maximum portability, I'm thinking it would be nice to have the front end be done in HTML. Essentially, I want to record audio in a browser, send that audio to the server for processing, and then receive audio from the server which will then be played back inside the browser.
Obviously, it would be nice if HTML5 supported microphone input, but it does not. So, I will need to create a plugin of some kind in order to make this happen. NPAPI scares me because of the security issues involved, so I was looking into PPAPI and Native Client. Native Client does not yet support microphone input, and I believe that the PPAPI audio input API would be limited to a dev build of Chrome. FireBreath doesn't look like it supports any microphone function either. So, I believe my options are:
Write my own NPAPI plugin to record the audio
Use Flash to get microphone input
Bail on browsers altogether and just make a native application
The target audience for this is young children and people who aren't computer-adept. I'd like to make it as portable and simple to use as possible. Any suggestions?
If you can do it all in Flash and have the relevant knowledge, that would probably be the best solution:
You can avoid writing platform-specific code, delivery/updating is easy and Flash has broad coverage so users don't need to install any custom plugins.
FireBreath doesn't look like it supports any microphone function either.
You can write your own (platform-dependent) code for audio recording with FireBreath, just like you could in a plain NPAPI plugin. FireBreath just makes it easier for you to write the plugin, the result is still a NPAPI (and ActiveX) plugin with access to native APIs etc.
You can use Capturing Audio & Video features in HTML5, see this link for more information.
Recently I started developing a J2ME app prototype. I noticed how difficult it is to develop a good looking UI. Consider developing an app in J2ME for booking flights interacting with webservice.
A website to book flights will be easy to develop with nice ui and can be accessed by browser on a handset. I understand not all handsets have browser but all the new and upcoming ones have browser and have big screen as well.
Is it a good idea to develop such a application in j2me which need to talk to webservice for it to work? Or j2me is only suitable for standalone apps?
Advantages of J2ME:
Can access phone resources, like file system, phone book and GPS. The last is very important in map applications.
You can build richer User Interfaces. It may be difficult as you say, but there are many GUI libraries that could assist you. On the contrary the UI for a mobile browser (you can't rely on CSS and javascript working) would be very poor.
Greater flexibility on the communication logic. You can encrypt/decrypt data, compress them, use SOAP web services. With the browser, your best bet would be to develop REST services.
Disadvantages of J2ME:
Midlets need to be signed. This has some cost and there are situations that even a signed app won't run properly in specific phones.
Developing a midlet to run in all types of phones is a nightmare. On the contrary, a well designed mobile web application would be displayed properly in all recent phones.
You need to have a channel for distributing your application. People would need to download it and get charged for the required bandwidth. You would need to care for angry customers having problems with the application. Things are easier with a web site.
J2ME apps are inevitably compared with native applications (iPhone, Windows Mobile, Symbian). Compared to these, they are very poor and many would find that paying for them or even using them isn't justified.
My conclusion: Nowadays real smart phones are becoming more popular and win an ever growing market share. Under these circumstances, the advantages of J2ME can't really overcome its restrictions. The only exception I could think of, is if to have to develop a GPS application. For all other cases, a mobile web site is a better idea.
There are a lot of misunderstanding and plain wrong statements in the previous answers.
I advice you to just do your research yourself. Nowadays you CAN develop really good looking apps with J2ME without writing your own GUI framework. Take a look at LWUIT really. For example they have a virtual keyboard as one of their touch screen functionalities and this you have on devices like the N97 which itself does not have a Virtual keyboard. BTW using LWUIT you have a Blackberry and Android port included if anyone cares.
Also Apps nowadays become the center stage on many platforms not just the iPhone. Look at the recent developments in this area like OVI, RIM, Samsung, SE, Orange World they all start with app shops.
"Getting people to use a website on their mobile phone is easier than getting them to download an application." this is just a claim without proof. you cannot say that like this. It depends on a lot of other factors. - Why should users type in your mobile url into the rather small screen again?
Anyway, this answer is probably too late so I'm not gonna write much more. The mobile industry is changing fast right now but there is not yet a alternative to J2ME for crossplatform development. Maybe in the future with better browsers and widget technolgies.
Just a short note, applications like google maps or gmail mobile probably don't use WebServices to talk to their server part. A WebService has a lot of overhead, especially when considering that mobile users are usually rated by the amount of data they transmit. The best way to perform communication between your client app and its server part is to use binary data over a socket connection.
I personally think it's really hard to make a consistent and reliable J2ME application that will run across a large set of mobile phones. Based on my experience, I would only develop a J2ME application (instead of a Web application) if it's a strict requirement - for example, to be able to view your bookings without being connected to the network. There are other costs associated with J2ME applications - the applications must be downloaded, the user will be asked if the application is allowed to connect to the network when it attempts to (there are exceptions for this case but I believe the application has to be signed by 3rd party company - more $$$ involved), you will have to maintain different versions of the application running on a variety of mobile phones (more complexity to the application), and so on...
Think about it this way - if you were developing a similar thing for a computer, would you build a desktop application or a web application? With the cellphones of today (many of which can access full-html sites with javascript - which means ajax), the proposition of the question is valid.
I thing a good rule of thumb should be: If what you're trying to achieve can be done with a mobile website - go for the website.
IMHO, apps should only be used to if they cant take advantage of the mobile hardware - like location, sound, video, 3d, pictures etc...
Even if the dev costs for the app were insignificant (they usually aren't), you'd have to offer some really amazing capabilities to make the users go through the trouble of downloading it.
(All of this is essentially true for J2ME/BREW. The iPhone is a little different as apps take the center stage)
One thing worth highlighting: the only standard way of deploying a MIDlet is via OTA download so you wouldn't expect a J2ME-capable phone to not have a web browser.
Mobile web browser like Webkit and Opera are getting better faster than J2ME (at least until MIDP3.0 starts shipping, if ever).
No matter which platform you choose, you will need to test your service on many devices. I don't think switching from J2ME to webapp makes a huge difference in that regard, because phone manufacturers keep changing the binaries that go into the phones firmwares.
Getting people to use a website on their mobile phone is easier than getting them to download an application. unless that application is already installed when they buy the phone, that is.
You might want to look at LWUIT for better and easier J2ME GUI.
One thing that J2ME will accomplish for a flight booking service is save battery life by not requiring constant network data transfer, thanks to the local storage mechanisms.
there are many great j2me apps that (need to) talk to webservices. just think of the google apps, like gmail mobile and maps for mobile. they are faster and easier to use than using the services via cell phone browser. so if you can design a good app, it's definitely worth it.
EDIT: also, a j2me app makes possible features that can't be provided by a web application: integration with phone features (address book, calendar), "call this number", location api, etc.
I think for business apps, or more text/data oriented things, a mobile web/wap site might be easier to maintain, since you won't have to deal with pushing client updates out to handsets.
For UI-intensive apps (maps, games, etc.), client apps are probably the way to go, so you can handle the more of the processing and rendering on the client side.
Both options are difficult though, since there are so many compatibility issues with phones. You might be best served by narrowing down what types of phones you want to support for your app. If you think most of your customers will be iPhone or Android phones, you can target those platforms (with either client apps or web apps) and avoid old-school j2me completely.
I hate WebApps on phones. They are slow and they don't work in a semi-connected environment.
J2ME apps can do local backups, bluetooth backups, bluetooth data sharing between 2 phones and better responsive UI. However that requires money,skill,time etc.
My main gripes with MIDP though is pushing software updates and wav real time mixing. Technically those are possible within the scope of MIDP but the goons at wheel are not very creative.
I have a large amount of audio stored on my web server in a very custom format that can't be replayed by anything other than my own application. That application is a Win32 app that can connect to my web server and stream and replay that audio.
I'd really like to be able to do the streaming and replaying from within a browser, but don't know where to start. Ideally I'd like the technology to be cross-platform (unlike my current Win32 app) and cross-browser (IE 6 and above and Firefox).
My current thoughts are to look at things like:
Flash, but doesn't that only replay mp3 audio?
Java, are VMs freely available still?
Converting the audio to a WAV file on the web server and then using someone else's plugin to replay that file. I'd rather keep the conversion off the web server for performance reasons, but is still an option.
Writing my own custom plugin to do the complete stream and replay operation.
Any guidance would be most useful.
Please note that the audio is not music and that simply converting to another audio format is not trivial. The audio that is stored also changes frequently (every minute) would need constant conversion.
Why are you using a proprietary music format? I'd probably not even bother downloading a program to listen to it.
I would suggest you convert it to mp3 and then use flash.
Building your own plugin would probably be hard, there are so many different platforms you'd have to cater for, something like flash is written for them already.
Apart from converting server-side: Implement a decoder for your format in ActionScript or Java. Then you can write a Flash movie or Java applet that plays it. Both languages/runtimes should be fast enough to decode in realtime unless your format is very complex. Flash would be the more accessible of the two, since nearly everyone has the plugin installed. (It's possible that playing a raw sound buffer isn't supported by older Flash versions than 10, I'm no expert on that.) The Java plugin is definitely free, but you'd require the users to install it.
I'd go with converting the audio to WAV (or MP3) on the server. Writing your own cross-platform browser component would be a lot of work, thanks to the different ways the major OSes handle their audio APIs.
Try taking a look at shoutcast.
Basically its a server app that will stream music to any client that connects to it through a browser (effectively your own radio station). I've never used it myself but should be straight forward.
Another idea is winamp remote. Again you install the app on the server but this time you can browse your music collection on their website and play individual songs.