As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 12 years ago.
I have seen many people writing that Java ME (J2ME) is dying. Is it true? What is its future? Should I learn Java ME if I want to create apps for smartphones? If yes, where to start?
I've been developing for J2ME/Java-Me for several years and now I see based on download statistics of my applications, most of downloads (90%) comes from developing countries. So if your target users are not from Western countries - go ahead learn J2ME, otherwise learn Android and/or iPhone.
If I were you, I'd start learning to write apps for Android and iPhone.
Java ME may not be entirely dead, but you'll do much better with these. The potential market for your products is much bigger and keeps getting even bigger, and I bet it would be a lot more fun, too.
I used to develop for j2me . I think it all depends on demand. Right now, i mostly get demands for iphone apps. Android is also making its way but i'd say for every 10 iphone/ipad apps, I get about 2-3 android apps and maybe 1 j2me app. Oh and that's in the uK. It all depends on you. If you want to freelance, I'd say go and learn objective c . It is a very simple language and simpler than JAVA imo.
Nowadays many phones support J2ME.So it is very usefull.J2ME does not need hight cost configurations so many phones support it.Not only phones lot of small devices support J2ME.So J2ME is dying is wrong sentence.J2ME is growing.
If u think only in mobile circles then u get a view that J2ME is dying but if u think in the view of real world then u understand the need of J2ME.Today many devices such as set-top boxes,home applainces,wirless phones,etc uses/support J2ME.
J2ME also had many job oppurtunities.
At present in smart phone market the Android,Blackberry,IPhone are grows larger than J2ME.
But some of the points,libraries,concepts,etc in J2ME are used in Blackberry,Android,etc.
J2ME is very old and it will enable the ground for the modern Smart Phone technology such as Android,Blackberry,etc.
Thanks & Regards,
Sivakumar.J
Depends on which phones you want to target, as you have mentioned smartphones rather than feature phones, I would suggest Android, especially if you are already coming from a Java background, otherwise either iOS (iPhone, iPod Touch and iPad) or Android, as these are far more enjoyable for a developer and have far far better documentation and example open source applications available for them as well as being able to deploy and debug on device relatively painlessly
Don't forget that the expected UI polish of Android and iPhone apps is far higher than MIDlets. This takes a lot of time and effort to create.
Related
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
As seen on this thread, it seems that the missing part to be able to run DirectX on Linux natively are vendor drivers.
What exactly are vendor driver? Are they drivers interfacing a specific model of a component, or a family, or even any of them? What are they coded in? ASM and C most likely?
How would someone (or a team) create these drivers for Linux? How would it be integrated into Linux? Would the games or applications in general made for Windows and using DirectX need any tweak for Linux? Would companies making games build their games for Linux knowing they can be used without or with only a few tweaks needed?
How hard would it be to make these drivers? How long would it take? Would it require any specific knowledge?
I know this makes a lot of questions, but I'm very curious about that and why no big groups have ever worked on that seriously (even though there must be a good reason).
Thank you a lot in advance for your answers!
EDIT: This is by no means an incitation to a debate of, for example, OpenGL vs DirectX, or Windows vs Linux. By reading the FAQ, I can't really see why this thread isn't constructive as it asks for pretty well-aimed questions which should be answerable quickly.
IMHO the main reason no one really bothers into dealing with directX is based on the fact that there already is a graphics library (mesa in the special case of Linux) available that fully supports any desired graphics operation also available with DirectX.
In contrast to following DirectX, which is a specification based on so called intellectual property owned by a single corparation the API used by this library called openGL is an open standard agreed upon by a consortium of hardware manufacturers.
Different from the philosophy of constraining it's use to just one operating system possibly trying to shackle its users to the one and only platform openGL was intended as a platform independent API right from the beginning.
Following this principle in contrast to DirectX being available just on one single platform openGL is available on any computing platform ranging from android based systems, Mac and numerous other UNIXoid systems including Linux even to Windows machines.
Using any other API than openGL would break this platform independence, which probably wouldn't be received as a progress but rather as a regression.
To sum it up possibly the main reasons to favor openGL over DirectX are the following:
openGL is an open standard while DirectX is proprietary
openGL is available on any platform DirectX is only available on a single platform
any operation supported by DirectX is supported by openGL as well
if they are really needed DirectX calls can be provided by a wrapper library pushing operations down to openGL as for example done in WINE
Mere availablity of a DirectX library implementation alone wouldn't enable any binary code designed for the Windows platform to run at all as the whole set of system libraries and infrastructure still would not be available at all. As a matter of fact even the binary format in use PE/COFF on Windows ELF on Linux is different.
An effort to supply a whole compatibility layer including needed system libraries is already on the way. As already mentioned above it goes by the name of WINE. (see: http://www.winehq.org/)
I hope I gave you some good reasons why no one ever tried (or will try) as you requested.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I have a Samsung TV with connected via optical output to my DTS sound system.TV has ARM CPU and an embedded linux operating system. It comes with an embedded media player.But internal media player doesn't pass DTS audio to my sound system. it only passes-though AC3 and other formats. I want to hack TV's firmware to pass-though DTS audio to my sound system.
What are the books, learning resources etc. to start this kind of hobby project ? I have never programmed on embedded platform. What should I know before the start ? For example Should I know audio programming, electronics, Linux Kernel, C Programming ? Any recommendation would be helpful.
Actually there is a whole dedicated sourceforge project related to firmware hacking on Samsung TVs called SamyGO. But When I asked one of main firmware hacker in this site, I didn't get a reply from him. So I thougth someone here can answer to my question.Thanks..
UPDATE :
How much electronics knowledge should I have for this kind of job ? (I have a C.S degree and basic knowledge about electronics, logic design etc.). Should I be a linux kernel or C expert ?
Actually I have always wondered how embedded device hackers like George Hotz gain this kind of knowledge.Because it is a closed system, they don't have any documentation, how can they do this? Do they learn it from school?(I don't think so). If they do, What do they study? Which books do they use? If they don't, how ?
I came across this excellent blog post on Reverse engineering firmware for Linksys router,
http://www.devttys0.com/2011/05/reverse-engineering-firmware-linksys-wag120n/
It explains in detail how to reverse engineer the firmware, get to the file system in clear steps. If you can follow through the steps, you would get a really good insight into what it takes to hack a firmware.
I don't think you require lot of electronics knowledge to hack a firmware, basic 101 level knowledge would suffice. If you understand basic OS and systems in general then you should be able to work your way through.
Hacking an undocumented system is nowhere a trivial task. You definitively should learn C and practice embedded programming and a good knowledge about electronics will help a lot (you'll have to look at the circuit and guess how it works, if you can't find any docs on google).
My advice is: get some (documented) ARM board to start hacking on (beagleboard/pandaboard comes to mind). You'll learn a lot about Linux, C, Kernel development and even electronics if you want to.
Trying to dive directly on a TV system will probably be very frustrating for you.
UPDATE:
At the electronics side, you don't need to be an electronic engineer. Study a lot of digital electronics and understand how CPU's, buses and commom peripherals work. Most of the time you can look at any chip code and search for it datasheet, but sometimes they're designed specially for one device, or have no identification at all. In this case, you'll need a logic analyzer to reverse engineer it and try to understand how to "talk" to it.
You can learn CS and electronics from books, but real reverse engineering can only be learnt by experience (of course, learning how others do things helps a lot).
Go on and open devices you find interesting, try to understand how they work and change things on it. You'll for sure burn some of them (begin with the cheaper ones), but it's the best way to learn how to hack devices.
Just take care to not die while messing with high voltage devices (and LCD TV do have some HV parts)
Suitable development boards to consider, with very active communities, i.e. easy to get help if you are stuck :-)
http://www.arduino.cc/
http://beagleboard.org/
Start with them to learn about embedded systems, before you move on the more difficult tasks
Yes,C language knowledge is important.
First try and learn Raspberry Pi then jump into Adruino.
You could find so many communities for these in facebook and google plus join and involve yourself.
Then you can learn so many hacks
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
Im planning to use Intel atom on a board for an embedded system. The embedded system will be running programs written in C for image processing. Since its an embedded system footprint is obviously a concern. I was thinking about using a modified version of the linux kernel. Any other options??
I've written my own O/S for embedded systems so I'm not too sure. But one project I've been wanting to try is uCLinux. Though that might not be enough for what you want to do. If you have more ressources you might want PuppyLinux or Damn Small Linux. They all should have a C compiler which will suit your need.
Hope this helps!
p.s. since I'm a new user, I can only post one hyperlink, you'll have to google the other two, sorry!
I don't know how much memory you have, but Windows CE might be another choice. Going this route lets you stay with Windows tools (if you like those) There is also a Micro edition of the .NET framework available for use on Windows CE
It depends what services you need form your OS. The smallest footprint will be achieved by using a simple RTOS kernel such as uC/OS-II or FreeRTOS; however support for devices and filesystems etc will be entirely down to you or third-party libraries with associated integration issues. Also the simpler kernels do not utilise the MMU to provide protection between tasks and the kernel - typically everything runs as a single multithreaded application.
Broader and more comprehensive hardware support can be provided by 'heavyweights' such as Linux or Windows Embedded.
A middle ground can probably be achieved with a more fully featured RTOS such eCOS, VxWorks, Neucleus, or QNX Neutrino. QNX is especially strong on MMU support.
"Image processing" in an embedded box almost always means real-time image processing. Your number one concerns are going to be maximizing data throughput and minimizing latency processing overhead.
My personal prejudice, from having done real-time image processing (staring focal plane array FLIR nonuniformity compensation and target tracking) for a living, is that using an Intel x86-ANYTHING for real-time embedded image processing is a horrible mistake.
However, assuming that your employer has crammed that board down your throat, and you aren't willing to quit over their insistence on screwing up, my first recommendation would be QNX, and my second choice would be VxWorks. I might consider uCOS.
Because of the low-overhead, low-latency requirements inherent in moving massive numbers of pixels through a system, I would not consider ANYTHING from Microsoft, and I would put any Linux at a distant third or fourth place, behind QNX, VxWorks, and uCOS.
If you are needing to do real-time image processing, then you will likely want to use a Real-Time Operating System. If that is the route you want to take, I would recommend trying out QNX. I (personally) find that QNX has a nice balance of available features and low overhead. I have not used VxWorks personally, but I have heard some good things about it as well.
If you do not need Real-Time capabilities, then I would suggest starting with a Linux platform. You will have much better luck stripping it down to meet your hardware limitations than you would a Windows OS.
The biggest factor you should consider is not your CPU, but the rest of the hardware on your board. You will want to make sure that whatever OS you choose has drivers available for all of your hardware (unless you are planning on writing your own drivers), and embedded boards can often have uncommon or specialized chipsets that don't yet have open-source drivers available. Driver availability alone might make your decision for you.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
Which is better, Qt4 or JavaFX?
I work for a startup. We built the first version using JavaFX. Now my superiors are suggesting a shift to Qt4.
Is it a good decision to shift?
The question like this is too broad to answer meaningfully.
Qt4 and JavaFX have different goals.
Qt4 is for writing cross-platform desktop applications
JavaFX is for writing rich Internet applications
Qt4 allows for better desktop integration (drag&drop, playing nice with the configuration systems of different platforms, native look & feel, ...), so if you want a nice desktop app, use Qt4.
OTOH, Qt4 cannot be used for an Internet application (web app), so if that's what you want, use JavaFX. JavaFX can also be used for desktop apps, but it requires more compromises than a dedicated desktop GUI toolkit like Qt4.
So what are your requirements?
Edit:
Based on your comments:
Standalone desktop apps are not JavaFX's main goal, but if it works for you, I see no reason to change it.
You can write great applications using JavaFX, especially because you have access to all the stuff the JDK offers. As to system integration: While QT4 is better in this respect, Java already offers a lot (such as Swing an JDK 6's new system integration features). So if you don't intend to write something highly integrated (such as a Windows shell extension), JavaFx will be fine.
I'm looking for options to replace and old application running in a Psion Workabout mx handheld, developed in OPL.
The handheld and the application (developed more than 10 years ago) are both working fine
by now, but the device is discontinued, and each time is harder to find replacement parts for it.
Then I started to look to the newer Psion handheld models, but they are expensive and
filled with features that I don't need at all (color screen, barcode reader, ...). Also,
they look a lot less rugged than the actual Workabout mx that I'm using. I had to replace
around 50 handhelds, and i'm looking for good options with this features:
Reasonable priced
Fast numeric data entry, optionally alphanumeric data (not usual)
Readable screen, with at least 7 lines of text visible. No color needed
Rugged
Replacement parts available
Reasonable development environment (handheld emulator, IDE, minimal GUI support, PC / handheld connectivity)
Maybe an old mobile phone with Java support can do the work?
Please indicate the suggested device model and the development options available for it.
Thanks in advance
Perhaps a compaq ipaqs may be suitable replacements, but I'm not sure they make those anymore.
I was also thinking an iPod iTouch (serious suggestion!) may be a good device to get (cheapest version £165) Its a good development environment (Objective-C, free compiler download, although you'll probably have to register with apple to get your apps. a certificate so it works on the device). This may be too expensive, and far above the requirements you're looking for.
If you're thinking about java enabled phones (I'm not sure what you're performance requirements are, but sounds quite minimal if its a port of a 10 year old app) you want to be careful, some mobile java implementations won't support floating point arithmetic directly, you may have to implement a fixed point math library. Somes phone Java VMs vary quite dramatically performance wise too, again this may not be your primary concern. The mobile phone development route may be a valid one, if you're assuming that your off-site engineers all have company phones anyway!