is JXTA 2.7 production ready and when is Chaupal 1.0 due? - p2p

i am about to develop a new application and I was looking at JXSE framework. It seems some political issues with oracle have killed the project from what i understand. this raises a few questions:
Is JXSE 2.7 stable enough for production use?
Will jXSE/JXTA for java still be developed?
is Chaupal the continuation of JXSE (does it start from the 2.7 codebase?)
when is Chaupal 1.0 set for release? (approximation i guess)
how similar will Chaupal interfaces be with JXSE (how easy will it be to migrate to)
thanks!

Political issues did not kill the project. Some people are still posting patches, but it surely slowed down involvement.
Is JXSE 2.7 stable enough for
production use?
Yes, some companies are even using the 2.6 branch.
Will jXSE/JXTA for java still be
developed?
Some people are posting patches on the 2.6.x branch and these will be re-integrated in trunk somewhere down the road. The future of JXTA depends on voluntary contributions.
is Chaupal the continuation of JXSE
(does it start from the 2.7 codebase?)
Yes and no. No, it does start from the 2.7 codebase, it starts from scratch. Yes, it is a continuation 'in spirit' of the development of an open source java based P2P library.
when is Chaupal 1.0 set for release?
(approximation i guess)
No plan for a date. We are working on our spare time. So it is not fast.
how similar will Chaupal interfaces be with JXSE (how easy will it be to migrate to)
Chaupal aims at being simpler and more straight forward. One of the weaknesses of JXTA was its complicated API. We want to solve this issue. We also want to focus more on communication between peers and drop the service loading features of JXTA.
Update (August 2013): You thought JXTA/JXSE was dead? Well someone worked further on it and developed a DZone tutorial (unfortunately, SO does not allow links to Dzone, so Google: JXSE and Equinox Tutorial).

Related

Consequences of not upgrading Expression Engine?

How necessary is an Expression Engine software update for an existing website that works fine? I have version EE 2.11.6 and have been advised to upgrade to version 4 --at great expense.
I need the site to work for maybe five more years before I retire my business. It is a simple brochure site for a photography business displaying pictures and text with a blog and a contact form. Nothing fancy, no sensitive customer data, nothing of interest to hackers.
If I don't do the upgrade, is it really likely to stop working anytime soon? Would upgrading to version 3 instead of 4 give me a few more years?
Thanks in advance!
Dennis
whether it will keep working largely depends on your hosting environment. If your hosting environment stays the same, you don't want new functionality (ie plugins), and you have no payment systems involved, you could probably make this site stretch another five years. On the other hand, if your host wants to upgrade operating systems and PHP versions, eventually EE 2 will not be an option.
An upgrade to EE 3 is a big jump from EE 2. The jump from EE 3 to 4 is not nearly as difficult. What creates the cost in upgrading is often incompatible plugins. The EE 3 release changed the underlying code framework to the point that many useful plugins were abandoned and many developers decided to exit the EE marketplace. Finding replacements and re-working the functionality the plugin got you can be time consuming and expensive.
By the way, every site has something of interest to hackers. Your content is not necessarily the end goal, if they can get access to your server through some exploit, they can then use it to send spam emails, host a botnet, or use the computing power for their own purpose with obscured traceability. That can bring negative consequences for your site in the form of blacklisting and other penalties for being hacked.
Best of luck!

Why has JXTA been abandoned? Any alternatives out there?

P2p/Grid Computing seem like a promising concepts. JXTA looks like the only all in one framework for it. Is there a reason this field is so sparsely pursued?
I have lead the release of JXTA 2.6 and 2.7 - JXTA is not completely abandoned. Some people have posted patches on the 2.6 branch and it could easily be merged with the 2.7 branch.
There are many reasons why people did not carry on participating to JXTA:
Oracle did not follow-up on their duties regarding project governance, which left the project in a limbo state.
Oracle did not follow-up on a request to move the project to Apache.
The code base was old. We cleaned it and implemented unit tests. But in order to move the project to the next level, it would have required a lot of rewriting. Not enough volunteers.
But more fundamentally, the reason few P2P frameworks took off is because P2P is fundamentally complex when you get into details. Most people don't get it until they start putting their hands in the dirt. It is not possible to implement P2P 'in a simple way'.
So nothing to do with all-Java clients, licensing fees or others.
Update (August 2013): You thought JXTA/JXSE was dead? Well someone worked further on it and developed a DZone tutorial (unfortunately, SO does not allow links to Dzone, so Google: JXSE and Equinox Tutorial).
Update (November 2013): A group of people is working on new releases of JXTA. For more information, register on the mailing lists.
Interestingly what was missing with all the P2P initiatives of the past was a motivation for a peer to stay active. Question always was why would a peer keep running a CPU draining and XML based verbosed protocols.
Trust was another factor - how can I trust a peer. As a key member of the team, we introduced security. But security doesn't address trust.
To make it even worse, JXTA introduced the concept of super nodes - defeating the very concept of peer to peer.
However, not everything was that bad. JXTA provided a lots of new concepts. One being Edge Computing with JXME and JXTA sitting together - you can call it to be current day Fog computing where heavy lifting was on the JXTA node and some intelligence on the constrained JXME nodes.
Fast forward, Blockchain addressed gaps with addressing most if not all the questions that any P2P platform could not answer: trust, incentivizing peers, tamper proof and much more.
P2P is still alive :)
I think it's for the same reasons that RMI, CORBA, and Jini aren't much in favor: complex and closed.
Simple and open win most of the time.
It might have had something to do with all-Java clients or licensing fees or something else.
It could be competition. MPI is a widely accepted messaging standard for computing. Hadoop is getting a lot of traction.
UPDATE: The answer that was accepted discusses why people may or may not choose to participate in JXTA. I think my answer has more to do with user adoption, which is different. Mine go back to the origins of JXTA, not the details of releases 2.6 and 2.7.
If you work with Linux, try this: http://www.p2pns.org/
"P2PNS (Peer-to-Peer Name Service) is a distributed name service using a peer-to-peer network. The current focus of P2PNS is to provide a secure and efficient SIP name resolution for decentralized VoIP ( P2PSIP)."
In most cases Name Resolution is enough to build up a P2P-App on top of it.

Viability of using ADF Faces outside of jDeveloper studio

Having worked with 2 JSF component libraries, or frameworks if you wish, I can't help
but wonder if I'm missing a trick by not evaluating ADF Faces. The big stumbling block
for me is the way ADF is clearly deliberately tied in with jDeveloper studio. I work
with NetBeans on a glassfish a/s, and as open as I am to change, I want to maximise
my existing experience and leverage this stability.
My (tentative) rationale is that perhaps a company with the resources of Oracle could,
perhaps, come up with a better quality product than other alternatives and with superior
functionality, and this is what I wish to evaluate.
So I just wondered if anyone had any experience they could share with respect to working
with ADF Faces outside a jDeveloper environment. Presumably from a technical perspective
it's not much more than extracting the necessary jars from the distribution and taking
it from there. Equally important would be any licensing/legal considerations of course.
I've read that ADF has a business layer however I'd want to continue working with a Java
6 EE stack at this time.
What would be really cool would be to have a maven repository, although presumably the
reason that no such thing exists is to protect the jDeveloper business.
Has anyone got ADF working outside jDeveloper, got it working and can provide a small
amount of direction as to any major considerations.
If so, is it worth it? is ADF Faces a superior and more in depth product than can
otherwise be found (oh no, what have I said...)?
What are the licensing considerations? my present understanding is that fees would be
payable is you're not using weblogic. How about glassfish enterprise?
I have only recently found out, and was very surprised, to realise that ADF has the
third JSF core implementation, alongside Mojarrra and MyFaces. Is this just a case of
taking the RI and making a few necessary changes to support core ADF functionality or
more than that? I see from the JIRA that Ed Burns corresponds closely with the ADF
team, of course they now work for the same company. Clearly the RI has to mirror the
spec, and that takes time, so this in itself could be interesting.
Thanks.
dont know if this will help you , but it seems that you can now freely deploy adf on glassfish server
https://blogs.oracle.com/shay/entry/deploying_oracle_adf_applications_to
http://docs.oracle.com/cd/E35521_01/admin.111230/e16179/ap_glassfish.htm#BABIEADD
ADF is only officially certified to work with WebSphere 7.0.0.13 AS or ND(aside from weblogic of course). Although there are blogs and forum posts that indicate people were able to make it work with Tomcat(with much pain of course).
I know for a fact that there is an eclipse version/plugin/whatever that has ADF Faces features.
AFAIK, you only have to pay when you are using a container/AS that is not weblogic.
ADF Faces is great. It has all the components you would expect plus more. It also has a great Controller framework(see ADF Task Flows). For demos you can check out the Demo.

Can Gobby/Sobby be used for collaborative edition for a team of developers?

Gobby/Sobby is an open source client/server for collaborative edition of plain text file (source code).
My question is 4-fold :
Can you share any real-life usage of Gobby/Sobby for development among a group of physically separated developers ?
Is the project mature enough as a productivity tool ?
What are the working use cases ?
What versions should be used ? (It seems 'undo' feature is not yet officially packaged)
Thanks
Jerome
Yes
Mostly, you encounter issues running Windows clients (random crashes). Find a version that works and stick to it.
I recently started using it with the team I manage. It has definitely increased my productivity when reviewing code, implementation collaboration, and answering general questions. It's great for those "hey, can you take a quick look a this?" questions.
We have only used development release 0.4.93.

How important is backwards-compatibility? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
When you are considering an upgrade in a development tool, how important is backwards-compatibility to you? Would you still purchase Visual Studio 2010 if it required significant changes to your source code? Where is the tipping point for you in terms of trading backwards compatibility for new features?
While you asked this from the point of view of the developer, I think it would be a more interesting question in reference to the software you develop. So I'm going to answer that question instead. :)
Hardware and software that's backwards compatible (and, more importantly, future-compatible) provides a sense of security to your users, especially when buying or upgrading platforms like Windows. If nothing else, Windows is known for its meticulous attention to backwards compatibility. You can run programs written over a decade ago on Windows Vista with only minor problems, provided they were "well written" (i.e. don't use undocumented APIs).
On the other hand, strict attention to backwards compatibility can tie your hands when you're attempting to introduce new features or to revolutionize the platform. Apple knew it had a dying OS, and in one of its most daring moves, it purchased NeXT and decided to make NeXTSTEP the new MacOS. One of the key things that sold people on the transition was the backwards-compatible layer Classic. Again when Apple decided to switch to Intel chips, a mechanism for running PowerPC apps on Intel called Rosetta, along with the Universal Binaries, allowed people to freely move between PowerPC and Intel without fear of application loss.
One interesting thing is that with the transition to Intel the Classic environment disappeared, but nobody really cares because they had the preceding 5 years to transition away from Mac OS 9. So it is possible to eventually drop support for legacy systems as long as you have an easy way to migrate to the new system and give your users ample time to do so.
In a development tool, if it doesn't provide total backward compatibility with my previous code, I won't buy it and I doubt anyone would. Frankly, there's no point. If I already have a compiler that works to build my source code into executable code that works for me, then I'll use that. Why bother changing my code to conform with what is obviously to the toolmaker not a standard? If they force source code changes from one version to the other, why would they bother to make the next version compatible?
100% backwards compatibility with source is a requirement. The only situation where this isn't a total requirement is when the incompatible bits are extensions; i.e. API changes that are specific to the tool, such as Eclipse plugins, etc. Even then, I'd like compatibility, but I realize that it can't be completely expected. But if you provide an API for base application / tool development, and can't be bothered to maintain compatibility; well, then, you clearly aren't serious about your tools, and I won't pay serious money for them.
For home projects, backwards compatibility really isn't important. For the office/enterprise, it's absolutely critical.
It depends on what environments you need to support and what third-party tools are used that may or may not be compatabile.
For example, where I work we upgraded everyone to VS2008 from VS2005 except for our BI group as the SQL Server BI tools were not compatible with VS2008. Once they were updated, they upgraded to VS2008.
When looking at VS specficially, keep in mind that VS2008 can target .NET 2.0, .NET 3.0, and .NET 3.5. The trick is to realize that it actually targets .NET 2.0 SP1 and .NET 3.0 SP1. As such, upgrading the IDE should not require you to make changes to your code.
In genreal if you are developing a platform which will be constantly used by a lot of other users to build their own products, and you plan to develop the application for a long time then it is important. See the PHP, Python, Eclipse and other open source projects who put a lot of importance into backwards compatibility. It is also importan when developing services or other open apis used in the n-tier architecture. You can have all applications in an enterprise breaking all the time when you change your services.
Now, If you are building a shrink wrap application or a bussiness application then it is not so important, beacuse each version is separate from its predecessors.
Since lot of changes are happening in the Software and Hardware field, I think it is a good idea to be open for new changes and better tools while you architect your solution. For example we didnt have multi core processors and high end graphics cards or network card back in 90s, So naturally the optimization goals of the compilers and tools were different. But at the same time Visual studio like tools are doing their best to accommodate the old frameworks and apps.
I think if we are looking forward for a better world, we should be open to a constant change until this industry is super matured. (May not happen in our life time though :) )
Define "significant changes". I'd go for it if the changes could be made with a carefully crafted "search & replace" even if they were extensive.
However, that's what I would do. Any company I've worked for would balk at any changes to existing code.

Resources