Data Access Layer in Asp.Net - c#-4.0

Am Afraid If am Overdoing things here.
We recently started a .Net project containig different Class Libraries for DAl,Services and DTO.
Question is about our DAL layer we wanted a clean and easily maintained Data access layer, We wanted go with Entity Framework 4.1.
So still not clear about what to opt for Plain ADO.Net using DAO and DAOImpl methodolgy or
Entity Framework.
Could any one please suggest the best approach.

It depends on how much work you want to put into creating your own customized DAL. It is always better to use ADO.NET and your own implementations, but this also includes maintaining and optimizing it and treating complex cases such as concurrency, caching and the mapping of you BO, the DAL and the Database.
If you want to concentrate more on business value and functionality you might decide to go with Entity Framework (now 4.3 released and 5.0 to come). The advantage would be that you use a DAL that was carefully tested and that already contains solutions for concurrency, caching and mapping.
But I would hardly suggest using the Repository and Unit Of Work patterns on top of it to abstract the usage of Entity Framework out of your other layers. Then you would have the possibility to later completely change the underlying technologies without any impact on the other layers (you could replace EF with your own ADO.NET implementation if you see that the performance is not as good as it should be for example).
It depends on the type of application that you need to build and on its performance requirements. Using EF could really reduce your work and give you much quicker results. It also depends on the development teams capabilities. If you only have senior developers and architects working on the project then you will create you own DAL easily. But for beginners it is really hard to implement a good, optimized and robust DAL.
I hope that helps !

I've been using ADO.NET and DTO combination in DAL ever since i remember and i love the fact that i control the entire process of creating entities and methods. However that comes with the price of having to write classes for every entity and methods for every stored procedure. Which i don't mind, but recently i have discovered PLINQO for LINQ to SQL and I'm loving it. It gives you ease of creation/updating of Classes based on your Database schema while allowing for high levels of customization. Its basically LINQ2SQL on steroids.
I also liked nHibernate but i think it had steeper learning curve than PLINQO.
I'd give PLINQO a try if i was you

Related

Is it possible to remove oracle ADF component from a web application and make it pure JSF?

We have an oracle forms application, and one of the many thoughts (considered converting to non-oracle-form technology) was to use JHeadStart (oracle product) that converts the oracle forms to ADF application. But we would like to not use ADF, so is there any way that we can remove the dependency on ADF?
If anyone feels this is not the question to ask, instead of giving me -ve marks please comment me and I will remove this question.
Thanks.
As always, it depends on what you want to achieve. I don't know JHeadStart, but to me, it sounds like a tool converting a legacy application to a framework that might be considered legacy soon. There are a few supporters of ADF, so I believe it's a good thing if you're ready to live with the compromises a full-stack framework brings. But in general, ADF is not popular among JSF developers (mostly because of those compromises, which often turn out to be too restrictive). Even more generally speaking, JSF is not popular among UI developers. That, in turn, is a bit unfair, but I observe a huge movement to pure JavaScript UI frameworks.
This indicates that using a tool like JHeadStart isn't the most future-proof approach. It's (probably) good to survive the next month, but in the long run, it'll probably backfire.
Let's have a look at the question from another angle. Why do you want to get rid of Oracle forms? Most likely, it's because of recruiting problems, but it might also have something to do with architecture. Oracle Forms supports a programming style integrating the database layer tightly with the UI layer. That's a very efficient way to write small applications, but it scales badly if your application grows.
So I'd recommend spending some extra money and time to re-implement your application from scratch. Automated tools tend to generate code that's hard to maintain. Re-designing your application from scratch gives you the opportunity to build an application that lasts a decade.
Oh, and I don't think it's possible to use JHeadStart without introducing ADF. Simply because JHeadStart has been designed with ADF in mind.

Choice of technical solution to handling and processing data for a Liferay Project

I am researching to start a new project based on Liferay.
It relies on a system that will require its own data model and a certain agility and flexibility in data management as well as its visualization.
These are my options:
Using Liferay Expando fields and define their own data models. I must do all the view layer.
Using Liferay ECMS adding patches creating structures and hooks that allow me to define data models Master - Detail. It makes much easier viewing issue (velocity templates), but perhaps is the most "dirty" way.
Generating data layer and access to services with Hibernate and Spring. (using Service Factory, for example).
Liferay Service Builder would be similar to the option of creating the platform with Hibernate and Spring.
CRUD generation systems as OpenXava or your XMLPortletFactory
And now my question, what is your advice? What advantages or disadvantages do you think would provide one or another option?
Thanks in advance.
I can't speak for the other CRUD generation systems but I can tell you about the Liferay approaches.
I would take a hybrid approach.
First, I would create the required data models as best as I can with the current requirements in Liferay Service Builder and maintain them there as much as possible. This would require that you rebuild and redeploy your plugin every time you changed the data model but would greatly enhance performance compared to all the other Liferay approaches you've mentioned. Service Builder in that regard is much more rigid and cannot be changed via GUI.
However, in the event for some reason you cannot use Service Builder to redefine your data models and you need certain aspects of it the be changed via GUI, you can also use Expandos to extend the models you've created with Service Builder. So, it is the best of both worlds.
On the other option, using the ECMS would be a specialized case and I would only take this approach if there is a particular requirement it satisfies (like integration with the ECMS).
With that said, Liferay provides you many different ways to create your application. It ultimately depends on how you're going to use your application.

Best ORM to use with C# 4.0 [closed]

It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 12 years ago.
what is the best way is to use a ORM like Nhibertate or Entity Framework or to do a customer ORM .
I will use this ORM for a C# 4.0 project
UPDATE 2016
Six years later things are VERY different. NHibernate is all but abandoned, other alternatives were abandoned (eg Subsonic), Entity Framework is perhaps the most common full-featured ORM, and people have been moving to micro ORMs like Dapper for years, to map from queries to objects with a minimum of overhead.
The application scenarios have also changed. Instead of loading and caching one big object graph at the expense of memory and performance, Web Services and REST APIs need to service a high number of smaller requests. This means that a full ORM's overhead is no longer acceptable.
This means that patterns and techniques like Active Record, transaction per request etc have become throughput and scalability killing anti-patterns
One of the most important features nowadays is asynchronous execution , to reduce thread and CPU waste due to waits. NHibernate never made this transition.
Original Answer
Define "best": Is it the most mature, the one with more documentation, bigger community, more mainstream?
NHibernate is more mature, feature rich, with a more advanced community and not likely to be discontinued when MS decides to break compatibility again. Entity Framework is more mainstream and is supported out-of-the-box. You will find more beginner books for EF, more advanced books for NH.
A good option would be to try one of the simpler ORMs like Subsonic and move to more advanced ORMs once you understand how ORMs work, what are the various pitfalls, what SELECT N+1 means [:P]
Just don't try to create your own ORM, there are several dozens out there already! Subsonic, Castle ActiveRecord, NH, EF (of course), LLBLGenPro...
If you can spend some money, have definetely a look at LLBLGEn Pro 3.0
full .NET 4.0 support and it's a mature product. Good support it's also useful.
wide database support (Oracle,MS Sql,Firebird,MySql,PostgreSQL,Sysbase)
nice designer, Model first support and also Database first support
If your budget is thin, then try NHibernate. It's also a mature product,but It has a bigger learning curve. And if you need some support,you can always call Ayende :-)
For smaller projects it's EF 4.0 a good choice.
Most ORMs have their own strengths and weaknesses.
Entity Framework, for example, has the (huge?) advantage of being in the framework itself, but it also is fairly heavy-weight, and a bit more difficult to get up and running (steeper learning curve).
There are some very nice, very easy to use commercial ORMs. I'm currently using Lightspeed in a C# 4 project, and extremely happy with it for this specific scenario.
It really comes down to what you need from the ORM. If you want very quick and easy to setup and use, Lightspeed, subsonic, and others are very nice. If you need full featured, then Entity Framework and NHibernate are good options.
Calling an ORM the best amongst all in General perspective is completely impossible. Each of them are best from different perspectives. You chose the one that best fits to your needs. Linq2Sql was written with performance in mind but it lacks support for other providers,Linq2Sql is very fast. Yet there are others that may not be as fast as Linq2Sql when it comes to dealing with SQL server but they support wide variety of providers. The best idea would be to list down the features you want an ORM to have for your project and select the one that is serves all your needs.` You may ask these questions to chose the right ORM for your project.
What database providers you want the ORM to support ? SQL Server,MySQL, Oracle, etc.
Do you need model-first or db-first support ?
What is my performance criteria [memory, processing] ?
Are you going to use it in web-app or a desktop-app ?
Do you have distributed clients in your application ?
And the list goes on..
I use Linq-to-SQL as my main ORM when creating C# applications. I'll eventually move on to Entity Framework, but for now this one is really easy to use and fast.
I would agree with #this. __curious_geek that choosing the right ORM would depend on your requirements.Having worked on both Hibernate and Entity Framework, I feel the latter is more user friendly as it is a GUI based editor. On the feature richness front, NHibernate has an advantage of supporting a lot of database providers. Also customizing NHibernate turned out to be much easier than tweaking the Entity Framework.
Assuming the most tools satisfy your core requirement, I would prefer NHibernate as a vibrant and engaging user community is a big plus for any tool.

Using java classes in Grails

I have a Java\Spring\Hibernate application - complete with domain classes which are basically Hibernate POJOs
There is a piece of functionality that I think can be written well in Grails.
I wish to reuse the domain classes that I have created in the main Java app
What is the best way to do so ?
Should I write new domain classes extending the Java classes ? this sounds tacky
Or Can I 'generate' controllers off the Java domain classes ?
What are the best practices around reusing Java domain objects in Grails\Groovy
I am sure there must be others writing some pieces in grails\groovy
If you know about a tutorial which talks about such an integration- that would be awesome !!!
PS: I am quite a newbie in grails-groovy so may be missing the obvious. Thanks !!!
Knowing just how well Groovy and Grails excel at integrating with existing Java code, I think I might be a bit more optimistic than Michael about your options.
First thing is that you're already using Spring and Hibernate, and since your domain classes are already POJOs they should be easy to integrate with. Any Spring beans you might have can be specified in an XML file as usual (in grails-app/conf/spring/resources.xml) or much more simply using the Spring bean builder feature of Grails. They can then be accessed by name in any controller, view, service, etc. and worked with as usual.
Here are the options, as I see them, for integrating your domain classes and database schema:
Bypass GORM and load/save your domain objects exactly as you're already doing.
Grails doesn't force you to use GORM, so this should be quite straightforward: create a .jar of your Java code (if you haven't already) and drop it into the Grails app's lib directory. If your Java project is Mavenized, it's even easier: Grails 1.1 works with Maven, so you can create a pom.xml for your Grails app and add your Java project as a dependency as you would in any other (Java) project.
Either way you'll be able to import your classes (and any supporting classes) and proceed as usual. Because of Groovy's tight integration with Java, you'll be able to create objects, load them from the database, modify them, save them, validate them etc. exactly as you would in your Java project. You won't get all the conveniences of GORM this way, but you would have the advantage of working with your objects in a way that already makes sense to you (except maybe with a bit less code thanks to Groovy). You could always try this option first to get something working, then consider one of the other options later if it seems to make sense at that time.
One tip if you do try this option: abstract the actual persistence code into a Grails service (StorageService perhaps) and have your controllers call methods on it rather than handling persistence directly. This way you could replace that service with something else down the road if needed, and as long as you maintain the same interface your controllers won't be affected.
Create new Grails domain classes as subclasses of your existing Java classes.
This could be pretty straightforward if your classes are already written as proper beans, i.e. with getter/setter methods for all their properties. Grails will see these inherited properties as it would if they were written in the simpler Groovy style. You'll be able to specify how to validate each property, using either simple validation checks (not null, not blank, etc.) or with closures that do more complicated things, perhaps calling existing methods in their POJO superclasses.
You'll almost certainly need to tweak the mappings via the GORM mapping DSL to fit the realities of your existing database schema. Relationships would be where it might get tricky. For example, you might have some other solution where GORM expects a join table, though there may even be a way to work around differences such as these. I'd suggest learning as much as you can about GORM and its mapping DSL and then experiment with a few of your classes to see if this is a viable option.
Have Grails use your existing POJOs and Hibernate mappings directly.
I haven't tried this myself, but according to Grails's Hibernate Integration page this is supposed to be possible: "Grails also allows you to write your domain model in Java or re-use an existing domain model that has been mapped using Hibernate. All you have to do is place the necessary 'hibernate.cfg.xml' file and corresponding mappings files in the '%PROJECT_HOME%/grails-app/conf/hibernate' directory. You will still be able to call all of the dynamic persistent and query methods allowed in GORM!"
Googling "gorm legacy" turns up a number of helpful discussions and examples, for example this blog post by Glen Smith (co-author of the soon-to-be-released Grails in Action) where he shows a Hibernate mapping file used to integrate with "the legacy DB from Hell". Grails in Action has a chapter titled "Advanced GORM Kungfu" which promises a detailed discussion of this topic. I have a pre-release PDF of the book, and while I haven't gotten to that chapter yet, what I've read so far is very good, and the book covers many topics that aren't adequately discussed in other Grails books.
Sorry I can't offer any personal experience on this last option, but it does sound doable (and quite promising). Whichever option you choose, let us know how it turns out!
Do you really want/need to use Grails rather than just Groovy?
Grails really isn't something you can use to add a part to an existing web app. The whole "convention over configuration" approach means that you pretty much have to play by Grails' rules, otherwise there is no point in using it. And one of those rules is that domain objects are Groovy classes that are heavily "enhanced" by the Grails runtime.
It might be possible to have them extend existing Java classes, but I wouldn't bet on it - and all the Spring and Hibernate parts of your existing app would have to be discarded, or at least you'd have to spend a lot of effort to make them work in Grails. You'll be fighting the framework rather than profiting from it.
IMO you have two options:
Rewrite your app from scratch in Grails while reusing as much of the existing code as possible.
Keep your app as it is and add new stuff in Groovy, without using Grails.
The latter is probably better in your situation. Grails is meant to create new web apps very quickly, that's where it shines. Adding stuff to an existing app just isn't what it was made for.
EDIT:
Concerning the clarification in the comments: if you're planning to write basically a data entry/maintenance frontend for data used by another app and have the DB as the only communication channel between them, that might actually work quite well with Grails; it can certainly be configured to use an existing DB schema rather than creating its own from the domain classes (though the latter is less work).
This post provides some suggestions for using grails for wrapping existing Java classes in a web framework.

How can I still use DDD, TDD in BizTalk?

I just started getting into BizTalk at work and would love to keep using everything I've learned about DDD, TDD, etc. Is this even possible or am I always going to have to use the Visio like editors when creating things like pipelines and orchestrations?
You can certainly apply a lot of the concepts of TDD and DDD to BizTalk development.
You can design and develop around the concept of domain objects (although in BizTalk and integration development I often find interface objects or contract first design to be a more useful way of thinking - what messages get passed around at my interfaces). And you can also follow the 'Build the simplest possible thing that will work' and 'only build things that make tests pass' philosophies of TDD.
However, your question sounds like you are asking more about the code-centric sides of these design and development approaches.
Am I right that you would like to be able to follow the test driven development approach of first writing a unti test that exercises a requirement and fails, then writing a method that fulfils the requirement and causes the test to pass - all within a traditional programing language like C#?
For that, unfortunately, the answer is no. The majority of BizTalk artifacts (pipelines, maps, orchestrations...) can only really be built using the Visual Studio BizTalk plugins. There are ways of viewing the underlying c# code, but one would never want to try and directly develop this code.
There are two tools BizUnit and BizUnit Extensions that give some ability to control the execution of BizTalk applications and test them but this really only gets you to the point of performing more controled and more test driven integration tests.
The shapes that you drag onto the Orchestration design surface will largely just do their thing as one opaque unit of execution. And Orchestrations, pipelines, maps etc... all these things are largely intended to be executed (and tested) within an entire BizTalk solution.
Good design practices (taking pointers from approaches like TDD) will lead to breaking BizTalk solutions into smaller, more modular and testable chunks, and are there are ways of testing things like pipelines in isolation.
But the detailed specifics of TDD and DDD in code sadly don't translate.
For some related discussion that may be useful see this question:
Mocking WebService consumed by a Biztalk Request-Response port
If you often make use of pipelines and custom pipeline components in BizTalk, you might find my own PipelineTesting library useful. It allows you to use NUnit (or whatever other testing framework you prefer) to create automated tests for complete pipelines, specific pipeline components or even schemas (such as flat file schemas).
It's pretty useful if you use this kind of functionality, if I may say so myself (I make heavy use of it on my own projects).
You can find an introduction to the library here, and the full code on github. There's also some more detailed documentation on its wiki.
I agree with the comments by CKarras. Many people have cited that as their reason for not liking the BizUnit framework. But take a look at BizUnit 3.0. It has an object model that allows you to write the entire test step in C#/VB instead of XML. BizUnitExtensions is being upgraded to the new object model as well.
The advantages of the XML based system is that it is easier to generate test steps and there is no need to recompile when you update the steps. In my own Extensions library, I found the XmlPokeStep (inspired by NAnt) to be very useful. My team could update test step xml on the fly. For example, lets say we had to call a webservice that created a customer record and then checked a database for that same record. Now if the webservice returned the ID (dynamically generated), we could update the test step for the next step on the fly (not in the same xml file of course) and then use that to check the database.
From a coding perspective, the intellisense should be addressed now in BizUnit 3.0. The lack of an XSD did make things difficult in the past. I'm hoping to get an XSD out that will aid in the intellisense. There were some snippets as well for an old version of BizUnit but those havent been updated, maybe if theres time I'll give that a go.
But coming back to the TDD issue, if you take some of the intent behind TDD - the specification or behavior driven element, then you can apply it to some extent to Biztalk development as well because BizTalk is based heavily on contract driven development. So you can specify your interfaces first and create stub orchestrations etc to handle them and then build out the core. You could write the BizUnit tests at that time. I wish there were some tools that could automate this process but right now there arent.
Using frameworks such as the ESB guidance can also help give you a base platform to work off so you can implement the major use cases through your system iteratively.
Just a few thoughts. Hope this helps. I think its worth blogging about more extensively.
This is a good topic to discuss.Do ping me if you have any questions or we can always discuss more over here.
Rgds
Benjy
You could use BizUnit to create and reuse generic test cases both in code and excel(for functional scenarios)
http://www.codeplex.com/bizunit
BizTalk Server 2009 is expected to have more IDE integrated testability.
Cheers
Hemil.
BizUnit is really a pain to use because all the tests are written in XML instead of a programming language.
In our projects, we have "ported" parts of BizUnit to a plain old C# test framework. This allows us to use BizUnit's library of steps directly in C# NUnit/MSTest code. This makes tests that are easier to write (using VS Intellisense), more flexible, and most important, easier to debug in case of a test failure. The main drawback of this approach is that we have forked from the main BizUnit source.
Another interesting option I would consider for future projects is BooUnit, which is a Boo wrapper on top of BizUnit. It has advantages similar to our BizUnit "port", but also has the advantage of still using BizUnit instead of forking from it.

Resources