Just wanted to know if Linq to Nhibernate ready for production. I want to use it for a fairly large website and don't want it to blow up. Also, do I still have to setup XML or use fluent for the mappings?
You can use fluent for the mappings it is very reliable or you can make use of the automatic mappings but that tends to be a bit too much magic for a lot of people. The linq provider for NH2 is pretty sketchy but the provider for 3 is very good. I've been using it in production for some time and found only a couple of issues which where soon corrected.
Related
We have an oracle forms application, and one of the many thoughts (considered converting to non-oracle-form technology) was to use JHeadStart (oracle product) that converts the oracle forms to ADF application. But we would like to not use ADF, so is there any way that we can remove the dependency on ADF?
If anyone feels this is not the question to ask, instead of giving me -ve marks please comment me and I will remove this question.
Thanks.
As always, it depends on what you want to achieve. I don't know JHeadStart, but to me, it sounds like a tool converting a legacy application to a framework that might be considered legacy soon. There are a few supporters of ADF, so I believe it's a good thing if you're ready to live with the compromises a full-stack framework brings. But in general, ADF is not popular among JSF developers (mostly because of those compromises, which often turn out to be too restrictive). Even more generally speaking, JSF is not popular among UI developers. That, in turn, is a bit unfair, but I observe a huge movement to pure JavaScript UI frameworks.
This indicates that using a tool like JHeadStart isn't the most future-proof approach. It's (probably) good to survive the next month, but in the long run, it'll probably backfire.
Let's have a look at the question from another angle. Why do you want to get rid of Oracle forms? Most likely, it's because of recruiting problems, but it might also have something to do with architecture. Oracle Forms supports a programming style integrating the database layer tightly with the UI layer. That's a very efficient way to write small applications, but it scales badly if your application grows.
So I'd recommend spending some extra money and time to re-implement your application from scratch. Automated tools tend to generate code that's hard to maintain. Re-designing your application from scratch gives you the opportunity to build an application that lasts a decade.
Oh, and I don't think it's possible to use JHeadStart without introducing ADF. Simply because JHeadStart has been designed with ADF in mind.
We have several legacy SQL Server databases that we occasionally make schema changes to. We currently have a utility written in C++ that allows users to update their DB's with these schema changes. The utility currently generates dynamic sql to create all DB objects. I am looking into redoing this and thought EF migrations might be a good way to go. I have read up a bit on the subject and I have a general idea of how it works. But I'm having a bit of a hard time figuring out how I would set it up to replace our current procedure (or if it is even possible). Currently, a client could be on any one of a number of previous versions. I'm assuming I would have to go back to the oldest possible version and create my model/initial migration from that, then generate incremental migrations for each version change in order to support updates from all versions. Is that a correct assumption? Also, currently our clients could be using sql server 2000, 2005, or 2008. Would this have any effect on how I would set things up (or if I even could)? Further, the goal is to create a utility with a (C# - probably WPF) UI that the user can use to manipulate the migrations (up or down, preferably). I've seen a lot of examples of how to manipulate migrations from command-line within package manager but not a lot of stuff on how to create a utility with a friendly UI for upgrading/downgrading DB's in production. Also, I have not seen anything that shows how to create stored procedures in a migration (our DBs rely on some stored procedures). I'm assuming that, if nothing else, I can use the Sql() method to generate a SQL query to create a SP. Is that correct? Is there a better way?
I know my questions are a bit non-specific and I apologize for that. But I'm still in the beginning processes of learning this and I'd like to get an idea of whether or not this is a good way to go. Any guidance would be greatly appreciated.
Thanks,
Dennis
Firstly, on SQL Server support, Entity Framework doesn't really support SQL Server 2000. See this question:
EntityFramework SQL Server 2000?
On the question of supporting all the multiple versions, you have the right idea about needing to generate an initial migration for the oldest version first then incrementally altering the model and generating migrations to support the later versions. This will be a pain as the migrations are opinionated about how they represent the model in the database and you will be doing a lot of messing about to end up with a model and a set of migrations that fully represent that. Specific concerns are indexes, column lengths, data types, stored procedures, triggers, functions, partitioning.
The Sql() function gets you around most issues, though also helpful in the migrations are functions like CreateIndex and AlterColumn.
For automating this, the migrations are definitely available as powershell cmdlets which are themselves just .Net objects so can be called programmatically.
As this question is a year old, I assume you will have made a decision on whether to do this. My opinion is that it is hard to see that it's worth the effort. If you were re-platforming the code base that uses this database to Entity Framework then it would make sense. Otherwise there are bound to be better tools out there for database version management. My first port of call would be Redgate.
Am Afraid If am Overdoing things here.
We recently started a .Net project containig different Class Libraries for DAl,Services and DTO.
Question is about our DAL layer we wanted a clean and easily maintained Data access layer, We wanted go with Entity Framework 4.1.
So still not clear about what to opt for Plain ADO.Net using DAO and DAOImpl methodolgy or
Entity Framework.
Could any one please suggest the best approach.
It depends on how much work you want to put into creating your own customized DAL. It is always better to use ADO.NET and your own implementations, but this also includes maintaining and optimizing it and treating complex cases such as concurrency, caching and the mapping of you BO, the DAL and the Database.
If you want to concentrate more on business value and functionality you might decide to go with Entity Framework (now 4.3 released and 5.0 to come). The advantage would be that you use a DAL that was carefully tested and that already contains solutions for concurrency, caching and mapping.
But I would hardly suggest using the Repository and Unit Of Work patterns on top of it to abstract the usage of Entity Framework out of your other layers. Then you would have the possibility to later completely change the underlying technologies without any impact on the other layers (you could replace EF with your own ADO.NET implementation if you see that the performance is not as good as it should be for example).
It depends on the type of application that you need to build and on its performance requirements. Using EF could really reduce your work and give you much quicker results. It also depends on the development teams capabilities. If you only have senior developers and architects working on the project then you will create you own DAL easily. But for beginners it is really hard to implement a good, optimized and robust DAL.
I hope that helps !
I've been using ADO.NET and DTO combination in DAL ever since i remember and i love the fact that i control the entire process of creating entities and methods. However that comes with the price of having to write classes for every entity and methods for every stored procedure. Which i don't mind, but recently i have discovered PLINQO for LINQ to SQL and I'm loving it. It gives you ease of creation/updating of Classes based on your Database schema while allowing for high levels of customization. Its basically LINQ2SQL on steroids.
I also liked nHibernate but i think it had steeper learning curve than PLINQO.
I'd give PLINQO a try if i was you
Azure, probably ASP.NET Webforms. We're building a management interface for about 8 tables. Usual CRUD :-)
2 backend users, and doesn't need to look flash.
We'd like to use Mindscape's LightSpeed as the ORM tool
Question: Is Dynamic Data worth pursuing? (Mindscape have a connector to DD).
We've also got the Telerik Suite which is next on the list to check out.
Rapid development here is key.
For rapid development, Dynamic Data is great. In its current form it has some good extension points. However, I think you are stuck with either Entity Framework or Linq to SQL for O/RM solutions. Not necessarily a bad thing for an 8 table app, but something to consider.
I just started getting into BizTalk at work and would love to keep using everything I've learned about DDD, TDD, etc. Is this even possible or am I always going to have to use the Visio like editors when creating things like pipelines and orchestrations?
You can certainly apply a lot of the concepts of TDD and DDD to BizTalk development.
You can design and develop around the concept of domain objects (although in BizTalk and integration development I often find interface objects or contract first design to be a more useful way of thinking - what messages get passed around at my interfaces). And you can also follow the 'Build the simplest possible thing that will work' and 'only build things that make tests pass' philosophies of TDD.
However, your question sounds like you are asking more about the code-centric sides of these design and development approaches.
Am I right that you would like to be able to follow the test driven development approach of first writing a unti test that exercises a requirement and fails, then writing a method that fulfils the requirement and causes the test to pass - all within a traditional programing language like C#?
For that, unfortunately, the answer is no. The majority of BizTalk artifacts (pipelines, maps, orchestrations...) can only really be built using the Visual Studio BizTalk plugins. There are ways of viewing the underlying c# code, but one would never want to try and directly develop this code.
There are two tools BizUnit and BizUnit Extensions that give some ability to control the execution of BizTalk applications and test them but this really only gets you to the point of performing more controled and more test driven integration tests.
The shapes that you drag onto the Orchestration design surface will largely just do their thing as one opaque unit of execution. And Orchestrations, pipelines, maps etc... all these things are largely intended to be executed (and tested) within an entire BizTalk solution.
Good design practices (taking pointers from approaches like TDD) will lead to breaking BizTalk solutions into smaller, more modular and testable chunks, and are there are ways of testing things like pipelines in isolation.
But the detailed specifics of TDD and DDD in code sadly don't translate.
For some related discussion that may be useful see this question:
Mocking WebService consumed by a Biztalk Request-Response port
If you often make use of pipelines and custom pipeline components in BizTalk, you might find my own PipelineTesting library useful. It allows you to use NUnit (or whatever other testing framework you prefer) to create automated tests for complete pipelines, specific pipeline components or even schemas (such as flat file schemas).
It's pretty useful if you use this kind of functionality, if I may say so myself (I make heavy use of it on my own projects).
You can find an introduction to the library here, and the full code on github. There's also some more detailed documentation on its wiki.
I agree with the comments by CKarras. Many people have cited that as their reason for not liking the BizUnit framework. But take a look at BizUnit 3.0. It has an object model that allows you to write the entire test step in C#/VB instead of XML. BizUnitExtensions is being upgraded to the new object model as well.
The advantages of the XML based system is that it is easier to generate test steps and there is no need to recompile when you update the steps. In my own Extensions library, I found the XmlPokeStep (inspired by NAnt) to be very useful. My team could update test step xml on the fly. For example, lets say we had to call a webservice that created a customer record and then checked a database for that same record. Now if the webservice returned the ID (dynamically generated), we could update the test step for the next step on the fly (not in the same xml file of course) and then use that to check the database.
From a coding perspective, the intellisense should be addressed now in BizUnit 3.0. The lack of an XSD did make things difficult in the past. I'm hoping to get an XSD out that will aid in the intellisense. There were some snippets as well for an old version of BizUnit but those havent been updated, maybe if theres time I'll give that a go.
But coming back to the TDD issue, if you take some of the intent behind TDD - the specification or behavior driven element, then you can apply it to some extent to Biztalk development as well because BizTalk is based heavily on contract driven development. So you can specify your interfaces first and create stub orchestrations etc to handle them and then build out the core. You could write the BizUnit tests at that time. I wish there were some tools that could automate this process but right now there arent.
Using frameworks such as the ESB guidance can also help give you a base platform to work off so you can implement the major use cases through your system iteratively.
Just a few thoughts. Hope this helps. I think its worth blogging about more extensively.
This is a good topic to discuss.Do ping me if you have any questions or we can always discuss more over here.
Rgds
Benjy
You could use BizUnit to create and reuse generic test cases both in code and excel(for functional scenarios)
http://www.codeplex.com/bizunit
BizTalk Server 2009 is expected to have more IDE integrated testability.
Cheers
Hemil.
BizUnit is really a pain to use because all the tests are written in XML instead of a programming language.
In our projects, we have "ported" parts of BizUnit to a plain old C# test framework. This allows us to use BizUnit's library of steps directly in C# NUnit/MSTest code. This makes tests that are easier to write (using VS Intellisense), more flexible, and most important, easier to debug in case of a test failure. The main drawback of this approach is that we have forked from the main BizUnit source.
Another interesting option I would consider for future projects is BooUnit, which is a Boo wrapper on top of BizUnit. It has advantages similar to our BizUnit "port", but also has the advantage of still using BizUnit instead of forking from it.