I have a plan to create a desktop app (language not chosen yet) that will be used as an admin part to manipulate data. At the same time the database will be used for a website.
My only concern is -- I may mix up technologies that aren't compatible, but the only thing that ties them together is the database.
Say I will use Delphi to create the desktop app to manage an Access or MSSQL/MYSQL (if possible) and then use php as to make the web.
Can there be obvious problems with this idea that I am blind to right now?
Any other ideas suggestions are greatly appreciated.
Databases have to be one of the most common ways I see two languages communicating/cooperating. I've seen databases as a conduit between C/C++, Java, Perl, Python, C#, etc... Databases have the benefit of storing data in a pretty language agnostic way. Almost all languages have a way to talk to a database.
The main downside of using two different languages is that you won't be able to reuse code between your web project and your desktop project. That may sound fine, but every time you update your DB schema, you have to update the two code bases. Not a deal-breaker, but annoying nonetheless.
I would recommend avoiding Access if you could help it. Access works for a simple desktop application, but once you start introducing multiple users, you should go with something a little more robust (and secure). Go with something like SQL Server Compact or SQLite if you need a file database. I personally would bite the bullet and go right for MySQL.
Related
I have two separate cloud-based APIs that I am working on integrating together. Neither software directly talks to each other so I am creating something in the middle to get them to communicate. I have had trouble finding examples or documentation on how exactly to do this, does anyone know of any resources that could help me out?
My plan going in was to use a MERN Stack, running on a local server to do GET and POST requests to both APIs, use some mapping and logic to transpose the data into the correct format and send it to the other software. I do not have a client per se (other than myself) on my end, so I really will be skipping the React part of MERN, at least that is what I'm thinking. I'll be using Mongo to keep track of both sets of data for redundancy. I also considered using a LAMP Stack but felt that MERN would be faster in handling the data, and Mongo is more flexible in handling different data formats. If there is another process or technology that could help me that I'm not thinking of, I would be grateful to hear about it.
Has anyone encountered something like this before? Thank you.
As with most architecture questions, there's no completely right or wrong answer here. You could certainly design a well-built system to handle for this purpose with either stack; even more-so when you mention that your front-end framework is not an important consideration. Instead, ask yourself questions like this:
Which stack do you have more experience with, and is this an appropriate time to learn a new set of technologies, or is it important to do the best work you're capable of right now (how important is time, cost, or quality in this case)?
Another generalization I'll stick my neck out for is a data-first approach; what sort of data are you dealing with from each cloud integration, and what kind of data do you need to support and/or create in order to make your system work? Mongo, being a NoSQL persistence layer, will allow you to change your data model and handle more varied data in a quicker and easier manner than a SQL solution will. This is a double-edged sword, however, as lack of validation and a strongly-constrained (typed?) data model will make your application harder to work with and debug as it grows. In short - how big might this application grow?
If you have a handy and familiar way to manage the three different data models you're dealing with (cloud service 1, cloud service 2, and your app) via MySQL, then that's a compelling reason to use it. However, if your style is to start dumping data into your database and you're comfortable with a more iterative approach (which may require more, albeit shorter rounds of refactoring), then Mongo with MERN may be the preferable choice.
Finally, will others ever be working on this application? If so, which language would you prefer to be dealing with them upon - PHP or Javascript?
We have several legacy SQL Server databases that we occasionally make schema changes to. We currently have a utility written in C++ that allows users to update their DB's with these schema changes. The utility currently generates dynamic sql to create all DB objects. I am looking into redoing this and thought EF migrations might be a good way to go. I have read up a bit on the subject and I have a general idea of how it works. But I'm having a bit of a hard time figuring out how I would set it up to replace our current procedure (or if it is even possible). Currently, a client could be on any one of a number of previous versions. I'm assuming I would have to go back to the oldest possible version and create my model/initial migration from that, then generate incremental migrations for each version change in order to support updates from all versions. Is that a correct assumption? Also, currently our clients could be using sql server 2000, 2005, or 2008. Would this have any effect on how I would set things up (or if I even could)? Further, the goal is to create a utility with a (C# - probably WPF) UI that the user can use to manipulate the migrations (up or down, preferably). I've seen a lot of examples of how to manipulate migrations from command-line within package manager but not a lot of stuff on how to create a utility with a friendly UI for upgrading/downgrading DB's in production. Also, I have not seen anything that shows how to create stored procedures in a migration (our DBs rely on some stored procedures). I'm assuming that, if nothing else, I can use the Sql() method to generate a SQL query to create a SP. Is that correct? Is there a better way?
I know my questions are a bit non-specific and I apologize for that. But I'm still in the beginning processes of learning this and I'd like to get an idea of whether or not this is a good way to go. Any guidance would be greatly appreciated.
Thanks,
Dennis
Firstly, on SQL Server support, Entity Framework doesn't really support SQL Server 2000. See this question:
EntityFramework SQL Server 2000?
On the question of supporting all the multiple versions, you have the right idea about needing to generate an initial migration for the oldest version first then incrementally altering the model and generating migrations to support the later versions. This will be a pain as the migrations are opinionated about how they represent the model in the database and you will be doing a lot of messing about to end up with a model and a set of migrations that fully represent that. Specific concerns are indexes, column lengths, data types, stored procedures, triggers, functions, partitioning.
The Sql() function gets you around most issues, though also helpful in the migrations are functions like CreateIndex and AlterColumn.
For automating this, the migrations are definitely available as powershell cmdlets which are themselves just .Net objects so can be called programmatically.
As this question is a year old, I assume you will have made a decision on whether to do this. My opinion is that it is hard to see that it's worth the effort. If you were re-platforming the code base that uses this database to Entity Framework then it would make sense. Otherwise there are bound to be better tools out there for database version management. My first port of call would be Redgate.
I have an app that I would like to create. But I am not sure how to go about it. I am using node.js and would like to use couchdb, but if something like mongodb or riak would be a better choice them im willing to hear ideas. But, i have a site, say
cool.com
and on there is a couchdb instance, as well as a site to manage a store. say a shopping cart. the db houses all the store's items and data. The app itself has an admin backend to manage that data and can change items. What i would like to be able to do, is have the ability to have the user be disconnected from the internet, and still have the admin backend work. I realize for this to work I need to use a client side framework with my models/routes/controllers/whatever. But what I am not sure of, is how to let the site function while offline. couchdb if installed locally can sync the data from local to remote when back online, and if the admin user is on the computer, i could have them install couch. but that could be messy.
Also, what if the admin user is on a tablet or a phone? Would I need to have an actual mobile app and a desktop app to do this? is there some way I can set this up so it is seamless the the end user. I would also like this to be offline for end users too, but the bigger audience is the admin.
Another use case, instore POS system. and the power goes out. But the POS system can be loaded from the web onto a tablet and they can still make card based sales if the wifi is out, because the app is available offline.
Im just not sure how to do this. lets assume i need a client framrwork that can handle the data as well as the backend. something like ember, or angular. theres also all in one stacks like meteor and derby js, but those arent fully offline,but are for the appearance of real time. though meteor does have mini mongo so it might be worth looking into.
I was hoping someone could help me figure out how I would get this setup to work, preferrably with couch, but other nosql's would work too if I can have a way to sync the data.
I'm not sure if it would work for you, but I have been thinking of such an application for quite a long time now and been doing some research on what's possible. The best solution I could come up with is using a server with a couchdb and writing the application clientside based. Then for the data storage use pouchdb and synchronize the pouchdb regularly with your serverside couchdb if the app is online. I know pouch is in an early stage and not production ready but if you are willing to put some work into it I'd say it's doable.
If you want clients that work seemless as they go offline and come online (like a POS with the power out) then I would recommend making the app primarily work off local storage with a background publishing or synchronization to the cloud.
Local storage options could be everything from something light like sqlite, sqlexpress, firebird to no sql options like mongo, couchdb etc...
But for the client or device, consider the ease of configuration and weight of the option. You also need to consider the type of clients - do you have many platforms varying from devices to PCs? You don't want something that has a heavy config and runtime footprint. That's fine on the service side.
On the service side, consider the nature of your data and whether it's fitted better for transactional/relational systems (banking etc...) or eventually consistent/non transactional (no-sql) documents. Don't forget hybrid as an option. Also consider the service platform - for example, node goes well with mongodb (json objects front to back) ...
The device and service storage options can be different (and likely should be) separate by service interfaces (soap, rest/http, sockets etc...).
It's hard to have a one size fits all solution but often something light weight like sqlite on the device or client makes for ease of installation/config while scalability on the service side with something like sqlserver/mysql or couchdb/mongodb makes sense.
Some links to read:
http://www.mongodb.org/display/DOCS/Comparing+Mongo+DB+and+Couch+DB
http://www.sqlite.org/
http://blogs.msdn.com/b/sqlexpress/archive/2011/07/12/introducing-localdb-a-better-sql-express.aspx
You're question is pretty wide open and there's no one size fits all solution. Hopefully I provided some options to think about.
There's an interesting project out there called AppJs (http://appjs.com/), which packages Node.JS and Chrominium as a desktop environment. It's currently very fresh (very little documentation), but it appears to be straight forward enough (you'll be using the same tools as you would for your online application).
As for synchronising the offline and online environments. I doubt you can rely on CouchDB in the way that you envisage. CouchDB mobile support is not as comprehensive as some of the documentation suggests. So in this sense, it would be no different to using SQL/Mongo/Punchcards.
You might have more luck with designing a suitable serialisation scheme based on XML or JSON (or just plain text), and passing files between the online and offline installations.
Edit - Since writing this, Node Webkit - http://nwjs.io/ - is clearly the most obvious replacement for App.js. It has a very simple API, and some great features.
Ok, here's the thing.
I have a good JS background, had my share of JS in the past, and have lots of cool bare-bones tools I take with me from project to project that act like a library.
I'm trying to formulate work with CouchDB.
Now, after getting used to luxury of cool tools that you wrote and simplify the language for you - I find it a little frustrating to write many things in bare-bones manner.
I'm looking for a way I can load to the database context a limited, highly efficient and generic set of tools that focus on the pure language and makes the work with the language much more groovy (and gosh, no, im not talking about jquery or any of the even more busty libraries out there).
If on top of that, there could be found a way where I can add to the execution context of the couchDB JS engine some of my own logic tools (BL model functions) - it would present a great and admirable power and make couchDB the new home for a JavaScript-er like me.
Maybe I'm aiming too low.
I'd be satisfied with a way I can allocate a set of extensions even for a specific database, and I don't mind do it for every database in separate. Or worse - to add it to every design document, so I can teach for example several views in the same design-doc what a Person is, what a Worker is, and use their methods to retrieve data from them according to logic in a reusably coded manner.
Can anybody point me the the way?
Whatever way you can point me - I'll be very verrry grateful.
If there are ways for all of these - then great.
Trust me to know the difference of what logic belongs to what layer...
You open my possibilities - I promise to use them :D
CouchDB now supports code sharing as CommonJS modules.
http://docs.couchbase.org/couchdb-release-1.1/index.html#couchdb-release-1.1-commonjs
http://caolanmcmahon.com/posts/commonjs_modules_in_couchdb
In this way, you can share your javascript modules between views, lists, and shows in the same design doc. (Server-side)
Also, you can load these modules on the browser side with this library:
https://github.com/couchapp/couchapp/blob/master/couchapp/templates/vendor/couchapp/_attachments/jquery.couch.app.js
You also might want to look at Kanso:
http://kansojs.org/
It does a really good job of making your javascript work seemless between the server and client.
You can find some helpful tools here : https://github.com/vivekpathak/casters
The running examples and test cases may particularly help you.
I curently have an application writen in php using the symfony framework. Rather than have seperate installs for customer on a hosted server, I would like to move to as SaaS model with one install for all customers posible running of google code or another cloud based service. I am not tied to PHP though i would like to have the benifits of a good framework.
So the chalenge: If all customers are using the same application we then have fin a way isolating each customers data. Customers do for eample have admin access and can manager their own users and privlages. At a simplistic leve you could just have a organisation identifier in each table take and add that to all database operations. However most application framewors use and ORM of some kind, and I have not been able to find one that will easly / seemlesly facinatate this at a leve the has minimum impact on the application code.
Has anyone looked at this, are there any good aproaches to this problem?
As Itay says, a multi-tenant system is a common requirement. A while back I was doing some research on this problem and came across a pretty good presentation on the different ways to handle this issue, and the pros and cons of each: http://aac2009.confreaks.com/06-feb-2009-14-30-writing-multi-tenant-applications-in-rails-guy-naor.html
This particular presentation is targeted to a Rails audience, but the principles are the same as with any language.
The approach you described is common, and PHP (One of the strengths) will allow you to comparatively easily go into the ORM code and modify it to your needs.
Second approach is to create a separate DB for each organization and a joint DB for shared resources.
A bit of a design challenge (but just a bit).
if you are really big, then you will even need to consider a separate DB server for each organization (I would say this is a serious overkill in 99.99999% of the cases).
This MSDN article gives you a very good overview of Data Architecture in Multi-tenancy: http://msdn.microsoft.com/en-us/library/aa479086.aspx