I have a stored procedure I've created for postgres. The app I'm developing has backend done in node and I've got to upload the stored procedure to our dev server. By upload I just mean to persist that stored procedure in the dev server by executing the CREATE PROCEDURE... .
How is this process usually done? I mean, I could just copy the code and paste it on our dev's pgadmin and that would work. But, creating a migration file for that would also work. Also, executing a CREATE OR REPLACE PROCEDURE raw query inside the code itself would also work.
How is the process of uploading a stored procedure on a postgres running server usually done?
There's no definite consensus on this. I've written a console task to do that in a PHP application to apply it, and with hindsight I think that was the wrong choice.
I quite often make other non-structural changes, such as updating or deleting existing records, in a migration, and that is the method I would be inclined to use now.
Related
I'm trying to get my React/Node/Express app deployed to Azure and everything is working so far except I cannot write to the sqlite3 database. Whenever a function attempts to write the logs show the following error:
SQLITE_BUSY: database is locked
I suspect that the .sqlite file is read-only, but have no idea how to change this. Any help greatly appreaciated!
It appears that the issue here was the default way that Azure/VS Code deploys is into a read-only wwwroot folder. In the end I abandoned sqlite and converted everything to Azure SQL DB and working flawlessly, but was a fair bit of work to transition.
I have an ASP.NET MVC 5 project with database and external JSON file that is updated once a day, by third-party website.
What I'm trying to do is to update my DB once a day accordingly with the JSON (the precision is not the issue here).
Currently I'm using a button that call to Action that parsing the JSON and update the database, and I want to do it automatically.
As far as I understood, running the scheduled task from the MVC application is bad practice and risky, and running external dedicated service is preferred.
If I understood it correctly, I can make a console application that will parse the JSON and update the DB automatically, but I'm not sure if this console application can run on the windows server, and if so, how to do it (and I'm also not sure that this is really good idea).
So, I would be very happy if you can advise me here.
Thanks.
Finally the solution was to build a console application that parse JSON and updating the database.
Then, I used the built-in task scheduler in my hosting control panel to run the application (in my case the control panel is plesk)
I'm using Entity Framework with a code first model; I've got my InitialCreate migration setup and working locally, I can run code against my database context, and everything works.
But when I deploy my project to Azure, I just get a connection string error ("Format of the initialization string does not conform to specification starting at index 0.").
I can't seem to find where in the Publish dialog are the options to create the Azure database. -- Do I have to create the database separately and hook them up manually? -- If so, what exact process should I follow. Does the database need to have contents?
I thought Microsoft was making a big deal that this could all be done in a single deploy step, but that doesn't seem to be the case from my current experience.
When you publish your project in the publish dialog, there is an option for the code first migration in the Settings tab, it will automatically show your data context and it will give you the option to set the remote connection string, and this will add a section in web.config to specify the data context and the Migration class to run during the migration process.
It will also allow you to set if you want to run the code first Migration or not.
You can also take a backup from the dev and clear the data then upload it to Azure SQL DB, this way the code first data context will check at first connection and it will find the code an database the same
Liferay 6.1.1 CE GA2 works works fine-and-dandy on my local machine and my coworkers as well. When we deploy it to our main dev server, we find that we can't save any changes (i.e., create web content, change theme/schemes, etc).
Any idea why this is happening?
Cheers!
check the log output, this might give you hints on what's going wrong
If you get UI errors, this might help as well
check what database you're connecting with, make sure that the user account you use has write access to the database
In case you use hsql (don't do this unless on an unimportant demo server) make sure that the user your appserver runs as is able to write to the data directory and persist the database
If you're running on a cluster and the second machine doesn't pick up changes made through the first machine, you need to configure clustering properly. The Userguide at https://www.liferay.com/documentation/ has a chapter on this.
If nothing of this helps, please give more information
I am experiencing weird behavior when trying to call a stored procedure using subsonic from a website. I am getting "time out expired" error when I call the stored procedure using subsonic. If I execute the same stored procedure in sql server management studio, it runs instantly. I am not knowing what the problem is. I cannot step into the code because it is referenced as a dll. I am using version 2.1. Please let me know if you have any ideas.
Thanks,
sridhar.
It is my fault. I have a transaction open in the sql server management studio. Then I ran an update statement on a table. All the records affected by that update statement are locked because of transaction. it is working for some employees because those records are not locked. I ran the sp_who2 to determine the locked processes and figured out the problem.
Thanks,
sridhar.