Service builder doesn't delete table in DB - liferay

Hi all,
I'm using LR 6.1.
I created a new entity Called " Recommendation" in service.xml then a run BuildServices with Ant to generate the service builder Files.
Now I want to delete the Entity. So I Deleted if from service.xml, but nothing happens in Data base and the other files: The Database Table is here and the other generates File.
any idea?
Thanks a lot.

Liferay will never ever delete a table which has been in your service.xml but is now deleted. Also it will not delete any of the classes generated by a build-service command. For example it could be the case, that you refactored your portlet to 2 different projects. On a deploy, you don't want to lose all your data, only because the table is now defined in a different service.xml. So don't see this as a bug, more a security feature. If you want to delete the table, you have to issue a drop command on your database, and manually delete the Service files generated by Liferay Service Builder.

Related

Jhipster Elasticsearch org.elasticsearch.indices.IndexMissingException

I am trying out jhipster and learning the technology stack.
Environment:
Database:
Orcale (both prod and dev)
Elasticsearch
Windows
I created a new jhipster project and copied some external generated entities into the domain folder.
Then wrote a parser that generates the [Entity].json file in the .jhipster folder.
I ran the entity subgenerator using this json file which asks me to overwrite the existing entity file(which I copied from external project).
I select no and then the generator generates the CRUD html/js files.
When I run the application, it can save/edit data correctly.
But when I search, I get IndexMissingException.
I checked the target folder and found that target/elasticsearch/data does not contain any index for this entity.
I am not very familiar with elasticsearch and would like to know if there is any workaround for this IndexMissingException
There are a few ways to solve this.
You can simply delete your target folder while the application isn't running, then rerun it. This will regenerate the indexes for all of your entities, but because Elasticsearch is essentially a data store, you will lose all data from it so it is not appropriate for a production environment.
I have created a Yeoman generator that will generate a service to reinsert all data from your main datastore into your elasticsearch indexes. This can help resolve the data issue from the first solution. It will also programmatically delete and recreate your indexes, so it can be used to solve your problem directly.
You can use the Create Index API while the server is running. This is important for a production environment where the data in your index is important to keep.

Recover Deleted Data Table Azure Mob Service

I accidentally deleted a table from Data of Mobile Service. Is there any way I can recover it?
I used the default free database given with making a mobile service. I really do not care about the data in table, instead I really want the scripts than ran on it.
.........................................
In order to retrieve the data I did the following:
Cloned the mobile service, reverted it to a previous commit, copied the deleted table and its script files, pulled again from the server, added the table and the script files where they should be, added the files to git tracking index, pushed the commit to master
Now the files are there in the azure mobile service, but the table is not being displayed in the GUI.
I tried to restart the azure mobile service but still it is not there.
In order to confirm the table and its files were indeed there I even cloned the mob service again and this time in the table folder I had users.json and its script files, but sadly they are not visible in azure portal
To get the table to show again in the UI, you need to use the portal create table command. It will basically noop if it detects the table already exists in SQL. I don't believe it will touch your table scripts, however it may override the .json permissions file.
If it does override the js files, then after creating the table through the UI you can revert the commit that changed the json/js table files as part of that process.
At that point you should be good again.

DACPAC not including composite objects

I have a VS2012 database project which includes tables, stored procs, views etc.
I then have a second database project which contains a database reference to the first project. I have ensured that the "Include composite objects" options is selected from the "Project Properties -> Debug -> Advanced" menu.
When I build the second project and take the resulting DACPAC file and deploy it through SSMS2012 it doesn't create the firsts project's objects.
Am I missing something? Why don't the composite objects get included in the DACPAC?
As Peter Schott said in the comments above, I needed to deploy both outputted Dacpacs in order, with the referenced database first.
You cannot do this through SSMS, it won't allow you to deploy two different Dacpacs to the same database. But it is not a problem through SQLPackage.
I had the same problem when publishing dacpac with dacfx. As answered here, I had to activate the "Include composite objects" option. But that's not all. When you compile your project, Visual Studio generates two dacpac files in your "second" project. You must put both files in the same directory when you deploy the second project.
Hope that helps, for me it did ;-)

Did a successful Core Data migration but the existing store was deleted

I have a CoreData store managed with MagicalRecord. I did a successful migratiion, but lost the data in the newly created store. This is what I have:
salonbookV1.0 is the original xcdatamodel for the initial store. I added only new attributes to an existing entity, and the mappingmodel looks like this: (a partial image).
Let me elaborate on what I did...
created the xcdatamodeld folder with both xcdatamodel's in it
marked the salonbookV1.0 as the current version and ran the app creating some entries
stopped the app, and marked salonbookV1.5 as the current version and ran the app
data which was entered previously was gone! (apparently the migration did not occur)?
The migration was accomplished; I know that because I can use the new attributes. However, the existing CD store was deleted. I have read all I can on MR, and there is only one method that deals with migration; MR does the rest without any coding from me.
So the question remains: why is the existing store being deleted?
I don't know about MR but in "normal" Core Data you have to set the NSMigratePersistentStoresAutomaticallyOption to the persistent store, otherwise it will not migrate your existing data to the new store version.

Some default objects of core data entities

Here I have some entities in core data, for example, "Account",
and I wish that, when user enter my app for the first time, there is some thing in Account,
to make it more clear, maybe I should say I want to give some default managed-objects for an entity so that they will be there when the app is just installed.
How can I achieve this?
Thanks a lot!
I have a similar requirement for an app I'm working on. I am using a sqlite persistence store for my data, so I basically want to pre-populate that table with the data for my default entities.
To do this, I have created a separate project in my Xcode workspace that shares the same data model with my app. In this separate project I write the code I need to insert entities into the table, and to store the file in a well-known place on my Mac.
Then, I copy the sqlite file that my initializer app has created into the resources directory for my "real" app. As part of the startup for that app, I wrote a small bit of code that copies the default DB from the resources to my app's documents folder if a copy doesn't already exist there.
The end result is this: I can run my initializer app to populate the default data as I need to. I then copy the file into my real app, so when the app is first run there is a default copy of that DB ready to go.

Resources