How to access data in an in-memory database? - groovy

I have created an in-memory database using H2 in groovy. I have also successfully added data in it. Now, I wanted to access the data in that database somewhere in my program like in a service, but I was not able to. Ive tried using the findAll(), getAll() methods, but nothing is returned, though the database has a content.
How could I fix this?
Please help. Thanks.

If you are using an h2 database in groovy, you'll probably want to access it via JDBC through the groovy.sql.Sql interface. For example:
#GrabConfig(systemClassLoader=true)
#Grab(group='com.h2database', module='h2', version='1.3.168')
import groovy.sql.Sql
def sql = Sql.newInstance("jdbc:h2:mem:db1", "sa", "sa", "org.h2.Driver")
println sql.rows("select * from MY_TABLE")

Related

Best practice to use django db connection in network call

I have two Django projects with DRF and have some Rest API's
I want to use db connection of one Django project to another so i need to pass db connection/credentials over api network. I know we can get connection object using following
from django.db import connections
db_conn = connections['default']
and what am thinking is
1- to pass above db_conn in an api call as parameter to second project api but when am getting on other side its showing as string not as db connection object.
2- Another option is to pass credentials and create connection using sql alchemy in second project.
So what is the best approach in above two scenarios?. Also it would be great if you can answer how to pass db connection object in api params as object not string
I dont think you can put the object as parameter in the api request. A string is the best result you will get. Besides i would not be very happy with that info in the api!
Option 2 is the better aproach i would think.

node and express - how to use fake data

It's been a while since I used node and express and I was sure that this was possible, but i'm having an issue of figuring it out now.
I have a simple postgres database with sequelize. I am building a back end and don't have a populated database yet. I want to be able to provide fake data to use to build the front end and to test with. Is there a way to populate a database when the node server is started? Maybe by reading a json file into the database?
I know that I could point to this fake data using a setting in the environment file, but I don't see how to read in the data on startup. Is there a way to create a local database, read in the data, and point to that?
You can use fake factory package, I think it can solve your problem.
https://www.npmjs.com/package/faker-factory
FakerJs provides that solution.
import { faker } from '#faker-js/faker';
const randomName = faker.name.findName();
const randomEmail = faker.internet.email();
with the above, you can run a loop for loop to be specific to create
the desired data you may need for your project.
Also, check on the free web-API that provides fake or real data to workon

DocumentDB Data migration Tool, can't migrate from db to db

I'm using DocumentDB Data Migration Tool to migrate a documentDB db to a newly created documentDB db. The connectionStrings verify say it is ok.
It doesn't work (no data transferred (=0) but not failure written in the log file (Failed = 0).
Here is what is done :
I've tried many things such as :
migrate / transfer a collection to a json file
migrate to partitionned / non partitionned documentdb db
for the target indexing policy I've taken the source indexing policy (json got from azure, documentdb db collection settings).
...
Actually nothing's working, but I have no error logs, maybe a problem of documentdb version ?
Thanx in advance for your help.
After debugging the solution from the tool's repo I figure the tools fail silently if you mistyped the database's name like I did.
DocumentDBClient just returns an empty async enumerator.
var database = await TryGetDatabase(databaseName, cancellation);
if (database == null)
return EmptyAsyncEnumerator<IReadOnlyDictionary<string, object>>.Instance;
I can import from an Azure Cosmos DB DocumentDB API collection using DocumentDB Data Migration tool.
Besides, based on my test, if the name of the collection that we specify for Source DocumentDB is not existing, no data will be transferred and no error logs is written.
Import result
Please make sure the source collection that you specified is existing. And if possible, you can try to create a new collection and import data from this new collection, and check if data can be transferred.
I've faced same problem and after some investigation found that internal document structure was changed. Therefor after migration with with tool documents are present but couldn't be found with data explorer (but with query explorer using select * they are visible)
I've migrated collection through mongo api using Mongichef
#fguigui: To help troubleshoot this, could you please re-rerun the same data migration operation using the command line option? Just launch dt.exe from the same folder as Data Migration Tool for syntax required. Then after you launch it with required parameters, please paste the output here and I'll take a look what's broken.

User specific database in MongoDB

I am currently working on an inventory management software in Node js and MongoDB. I am pretty new to MongoDB, having worked in Oracle and MySQL for most of my projects.
Is it possible to create a separate database schema for every client who uses my software, with each client having access only to his copy of the database schema and collections?
The equivalent of selecting data in Oracle database would be
Select * from User1.table,
Select * from User2.table etc
Also, if it were possible, how would it be implemented using a node js mongo db client like mongoose?
I looked at MongoDB documentation, but it talks mainly about adding users to a database for authorization.
I apologize if it seems like a silly question, but id appreciate it if someone could point me in the right direction for this.
Before starting to invest a lot of time in the development of your project, check out other possible approaches to the scenario that you are trying to build.
I did a quick search on SO and found some additional threads with similar scenarios:
MongoDB Database vs. Collection
MongoDB Web App - Database per User
Additional info about mongoose database creation
Whenever you call the connect method on the mongoose object, you are either connecting to an existing database or you are creating it in case it doesn't already exist.
You could have a function that allows you to pass in a name argument with the name and create databases programmatically:
function createDatabase(name) {
var conn_string = 'mongodb://localhost/';
if (typeof name == 'string') {
conn_string += name;
}else{
return false;
}
mongoose.connect(conn_string);
}
Also, be aware that a database will be created when you first insert a record in a collection of that particular database.
It is not sufficient to only connect to the database, you also have to insert a record.
As per my previous example, you could also pass a schema parameter to the function, tailored to each user's profile and fire an insert statement after you connect to that database.

azure table storage query

when I Post data and Query a Table with the database as: Dev datastorage (emulator) it works.
When I Post data Table with the data in Azure data base (have account) it works.
When I Get data from Table with the data in Azure data base (have account) it does not works.
In both the cases the code is the same.except the key and account credentials.
Is it I should do anything to Query ?
var query = azure.TableQuery
.select().from('dummytable').where('PartitionKey eq ?', key);
can any one suggest why Query is not working.
should there be anything else that need to be done
From Storage Explorer it works, I am able to see the entities.
only from the program I am not able to get the response. But in the same program "PUT"operation is working.
Was happening the same to me.. did an upgrade from the azure npm package 0.6.1 to 0.6.7 and now works, hope this helps.
I would look at the value that is on your partition key. There are some values that aren't on the list of invalid characters that Azure has issue with. For example before SDK 1.7 you could safely insert a % in a key, but if you queried for it specifically it wouldn't work. To test if this is the problem try running your query but without the filter and make sure your row is returned.
After reading the msdn mailing lists, I have upgraded the azure npm with the latest package 0.6.7 and it works. looks to be an issue with azure

Resources