What are different methods to import/export client data in Openbravo? - openbravo

I'd like to know if there are different ways to import / export client data in Openbravo. The built-in import client doesn't work well when we have a client with large data.

There are three different ways you can bring in the client into the new system.
1. Binary Database Dump
`* pg_dump dbname > outfile`
`* psql -U postgres -f outfile dbname `
2. Using Import/Export UI provided by Openbravo (not for production)
This client export should only be used for smaller datasets and for
development purposes. It should not be used in production
environments with large amounts of data since its XML format is not
designed for such a purpose.
However, for production purposes, a binary dump of the database
should be used.
3.ant install.source (The export client UI will be used here.)
In order to be automatically imported the clients data, the xml files should be moved from referencedata/importclient to referencedata/sampledata
Click here for more information
hope this helps you !..

Related

CosmosDB return data from external API on read

I am attempting to write an Azure CosmosDB integration (Core SQL Api) that integrates with an external service to provide some of the query data. As an example, I need a query made on Cosmos DB to convert some of the data returned (e.g. ID's) by the query into real data by calling an external service via a REST API. This should only happen when querying certain columns.
I initially investigated using a JS stored procedure and/or a UDF to make this external call, but the JS environment seems to be extremely limited and doesn't provide any way to make external calls. I then tried using this https://github.com/Oblarg/cosmosdb-storedprocs-ts repository, which uses webpack to bundle all of node.js into the stored procedure, allowing node modules to be used in stored procedures. Whilst this does allow some node modules to be used, whenever I try and use "https", "fetch", or "axios" modules to make an HTTP GET request I get errors (the same code works fine in a normal node environment, but I'm not a JS expert and can't seem to work past these errors). After a day of attempts it seems like the stored procedure approach is not possible.
Is this the case or is there some way of making HTTP GET requests from a JS stored procedure? If not possible with stored procedures, are there any other techniques to achieve the requirement of reading data from a remote API when querying cosmos DB?
Thanks
There is no way to achieve this from CosmosDB directly, for queries you also cannot use the change feed as the document dont change, so really your only option is to use a function or some preprocessor app to handle it, as you say its not ideal but there is no other solution here. If it was an insert or an update then change feed would allow you to do this but for plain queries its not possible.

How to query small client-side database in Node + Express

I am working on a simple Node/Express weather app in which the user can type in the name of a city and the program will query a weather server using the geographic coordinates of that city.
These coordinates are stored in a simple CSV file (city, country, longitude, latitude) which I have converted to an SQL database using SQLite.
My question is: Is there a way to allow my App to load and query a DB locally (client-side) without resorting to a dedicated DB server?
The reason I ask is because the DB is quite small (less than one MB) and it seems like overkill to use a dynamic server for this use-case. I tried finding a solution online, but couldn't come up with anything conclusive.
Just convert your csv file to a json, then put it in your local project and make sure to export it with module.exports, after that you just parse it and use it.

how to use different database with same table structure in typed dataset xsd

I'm confused how to explain myself well here is my scenario.
I have a database named "database1" and i have used typed dataset in visual studio and have added more than 200 stored procedures in the table adapters.
it is a desktop based application. now i want to deployee the same software in other school same database but i have changed the database name
when i generate new database from query and write all the stored procedures in the database and change the database name in the connection string it doesn't work.
I'd recommend you don't change the database name, or if it's something specific to the first client (like ParisTechnicalCollegeDatabase), change it now to something generic (SchoolManager) so you never have to change it again. There's no specific problem I can think of regards reusing a typed dataSet on a different database: I do it daily on database that aren't clones of each other. Ensure your second server is set up with a user and default schema that is specified in the connection string. The problem will eithe per be faulty connection string or incorrect database setup, not the fault of the dataSet
For more targeted help, post up the error messages that appear when you try to run your app on the new database

Real-Time Database Messaging

We've got an application in Django running against a PGSQL database. One of the functions we've grown to support is real-time messaging to our UI when data is updated in the backend DB.
So... for example we show the contents of a customer table in our UI, as records are added/removed/updated from the backend customer DB table we echo those updates to our UI in real-time via some redis/socket.io/node.js magic.
Currently we've rolled our own solution for this entire thing using overloaded save() methods on the Django table models. That actually works pretty well for our current functions but as tables continue to grow into GB's of data, it is starting to slow down on some larger tables as our engine digs through the current 'subscribed' UI's and messages out appropriately which updates are needed as which clients.
Curious what other options might exist here. I believe MongoDB and other no-sql type engines support some constructs like this out of the box but I'm not finding an exact hit when Googling for better solutions.
Currently we've rolled our own solution for this entire thing using
overloaded save() methods on the Django table models.
Instead of working on the app level you might want to work on the lower, database level.
Add a PostgreSQL trigger after row insertion, and use pg_notify to notify external apps of the change.
Then in NodeJS:
var PGPubsub = require('pg-pubsub');
var pubsubInstance = new PGPubsub('postgres://username#localhost/tablename');
pubsubInstance.addChannel('channelName', function (channelPayload) {
// Handle the notification and its payload
// If the payload was JSON it has already been parsed for you
});
See that and that.
And you will be able to to the same in Python https://pypi.python.org/pypi/pgpubsub/0.0.2.
Finally, you might want to use data-partitioning in PostgreSQL. Long story short, PostgreSQL has already everything you need :)

jmeter auto create user/password account

I'm trying to use jmeter to simulate 500 usernames/passwords being created on a test site I have. The home page has 3 fields, username,email address, and password. How can I get jmeter to auto-fill those fields?
The next question is can jmeter then go to the next page and fill in credit information for example?
One thing to note here is JMeter is not like QTP / Selenium. It is not a pure functional testing tool.
However, It can be used for functional testing when you know how to use it!
For your question,
Record the http requests for creating the user and entering the credit information. Check this for more information. http://jmeter.apache.org/usermanual/jmeter_proxy_step_by_step.pdf
Then update the recorded scripts to parameterize the username, password email etc
Then you can update the loop count to rerun it again and again to create the data you want to create
JMeter is an excellent tool for performance testing, functional testing and for creating test data etc.
JMeter has possibility to use either external pre-defined data or generate some random values.
To use existing username/password/email combinations there are following options:
CSV Data Set Config- to read information from CSV files
JDBC PreProcessor - to fetch information from any database which supports JDBC protocol
StringFromFile - to read a string from file
CSVRead - similar to CSV Data Set Config
RandomString - to generate a random string
In regards to "go to the next page", it is also possible given that you have a HTTP Cookie Manager
Remember that JMeter acts on protocol level so you'll need to properly construct HTTP Requests.
The best way to trace execution and visualize requests/responses is using View Results Tree listener.

Resources