Is there a way to copy an activity to another database? - brightway

I would like to create a copy of an activity and move it in another database.
I can see two solutions:
Option 1 - Use the method .copy() of my activity and then to change the database of the copy. The problem is that I cannot figure how to move an activity from one database to another or even if it is possible.
Option 2 - Re-create an activity in the destination database that contains the data of the activity to copy. Although it is not difficult, I find this option way less elegant.
Below is an implementation of option 2.
def copy_activity(activity, target_database=None, target_name=None):
"""
Copy an activity to another database
Args:
activity: Activity to copy
target_database (str):
Name of the database in which to copy the activity default is the database of the activity
target_name (str): Name of the copy, default is the name of the activity
"""
if (target_database is None) and (target_name is None):
raise ValueError('You must specify at least a target database or a target name')
db_name = target_database or activity.key[0]
db = bw.Database(db_name)
activity_copy = db.new_activity(activity.key[1])
for attribute in activity:
activity_copy[attribute] = activity[attribute]
for exchange in activity.exchanges():
if exchange.input == exchange.output:
activity_copy.new_exchange(input=activity_copy, output=activity_copy)
else:
activity_copy.new_exchange(input=exchange.input, output=activity_copy)
activity_copy.save()

You can change the database of an activity with my_activity['database'] = 'something'. You don't need to save afterwards.
Functionality in the source code here, tests here.

Related

How to set up output path while copying data from Azure Cosmos DB to ADLS Gen 2 via Azure Data Factory

I have a cosmos DB collection in the following format:
{
"deviceid": "xxx",
"partitionKey": "key1",
.....
"_ts": 1544583745
}
I'm using Azure Data Factory to copy data from Cosmos DB to ADLS Gen 2. If I copy using a copy activity, it is quite straightforward. However, my main concern is the output path in ADLS Gen 2. Our requirements state that we need to have the output path in a specific format. Here is a sample of the requirement:
outerfolder/version/code/deviceid/year/month/day
Now since deviceid, year, month, day are all in the payload itself I can't find a way to use them except create a lookup activity and use the output of the lookup activity in the copy activity.
And this is how I set the ouput folder using the dataset property:
I'm using SQL API on Cosmos DB to query the data.
Is there a better way I can achieve this?
I think that your way works, but its not the cleanest. What I'd do is create a different variable inside the pipeline for each one: version, code, deviceid, etc. Then, after the lookup you can assign the variables, and finally do the copy activity referencing the pipeline variables.
It may look kind of redundant, but think of someone (or you 2 years from now) having to modify the pipeline and if you are not around (or have forgotten), this way makes it clear how it works, and what you should modify.
Hope this helped!!

Navigating from Location to Workorder

I need to :
1. Create a single page location application
2. Display all the asset present in the selected location in a table
3. Provide a button from which user can navigate to WOTRACK to view all the workorder(s) created on selected location and its asset(s).
I am facing difficulty in the 3rd one. I have tried Launch in Context and it is working fine except am not able to pass sql query like 'location={location} and assetnum in ({asset.assetnum})'. I need to filter workorders with particular location and all its assets.
I tried to save all the assets in the location to a Non-persistant attribute and passing the values of the attribute in the Launch in context url, Its working as expected but to do so I have written a script on 'Initialize value' which is causing performance issues.
script goes like this:
from psdi.server import MXServer;
from psdi.mbo import MboConstants;
if app == "LOCATION1" :
if mbo.getString("LOCATION") is not None:
Locsite = mbo.getString("SITEID")
desc = mbo.getString("DESCRIPTION")
MaxuserSet = MXServer.getMXServer().getMboSet("MAXUSER", mbo.getUserInfo())
MaxuserSet.setWhere(" userid='"+user+"' ")
MaxuserSet.reset()
UserSite = MaxuserSet.getMbo(0).getString("DEFSITE")
if Locsite == UserSite:
AssetSet = mbo.getMboSet("ASSET")
AssetSet.setFlag(MboConstants.DISCARDABLE, True);
if not AssetSet.isEmpty():
AssetList = ""
AssetMbo = AssetSet.moveFirst()
while AssetMbo is not None:
AssetList = AssetList + str(AssetMbo.getString("ASSETNUM")) + "%2C"
AssetMbo = AssetSet.moveNext()
mbo.setValue("non-persitant",str(AssetList),11L)
and in the LIC url i have given : 'http://xx.x.x.xx/maximo/ui/?event=loadapp&value=wotrack&tabid=List&additionalevent=useqbe&additionaleventvalue=location={LOCATION}|assetnum={non-persistant}'
Is there any other feasible solution to the requirement?
Thanks in Advance
Launch In Context is better used for sending the user to an outside-of-Maximo application and passing along some data from inside-Maximo to provide context in that external app.
What you are doing sounds like a good place to use a workflow process with an Interaction node. The developer tells the Interaction node which app to take the user to and which Relationship to use to find the data the user should work with there.
Why don't you add a table control inside the table details (expanded table row) and show a list of the work orders there. From the WONUM in that table, you could have an app link to take them to WOTRACK, if they want more details about a particular work order. No customization (automation scripting) needed. No workflow needed. Nice and simple.

Azure Data factory copy activity failed mapping strings (from csv) to Azure SQL table sink uniqueidentifier field

I have an Azure data factory (DF) pipeline that consists a Copy activity. The Copy activity uses HTTP connector as source to invoke a REST end-point and returns csv stream that sinks with Azure SQL Database table.
The Copy fails when CSV contains strings (such as 40f52caf-e616-4321-8ea3-12ea3cbc54e9) which are mapped to an uniqueIdentifier field in target table with error message The given value of type String from the data source cannot be converted to type uniqueidentifier of the specified target column.
I have tried to wrapped the source string with {} such as {40f52caf-e616-4321-8ea3-12ea3cbc54e9} with no success.
The Copy activity will work if I modified the target table field from uniqueIdentifier to nvarchar(100).
I reproduce your issue on my side.
The reason is data types of source and sink are dismatch.You could check the Data type mapping for SQL server.
Your source data type is string which is mapped to nvarchar or varchar, and uniqueidentifier in sql database needs GUID type in azure data factory.
So,please configure sql server stored procedure in your sql server sink as a workaround.
Please follow the steps from this doc:
Step 1: Configure your Sink dataset:
Step 2: Configure Sink section in copy activity as follows:
Step 3: In your database, define the table type with the same name as sqlWriterTableType. Notice that the schema of the table type should be same as the schema returned by your input data.
CREATE TYPE [dbo].[CsvType] AS TABLE(
[ID] [varchar](256) NOT NULL
)
Step 4: In your database, define the stored procedure with the same name as SqlWriterStoredProcedureName. It handles input data from your specified source, and merge into the output table. Notice that the parameter name of the stored procedure should be the same as the "tableName" defined in dataset.
Create PROCEDURE convertCsv #ctest [dbo].[CsvType] READONLY
AS
BEGIN
MERGE [dbo].[adf] AS target
USING #ctest AS source
ON (1=1)
WHEN NOT MATCHED THEN
INSERT (id)
VALUES (convert(uniqueidentifier,source.ID));
END
Output:
Hope it helps you.Any concern,please free feel to let me know.
There is a way to fix guid conversion into uniqueidentifier SQL column type properly via JSON configuration.
Edit the Copy Activity via Code {} button in top right toolbar.
Put:
"translator": {
"type": "TabularTranslator",
"typeConversion": true
}
into typeProperties block of the Copy activity. This will also work if Mapping schema is unspecified / dynamic.

Brightway2: Modifying/deleting exchanges from activity without using activity as dict

I would like to modify an activity's exchanges and save the activity back to the database.
It is possible to change other aspects of the activity, like its name:
some_act['name'] = "some new name"
and then save the activity with:
some_act.save()
It is also possible to modify exchanges the same way:
some_exc['scale"] = 0.5
and then save the exchange with:
some_exc.save()
However, the only way I have found to add/delete exchanges from a specific activity is to go through the dictionary version of the activity:
some_act_dataset = some_act._data
some_act_dataset['exchanges'] = [{exchange1}, {exchange2}] # exc must be valid exchange dict
The problem is that I don't know how to save the new activity (as dict) back to the database.
some_act_dataset.save() doesn't work, since dictionaries don't have a save method.
Database("my_database").write(some_act_dataset)overwrites all the other data in the database.
I could work in the loaded database:
loaded_db = Database("my_database").load()
and make the changes I need in the resulting dictionary, and then write the whole database, but when the databases are big, this seems like a costly operation.
So, the question is: is there a way to modify an activity's exchanges and save the activity back to the database without needing to overwrite the entire database?
Actiities and exchanges are stored in separate tables in the SQLite database, and they each have their own object. In the journey to and from the database, several translation layers are used:
However, we almost always work with Activity or Exchange objects. The key point here is that because activities and exchanges are two separate tables, they have to be treated separately.
To create a new exchange, use Activity.new_exchange():
In [1] from brightway2 import *
In [2]: act = Database("something").random()
In [3]: exc = act.new_exchange()
In [4]: type(exc)
Out[4]: bw2data.backends.peewee.proxies.Exchange
You can also specify data attributes in the new_exchange method call:
In [5]: exc = act.new_exchange(amount=1)
In [6]: exc['amount']
Out[6]: 1
To delete an Exchange, call Exchange.delete(). If you are doing a lot of data manipulation, you can either execute SQL directly against the database, or write peewee queries with ActivityDataset or ExchangeDataset (see e.g. the queries built in the construction of an Exchanges object).

Rename Object not supported in Azure SQL Data Warehouse?

[Posting question from customer on an internal thread]
I tried to run the following commands in SQL DW:
RENAME OBJECT dbo.test TO test2
RENAME OBJECT test TO test2
Both failed with the following error:
No item by the name of '[DemoDB].[dbo].[test]' could be found in the current database 'DemoDB', given that #itemtype was input as '(null)'.
Is this a defect or is there a workaround that I can use?
RENAME is now supported. In order to use rename object you must prefix the table you want to change with the schema name like this:
RENAME OBJECT x.T_New TO T;
Notice that there is no schema qualification on the target. This is because the renamed object must continue to reside inside the same schema. To transfer a table from one schema to another you need to use the following command:
ALTER SCHEMA dbo TRANSFER OBJECT::x.T_NEW;
RENAME is now supported. In order to use rename object you must prefix the table you want to change with the schema name like this:
RENAME OBJECT x.T_New TO T;
Notice that there is no schema qualification on the target. This is because the renamed object must continue to reside inside the same schema. To transfer a table from one schema to another you need to use the following command:
ALTER SCHEMA dbo TRANSFER OBJECT::x.T_NEW;
In case someone else is looking at the time. It posible now in Azure Synapse Analytics, formely Azure SQL Datawarehouse; you can go with:
ALTER DATABASE AdventureWorks2012
MODIFY NAME = Northwind;

Resources