Mapping contracts to ganache addresses - truffle

I am deploying a factory contract to ganache using truffle migrate.
In response to user events my factory deploys other contracts. All the contracts are in the truffle-config.js I associate with ganache and are visible in the workspace
However, I only actually deploy the factory using truffle migrate, I never use that to deploy the other contracts. As a result ganache doesn't seem to know about any of the other contracts or events in my project.
The newly deployed contracts are visible in ganache as addresses, but are not recognized as being of the types in my project.
Is there a way to map contracts that are deployed by my factory to addresses that ganache knows about? Also events?
Do I have to deploy all the contracts in my project to ganache once?

Actually your Factory contract doesn't deploy other contracts just using them to create an instance (like creating instance in OOP). When you deploy the Factory contract it also deploys other contracts' code cause you imported them top of the Factory contract.
In addition, you can see the contracts in Ganache on Contract tab if you link your truffle project by specifying truffle-config.js file.
Updated Answer:
These answers explain how factory pattern works:
https://ethereum.stackexchange.com/a/45918/77376
https://ethereum.stackexchange.com/a/14012/77376

Related

Entity Framework Migrations running in Azure

I am moving an application to Azure and I am not sure where I can run my entity framework migrations. We are deploying from github, but I can't run the migrations from the runner due to security policies outside of my control. I need the migrations to run in Azure.
I already have a command line program that runs the migrations, but I am not opposed to running them with the dotnet command (dotnet ef database update).
This is something I only need to run once on each deployment.
I got it to work with a container instance, but that doesn't support using KeyVault references for environment variables. My deployment can't have any secrets and I need to use a KeyVault reference for the connection string.
I thought about a timer-triggered function, but the deployment needs to know that it completed successfully.
What is a good cloud native solution for something that only needs on deployment?
Thanks # Albert Starreveld for the doc.
There are different approaches on EF migrations running in Azure
Code first is one approach
For this approach a Data Model need to be created
*For the below address class a table is created in database and for the person as well with a unique identifier *
Need to use the below NuGet packages to the project
Microsoft.EntityFrameworkCore
Microsoft.EntityFrameworkCore.SqlServer
Microsoft.EntityFrameworkCore.Design
To communicate with the dtabase and model classes
we use the DbContext class, which is a base class provided by EntityFrameworkCore
To run the below command
you need to install the dotnet-ef tool, and run the below command based on your .net framework version
.NET Core 3: dotnet tool install --global dotnet-ef --version 3.1
And database updates can be done by using teh below command
dotnet ef database update
Updating the database in an Azure DevOps pipeline
for Azure DevOps, there is deployment task for SQL databases. And this task is compatible with the EntityFrameworkCore migrations.
By executing the ‘ef migrations script’ command, a SQL file will be generated that can be used by the SqlAzureDacpacDeployment#1 task in Azure pipelines.
for more information, please check the link1 and link2

Can you deploy a smart contract from a webpage?

I am trying to deploy a smart contract from a webpage. I am using react for frontend and I wrote a smart contract in solidity, but the contract is deployed only when I run the truffle commands in terminal. I want to deploy the contract when the user clicks a button.
How can I do that?
You can do a deployment transaction from frontend using web3.js or ethers.js library. How to deploy a contract using these libraries, please refer to the documentation of deploy function.

Deployment deep learning system with some models with MLaaS

I read some articles with deployment examples and they were about deploying one model but not a whole deep learning system.
If I want to deploy my project including launch of multiple deep models built with different frameworks (Pytorch, tensorflow) then what's good option for that:
build Docker image with whole project and deploy it with ml
service (azure, aws lambda etc);
or deploy every single model with
chosen MLaaS and and elsewhere deploy the logic that makes requests
to the above models;
I would appreciate any reference/link on the subject.
Thanx.
We have public open source release of Many Models solution accelerator. The accelerator is now available on GitHub and open to everyone: Many Models: https://aka.ms/many-models.
• Check out a blog on Many Models from MTC AI Architect Sam here
Check this document using designer: https://learn.microsoft.com/en-us/azure/machine-learning/how-to-deploy-model-designer? Basically you can register a trained model in Designer bring it out with SDK/CLI to deploy it.
One approach with current integration between Azure ML and Azure DevOps is to setup a release pipeline in Azure DevOps which is triggered by the model registration in your Dev workspace model registry which them deploys to your Prod workspace.
There is guidance and examples in this repo
https://github.com/Microsoft/MLOpsPython
And more general guidance for MLops at http://aka.ms/mlops
This also allows for putting integration tests into your release process or other steps like approval processes if needed using DevOps functionality.

Machine learning in Azure: How do I publish a pipeline to the workspace once I've already built it in Python using the SDK?

I don't know where else to ask this question so would appreciate any help or feedback. I've been reading the SDK documentation for azure machine learning service (in particular azureml.core). There's a class called Pipeline that has methdods validate() and publish(). Here are the docs for this:
https://learn.microsoft.com/en-us/python/api/azureml-pipeline-core/azureml.pipeline.core.pipeline.pipeline?view=azure-ml-py
When I call validate(), everything validates and I call publish but it seems to only create an API endpoint in the workspace, it doesn't register my pipeline under Pipelines and there's obviously nothing in the designer.
My question: I want to publish my pipeline so I just have to launch from the workspace with one click. I've built it already using the SDK (Python code). I don't want to work with an API. Is there any way to do this or would I have to rebuild the entire pipeline using the designer (drag and drop)?
Totally empathize with your confusion. Our team has been working with Azure ML pipelines for quite some time but PublishedPipelines still confused me initially because:
what the SDK calls a PublishedPipeline is called as a Pipeline Endpoint in the Studio UI, and
it is semi-related to Dataset and Model's .register() method, but fundamentally different.
TL;DR: all Pipeline.publish() does is create an endpoint that you can use to:
schedule and version Pipelines, and
re-run the pipeline from other services via a REST API call (e.g. via Azure Data Factory).
You can see PublishedPipelines in the Studio UI in two places:
Pipelines page :: Pipeline Endpoints tab
Endpoints page :: Pipeline Endpoints tab

Endpoint name in project deployed on Azure Service Fabric

I'm using Service Fabric as a container for deploying existing executables.
I intend to spawn a listener on the endpoint configured at deployment time, is it possible to get the endpoint settings somehow from the context? I know that using the Stateful/stateless/actor boilerplate type of projects allow the retrieval of CodePackageActivationContext, but how about a basic console project deployed as an exe?
Thanks
You should be able to retrieve the activation context using FabricRuntime.GetActivationContext()

Resources