Adding a new Dashboard in OpenStack Horizon - openstack-horizon

I am trying to create a new dashboard but whenever I do so, I get no changes in the OpenStack dashboard, and the dashboard doesn't get added to it.
from django.utils.translation import ugettext_lazy as_
import horizon
class BasePanelGroup(horizon.Dashboard):
name = _("Overview")
slug = "overview"
panels = ("hypervisors",)
class Chargeback(horizon.Dashboard):
name = _("Chargeback")
slug = "chargeback"
panels = ('BasePanelGroup',)
default_panel = 'hypervisor'
permissions = ('openstack.roles.admin',)
horizon.register(chargeback)
From this link I have even tried to add the file _50_chargeback.py with the given details in it and now my existing OpenStack dashboard doesn't show up.

Try to see the error log file generated by the apache server. Also add the panel group configuration file in your enabled local settings. I hope this will help you !

Related

Accessing GraphDB with RDF4J with an specific user

I'm using RDF4J to add RDF triples to a completely open (Security off) GraphDB instance. I use the RemoteRepositoryManager and it works fine:
RepositoryManager repositoryManager = new RemoteRepositoryManager(GraphDBInstanceURL);
Repository repository = repositoryManager.getRepository(graphDBrepoName);
RepositoryConnection repositoryConnection = repository.getConnection();
Now we need to add security to GraphDB, but it is not clear to me how to add the specific GraphDB user credentials in the above code. Any pointers are wellcome, thanks

Terraform - Azure - Extract API from one resource group and import into another resource group

I have 5 different APIs in my Dev environment. This environment was built manually.
However, for the subsequent environments like Test, Pre-Prod, etc.. Terraform is being used.
Since I need to create each of the APIs in the subsequent environments, am extracting each of these APIs as a JSON file, making minor tweaks to the API URLs and importing it into the new environments.
The following is the process that am doing right now.
Went to Resource groups in Azure
Then under API Management service > APIs, clicked on the necessary API
Now, clicked on the three dots next to the API that I need and clicked on Export
Selected OpenAPI v3 (JSON) format
Now, I'm using the extracted JSON file and using the Terraform code below to add it to the APIM
resource "azurerm_api_management_api" "example" {
name = "example-api"
resource_group_name = azurerm_resource_group.example.name
api_management_name = azurerm_api_management.example.name
revision = "1"
display_name = "Example API"
path = "api/path"
protocols = ["https"]
service_url = "https://actualURL-of-the-API"
import {
content_format = "openapi+json"
content_value = file("extracted-filename.json")
}
}
The issue here is:
Even though the API gets added to the APIM, this doesn't create all the data - like Webservice URL, Backend HTTP(s) endpoint
How do I go about doing this?
Are you locked into exporting the Json file and importing it on the other environments through Terraform?
The reason I ask is because I attempted something similar but decided to go another route.
Initially I created the API manually in a Dev environment. I then re-created the same API from the ground up using only Terraform. No Json export & import.
I then used that Terraform script to create my other environments.
That allowed me to bypass the import problem altogether since nothing is imported.
I have found that there are downsides to taking this approach; It is much less intuitive to author the API through the Terraform script than through the Azure GUI. This is therefore more time consuming. Especially since my initial API was discarded for the one generated with the Terraform script.
Additionally, I have had problems with Terraform diffs reporting example changes when there are none (I suspect the same problem is to be had when using the import method).
If you are wondering why I decided to go another route? The reason was twofold; Firstly, similar to you, I had trouble with getting the export/import to generate the API that I wanted. Secondly, I prefer not to rely on auto generated files.

How to update the existing web service with a new docker image on Azure Machine Learning Services?

I am currently working on machine learning project with Azure Machine Learning Services. But I found the problem that I can't update a new docker image to the existing web service (I want to same url as running we service).
I have read the documentation but it doesn't really tell me how to update (documentation link: https://learn.microsoft.com/en-us/azure/machine-learning/service/how-to-deploy-and-where).
The documentation said that we have to use update() with image = new-image.
from azureml.core.webservice import Webservice
service_name = 'aci-mnist-3
# Retrieve existing service
service = Webservice(name = service_name, workspace = ws)
# Update the image used by the service
service.update(image = new-image)
print(service.state)
But the new-image isn't described where it comes from.
Does anyone know how to figure out this problem?
Thank you
The documentation could be a little more clear on this part, I agree. The new-image is an image object that you should pass into the update() function. If you just created the image you might already have the object in a variable, then just pass it. If not, then you can obtain it from your workspace using
from azureml.core.image.image import Image
new_image = Image(ws, image_name)
where ws is your workspace object and image_name is a string with the name of the image you want to obtain. Then you go on calling update() as
from azureml.core.webservice import Webservice
service_name = 'aci-mnist-3'
# Retrieve existing service
service = Webservice(name = service_name, workspace = ws)
# Update the image used by the service
service.update(image = new_image) # Note that dash isn't supported in variable names
print(service.state)
You can find more information in the SDK documentation
EDIT:
Both the Image and the Webservice classes above are abstract parent classes.
For the Image object, you should really use one of these classes, depending on your case:
ContainerImage
UnknownImage
(see Image package in the documentation).
For the Webservice object, you should use one of these classes, depending on your case:
AciWebservice
AksWebservice
UnknownWebservice
(see Webservice package in the documentation).

Register/ Login/ Membership module in Orchard

I can't figure out how to add Register/Login functionality to a site in Orchard. Is there a Membership module or some configuration I need to enable?
EDIT: What I had in mind were modules along the lines of these that extend the existing User model with registration/profile functionality:
Extended Registration module: http://extendedregistration.codeplex.com/
Orchard Profile module: http://orchardprofile.codeplex.com/
It's under settings/users in the admin ui.
In the Dashboard scroll down to Settings and select Users.
Make sure "Users can create new accounts on the site" is checked and click "Save".
Once this is done log out.
Then click log in, and bellow your username and password field there will be a small text with a blue link to Register.
You don't actually need the extended registration and profile for this. Those are for adding additional information to the registration form.
This can also be done programmatically:
var registrationSettings = _services.WorkContext.CurrentSite.As<RegistrationSettingsPart>();
registrationSettings.UsersCanRegister = true;
However this will not work if you're doing it from Migrations because you won't be able to use WorkContext.
For migrations you can use IRepository for RegistrationSettingsPartRecord:
RegistrationSettingsPartRecord currentSetting = _registrationSettingRepository.Table.First();
currentSetting.UsersCanRegister = true;
_registrationSettingRepository.Update(currentSetting);
However this will no longer work as of Orchard version 1.8 as the record no longer exists. As of 1.8 one way I know of would be using ISiteService:
var site = _siteService.GetSiteSettings();
var regsettings = site.As<RegistrationSettingsPart>();
regsettings.UsersCanRegister = true;

Retrieving IIS Logs from Azure

I have been trying to get IIS Logs from Azure, and I was able to get to get it going once - now, no matter what I try, I can't get it to transfer logs to my Storage account.
I was trying to do this without re-deploying my code, which after reading around seemed possible. And, as I mentioned, I was successful. But this is driving me insane, it just won't do it anymore. Although, it does create the Queue in my storage account when I start a transfer, but that's all it seems to do.
The basic steps I am doing are:
Adding the storage name and key to my config as "DiagnosticsConnectionString"*.
Setting a DiagnosticMonitorConfiguration for one minute, with a DirectoriesBufferConfiguration.
Starting an OnDemand Transfer with a new queue name.
I've done all of the above both programmatically, and through the cmdaplets for PowerShell. As soon as I start a transfer, it just stays with a status of "Not Yet Published (Do not end/cancel)".
I have tried Logs, Directories and even deleted and recreated my storage account. Nothing seems to be working. It appeared to work when I directly added my storage account info to my role config via the azure portal; after it Updated the deployment I saw the logs. But this is not working anymore. Does anyone have some good advice/material? I just want to transfer my IIS logs to my storage account - I've been at it for days.
Update:*This is my config: . My WebRole.cs contained the following, when it worked:
DiagnosticMonitor.Start("DiagnosticsConnectionString");
I've updated it to start transfers:
var diag = new DiagnosticMonitorConfiguration()
{
ConfigurationChangePollInterval = TimeSpan.FromMinutes(1),
Directories = new DirectoriesBufferConfiguration()
{
ScheduledTransferPeriod = TimeSpan.FromMinutes(1)
},
Logs = new BasicLogsBufferConfiguration()
{
ScheduledTransferLogLevelFilter = LogLevel.Verbose,
ScheduledTransferPeriod = TimeSpan.FromMinutes(1)
}
};
DiagnosticMonitor.Start("DiagnosticsConnectionString", diag);
Change one line:
From
var diag = new DiagnosticMonitorConfiguration()
to
var diag = DiagnosticMonitor.GetDefaultInitialConfiguration()
Afterwords, use the existing objects within the diag and not add your own.
This is my OnStart:
var config = DiagnosticMonitor.GetDefaultInitialConfiguration();
config.DiagnosticInfrastructureLogs.ScheduledTransferLogLevelFilter = LogLevel.Information;
config.DiagnosticInfrastructureLogs.ScheduledTransferPeriod = TimeSpan.FromMinutes(1);
config.Directories.ScheduledTransferPeriod = TimeSpan.FromMinutes(1);
DiagnosticMonitor.Start("DiagnosticsConnectionString", config);
It could be that the logs are not being generated, rather than a problem at the download time.
There is a progam called AzureLogFetcher that may help, tips on getting logging to work can be found here

Resources