Loading the site returns the error: site map root node is null, the database is demo data stock
You can run this SQL command against the database to check if the node exists:
SELECT * from sitemap where Title = 'Sitemap Root' and CompanyID > 0;
If it is not there the deployment is corrupted and you will need to re-install the website.
There's not enough information in the question to determine how it was deployed and which data set was used. For SalesDemo data it should be deployed from the wizard.
Related
Following this blog with steps by steps http://www.windowsazure.com/en-us/documentation/articles/web-sites-dotnet-deploy-aspnet-mvc-app-membership-oauth-sql-database/#setupdevenv
If I run from my local machine then i see the data which is coming from Windows Azure db and i can add and update or delete ... perfectly working fine but the problem is when I publish my application to Windows Azure and I able to see my page and all the static pages are working fine but except one page which is interacting with database.
here is my web.config connection string:
<add name="DefaultConnection" connectionString="server=tcp:* insert server name*.database.windows.net,1433;Database=* insert database name *;User ID=*insert username *#* insert server name*;Password={*insert password here *};Trusted_Connection=False;Encrypt=True;Connection Timeout=30;" />
I get this message when i try to access the page from http://XXXX.azurewebsites.net/Employee
Error Message:
Migrations is enabled for context 'ApplicationDbContext' but the database does not exist or contains no mapped tables. Use Migrations to create the database and its tables, for example by running the 'Update-Database' command from the Package Manager Console.
Seems that your database cannot be created automatically. The fastest way to fix that is to follow the suggestion from your error message. Open your Package Manager Console with the project, which contains the connection string and the Configuration.cs (your migrations), selected as the startup project and run Update-Database. It could be that you must pass some parameters to this command if you have changed something on your migrations.
Sitecore 6.6 was deployed on azure using Sitecore Azure 3.0 tool. I'm looking for sitecore logs,because I need to see who changed particular item, but I can't find them. So I'm interested in AUDIT log lines.
I was looking in blob storage, but there are only containers: cacheclusterconfigs, publishtargets, sitecore-auto-deploy, wad-control-container, wad-iis-logfiles.
I was looking in table storage, using azure storage explorer, but there are only tables: WADDiagonsticInfrastructureTable, WADDirectoriesTable, WADLogsTable, WADPerformanceCountersTable, WADWindowsEventLogsTable.
Where are audit logs stored?
Bartłomiej.
You can find Sitecore log entries in the WADLogsTable table into Azure Storage Service that Sitecore Azure module creates during deployment.
Please take a look at the following article for more details:
https://kb.sitecore.net/articles/400950
Best Wishes,
Oleg Burov
With Sitecore Experience Platform 8.2, the location of the log files has been updated: https://kb.sitecore.net/articles/911837
Logs can now be accessed via Application Insights. Find the proper Application Insights object:
Select the object, which will launch a new window:
Open a new tab (query) and enter the following query exactly:
traces
| extend scinstancename=parsejson(customDimensions).InstanceName
| where timestamp > now(-1d)
| summarize count(), any(tostring(scinstancename)) by cloud_RoleInstance
| extend InstanceName=any_scinstancename
| extend CloudRole=cloud_RoleInstance
| project InstanceName, CloudRole
| order by InstanceName asc
Hit 'GO'.
Find the instance you wish to view the logs of, and take note of the CloudRole column:
Lastly, run the following query in a new tab, swap out the 'CloudRole' with the role you found from the previous step:
traces
| where cloud_RoleInstance == 'REPLACE_THIS_ROLE'
| where timestamp > now(-14d)
| project timestamp, message
| sort by timestamp desc
The log file is displayed below:
Results may be exported to CSV if desired:
Sitecore logs, by default, are stored in a folder named "logs" under the folder specified in this setting in the web.config file:
<sc.variable name="dataFolder" value="/data" />
By default, this value is "/data". In a typical non-Azure implementation, this value is usually an absolute path to the "Data" folder that is one level up from the Sitecore "Website" folder.
When using the Sitecore Azure module, this value is transformed to "/App_Data" during deployment. So you would find your logs under "/App_Data/logs" on your Azure instance.
If you don't want to access the logs via file system/remote desktop, you can also view them within the Sitecore desktop interface. Simply log in to the Sitecore desktop, then click the Sitecore button -> Reporting Tools -> Log Viewer. In the Log Viewer application, you will be able to open and view a log file from the Sitecore instance you are logged into.
The migration of the SQL Reporting component of Windows Azure from the old portal to the newer html 5 one has in the process limited the folder hierarchy to 2 levels deep (As indicated in this article).
The article does however state that existing reporting services can still have deeper hierarchies; whilst Business Intelligence Development Studio still allows you to deploy to the sub folder.
We have preserved our hierarchy like so:
Root Level
Client Reports
Internal Reports
Report Category 1
Data Source
Report1.rdl
Report2.rdl
Report Category 2
Due to the number of reports we have it is unfeasible to have every folder at root level and, thus far, the hierarchy has still be functioning correctly.
However we have run into a problem; we can no longer update any data sources or delete reports that are more than 2 levels deep.
Rather than restructure all our reports to suit what feels like an extremely restrictive structure, is there a way of managing our SQL Reporting reports external to the portal, via either an API, BIDS or Powershell?
OK so I've done a bit of research into this; SQL Reporting on Windows Azure exposes the ReportService2010 SOAP interface for administering the reports programmatically.
Using a proxy generated through the WSDL tool we can remotely connect to SQL Reporting using the below C# code:
var reportServiceUrl =
"https://X.reporting.windows.net/reportserver/reportservice2010.asmx?wsdl";
var serviceUsername = "AdminUserName;
var servicePassword = "AdminPassword";
var reportingService = new ReportingService2010
{
Url = reportServiceUrl,
CookieContainer = new CookieContainer()
};
reportingService.LogonUser(serviceUsername, servicePassword, reportServiceUrl);
The reportingService object can then be used to upload, update and delete all items on the SQL Reporting instance. If, for example, I wanted to delete a report in a sub folder that cannot be accessed on the Windows Azure portal, I could use:
reportingService.DeleteItem("Internal Reports/Report Category 1/Report1.rdl");
That stated; it is much easier to refactor the report folders to a 2 level hierarchy instead. The naming convention we ended up using is:
Root Level
Internal Reports - Report Category 1
Data Source
Report1.rdl
Report2.rdl
Internal Reports - Report Category 2
i have copied the drupal files to the live server using fzilla, now when i open the live site it says site offline,The mysql error was: Unknown MySQL server host 'dbramha' (1).,in settings.php i have given db_url as $db_url = 'mysql://dbramha/testing', testing is the database used locally, do i have to install drupal again in the server?
The error is Drupal saying it can't see the database. Depending on your hosting arrangement for the live server this may or may not be on the same server as the Drupal installation.(It's most likely on another server though)
Have you uploaded the database? - Most commonly done by producing a SQL dump from the local development DB and importing it into the live one.
Just copying the Drupal files is only half of the story.
you have to change the db_url like this
* Database URL format:
* $db_url = 'mysql://username:password#localhost/databasename';
* $db_url = 'mysqli://username:password#localhost/databasename';
* $db_url = 'pgsql://username:password#localhost/databasename';;
you have also to put your mysql in the live DB :)
I'm trying to deploy an application on Azure but I'm facing some problems.
on my dev box, all works fine but I have a problem when I'm trying to use the application once it is deployed.
on the dev box, I have an action that I do manually wich crates the test tables in my local sql server express.
but I do not know how to create the tables on the server ? so when I run my website application, it says TableNotFound.
Can sy guide me through this final step ? do I need to make sg additional ?
Thx in advance
The table storage client provides a method to create the schema in the cloud storage; I forget the name (will look it up in a second); call that when you initialise whatever you're using as your data service layer.
Edit: The following snippet is what I use:
StorageAccountInfo = StorageAccountInfo.GetDefaultTableStorageAccountFromConfiguration();
TableStorage.CreateTablesFromModel( typeof( <Context> ), info );
where <Context> is your data context object.