How to link excel file to sql server 2008r2 management studio - excel

we have recently adopted to sql server 2008r2 from access. We are completely new to it. We need to work on data which comes in excel spreadsheets. In access we directly link the data to it and run the queries. This similar approach is available in sql server also? If not how to link the excel files into sql server?
what is sql server compact?

In order to link the excel files : In excel go to data tab. There you will find the From Other Sources option. A wizard will appear requesting the Server name and your credentials to the db -> next -> select the db you want to connect to and then the table and click finish.
Hope it helps.

I don't know about "linking" your Excel workbook or sheets to SQLServer database. Are these always on the same host?
In any case, you can easily import your Excel data into a SQLServer table:
INSERT INTO sp_configure 'Show Advanced Options', 1;
RECONFIGURE;
GO
sp_configure 'Ad Hoc Distributed Queries', 1;
RECONFIGURE;
GO
EXEC sp_MSset_oledb_prop N'Microsoft.ACE.OLEDB.12.0', N'AllowInProcess', 1
GO
EXEC sp_MSset_oledb_prop N'Microsoft.ACE.OLEDB.12.0', N'DynamicParameters', 1
GO
INSERT INTO dbo.YOUR_SQL_TABLENAME
SELECT EXCEL_COLUMN_NAME1 [, EXCEL_COLUMN_NAME2...] FROM
OPENDATASOURCE('Microsoft.ACE.OLEDB.12.0',
'Data Source=C:\YOUR_EXCEL_FILENAME.xlsx;Extended Properties=Excel 12.0')...[Sheet1$]
This will import from Sheet1$, if you have named your worksheets, then use the name.

Related

Syncing Azure Easy Table with WHERE clause

I'm developing a Xamarin.Forms app which uses an Azure app service with SQL database linked through EasyTables. I've run the samples and successfully tested querying tables etc on the server and enabled offline sync so as a localdb is created.
I've created the store, defined the table & sync'd it, however I want to be able to query it somehow with a where clause - is that possible? Can I add a where clause to the client.GetSyncTable line?
var store = new MobileServiceSQLiteStore("localstore.db");
store.DefineTable<Journey_Stages>();
client.SyncContext.InitializeAsync(store);
tbl_Stages = client.GetSyncTable<Journey_Stages>();
Some of the tables I'm pulling down will grow over time & are linked to individual user profiles, so I only want data which belongs to that user and I don't want to be bringing down masses of data each time, preferably let the server handle that and only bring down what I need on a user by user basis.
Thanks,
Steve
You should add this filtering logic on the server side, so that each user's data isn't exposed to all your other users. See for example this sample if you are using the Node.js backend -- line 17 adds a WHERE clause for the table read query. If you have the .Net backend, similar logic would go in your table controller.
// Configure specific code when the client does a request
// READ - only return records belonging to the authenticated user
table.read(function (context) {
context.query.where({ userId: context.user.id });
return context.execute();
});

View rows and tables for my data in Graphcool?

With Graphcool is it possible to view the SQL database to see your tables, rows, items etc?
My application will use GraphQL queries (obviously!) but as an admin it would be handy for me to be able to see my entire database.
{
allGroups{
name
id
}
}
In the terminal run graphcool console from your server directory. There is then a data menu link in the page that opens.

DocumentDB Data migration Tool, can't migrate from db to db

I'm using DocumentDB Data Migration Tool to migrate a documentDB db to a newly created documentDB db. The connectionStrings verify say it is ok.
It doesn't work (no data transferred (=0) but not failure written in the log file (Failed = 0).
Here is what is done :
I've tried many things such as :
migrate / transfer a collection to a json file
migrate to partitionned / non partitionned documentdb db
for the target indexing policy I've taken the source indexing policy (json got from azure, documentdb db collection settings).
...
Actually nothing's working, but I have no error logs, maybe a problem of documentdb version ?
Thanx in advance for your help.
After debugging the solution from the tool's repo I figure the tools fail silently if you mistyped the database's name like I did.
DocumentDBClient just returns an empty async enumerator.
var database = await TryGetDatabase(databaseName, cancellation);
if (database == null)
return EmptyAsyncEnumerator<IReadOnlyDictionary<string, object>>.Instance;
I can import from an Azure Cosmos DB DocumentDB API collection using DocumentDB Data Migration tool.
Besides, based on my test, if the name of the collection that we specify for Source DocumentDB is not existing, no data will be transferred and no error logs is written.
Import result
Please make sure the source collection that you specified is existing. And if possible, you can try to create a new collection and import data from this new collection, and check if data can be transferred.
I've faced same problem and after some investigation found that internal document structure was changed. Therefor after migration with with tool documents are present but couldn't be found with data explorer (but with query explorer using select * they are visible)
I've migrated collection through mongo api using Mongichef
#fguigui: To help troubleshoot this, could you please re-rerun the same data migration operation using the command line option? Just launch dt.exe from the same folder as Data Migration Tool for syntax required. Then after you launch it with required parameters, please paste the output here and I'll take a look what's broken.

View extended event file

When i tried to read extended event file for azure database its giving an following error :
I am able to download .xel file from blob storage and view it through SSMS.
select * from sys.fn_xe_file_target_read_file ( 'http location of .xel file', null, null, null ) but its not user friendly.
Is there is any another way we can view extended event .xel file?
There is only one way that I can think of to make the interface more user-friendly and it is to create your own. I do not know if you have the time to invest into the task, but in a similar situation I used this library to query and parse xEvents with linq (it is the same library that Microsoft uses for the SSMS extended events viewer in SQL Server 2016).

Delete Reports in an SQL Azure Sub-Folder

The migration of the SQL Reporting component of Windows Azure from the old portal to the newer html 5 one has in the process limited the folder hierarchy to 2 levels deep (As indicated in this article).
The article does however state that existing reporting services can still have deeper hierarchies; whilst Business Intelligence Development Studio still allows you to deploy to the sub folder.
We have preserved our hierarchy like so:
Root Level
Client Reports
Internal Reports
Report Category 1
Data Source
Report1.rdl
Report2.rdl
Report Category 2
Due to the number of reports we have it is unfeasible to have every folder at root level and, thus far, the hierarchy has still be functioning correctly.
However we have run into a problem; we can no longer update any data sources or delete reports that are more than 2 levels deep.
Rather than restructure all our reports to suit what feels like an extremely restrictive structure, is there a way of managing our SQL Reporting reports external to the portal, via either an API, BIDS or Powershell?
OK so I've done a bit of research into this; SQL Reporting on Windows Azure exposes the ReportService2010 SOAP interface for administering the reports programmatically.
Using a proxy generated through the WSDL tool we can remotely connect to SQL Reporting using the below C# code:
var reportServiceUrl =
"https://X.reporting.windows.net/reportserver/reportservice2010.asmx?wsdl";
var serviceUsername = "AdminUserName;
var servicePassword = "AdminPassword";
var reportingService = new ReportingService2010
{
Url = reportServiceUrl,
CookieContainer = new CookieContainer()
};
reportingService.LogonUser(serviceUsername, servicePassword, reportServiceUrl);
The reportingService object can then be used to upload, update and delete all items on the SQL Reporting instance. If, for example, I wanted to delete a report in a sub folder that cannot be accessed on the Windows Azure portal, I could use:
reportingService.DeleteItem("Internal Reports/Report Category 1/Report1.rdl");
That stated; it is much easier to refactor the report folders to a 2 level hierarchy instead. The naming convention we ended up using is:
Root Level
Internal Reports - Report Category 1
Data Source
Report1.rdl
Report2.rdl
Internal Reports - Report Category 2

Resources