I was looking for a way to upload a text file of dictionary of synonyms in azure search, the nearest I could find was
https://azure.microsoft.com/en-in/blog/azure-search-synonyms-public-preview/
https://learn.microsoft.com/en-us/azure/search/search-synonyms
I know it is not a good idea to compare products of different companies, but if there exists a way to upload a dictionary of synonyms in azure search like it is supported in elastic search, then it will of great help and might save a lot of time and rework.
Please help me know how to achieve such thing like uploading the dictionary of the synonym in azure search
The latest .NET SDK for Azure Cognitive Search has this capability. From this sample:
// Create a new SearchIndexClient
Uri endpoint = new Uri(Environment.GetEnvironmentVariable("SEARCH_ENDPOINT"));
AzureKeyCredential credential = new AzureKeyCredential(
Environment.GetEnvironmentVariable("SEARCH_API_KEY"));
SearchIndexClient indexClient = new SearchIndexClient(endpoint, credential);
// Create a synonym map from a file containing country names and abbreviations
// using the Solr format with entry on a new line using \n, for example:
// United States of America,US,USA\n
string synonymMapName = "countries";
string synonymMapPath = "countries.txt";
SynonymMap synonyms;
using (StreamReader file = File.OpenText(synonymMapPath))
{
synonyms = new SynonymMap(synonymMapName, file);
}
await indexClient.CreateSynonymMapAsync(synonyms);
The SDKs for Java, Python, and Javascript also support creating synonym maps. The Java SDK accepts a string rather than a file stream, so you'd have to read the file contents yourself. Unfortunately the Python and Javascript SDKs seem to require a list of strings (one for each line of the file), which is something we should improve. I'm following up with the Azure SDK team to make these improvements.
Related
We are looking to write a C# function that can return the connection string for an EventHub in our application.
We understand that there is a PowerShell Get-AzEventHubNamespaceKey that can do it, but looking for a C# equivalent.
Any suggestions welcome.
MrHappyhead
To help anybody else out there looking for something like this, it is possible to use the Azure.Management.% namespaces.
For example for an EventHub:
EventHubManagementClient eventHubClient = new EventHubManagementClient(tokenCredentials);
You can then use
AccessKeys accessKeys = await eventHubClient.Namespaces.ListKeysAsync(resourceGroup, eventHubNamespace,authorisationRuleName);
It is better to use the Namespaces as opposed to eventhubs, because the former provides connection strings in the scenario that you are using Geo DR - e.g. you can obtain the Primary/Secondary alias connection strings.
I am trying to get files from a google cloud storage bucket. The file name are something like 20180618_1400/SOMEID_20180618.jpg, 20180618_1200/SOMEID_20180618.jpg, 20180617_1400/SOMEOTHERID_20180617.jpg, etc.
I want to get files based on SOMEID.
I tried using the following code with reg exp
bucket.getFiles({
prefix: new RegExp(`[0-9_]*\/SOMEID_`),
}, (err, files) => {
if (err) return reject(err);
resolve(files);
});
The expected result is files 20180618_1400/SOMEID_20180618.jpg and 20180618_1200/SOMEID_20180618.jpg. But the code returns all the files in the bucket.
I searched on the internet but couldn't find anything.
Is there any other way to achieve this?
The prefix has to be a string. This is a prefix, not a regex. I had a look to be sure in documentation and it is, as expected, not possible.
The correct way to do that in GCS would be to structure your bucket in a way prefix as a string is usable. For example, having a directory for profile picture, another for pdf, ... And all files are named with your user id.
Example:
profiles/1245.jpg
profiles/7561.jpg
billing/1245-2018-10.pdf
billing/1245-2018-09.pdf
billing/7561-2018-10.pdf
...
If you cannot, you will have to get all items and then apply your regex on it. You have an example at the end of the getFiles() documentation
I think (it's been a while), you can use a regex using gsutils, but gsutils get all files and then apply the regex on the client side, so it won't be a better solution.
I'm making a program that stores and reads from Azure tables some that are stored in CSV files. What I got are CSV files that that can have various number of columns, and between 3k and 50k rows. What I need to do is upload that data in Azure table. So far I managed to both upload data and retrieve it.
I'm using REST API, and for uploading I'm creating XML batch request, with 100 rows per request. Now that works fine, except it takes a bit too long to upload, ex. for 3k rows it takes around 30seconds. Is there any way to speed that up? I noticed that it takes most time when proccessing response ( for ReadToEnd() command ). I read somewhere that setting proxy to null could help, but it doesn't do much in my case.
I also found somewhere that it is possible to upload whole XML request to blob and then execute it from there, but I couldn't find any example for doing that.
using (Stream requestStream = request.GetRequestStream())
{
requestStream.Write(content, 0, content.Length);
}
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
{
Stream dataStream = response.GetResponseStream();
using (var reader = new StreamReader(dataStream))
{
String responseFromServer = reader.ReadToEnd();
}
}
As for retrieving data from azure tables, I managed to get 1000 entities per request. As for that, it takes me around 9s for CS with 3k rows. It also takes most time when reading from stream. When I'm calling this part of the code (again ReadToEnd() ):
response = request.GetResponse() as HttpWebResponse;
using (StreamReader reader = new StreamReader(response.GetResponseStream()))
{
string result = reader.ReadToEnd();
}
Any tips?
As you mentioned you are using REST API you may have to write extra code and depend on your own methods to implement performance improvement differently then using client library. In your case using Storage client library would be best as you can use already build features to expedite insert, upsert etc as described here.
However if you were using Storage Client Library and ADO.NET you can use the article below which is written by Windows Azure Table team as supported way to improve Azure Access Performance:
.NET and ADO.NET Data Service Performance Tips for Windows Azure Tables
I cannot search the twitter API for tweets which contain one of multiple tags.
Like: q="#tag1 OR #tag2 OR #tag3"
If I leave away the hashes and only search for words, the OR-ing works. For tags they don't.
When I only use spaces, the search terms will be AND-ed, what shrinks the result...
I use the twitter4j library with:
Twitter rest = new TwitterFactory().getInstance();
Query query = new Query();
query.setQuery("#win | #fail");
QueryResult result = rest.search(query);
Isn't it possible, or didn't i use it correctly?
Might just be easier to use twitter's REST API. You'll want to use the search query. Here's an example search url searching for #LA, #NYC or #Boston. Note the spaces and #s are all URL encoded. Just pop a URL like that into a getJSON call like below and you can easily extract your values from the returned JSON object as in the example.
var requestedData = "http://search.twitter.com/search.json?q=%23LA%20OR%20%23NYC%20OR%20%23Boston%22&callback=?"
$.getJSON(requestedData,function(ob)
{
var firstTweet = ob.results[0].text;
var firstTweeter = ob.results[0].from_user;
}
From there it's just a matter of looping through your results and pulling the appropriate fields which are all outlined in the JSON file if you simply visit that example search link in your browser! I don't know this TwitterFactory API but its possible they haven't updated to Twitter's new API or they're just not URL encoding appropriately. Good luck!
Try to use OR operator instead of "|":
query.setQuery("#win OR #fail");
See available Twitter search API operators here:
Using the Twitter Search API
I trying to upload a file with metadata using SharePoint Web Services. The first approach I took is to use the WebRequest/WebResponse objects and then update the metadata using the Lists.asmx - UpdateListItems method. This works just fine but it creates two versions of the file. The second approach I took was to use the Copy.asmx web service and use the CopyIntoItems method which copies the file data along with the metadata. This works fine and creates v 1.0 but when I try to upload the same file with some changes in the metadata (using the Copy.asmx) it does not do update anything. Does anybody came across the same issue or has some other ideas to implement the required functionality.
Thanks,
Kiran
This might be a bit of topic (sorry) but I'd like to advice you to a real timesaving shortcut when working with SharePoint remotely, http://www.bendsoft.com/net-sharepoint-connector/
It enables you to work with SharePoint lists and document libraries with SQL and stored procedures.
Uploading a file as a byte array
...
string sql = "CALL UPLOAD('Shared Documents', 'Images/Logos/mylogo.png', #doc)";
byte[] data = System.IO.File.ReadAllBytes("C:\\mylogo.png");
SharePointCommand cmd = new SharePointCommand(sql, myOpenConnection);
cmd.Parameters.Add("#doc", data);
cmd.ExecuteNonQuery();
...
Upload stream input
using (fs == System.IO.File.OpenRead("c:\\150Mb.bin")) {
string sql = "CALL UPLOAD('Shared Documents', '150Mb.bin', #doc)";
SharePointCommand cmd = new SharePointCommand(sql, myOpenConnection);
cmd.Parameters.Add("#doc", fs);
cmd.ExecuteNonQuery();
}
There are quite a few methods to simplify remote document management
UPLOAD(lisname, filename, data)
DOWNLOAD(listname, filename)
MOVE(listname1, filename1, listname2, filename2)
COPY(listname1, filename1, listname2, filename2)
RENAME(listname, filename1, filename2)
DELETE(listname, filename)
CREATEFOLDER(listname, foldername)
CHECKOUT(list, file, offline, lastmodified)
CHECKIN(list, file, comment, type)
UNDOCHECKOUT(list, file)
Cheers