Sybase ASEBulkCopy is not working - sap-ase

Sybase ASEBulkCopy is not working.
I have set the EnableBulkLoad attribute to 1 in the connection string.
It is uploading 1 record at a time even after setting the batch size to 500. The other settings EnableBulkLoad attribute is set to 1 in the connection string.
What other settings am I missing.
Please someone help me with this.
Thanks in advance.

Whether bulk load actually happens depends on other things as well, such as the presence of indexes on the target table. By enabling bulk load you're basically telling the ASE server that it should try to do bulk uploading if it can -- but maybe it cannot so it uses non-bulk.
I'm not sure I understand the details of your question though. What do you mean by "upload"? Does your client app send only 1 record to the ASE server at a time?
Or does it mean that ASE performs regular inserts instead of bulk inserts? If the latter, how did you diagnose that?
I recommend trying it first with the 'bcp' client utility to figure out if bulk loading is possible to start with.

Related

How can I apply a ProseMirror (tiptap) transaction to data persisted on the server?

I'm using TipTap to build a document editor. On the server side (node), I've got a postgres db storing tiptap's output as json. My issue is that I'm expecting the content of the editor to get big, and so posting the entire content to the server on each save isn't going to work.
What I'd like to do is take the transactions from tiptap and send them to the server, load the persisted version, apply the transactions, and then persist the result. Does anyone know how I would go about doing this?
It seems that tiptap depends on there being a DOM, so I'm not sure it's possible to load it in node. And even if it was, I'm unclear on how I could apply those transactions, although prosemirror does have the apply and prosemirror-transform commands, which seem promising.
It seems like people would have run into this issue before-any thoughts?
Thanks!

How to avoid startup latency for Azure Table Storage/Cosmos DB Table API on .Net Core

Microsoft documentation here suggests to use await client.OpenAsync(); to avoid startup latency to Cosmos DB. This seems to be only applicable to SQL API. I try to use Table API and could not manage to do the same. My first request executes in 1500 ms and subsequent takes only 40, so that would be a very nice improvement.
I had tried both Microsoft.Azure.Cosmos.Table and Microsoft.WindowsAzure.Storage to connect, but did not find any way of doing that. The only thing I can think of is doing a "dummy" request that for sure does not return anything instead to achieve the same goal.
Is there any better way to initialize the connection?
A simple solution would be to query anything that you know exists.
Any call using the client will initialise the connection and do the (approximately) 8 requests that CosmosDB needs.
Reading the database account would be the simplest way to achieve this.

Import data from Clio to Azure database using API v4

Let me start out by saying I am a SQL Server Database expert, not a coder so making API calls is certainly not an everyday task for me.
Having said that, I am trying to use the Azure Data Factory's data copy tool to import data from Clio to an Azure SQL Server database. I have had some limited success, data is copied over using the API and inserted into the target table but paging really seems to be an issue. I am testing this with the billable_clients call and the first 25 records with the fields I specify are inserted along with the paging record. As I understand, the billable_clients call is eligible for bulk actions which may be the solution, although I've not been able to figure out how it works. The url I am calling is below:
https://app.clio.com/api/v4/billable_clients.json?fields=id,unbilled_hours,name
Using Postman I've tried to make the same call while adding X-BULK true to the header but that returns no results. If there is anyone that can shed some light on how the X-BULK header flag is used when making a call, or if anyone has any experience loading Clio data into a SQL Server database I'd love some feedback on your methods.
If any additional information regarding my attempts or setup would help please let me know.
Thanks!
you need to download the json files with Bulk API and then update them in DB.
It isn't possible to directly insert the data

Syncing Problems with Xamarin Forms and Azure Easy Tables

I've been working on a Xamarin.Forms application in Visual Studio using Azure for the backend for a while now, and I've come across a really strange issue.
Please note, that I am following the methods mentioned in this blog
For some strange reason the PullAsync() method seems to have some bizarre problems. Any data that I create and sync will only be pulled by PullAsync() from that solution. What I mean by that is that if I create another solution that accesses the exact same backend, it can perform it's own create/sync data, but will not bring over the data generated by the other solution, even though they both seem to have the exact same access. This appears to be some kind of a security feature/issue, but I can't quite make sense of it.
Has anyone else encountered this at all? Was there a work-around at all? This could potentially cause problems down the road if I were to ever want to create another solution that accesses the same system/data for whatever reason.
For some strange reason the PullAsync() method seems to have some bizarre problems. Any data that I create and sync will only be pulled by PullAsync() from that solution.
According to your provided tutorial, I found that the related PullAsync is using Incremental Sync.
await coffeeTable.PullAsync("allCoffees", coffeeTable.CreateQuery());
Incremental Sync:
the first parameter to the pull operation is a query name that is used only on the client. If you use a non-null query name, the Azure Mobile SDK performs an incremental sync. Each time a pull operation returns a set of results, the latest updatedAt timestamp from that result set is stored in the SDK local system tables. Subsequent pull operations retrieve only records after that timestamp.
Here is my test, you could refer to it for a better understanding of Incremental Sync:
Client : await todoTable.PullAsync("todoItems-02", todoTable.CreateQuery());
The client SDK would check if there has a record with the id equals deltaToken|{table-name}|{query-id} from the __config table of your SQLite local store.
If there has no record, then the SDK would send a request as following for pulling your records:
https://{your-mobileapp-name}.azurewebsites.net/tables/TodoItem?$filter=(updatedAt%20ge%20datetimeoffset'1970-01-01T00%3A00%3A00.0000000%2B00%3A00')&$orderby=updatedAt&$skip=0&$top=50&__includeDeleted=true
Note: the $filter would be set as (updatedAt ge datetimeoffset'1970-01-01T00:00:00.0000000+00:00')
While there has a record, then the SDK would pick up the value as the latest updatedAt timestamp and send the request as follows:
https://{your-mobileapp-name}.azurewebsites.net/tables/TodoItem?$filter=(updatedAt%20ge%20datetimeoffset'2017-06-26T02%3A44%3A25.3940000%2B00%3A00')&$orderby=updatedAt&$skip=0&$top=50&__includeDeleted=true
Per my understanding, if you handle the same logical query with the same query id (non-null) in different mobile client, you need to make sure the local db is newly created by each client. Also, if you want to opt out of incremental sync, pass null as the query ID. In this case, all records are retrieved on every call to PullAsync, which is potentially inefficient. For more details, you could refer to How offline synchronization works.
Additionally, you could leverage fiddler for capturing the network traces when you invoke the PullAsync, in order to troubleshoot your issue.

How do you know data has been new added in your MongoDB

I have an nodejs server running witch show data on a web interface. The data is fetched from a MongoDB using mongoose. The data is added via an node-red application witch is isolated from the rest.
Currently my nodejs server fetches the data every 5 seconds. Is there a way to know if the data in my MongoDB has changed?
Thanks, I hope my question is clear.
I was also looking for something similar to what you are asking for few months back. Few ways which i know to do it are:
1) You can try to use middlewares while inserting your documents in DB. You can then send that new data either after saving it in DB or at the time of insertion only.
2) Refer to this answer which talks about solving your problem using inbuilt functions provided by mongoDb. You can study in deep about them in mongoDb docs.
3) There is also another way to do this which includes listening to changes in log files. As you know everything done in mongo is recorded and logged in files so whenever there is some change in data you can know it from there also. You will have to do the digging by yourself in this method.
Hope it helps!

Resources