I have seen the video of paging using subsonic
http://subsonicproject.com/querying/webcast-using-paging/
I am calling a stored procedure to retrieve the data. Is it possible for me to use paging? Please give me few samples.
Thanks
You'll need to add paging to your stored procedure. Methods to accomplish this vary based on which database you're using. I would capture the SQL generated by a paged query, copy that SQL into a new sproc, add params for page number and page size, then access the Stored procedure proxy method for your newly-created sproc.
Related
We use node-mssql and we're trying to send an array of data to a stored procedure.
Unlikely the TVP which seems a little bit complex, we found this bulk method which is very interesting, but all the examples that we found create a new table instead of pushing data to a stored procedure.
Is there a way to use it to get the bulk results in a stored procedure?
Our SQL Server version is 2012. Really appreciate any help in advance.
If table valued parameters, seems very complex to pass from node.js, alternatively, you can have a staging table and bulk load data into the staging table and you can process the bulkload table inside the stored procedure.
Follow below steps:
BULK LOAD INTO Staging Table
Process the staging table inside the stored procedure
Clean up the Staging table inside the stored procedure, once you are done with the processing.
Is it possible to do multiple operations in a single stored procedure in cosmos Db with bounded execution?
I have to perform below operations in a single stored procedure
A new record to be inserted
Few records to be deleted
Update Operation to be performed
How can data consistency be maintained with transaction support in this case?
Cosmos DB stored procedures work transactionally, within a single partition of a single collection. So, as long as your inserts, deletes, and updates (replacements, to be more accurate) are all within a single partition, they would all be handled transactionally within a single stored procedure call.
Hey Ramakrishna Reddy,
As David mentioned, transactions can only be achieved within a partition in a collection. see documentation here: https://learn.microsoft.com/en-us/azure/cosmos-db/database-transactions-optimistic-concurrency. I have the experience of a time when multiple collections were merged into one collection for the ability to achieve transactions. You might need to do the same thing as well.
There are examples to achieve transactions here: https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/cosmos-db/how-to-write-stored-procedures-triggers-udfs.md of how to create stored procedures.
In your particular situation, you will probably need to write a transaction that takes an array of items to upsert, and an array of items to delete. You can find an example of a deletion transaction here: https://github.com/Azure/azure-cosmos-dotnet-v2/blob/master/samples/clientside-transactions/DocDBClientBulk/DocDBClientBulk/bulkDelete.js
Alternatively, you can use Transactional batch in .NET SDK that Cosmos now supports. In addition to transactions support, you can see other upcoming long-awaited updates in this blog: https://devblogs.microsoft.com/cosmosdb/whats-new-in-azure-cosmos-db-nov-2019/. However, I am unclear whether it supports the deletion that you are seeking. I haven't gotten a chance to play with it. Maybe you can share when you figure it out!
I have an Excel template which reads data from a source Excel file using vlookups and Index/Match functions. I there a way to prevent the end user from accessing the source data file/sheet? e.g. by storing the source file on a remote location and make the vlookups read from there..
Depending on what resources are available to you, it may be difficult to prevent users from just going around the restrictions you put in place. Even if the data is in a database table you will need measures in place to prevent users from querying it outside of your Excel template. I don't know your situation, but ideally there would be someone (i.e. database administrator, infosec, back-end developer) who could help engineer a proper solution.
Having said that, I do believe your idea around using MS SQL Server could be a good way to go. You could create stored procedures instead of using sql queries to limit access. See this link for more details:
Managing Permissions with Stored Procedures in SQL Server
In addition, I would be worried about users figuring out other user IDs and arbitrarily accessing data. You could implement some sort of protection by having a mapping table so that there's no way to access information with user IDs. The table would be as follows:
Columns: randomKey, userId, creationDate
randomKey is just an x digit random number/letter sequence
creationDateTime is a time stamp and used for timeout purposes
Whenever someone needs a user id you would run a stored procedure that adds a record to the mapping table. You input the user id, the procedure creates a record and returns the key. You provide the user with the key which they enter in your template. A separate stored procedure takes the key and resolves to the user id (using the mapping table) and returns the requested information. These keys expire. Either they can be single use (the procedure deletes the record from the mapping table) or use a timeout (if creationDateTime is more than x hours/days old it will not return data).
For the keys, Mark Ransom shared an interesting solution for creating random IDs for which you could base your logic:
Generate 6 Digit unique number
Sounds like a lot of work, but if there is sensitivity around your data it's worth building a more robust process around it. There's probably a better way to approach this, but I hope it at least gives you food for thought.
No, it's not possible.
Moreover, you absolutely NEED these files open to refresh the values in formulas that refer them. When you open a file with external references, their values will be calculated from local cache (which may not be equal to actual remote file contents). When you open the remote files, the values will refresh.
Jdbc inbound channel adapter relies on the update query to mark the already processed records and thats how we can retrieve only the non-processed records in the subsequent polls. This makes sense but I am working with a table that doesnt have a column that I can modify to indicate this record being processed.
I was wondering if I can use a stored procedure which returns a cursor and somehow that will help with not having to load all the lets say million records in memory and still be able to process lets say 1000 every poll.
Edit: I am working with oracle
Yes, you can use stored procedure on the matter. For this purpose Spring Integration suggests <int-jdbc:stored-proc-inbound-channel-adapter> component.
Here you can find the sample.
I am using a big stored procedure which is using many linked server queries. If i run this stored procedure manually it runs fine but if i call this stored procedure with exe using mufti-threading, it is raising "Cannot get the data of the row from the OLE DB provider "SQLNCLI11" for linked server "linkedserver1". and "Row handle referred to a deleted row or a row marked for deletion." for each execution. Performance of stored procedure is also very slow in comparison of same stored procedure without linked server queries. Please provide me some tips to improve performance of stored procedure and fix the issue mentioned above.
Thanks
If you are querying over linked servers, you will see a decrease in performance. Could it be possible that the procedures are affecting the same results - therefore giving you exceptions? If so you might be looking at dirty reads. Is that OK for your result set?
From the looks of it you seem to have to call the procedures sequentially and not in parallel. What you can do is cache the data on a server, and sync the updates etc, in batches.