So, I have this problem:
fun doStuff(newUser: User) : ReturnType {
--- little bit of things here ---
--- some AWS things
--- some MongoDB insertions
--- some Kafka things
return user;
}
Here is the question, I need to make this kind of an async function. The aws, MongoDB and Kafka things should execute while I return the user, because I don't need the user info to do any of this tasks. Is there a way I can return the user, while I am still in the process of calling AWS, DB insertions etc..?
I've tried to take a look in Coroutines but I don't have an ideia on how to make this work.
Thanks for your help!
You can try the following flow:
Create a parent function. Let's call it A
fun A() {
asyncAWSFunction()
asyncMongoDBInsertions()
asyncKafkaThings()
val user = createUserFunction(userInputs: xyz)
return user
}
This will return the user and will do the additional stuff asynchronously.
Related
The lambda's job is to see if a query returns any results and alert subscribers via an SNS topic. If no rows are return, all good, no action needed. This has to be done every 10 minutes.
For some reasons, I was told that we can't have any triggers added on the database, and no on prem environment is suitable to host a cron job
Here comes lambda.
This is what I have in the handler, inside a loop for each database.
sequelize.authenticate()
.then(() => {
for (let j = 0; j < database[i].rawQueries[j].length; j++) {
sequelize.query(database[i].rawQueries[j] => {
if (results[0].length > 0) {
let message = "Temporary message for testing purposes" // + query results
publishSns("Auto Query Alert", message)
}
}).catch(err => {
publishSns("Auto Query SQL Error", `The following query could not be executed: ${database[i].rawQueries[j])}\n${err}`)
})
}
})
.catch(err => {
publishSns("Auto Query DB Connection Error", `The following database could not be accessed: ${databases[i].database}\n${err}`)
})
.then(() => sequelize.close())
// sns publisher
function publishSns(subject, message) {
const params = {
Message: message,
Subject: subject,
TopicArn: process.env.SNStopic
}
SNS.publish(params).promise()
}
I have 3 separate database configurations, and for those few SELECT queries, I thought I could just loop through the connection instances inside a single lambda.
The process is asynchronous and it takes 9 to 12 seconds per invocation, which I assume is far far from optimal
The whole thing feels very very sub optimal but that's my current level :)
To make things worse, I now read that lambda and sequelize don't really play well together:
I am using sequelize because that's the only way I could get 3 connections to the database in the same invocation to work without issues. I tried mssql and tedious packages and wasn't able with either of them
It now feels like using an ORM is an overkill for this very simple task of a SELECT query, and I would really like to at least have the connections and their queries done asynchronously to save some execution time
I am looking into different ways to accomplish this and i went down the rabbit hole and I now have more questions than before! Generators? are they still useful? Observables with RxJs? Could this apply here? Async/Await or just Promises? Do I even need sequelize?
Any guidance/opinion/criticism would be very appreciated
I'm not familiar with sequelize.js but hope I can help. I don't know your level with RxJS and Observables but it's worth to try.
I think you could definitely use Observables and RxJS.
I would start with an interval() that will run the code every time you define.
You can then pipe the interval since it's an Observable, do the auth bit and do a map() to get an array of Observables (for each .query call, I am assuming all your calls, authenticate and query, are Promises so it's possible to transform them into Observables with from()). You can then use something like forkJoin() with the previous array to get a response after all calls are done.
In the .subscribe at the end, you would make the publishSns().
You can pipe a catchError() too and process errors.
The map() part might be not necessary and do it previously and have it stored in a variable since you don't depend on an authenticate value.
I'm certain my solution isn't the only one or the best but i think it would work.
Hope it helps and let me know if it works!
I've been looking all over, but I can't seem to find the answer why I'm getting nothing in the file when exiting.
For context, I'm writing a discord bot. The bot stores its data once an hour. Sometime between stores I want to store the data in case I decide I want to update the bot. When I manually store the data with a command, then kill the process, things work fine. Now, I want to be able to just kill the process without having to manually send the command. So, I have a handler for SIGINT that stores the data the same way I was doing manually and after the promise is fulfilled, I exit. For some reason, the file contains nothing after the process ends. Here's the code (trimmed).
app.ts
function exit() {
client.users.fetch(OWNER)
.then(owner => owner.send('Rewards stored. Bot shutting down'))
.then(() => process.exit());
}
process.once('SIGINT', () => {
currencyService.storeRewards().then(exit);
});
process.once('exit', () => {
currencyService.storeRewards().then(exit);
});
currency.service.ts
private guildCurrencies: Map<string, Map<string, number>> = new Map<string, Map<string, number>>();
storeRewards(): Promise<void[]> {
const promises = new Array<Promise<void>>();
this.guildCurrencies.forEach((memberCurrencies, guildId) => {
promises.push(this.storageService.store(guildId, memberCurrencies));
})
return Promise.all(promises)
}
storage.service.ts
store(guild: string, currencies: Map<string, number>): Promise<void> {
return writeFile(`${this.storageLocation}/${guild}.json`, JSON.stringify([...currencies]))
.catch(err => {
console.error('could not store currencies', err);
})
}
So, as you can see, when SIGINT is received, I get the currency service to store its data, which maps guilds to guild member currencies which is a map of guild members to their rewards. It stores the data in different files (each guild gets its own file) using the storage service. The storage service returns a promise from writeFile (should be a promise of undefined when the file is finished writing). The currency service accumulates all the promises and returns a promise that resolves when all of the store promises resolve. Then, after all of the promises are resolved, a message is sent to the bot owner (me), which returns a promise. After that promise resolves, then we exit the process. It should be a clean exit with all the data written and the bot letting me know that it's shutting down, but when I read the file later, it's empty.
I've tried logging in all sorts of different places to make sure the steps are being done in the right order and I'm not getting weird async stuff, and everything seems to be proceeding as expected, but I'm still getting an empty file. I'm not sure what's going on, and I'd really appreciate some guidance.
EDIT: I remembered something else. As another debugging step, I tried reading the files after the currency service storeRewards() promise resolved, and the contents of the files were valid* (they contained valid data, but it was probably old data as the data doesn't change often). So, one of my thoughts is that the promise for writeFile resolves before the file is fully written, but that isn't indicated in the documentation.
EDIT 2: The answer was that I was writing twice. None of the code shown in the post or the first edit would have made it clear that I was having a double write issue, so I am adding the code causing the issue so that future readers can get the same conclusion.
Thanks to #leitning for their help finding the answer in the comments on my question. After writing a random UUID in the file name, I found the file was being written twice. I had assumed when asking the question, that I had shared all the relevant info, but I had missed something. process.once('exit', ...) was being called after calling process.exit() (more details here). The callback function for the exit event does not handle asynchronous calls. When the callback function returns, the process exits. Since I had duplicated the logic in the SIGINT callback function in the exit callback function, the file was being written a second time and the process was exiting before the file could be written, resulting in an empty file. Removing the process.once('exit', ...) logic fixed the issue.
This is a common process for me in my previous works, so i usually have a very complex use case take for example
async function doThis(){
for (100x) {
try {
insertToDatabase()
await selectAndManipulateData()
createEmailWorker()
/** and many more **/
} catch {
logToAFile()
}
}
}
The code works, but its complicated 1 function doing all the things, the only reason i do this is because i can verify in real time if one function fails i can make sure the other function wont run so there wont be any incorrect data.
What i want to know is, what is the best architecture in defining a project structure that is not sacrificing the data integrity? (or is it already good enough?)
const doThis = async() => {
try {
for (100x) {
await insertToDatabase();
await selectAndManipulateData();
await createEmailWorker();
/** and many more **/
}
}
catch {
await logToAFile();
}
}
The best way of doing this is, you should always use await to call any function and make sure to with es6 syntax's as it gives a lot more feature. Your function should always be an async.
Always put your loop in try catch as it will give you any error in catch and it will calling function specific.
Actually, I would separate the persistence, manipulation and email jobs. Consider storing your data is a single responsibility. In addition to this, your modification and email workers should work as scheduled jobs. Once the jobs triggered, they should check if there is data related to its responsibility.
Another way is changing these scheduled jobs with triggered jobs. You can build a chain of responsibility that triggers next jobs and they would decide to work or not.
I'm trying to do some offline synchronisation from a Xamarin.iOS app. When I'm calling PullAsync on the IMobileServiceSyncTable the call never returns.
I've tried with a regular IMobileServiceTable, which seems to be working fine. The sync table seems to be the thing here that doesn't work for me
Code that doesn't work:
var client = new MobileServiceClient(ApiUrl);
var store = new MobileServiceSQLiteStore("syncstore.db");
store.DefineTable<Entity>();
await client.SyncContext.InitializeAsync(store);
table = client.GetSyncTable<Entity>();
try
{
await table.PullAsync("all", table.CreateQuery());
}
catch (Exception e)
{
Debug.WriteLine(e.StackTrace);
}
return await table.ToListAsync();
Code that works:
var client = new MobileServiceClient(configuration.BaseApiUrl);
return await table.ToListAsync();
Can anyone point out something that seems to be wrong? I do not get any exception or nothing that points me in any direction - it just never completes.
UPDATE 1:
Seen some other SO questions where people had a similar issue because they somewhere in their call stack didn't await, but instead did a Task.Result or Task.Wait(). However, I do await this call throughout my whole chain. Here's e.g. a unit test I've written which has the exact same behaviour as described above. Hangs, and never returns.
[Fact]
public async Task GetAllAsync_ReturnsData()
{
var result = await sut.GetAllAsync();
Assert.NotNull(result);
}
UPDATE 2:
I've been sniffing on the request send by the unit test. It seems that it hangs, because it keeps on doing the http-request over and over several hundereds of times and never quits that operation.
Finally! I've found the issue.
The problem was that the server returned an IEnumerable in the GetAll operation. Instead it should've been an IQueryable
The answer in this question pointed me in the right direction
IMobileServiceClient.PullAsync deadlock when trying to sync with Azure Mobile Services
I have created a simple web interface for vertica.
I expose simple operation above a vertica cluster.
one of the functionality I expose is querying vertica.
when my user enters a multi-query the node modul throws an exception and my process exits with exit 1.
Is there any way to catch this exception?
Is there any way overcome the problem in a different way?
Right now there's no way to overcome this when using a callback for the query result.
Preventing this from happening would involve making sure there's only one query in the user's input. This is hard because it involves parsing SQL.
The callback API isn't built to deal with multi-queries. I simply haven't bothered implementing proper handling of this case, because this has never been an issue for me.
Instead of a callback, you could use the event listener API, which will send you lower level messages, and handle this yourself.
q = conn.query("SELECT...; SELECT...");
q.on("fields", function(fields) { ... }); // 1 time per query
q.on("row", function(row) { ... }); // 0...* time per query
q.on("end", function(status) { ... }); // 1 time per query