All the libraries I have come across that provide Swift concurrency functionality either depend on an Objective-C library or are wrappers around the C based GCD. (iOS and OS X)
I'm just wondering if anyone knows of a pure Swift concurrency library? No dependencies. Or if not how might we go about making one?
The Swift language doesn't have constructs to support concurrency yet. Rumor has it that those are going to be added to a future version of the language, but in the meantime, I don't think you can do this without OS support.
GCD makes for fairly clean concurrent solutions in Swift, since GCD blocks are Swift closures. It's not platform-indpendent, however. (At least not outside the Apple ecosystem.)
Edit:
I guess you could write a concurrency library that would run on all POSIX-compliant OS's by using POSIX, but Windows isn't really POSIX-compliant, so that might not be a perfect solution either.
Edit #2:
In Swift 3 the GCD interface has been cleaned up and made much more "Swifty". Now there is a Dispatch class, that has class methods to do the things that used to use global functions.
You can create a simple abstraction of queue-based dispatching like this:
protocol OperationQueue {
var suspended: Bool { get set }
var operationCount: Int { get }
func addOperationWithBlock(block: () -> Void)
func cancelAllOperations()
func waitUntilAllOperationsAreFinished()
}
Then, on OS X (or any other Apple platform), you could use GCD directly:
extension NSOperationQueue: OperationQueue { }
And on all other systems, you could create your own implementations:
class POSIXOperationQueue: OperationQueue { /* implementation with pthread */ }
class Win32OperationQueue: OperationQueue { /* implementation with Win32 API */ }
As of Swift 5.5, Swift now supports concurrency with built-in language features such as async/await, Tasks, TaskGroups, Actors and Structured Concurrency.
Swift Concurrency
[Concurrency vs Parallelism]
[Sync vs Async]
Official doc
Swift has built-in support for writing asynchronous and parallel code in a structured way.
It is a kind of from Swift v5.5 iOS v15 and v13 backward compatibility
solves:
callback hell
error handling(using native try/catch mechanism)
natural, straight-line code which is simple to read, write and support
High level structure:
await/async markers for recognising async block and run it. You are able to use withCheckedThrowingContinuation and withCheckedContinuation inside callback to support await/async
await async func
await async let
await Actor)
Task - create a concurrent environment in which await/async is allowed, it is stared immediately after creation, can be prioritized duting creation and can be canceled, has it's own lifetime as a result - current state. Task.init(priority:operation:) creates unstructured concurrency(doesn't have a parent task)
Task Group - allow to run parallel(more dynamic then await async let) tasks and all off them done - task group is finished. When you use TaskGroup.addTask or await async let creates Structured concurrency because it has explicit parent-child relationship between task and taskGroup
[Actor] - share mutable referance_type data between tasks in a concurrent environment which prevents Data race
Example:
func runImageTask() {
let imageSaver = ImageSaver()
Task.init {
do {
let url = URL(string: "https://www.google.com/images/branding/googlelogo/2x/googlelogo_dark_color_272x92dp.png")!
guard let image = try await self.downloadImage(from: url) else { return }
try await imageSaver.storeImage(image)
guard let transformedImage = try await self.transformImage(image) else { return }
try await imageSaver.storeImage(transformedImage)
self.showImage(transformedImage)
} catch {
print("error:\(error)")
}
}
}
func downloadImage(from url: URL) async throws -> UIImage? {
let (data, response) = try await URLSession.shared.data(from: url)
return UIImage(data: data)
}
func transformImage(_ image: UIImage) async throws -> UIImage? {
return try await withCheckedThrowingContinuation { continuation in
DispatchQueue.global().async {
UIGraphicsBeginImageContext(image.size)
image.draw(at: CGPoint.zero)
guard let context = UIGraphicsGetCurrentContext() else {
continuation.resume(with: .failure(NSError(domain: "context is nil", code: 1)))
return
}
context.setStrokeColor(UIColor.green.cgColor)
context.setLineWidth(5)
context.addEllipse(in: CGRect(x: 50, y: 50, width: 50, height: 50))
context.drawPath(using: .stroke)
let transformedImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
continuation.resume(with: .success(transformedImage))
}
}
}
func showImage(_ image: UIImage) {
self.imageView.image = image
}
class ImageSaver: NSObject {
var continuation: CheckedContinuation<Void, Error>? = nil
func storeImage(_ image: UIImage) async throws {
return try await withCheckedThrowingContinuation { continuation in
self.continuation = continuation
UIImageWriteToSavedPhotosAlbum(image, self, #selector(saveCompleted), nil)
}
}
#objc func saveCompleted(_ image: UIImage, didFinishSavingWithError error: Error?, contextInfo: UnsafeRawPointer) {
let result: Result<(), Error>
if let error = error {
result = .failure(error)
} else {
result = .success(())
}
self.continuation?.resume(with: result)
}
}
[Kotlin Coroutine]
Related
I'm using the recommended rocket_db_pools crate to create a database pool and attach it to Rocket e.g.
#[derive(Database)]
#[database("birthdays")]
struct DB(sqlx::SqlitePool);
// in main
rocket::build()
.attach(DB::init())
I would like to access the same database pool outside of the context of Rocket. I'm attempting to spawn a separate tokio task that runs alongside Rocket and does some additional background work e.g.
async fn main() -> _ {
rocket::tokio::spawn(async {
// access DB pool here
})
rocket::build()
// ...
}
Is there a way to initialize a sqlx::SqlitePool and easily provide it to both the background task and Rocket so I can still leverage the automatic #[derive(Database)] goodies that rocket_db_pools provides?
As a relative rust beginner I'm having a hard time reading through the traits and understanding if there is a way to do so that doesn't require doing it fully custom by creating a pool, writing my own impl FromRequest, etc.
I found an alternative based on the example from rocket_sync_db_pools which I adapted for the differences with rocket_db_pools.
It uses a fairing to get access to the Database after it has been initialized and then clones the wrapped SqlitePool to move into the task.
#[derive(Database)]
#[database("birthdays")]
struct DB(sqlx::SqlitePool);
// in main
rocket::build()
.attach(DB::init())
.attach(AdHoc::try_on_ignite("Background job", |rocket| async {
let pool = match DB::fetch(&rocket) {
Some(pool) => pool.0.clone(), // clone the wrapped pool
None => return Err(rocket),
};
rocket::tokio::task::spawn(async move {
loop {
if let Ok(mut conn) = pool.acquire().await {
let results = sqlx::query_as::<_, Birthday>(
"SELECT name, birthday, account_id FROM birthdays",
)
.fetch_all(&mut conn)
.await
.expect("query should succeed");
debug!("selected from birthdays: {:?}", results);
}
sleep(Duration::from_secs(10)).await;
}
});
Ok(rocket)
}))
I am developing a CLI program for rendering template files using the new MiniJinja library by mitsuhiko.
The program is here: https://github.com/benwilber/temple.
I would like to be able to extend the program by allowing the user to load custom Lua scripts for things like custom filters, functions, and tests. However, I am running into Rust lifetime errors that I've not been able to solve.
Basically, I would like to be able to register a Lua function as a custom filter function. But it's showing an error when compiling. Here is the code:
https://github.com/benwilber/temple/compare/0.3.1..lua
Error:
https://gist.github.com/c649a0b240cf299d3dbbe018c24cbcdc
How can I call a Lua function from the MiniJinja add_filter function? I would prefer to try to do this in the regular/safe way. But I'm open to unsafe alternatives if required.
Thanks!
Edit: Posted the same on Reddit and users.rust-lang.org
Lua uses state that is not safe to use from more than one thread.
A consequence of this is that LuaFunction is neither Sync or Send.
This is being enforced by this part of the error message:
help: within `LuaFunction<'_>`, the trait `Sync` is not implemented for `*mut rlua::ffi::lua_State`
In contrast a minijinja::Filter must implement Send + Sync + 'static.
(See https://docs.rs/minijinja/0.5.0/minijinja/filters/trait.Filter.html)
This means we can't share LuaFunctions (or even LuaContext) between calls to the Filters.
One option is to not pass your lua state into the closures, and instead create a new lua state every call, something like this.
env.add_filter(
"concat2",
|_env: &Environment, s1: String, s2: String|
-> anyhow::Result<String, minijinja::Error> {
lua.context(|lua_ctx| {
lua_ctx.load(include_str!("temple.lua")).exec().unwrap();
let globals = lua_ctx.globals();
let temple: rlua::Table = globals.get("temple").unwrap();
let filters: rlua::Table = temple.get("_filters").unwrap();
let concat2: rlua::Function = filters.get("concat2").unwrap();
let res: String = concat2.call::<_, String>((s1, s2)).unwrap();
Ok(res)
}
}
);
This is likely to have relatively high overhead.
Another option is to create your rlua state in one thread and communicate with it via pipes. This would look more like this:
pub fn test() {
let mut env = minijinja::Environment::new();
let (to_lua_tx, to_lua_rx) = channel::<(String,String,SyncSender<String>)>();
thread::spawn(move|| {
let lua = rlua::Lua::new();
lua.context(move |lua_ctx| {
lua_ctx.load("some_code").exec().unwrap();
let globals = lua_ctx.globals();
let temple: rlua::Table = globals.get("temple").unwrap();
let filters: rlua::Table = temple.get("_filters").unwrap();
let concat2: rlua::Function = filters.get("concat2").unwrap();
while let Ok((s1,s2, channel)) = to_lua_rx.recv() {
let res: String = concat2.call::<_, String>((s1, s2)).unwrap();
channel.send(res).unwrap()
}
})
});
let to_lua_tx = Mutex::new(to_lua_tx);
env.add_filter(
"concat2",
move |_env: &minijinja::Environment,
s1: String,
s2: String|
-> anyhow::Result<String, minijinja::Error> {
let (tx,rx) = sync_channel::<String>(0);
to_lua_tx.lock().unwrap().send((s1,s2,tx)).unwrap();
let res = rx.recv().unwrap();
Ok(res)
}
);
}
It would even be possible to start multiple lua states this way, but would require a bit more plumbing.
DISCLAIMER: This code is all untested - however, it builds with a stubbed version of minijinja and rlua in the playground. You probably want better error handling and might need some additional code to handle cleanly shutting down all the threads.
this warning appears even when used with IO context.
my code:
override fun download(url: String, file: File): Flow<Long> = flow {
var total: Long = 0
var count: Long = 0
withContext(Dispatchers.IO) {
val client = OkHttpClient()
val req = Request.Builder().url(url).build()
val response = client.newCall(req).**execute**()
val sink: BufferedSink = Okio.buffer(Okio.**sink**(file))
response.body()?.let {
while (count != -1L) {
count = it.source().**read**(sink.buffer(), 2048)
if (count == -1L) break
sink.**emit**()
total = total.plus(count)
withContext(Dispatchers.Default) {
emit(total.times(100).div(it.contentLength()))
}
}
}
sink.**close**()
}
}
the bold parts are getting warning. is anything going wrong with code or the warning should not appear?
These method calls can throw an IOException and are called inside a suspend method. These are flagged as likely blocking calls which they are. The subtleties of the Dispatchers.IO is missed by the compiler warnings.
Your best bet is generally to either switch to the async mode using enqueue(), or put this behind a library function that hides these warnings. A library like https://github.com/gildor/kotlin-coroutines-okhttp can also be helpful in bridging between blocking code in OkHttp and coroutines.
I'm trying to create my first app in Swift which involves making multiple requests to a website. These requests are each done using the block
var task = NSURLSession.sharedSession().dataTaskWithRequest(request, completionHandler: {data, response, error -> Void in ... }
task.resume()
From what I understand this block uses a thread different to the main thread.
My question is, what is the best way to design code that relies on the values in that block? For instance, the ideal design (however not possible due to the fact that the thread executing these blocks is not the main thread) is
func prepareEmails() {
var names = getNames()
var emails = getEmails()
...
sendEmails()
}
func getNames() -> NSArray {
var names = nil
....
var task = NSURLSession.sharedSession().dataTaskWithRequest(request, completionHandler: {data, response, error -> Void in
names = ...
})
task.resume()
return names
}
func getEmails() -> NSArray {
var emails = nil
....
var task = NSURLSession.sharedSession().dataTaskWithRequest(request, completionHandler: {data, response, error -> Void in
emails = ...
})
task.resume()
return emails
}
However in the above design, most likely getNames() and getEmails() will return nil, as the the task will not have updated emails/name by the time it returns.
The alternative design (which I currently implement) is by effectively removing the 'prepareEmails' function and doing everything sequentially in the task functions
func prepareEmails() {
getNames()
}
func getNames() {
...
var task = NSURLSession.sharedSession().dataTaskWithRequest(request, completionHandler: {data, response, error -> Void in
getEmails(names)
})
task.resume()
}
func getEmails(names: NSArray) {
...
var task = NSURLSession.sharedSession().dataTaskWithRequest(request, completionHandler: {data, response, error -> Void in
sendEmails(emails, names)
})
task.resume()
}
Is there a more effective design than the latter? This is my first experience with concurrency, so any advice would be greatly appreciated.
The typical pattern when calling an asynchronous method that has a completionHandler parameter is to use the completionHandler closure pattern, yourself. So the methods don't return anything, but rather call a closure with the returned information as a parameter:
func getNames(completionHandler:(NSArray!)->()) {
....
let task = NSURLSession.sharedSession().dataTaskWithRequest(request) {data, response, error -> Void in
let names = ...
completionHandler(names)
}
task.resume()
}
func getEmails(completionHandler:(NSArray!)->()) {
....
let task = NSURLSession.sharedSession().dataTaskWithRequest(request) {data, response, error -> Void in
let emails = ...
completionHandler(emails)
}
task.resume()
}
Then, if you need to perform these sequentially, as suggested by your code sample (i.e. if the retrieval of emails was dependent upon the names returned by getNames), you could do something like:
func prepareEmails() {
getNames() { names in
getEmails() {emails in
sendEmails(names, emails) // I'm assuming the names and emails are in the input to this method
}
}
}
Or, if they can run concurrently, then you should do so, as it will be faster. The trick is how to make a third task dependent upon two other asynchronous tasks. The two traditional alternatives include
Wrapping each of these asynchronous tasks in its own asynchronous NSOperation, and then create a third task dependent upon those other two operations. This is probably beyond the scope of the question, but you can refer to the Operation Queue section of the Concurrency Programming Guide or see the Asynchronous vs Synchronous Operations and Subclassing Notes sections of the NSOperation Class Reference.
Use dispatch groups, entering the group before each request, leaving the group within the completion handler of each request, and then adding a dispatch group notification block (called when all of the group "enter" calls are matched by their corresponding "leave" calls):
func prepareEmails() {
let group = dispatch_group_create()
var emails: NSArray!
var names: NSArray!
dispatch_group_enter(group)
getNames() { results in
names = results
dispatch_group_leave(group)
}
dispatch_group_enter(group)
getEmails() {results in
emails = results
dispatch_group_leave(group)
}
dispatch_group_notify(group, dispatch_get_main_queue()) {
if names != nil && emails != nil {
self.sendEmails(names, emails)
} else {
// one or both of those requests failed; tell the user
}
}
}
Frankly, if there's any way to retrieve both the emails and names in a single network request, that's going to be far more efficient. But if you're stuck with two separate requests, you could do something like the above.
Note, I wouldn't generally use NSArray in my Swift code, but rather use an array of String objects (e.g. [String]). Furthermore, I'd put in error handling where I return the nature of the error if either of these fail. But hopefully this illustrates the concepts involved in (a) writing your own methods with completionHandler blocks; and (b) invoking a third bit of code dependent upon the completion of two other asynchronous tasks.
The answers above (particularly Rob's DispatchQueue based answer) describe the concurrency concepts necessary to run two tasks in parallel and then respond to the result. The answers lack error handling for clarity because traditionally, correct solutions to concurrency problems are quite verbose.
Not so with HoneyBee.
HoneyBee.start()
.setErrorHandler(handleErrorFunc)
.branch {
$0.chain(getNames)
+
$0.chain(getEmails)
}
.chain(sendEmails)
This code snippet manages all of the concurrency, routes all errors to handleErrorFunc and looks like the concurrent pattern that is desired.
Asynchronous programming is a must for responsive user interfaces when application have to communicate over unpredictable networks (e.g. smart phone applications). The user interface must remain responsive while waiting for results to come back from servers somewhere over the internet.
In most languages, the application programmer has to implement their own state machines (maybe using closures) to respond to asynchronous callbacks and/or coordinate multiply threads using locks.
Both of these are very error prone and not for the fait hearted!
(c# introduced the async keyword to help with this, only time (at least 5 years) will tell if it is a good solution.)
Does Swift have any built in support to assist the writing of asynchronous code?
While it isn't a built-in language feature, it may be interesting to note that it's possible to implement C# style async/await for Swift, and that because of the special syntax afforded to the last closure argument of a function call, it even looks like it might be part of the language.
If anyone is interested, you can get code for this on Bitbucket. Here's a quick taster of what's possible:
let task = async { () -> () in
let fetch = async { (t: Task<NSData>) -> NSData in
let req = NSURLRequest(URL: NSURL.URLWithString("http://www.google.com"))
let queue = NSOperationQueue.mainQueue()
var data = NSData!
NSURLConnection.sendAsynchronousRequest(req,
queue:queue,
completionHandler:{ (r: NSURLResponse!, d: NSData!, error: NSError!) -> Void in
data = d
Async.wake(t)
})
Async.suspend()
return data!
}
let data = await(fetch)
let str = NSString(bytes: data.bytes, length: data.length,
encoding: NSUTF8StringEncoding)
println(str)
}
Also, if you want something like #synchronized, try this:
func synchronized(obj: AnyObject, blk:() -> ()) {
objc_sync_enter(obj)
blk()
objc_sync_exit(obj)
}
var str = "A string we can synchronise on"
synchronized(str) {
println("The string is locked here")
}
Swift's approach to asynchronous programming is the same as Objective C's: use Grand Central Dispatch. You can pass closures to gcd dispatch_ functions, just as in ObjC. However, for aesthetic reasons, you can also pass your closure (block) after the close parentheses:
dispatch_async(dispatch_get_main_queue()) {
println("async hello world")
}