I'm trying to figure out how to call this AVFoundation function in Swift. I've spent a ton of time fiddling with declarations and syntax, and got this far. The compiler is mostly happy, but I'm left with one last quandary.
public func captureOutput(
captureOutput: AVCaptureOutput!,
didOutputSampleBuffer sampleBuffer: CMSampleBuffer!,
fromConnection connection: AVCaptureConnection!
) {
let samplesInBuffer = CMSampleBufferGetNumSamples(sampleBuffer)
var audioBufferList: AudioBufferList
var buffer: Unmanaged<CMBlockBuffer>? = nil
CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(
sampleBuffer,
nil,
&audioBufferList,
UInt(sizeof(audioBufferList.dynamicType)),
nil,
nil,
UInt32(kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment),
&buffer
)
// do stuff
}
The compiler complains for the 3rd and 4th arguments:
Address of variable 'audioBufferList' taken before it is initialized
and
Variable 'audioBufferList' used before being initialized
So what am I supposed to do here?
I'm working off of this StackOverflow answer but it's Objective-C. I'm trying to translate it into Swift, but run into this problem.
Or is there possibly a better approach? I need to read the data from the buffer, one sample at a time, so I'm basically trying to get an array of the samples that I can iterate over.
Disclaimer: I have just tried to translate the code from Reading audio samples via AVAssetReader to Swift, and verified that it compiles. I have not
tested if it really works.
// Needs to be initialized somehow, even if we take only the address
var audioBufferList = AudioBufferList(mNumberBuffers: 1,
mBuffers: AudioBuffer(mNumberChannels: 0, mDataByteSize: 0, mData: nil))
var buffer: Unmanaged<CMBlockBuffer>? = nil
CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(
sampleBuffer,
nil,
&audioBufferList,
UInt(sizeof(audioBufferList.dynamicType)),
nil,
nil,
UInt32(kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment),
&buffer
)
// Ensure that the buffer is released automatically.
let buf = buffer!.takeRetainedValue()
// Create UnsafeBufferPointer from the variable length array starting at audioBufferList.mBuffers
let audioBuffers = UnsafeBufferPointer<AudioBuffer>(start: &audioBufferList.mBuffers,
count: Int(audioBufferList.mNumberBuffers))
for audioBuffer in audioBuffers {
// Create UnsafeBufferPointer<Int16> from the buffer data pointer
var samples = UnsafeMutableBufferPointer<Int16>(start: UnsafeMutablePointer(audioBuffer.mData),
count: Int(audioBuffer.mDataByteSize)/sizeof(Int16))
for sample in samples {
// ....
}
}
Swift3 solution:
func loopAmplitudes(audioFileUrl: URL) {
let asset = AVAsset(url: audioFileUrl)
let reader = try! AVAssetReader(asset: asset)
let track = asset.tracks(withMediaType: AVMediaTypeAudio)[0]
let settings = [
AVFormatIDKey : kAudioFormatLinearPCM
]
let readerOutput = AVAssetReaderTrackOutput(track: track, outputSettings: settings)
reader.add(readerOutput)
reader.startReading()
while let buffer = readerOutput.copyNextSampleBuffer() {
var audioBufferList = AudioBufferList(mNumberBuffers: 1, mBuffers: AudioBuffer(mNumberChannels: 0, mDataByteSize: 0, mData: nil))
var blockBuffer: CMBlockBuffer?
CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(
buffer,
nil,
&audioBufferList,
MemoryLayout<AudioBufferList>.size,
nil,
nil,
kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment,
&blockBuffer
);
let buffers = UnsafeBufferPointer<AudioBuffer>(start: &audioBufferList.mBuffers, count: Int(audioBufferList.mNumberBuffers))
for buffer in buffers {
let samplesCount = Int(buffer.mDataByteSize) / MemoryLayout<Int16>.size
let samplesPointer = audioBufferList.mBuffers.mData!.bindMemory(to: Int16.self, capacity: samplesCount)
let samples = UnsafeMutableBufferPointer<Int16>(start: samplesPointer, count: samplesCount)
for sample in samples {
//do something with you sample (which is Int16 amplitude value)
}
}
}
}
The answers posted here make assumptions about the size of the necessary AudioBufferList -- which may have allowed them to have work in their particular circumstance, but didn't work for me when receiving audio from a AVCaptureSession. (Apple's own sample code didn't work either.)
The documentation on CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer is not obvious, but it turns out you can ask the function it how big AudioListBuffer item should be first, and then call it a second time with an AudioBufferList allocated to the size it wants.
Below is a C++ example (sorry, don't know Swift) that shows a more general solution that worked for me.
// ask the function how big the audio buffer list should be for this
// sample buffer ref
size_t requiredABLSize = 0;
err = CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(sampleBuffer,
&requiredABLSize,
NULL,
NULL,
kCFAllocatorSystemDefault,
kCFAllocatorSystemDefault,
kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment,
NULL);
// allocate an audio buffer list of the required size
AudioBufferList* audioBufferList = (AudioBufferList*) malloc(requiredABLSize);
// ensure that blockBuffer is NULL in case the function fails
CMBlockBufferRef blockBuffer = NULL;
// now let the function allocate fill in the ABL for you
err = CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(sampleBuffer,
NULL,
audioBufferList,
requiredABLSize,
kCFAllocatorSystemDefault,
kCFAllocatorSystemDefault,
kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment,
&blockBuffer);
// if we succeeded...
if (err == noErr) {
// la la la... read your samples...
}
// release the allocated block buffer
if (blockBuffer != NULL) {
CFRelease(blockBuffer);
blockBuffer = NULL;
}
// release the allocated ABL
if (audioBufferList != NULL) {
free(audioBufferList);
audioBufferList = NULL;
}
I'll leave it up to the Swift experts to offer an implementation in that language.
Martin's answer works and does exactly what I asked in the question, however, after posting the question and spending more time with the problem (and before seeing Martin's answer), I came up with this:
public func captureOutput(
captureOutput: AVCaptureOutput!,
didOutputSampleBuffer sampleBuffer: CMSampleBuffer!,
fromConnection connection: AVCaptureConnection!
) {
let samplesInBuffer = CMSampleBufferGetNumSamples(sampleBuffer)
self.currentZ = Double(samplesInBuffer)
let buffer: CMBlockBufferRef = CMSampleBufferGetDataBuffer(sampleBuffer)
var lengthAtOffset: size_t = 0
var totalLength: size_t = 0
var data: UnsafeMutablePointer<Int8> = nil
if( CMBlockBufferGetDataPointer( buffer, 0, &lengthAtOffset, &totalLength, &data ) != noErr ) {
println("some sort of error happened")
} else {
for i in stride(from: 0, to: totalLength, by: 2) {
// do stuff
}
}
}
This is a slightly different approach, and probably still has room for improvement, but the main point here is that at least on an iPad Mini (and probably other devices), each time this method is called, we get 1,024 samples. But those samples come in an array of 2,048 Int8 values. Every other one is the left/right byte that needs to be combined into to make an Int16 to turn the 2,048 half-samples into 1,024 whole samples.
it works for me. try it:
let musicUrl: NSURL = mediaItemCollection.items[0].valueForProperty(MPMediaItemPropertyAssetURL) as! NSURL
let asset: AVURLAsset = AVURLAsset(URL: musicUrl, options: nil)
let assetOutput = AVAssetReaderTrackOutput(track: asset.tracks[0] as! AVAssetTrack, outputSettings: nil)
var error : NSError?
let assetReader: AVAssetReader = AVAssetReader(asset: asset, error: &error)
if error != nil {
print("Error asset Reader: \(error?.localizedDescription)")
}
assetReader.addOutput(assetOutput)
assetReader.startReading()
let sampleBuffer: CMSampleBufferRef = assetOutput.copyNextSampleBuffer()
var audioBufferList = AudioBufferList(mNumberBuffers: 1, mBuffers: AudioBuffer(mNumberChannels: 0, mDataByteSize: 0, mData: nil))
var blockBuffer: Unmanaged<CMBlockBuffer>? = nil
CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(
sampleBuffer,
nil,
&audioBufferList,
sizeof(audioBufferList.dynamicType), // instead of UInt(sizeof(audioBufferList.dynamicType))
nil,
nil,
UInt32(kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment),
&blockBuffer
)
I do this (swift 4.2):
let n = CMSampleBufferGetNumSamples(audioBuffer)
let format = CMSampleBufferGetFormatDescription(audioBuffer)!
let asbd = CMAudioFormatDescriptionGetStreamBasicDescription(format)!.pointee
let nChannels = Int(asbd.mChannelsPerFrame) // probably 2
let bufferlistSize = AudioBufferList.sizeInBytes(maximumBuffers: nChannels)
let abl = AudioBufferList.allocate(maximumBuffers: nChannels)
for i in 0..<nChannels {
abl[i] = AudioBuffer(mNumberChannels: 0, mDataByteSize: 0, mData: nil)
}
var block: CMBlockBuffer?
var status = CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(audioBuffer, bufferListSizeNeededOut: nil, bufferListOut: abl.unsafeMutablePointer, bufferListSize: bufferlistSize, blockBufferAllocator: nil, blockBufferMemoryAllocator: nil, flags: 0, blockBufferOut: &block)
assert(noErr == status)
// use AudioBufferList here (abl.unsafePointer), e.g. with ExtAudioFileWrite or what have you
Related
I am working on a project which involves Rust and Java. I need to be able to use the JNI from the Rust side, without the Java side calling invoking it (because it is not my code). So far, I have been able to ensure my DLL is injected (open a small window on keypress, I have been using this for debugging).
A shortened example of the relevant code is the following:
use jni::sys::{JNI_GetCreatedJavaVMs, JNIInvokeInterface_};
let jvm_ptr = null_mut() as *mut *mut *const JNIInvokeInterface_;
let count = null_mut();
// hasn't crashed
JNI_GetCreatedJavaVMs(jvm_ptr, 1, count); // https://docs.rs/jni/latest/jni/sys/fn.JNI_GetCreatedJavaVMs.html
// crashes
My question is this: is it possible to/how do I get a JNI environment in this situation?
With the help of the comments, I got that crash to stop happening. The trick was to pre-allocate an array.
let jvm_ptr = Vec::with_capacity(1).as_mut_ptr();
let count = null_mut();
JNI_GetCreatedJavaVMs(jvm_ptr, 1, count);
You can't chunk a null pointer into the vmBuf parameter and then tell it that vmBuf points to an array of length 1 via bufLen. Translating the C++ code linked above, I would do something like
let mut count: jint = 0;
let check = JNI_GetCreatedJavaVMs(null_mut(), 0, &mut count);
assert!(check == JNI_OK);
let mut vms = vec![null_mut(); count as usize];
let check = JNI_GetCreatedJavaVMs(vms.as_mut_ptr(), vms.len() as i32, &mut count);
assert!(check == JNI_OK);
assert!(vms.len() == count as usize);
though that's probably a bit overkill since there can only be one VM. Still, checking the count is probably a good idea.
I have been using this many years. It can be polished....
try {
// how many JVMs is there?
JNI_GetCreatedJavaVMs(NULL, 0, &number_of_JVMs);
}
catch (exception e)
{
int x = 0;
}
if (number_of_JVMs > 0)
{
JavaVM** buffer = new JavaVM * [number_of_JVMs];
JNI_GetCreatedJavaVMs(buffer, number_of_JVMs, &number_of_JVMs);
// 2. get the data
jvm_handle = buffer[0];
if (!jvm_handle == 0)
return true;
else return false;
}
This vulkan tutorial discusses swapchain recreation:
You could also decide to [recreate the swapchain] that if the swap chain is suboptimal, but I've chosen to proceed anyway in that case because we've already acquired an image.
My question is: how would one recreate the swapchain and not proceed in this case of VK_SUBOPTIMAL_KHR?
To see what I mean, let's look at the tutorial's render function:
void drawFrame() {
vkWaitForFences(device, 1, &inFlightFences[currentFrame], VK_TRUE, UINT64_MAX);
uint32_t imageIndex;
VkResult result = vkAcquireNextImageKHR(device, swapChain, UINT64_MAX, imageAvailableSemaphores[currentFrame], VK_NULL_HANDLE, &imageIndex);
if (result == VK_ERROR_OUT_OF_DATE_KHR) {
recreateSwapChain();
return;
/* else if (result == VK_SUBOPTIMAL_KHR) { createSwapchain(); ??? } */
} else if (result != VK_SUCCESS && result != VK_SUBOPTIMAL_KHR) {
throw std::runtime_error("failed to acquire swap chain image!");
}
if (imagesInFlight[imageIndex] != VK_NULL_HANDLE) {
vkWaitForFences(device, 1, &imagesInFlight[imageIndex], VK_TRUE, UINT64_MAX);
}
imagesInFlight[imageIndex] = inFlightFences[currentFrame];
VkSubmitInfo submitInfo{};
submitInfo.sType = VK_STRUCTURE_TYPE_SUBMIT_INFO;
VkSemaphore waitSemaphores[] = {imageAvailableSemaphores[currentFrame]};
VkPipelineStageFlags waitStages[] = {VK_PIPELINE_STAGE_COLOR_ATTACHMENT_OUTPUT_BIT};
submitInfo.waitSemaphoreCount = 1;
submitInfo.pWaitSemaphores = waitSemaphores;
submitInfo.pWaitDstStageMask = waitStages;
submitInfo.commandBufferCount = 1;
submitInfo.pCommandBuffers = &commandBuffers[imageIndex];
VkSemaphore signalSemaphores[] = {renderFinishedSemaphores[currentFrame]};
submitInfo.signalSemaphoreCount = 1;
submitInfo.pSignalSemaphores = signalSemaphores;
vkResetFences(device, 1, &inFlightFences[currentFrame]);
if (vkQueueSubmit(graphicsQueue, 1, &submitInfo, inFlightFences[currentFrame]) != VK_SUCCESS) {
throw std::runtime_error("failed to submit draw command buffer!");
}
VkPresentInfoKHR presentInfo{};
presentInfo.sType = VK_STRUCTURE_TYPE_PRESENT_INFO_KHR;
presentInfo.waitSemaphoreCount = 1;
presentInfo.pWaitSemaphores = signalSemaphores;
VkSwapchainKHR swapChains[] = {swapChain};
presentInfo.swapchainCount = 1;
presentInfo.pSwapchains = swapChains;
presentInfo.pImageIndices = &imageIndex;
result = vkQueuePresentKHR(presentQueue, &presentInfo);
if (result == VK_ERROR_OUT_OF_DATE_KHR || result == VK_SUBOPTIMAL_KHR || framebufferResized) {
framebufferResized = false;
recreateSwapChain();
} else if (result != VK_SUCCESS) {
throw std::runtime_error("failed to present swap chain image!");
}
currentFrame = (currentFrame + 1) % MAX_FRAMES_IN_FLIGHT;
}
The trouble is as follows:
vkAcquireImageKHR succeeds, signaling the semaphore and returning a valid, suboptimal image
Recreate the swapchain
We can't present the image from 1 with the swapchain from 2 due to VUID-VkPresentInfoKHR-pImageIndices-01430. We need to call vkAcquireImageKHR again to get a new image.
When we call vkAcquireImageKHR again, the semaphore is in the signaled state which is not allowed (VUID-vkAcquireNextImageKHR-semaphore-01286), we need to 'unsignal' it.
Is the best solution here to destroy and recreate the semaphore?
Ad 3: you can use the old images (and swapchain) if you properly use the oldSwapchain parameter when creating the new swapchain. Which is what I assume the tutorial suggests.
Anyway. What I do is that I paranoidly sanitize that toxic semaphore like this:
// cleanup dangerous semaphore with signal pending from vkAcquireNextImageKHR (tie it to a specific queue)
// https://github.com/KhronosGroup/Vulkan-Docs/issues/1059
void cleanupUnsafeSemaphore( VkQueue queue, VkSemaphore semaphore ){
const VkPipelineStageFlags psw = VK_PIPELINE_STAGE_BOTTOM_OF_PIPE_BIT;
VkSubmitInfo submit_info = {};
submit_info.sType = VK_STRUCTURE_TYPE_SUBMIT_INFO;
submit_info.waitSemaphoreCount = 1;
submit_info.pWaitSemaphores = &semaphore;
submit_info.pWaitDstStageMask;
vkQueueSubmit( queue, 1, &submit_info, VK_NULL_HANDLE );
}
After that the semaphore can be properly catched with a fence or vkQueueWaitIdle, and then destroyed or reused.
I just destroy them, because the new semaphore count might differ, and I don't really consider swapchain recreation a hotspot (and also I just use vkDeviceWaitIdle in such case).
I found the following issue in my memory. I couldn't understand it.
error evaluating expression “(CAListenerProxy::DeviceAggregateNotification *)0x7cee3d60”: error: use of undeclared identifier 'CAListenerProxy'
error: expected expression
I used two notification centers - one for sending the object to another view and the other in another view to send the dictionary which contain objects after deleting one from it.
My code is :
// only for delegate method for the downloading videos
extension WebViewController {
func urlSession(_ session: URLSession, downloadTask: URLSessionDownloadTask, didFinishDownloadingTo location: URL) {
// her i need to get the data from the movie which i download it so i can save it in the document directory
if let fetchDataFromDownloadFile = try? Data(contentsOf: location) {
// generate the fileName randamlly
let createFileName = UUID().uuidString
// generate object for save file
let operationDocumentDirectory = OperationDocumentDirectory()
operationDocumentDirectory.saveMovie(movieName: createFileName, data: fetchDataFromDownloadFile)
}// end the if let for the fetch data from download file
// her for fetch the download video object when i save it to set it to ni for free the memory
if let fetchURL = downloadTask.originalRequest?.url {
var fetchObject = operationObject?.dictionaryOfDownloadVideo?.removeValue(forKey: fetchURL)
// for stop the downloadtask when finish download
if fetchObject?.videoURL == downloadTask.originalRequest?.url {
downloadTask.cancel()
fetchObject = nil
}
// for update the badge after finish download movie
DispatchQueue.main.async {[weak self] in
if let mySelf = self {
// set badge for nil if the objects zero
if operationObject.dictionaryOfDownloadVideo?.count == 0 {
self?.tabBarController?.viewControllers?[1].tabBarItem.badgeValue = nil
}else{
// if the object not zero update the badge
mySelf.tabBarController?.viewControllers?[1].tabBarItem.badgeValue = "\(operationObject.dictionaryOfDownloadVideo!.count)"
}
}// end the if for myself
}// end the dispatchqueue.main
// update the data in table view
NotificationCenter.default.post(name: NOTIFICATION_UPDATE_TABLEVIEW, object: nil)
}// end the fetch url
}
// and this code in another view for updating table view when i currently downloading movie
extension MovieDownloadingViewController {
// data source
func tableView(_ tableView: UITableView, cellForRowAt indexPath: IndexPath) -> UITableViewCell {
let cell = tableView.dequeueReusableCell(withIdentifier: "Cell") as? DownloadingTableViewCell
if let cell = cell {
cell.movieObject = arrayOfObjects?[indexPath.row]
cell.movieDeleteButton.tag = indexPath.row
cell.movieDeleteButton.addTarget(self, action: #selector(self.deleteCurrentDownloadingMovie(sender:)), for: .touchUpInside)
}
return cell!
}
func deleteCurrentDownloadingMovie(sender:UIButton){
displayAlertDeleteMovie(arrayOfObject: arrayOfObjects!, index: sender.tag)
}
func displayAlertDeleteMovie(arrayOfObject:[DownloadVideo],index:Int) {
let alertController = UIAlertController(title: "Delete Movie", message: "Do You Want Delete Movie \(arrayOfObject[index].videoName)", preferredStyle: .actionSheet)
let alertDelete = UIAlertAction(title: "Delete", style: .default) {[weak self] (alertAction:UIAlertAction) in
var fetchObjectMovie = self?.arrayOfObjects?.remove(at: index)
// set the notification for update the number of element in dict and array
NotificationCenter.default.post(name: NOTIFICATION_UPDATE_NUMBER_OF_ARRAY_DICT, object: fetchObjectMovie?.videoURL)
if fetchObjectMovie != nil {
fetchObjectMovie = nil
}
// update table view
// self?.tableView.reloadData()
// update the badge in the tab bar controller
if operationObject.dictionaryOfDownloadVideo?.count == 0 {
self?.tabBarController?.viewControllers?[1].tabBarItem.badgeValue = nil
}else{
self?.tabBarController?.viewControllers?[1].tabBarItem.badgeValue = "\(operationObject.dictionaryOfDownloadVideo!.count)"
}
}
let alertCancel = UIAlertAction(title: "Cancel", style: .cancel) { [weak self](alertAction:UIAlertAction) in
self?.dismiss(animated: true, completion: {})
}
alertController.addAction(alertDelete)
alertController.addAction(alertCancel)
present(alertController, animated: true, completion: nil)
}
please any help
thanks a lot
The code is below and it is part of a thread. pFileChange->m_hDirectory is of type HANDLE and pFileChange->m_eventFileChange if of type CEvent. CreateFile and ReadDirectoryChangesW return with a success. I am not able to figure out why i am getting an invalid handle status, please help!
UINT CFileChange::FileMontiorThread(LPVOID pArgs)
{
CFileChange* pFileChange = NULL;
pFileChange = (CFileChange*)pArgs;
pFileChange = (CFileChange*)pArgs;
CString str = pFileChange->m_strDirectory;
LPSTR strDirectory;
strDirectory = str.GetBuffer(str.GetLength());
PathRemoveFileSpec(strDirectory);
DWORD dwBytes = 0;
vector<BYTE> m_Buffer;
BOOL m_bChildren;
OVERLAPPED m_Overlapped;
HANDLE arrHandles[2] = { pFileChange->m_hDirectory, pFileChange->m_eventFileChange };
::ZeroMemory(&m_Overlapped, sizeof(OVERLAPPED));
DWORD dwBufferSize = 16384;
m_Buffer.resize(dwBufferSize);
m_bChildren = false;
pFileChange->m_hDirectory = ::CreateFile(
(LPCSTR)(LPCTSTR)strDirectory,
FILE_LIST_DIRECTORY,
FILE_SHARE_READ
| FILE_SHARE_WRITE
| FILE_SHARE_DELETE,
NULL,
OPEN_EXISTING,
FILE_FLAG_BACKUP_SEMANTICS
| FILE_FLAG_OVERLAPPED,
NULL);
if (pFileChange->m_hDirectory == INVALID_HANDLE_VALUE)
{
return false;
}
BOOL success = ::ReadDirectoryChangesW(
pFileChange->m_hDirectory, // handle to directory
&m_Buffer[0], // read results buffer
m_Buffer.size(), // length of buffer
m_bChildren, // monitoring option
FILE_NOTIFY_CHANGE_LAST_WRITE | FILE_NOTIFY_CHANGE_CREATION | FILE_NOTIFY_CHANGE_FILE_NAME, // filter conditions
&dwBytes, // bytes returned
&m_Overlapped, // overlapped buffer
NULL); // no completion routine
DWORD dwWaitStatus;
while (!pFileChange->m_bKillThread)
{
dwWaitStatus = WaitForMultipleObjects(2, arrHandles, FALSE, INFINITE);
Switch(dwWaitStatus)
{
case WAIT_FAILED:
{
ULONG rc = 0;
rc = ::GetLastError();
LPVOID lpMsgBuf = NULL;
::FormatMessage(FORMAT_MESSAGE_ALLOCATE_BUFFER | FORMAT_MESSAGE_FROM_SYSTEM | FORMAT_MESSAGE_IGNORE_INSERTS,
NULL,
rc,
MAKELANGID(LANG_NEUTRAL, SUBLANG_DEFAULT), // Default language
(LPTSTR)&lpMsgBuf,
0,
NULL);
CString strErrMsg;
strErrMsg.Format(_T("%s, %s, Reason:%s"), "", "", (LPTSTR)lpMsgBuf);
break;
}
}
}
return 0;
}
Note that, as it specifies in the documentation, you can't wait on any type of handle.
Waiting on a directory handle isn't going to do what you think it should. Read this related question and its answers for more information and background reading.
As it seems you're trying to create a folder monitor, perhaps read this blog post for the correct way to use ReadDirectoryChangesW.
The answer was provided by Hans Passant in a comment:
You copy the handles into arrHandles too soon, before they are created. That cannot work of course.
Well I am trying to creating a simple tool to read an specific offset address from a file that's within the project.
I can read it fine and get the bytes, the problem is that I want to convert the result into a string, but I just can't.
My output is this: <00000100 88000d00 02140dbb 05c3a282> but I want into String.
Found some examples of doing it using an extension for NSData, but still didn't work.
So anyone could help??
Here's my code:
class ViewController: UIViewController {
let filemgr = NSFileManager.defaultManager()
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view, typically from a nib.
let pathToFile = NSBundle.mainBundle() .pathForResource("control", ofType: "bin")
let databuffer = filemgr.contentsAtPath(pathToFile!)
let file: NSFileHandle? = NSFileHandle(forReadingAtPath: pathToFile!)
if file == nil {
println("File open failed")
} else {
file?.seekToFileOffset(197584)
let byte = file?.readDataOfLength(16)
println(byte!)
file?.closeFile()
}
}
}
So long as you know the encoding, you can create a string from an NSData object like this
let str = NSString(data: data, encoding: NSUTF8StringEncoding)
By the way, you might want to try using if let to unwrap your optionals rather than force-unwrapping, to account for failure possibilities:
let filemgr = NSFileManager.defaultManager()
if let pathToFile = NSBundle.mainBundle() .pathForResource("control", ofType: "bin"),
databuffer = filemgr.contentsAtPath(pathToFile),
file = NSFileHandle(forReadingAtPath: pathToFile)
{
file.seekToFileOffset(197584)
let bytes = file.readDataOfLength(16)
let str = NSString(data: bytes, encoding: NSUTF8StringEncoding)
println(str)
file.closeFile()
}
else {
println("File open failed")
}
The correct answer following #Martin R suggestion to this link: https://codereview.stackexchange.com/a/86613/35991
Here's the code:
extension NSData {
func hexString() -> String {
// "Array" of all bytes:
let bytes = UnsafeBufferPointer<UInt8>(start: UnsafePointer(self.bytes), count:self.length)
// Array of hex strings, one for each byte:
let hexBytes = map(bytes) { String(format: "%02hhx", $0) }
// Concatenate all hex strings:
return "".join(hexBytes)
}
}
And I used like this:
let token = byte.hexString()