I am trying to write a VST3 plugin using the Steinberg VST3 SDK, that uses the FFTW library to perform a Fast Fourier Transform on an incoming audio signal. I have followed all the stepts for including the FFTW library in my project and the linker resolves it correctly.
Whenever I use one of the functions that the library provides, such as fftw_malloc for example, the moduleinfotool.exe fails to generate the moduleinfo.json file, exiting with code 1 and failing the build with a very undescriptive error message.
Here is a part of the process function where I tried using the FFTW functions:
tresult PLUGIN_API kw_SquarifyProcessor::process (Vst::ProcessData& data)
{
if (data.numInputs == 0 || data.numOutputs == 0)
{
return kResultOk;
}
fftw_complex* in, * out;
// This is the code that, when included, makes the build crash.
// Using any other function provided by the FFTW library also crashes the build.
in = (fftw_complex*)fftw_malloc (sizeof (fftw_complex) * 1024);
}
I have no idea what to do right now, and am amazed there are virtually zero resources about the VST3 SDK (apart from the documentation which does not seem to cover cryptic errors like these), so if anyone could point me to some of those/some guide on how to perform FFTs in the VST3 SDK that would be much appreciated as well!
Related
Main context
We're actually trying to get a multi-threading version of Ghostscript x64 DLL, to make use of it through Ghostscript .NET. This component is supposed to "allow runing multiple Ghostscript instances simultaneously within a single process", but, as we have checked in our project, works fine until concurrent requests are made to the application. Same behavior can be replicated lauching same method using Tasks. The error description that raises in both cases, just when a call to the process is made until the last is being executed, is:
An error occured when call to 'gsapi_new_instance' is made: -100
Even it does no seem to be related with .NET directly, I will post a sample of our C# method code, just for contextualize.
// Define switches...
string[] switchesArray = switches.ToArray();
using (GhostscriptProcessor procesador = new GhostscriptProcessor())
{
try
{
procesador.StartProcessing(switchesArray, null);
byte[] destinationFile = System.IO.File.ReadAllBytes(destinationPath);
return destinationFile;
}
catch (Exception ex)
{
throw ex;
}
finally
{
System.IO.File.Delete(sourceFile);
}
}
THREADSAFE solution
Starting our investigation, we found this KenS's answer on this post, indicating that Ghostscript DLL must be generated with GS_THREADSAFE compiler definition.
To clarify, as we make use of Ghostscript 9.52 x64 to generate our PDFs, we need this x64 DLL compiled for Release configuration. After trying to compile Ghostscript sources on Windows 10 x64 machine, using Visual Studio Community 2017 and Visual Studio Community 2019, we finally managed to build and generate all items (only with VS Community 2019) without GS_THREADSAFE parameter, just to confirm that compilation is fine, and we check that the DLLs and executables are working. For this process we took in mind all we found in Ghostscript official documentation.
As we have no other guide to include this GS_THREADSAFE parameter, we followed the instructions given in this solution, including XCFLAGS="-DGS_THREADSAFE=1" on nmake build commands, usign this sentence for Rebuild all option:
cd .. && nmake -f psi\msvc32.mak WIN64= SBR=1 DEVSTUDIO= XCFLAGS=-DGS_THREADSAFE=1 && nmake -f psi\msvc32.mak WIN64= DEVSTUDIO= XCFLAGS=-DGS_THREADSAFE=1 bsc
This approach, rises an error during build:
Error LNK2019 unresolved external symbol errprintf_nomem referenced in
function gs_log_error File \mkromfs.obj 1
As it seems, the file mkromfs.c has a method called errprintf_nomem, which can't be found when GS_THREADSAFE is set.
Questions
1 - Is there any public release of Ghostscript that include x64 DLLs compiled to be THREADSAFE?
And, if not (that's what I'm guessing...)
2 - Is it possible to get this DLL to be THREADSAFE without changing the source code?
3- Could anyone provide, please, a step by step guide or walkthrough to build a x64 Ghostscript DLL using GS_THREADSAFE using Visual Studio (or even any other possible alternative) over Windows 10 x64?
4 - A few posts talk about people achive to manage multithreading using Ghostscript .NET. I assume this examples are all using a GS_THREADSAFE DLL... Is any other workaround we have passed?
Thank a lot in advance.
To summarize all this questions, and as a guide for future developers having this same trouble, these are the answers we've found until now:
AS #KenS mentions in his reply: "No, the Ghostscript developers don't actually build thread-safe versions of the binaries."
At this very moment, clearly not, as it has been reported on this opened bug.
As it seems to be a matter of commercial licensing support, we avoid comment on this point anymore.
Thanks again to #HABJAN. I absolutely take back what I've stated on my question, as it is possible to have Ghostscript .NET working on multi-threading scenarios. Below comes the solution we applied, in case it could be useful for someone.
Based on HABJAN example, what we have done to achieve this was to create a custom class to capture Ghostscript logging:
protected class ConsoleStdIO : Ghostscript.NET.GhostscriptStdIO
{
public ConsoleStdIO(bool handleStdIn, bool handleStdOut, bool handleStdErr) : base(handleStdIn, handleStdOut, handleStdErr)
{
}
public override void StdIn(out string input, int count)
{
char[] userInput = new char[count];
Console.In.ReadBlock(userInput, 0, count);
input = new string(userInput);
}
public override void StdOut(string output)
{
//log
}
public override void StdError(string error)
{
//log
}
}
For our previous method, we simple include a call to this class and this avoids errors when multiple tasks are executed at the same time:
// Define switches...
string[] switchesArray = switches.ToArray();
using (GhostscriptProcessor procesador = new GhostscriptProcessor())
{
try
{
procesador.StartProcessing(switchesArray, new ConsoleStdIO(true, true, true));
byte[] destinationFile = System.IO.File.ReadAllBytes(destinationPath);
return destinationFile;
}
catch (Exception ex)
{
throw ex;
}
finally
{
System.IO.File.Delete(sourceFile);
}
}
Well, it seems to me that you are asking here for technical support.
You clearly want to use Ghostscript in a commercial undertaking, indeed one might reasonably say you want an enterprise version of Ghostscript. Presumably you don't want to alter the source in order to permit you to use an open source license, because you don't want to pay for a commercial license.
With that in mind the answers to your questions are:
No, the Ghostscript developers don't actually build thread-safe versions of the binaries.
Currently, no. That's probably an oversight.
That would be a technical support question, there's no guarantee of technical support to free users, it's the one of the few areas of leverage for dual license vendors to persuade people to take up a commercial license. So I hope you will understand that I'm not going to provide that.
as far as I can see, no.
I have to use a reporting tool (IBM Cognos analytics), which provides the ability to inserts custom scripts in HTML reports created by this reporting tool. So, we are supposed to be able to reference JS libraries like jQuery, jQueryUI, and work with them, etc..
This functionality in the reporting tool is official, and it is stated that they are using requirejs to handle this.
The model of code we need to respect in this case (from IBM "api") is the following:
define(['https://our_site/someFolder/someResource.js'], function() {
/*
A) - Here, I would expected that any code writtne here is executed only when "someResource.js" has been loaded
*/
function monModule() { };
monModule.prototype.setData= function() {
/*
B) - some code here
*/
};
monModule.prototype.draw= function(o) {
/*
C) - some code here
*/
};
return myObject;});
My questions are the following:
1) My current understanding of the behavior within a "define" block is that the JS code will be executed only once the resources mentioned in the "define" instruction are loaded/available. Is it correct?
In the example above, the part mentioned in A) or C) would be "evaluated" / executed only when the resource "someResource.js" is loaded. Do I understand correctly?
2) the RequireJs library used by IBM Cognos analytics shows the following:
/** vim: et:ts=4:sw=4:sts=4
* #license RequireJS 2.1.14 Copyright (c) 2010-2014, The Dojo Foundation All Rights Reserved.
* Available via the MIT or new BSD license.
* see: http://github.com/jrburke/requirejs for details
*/
Can I assume then that the RequireJS library used by IBM Cognos analytics is the "common" / "standard" and not a derived one, and so that all functionalities and behavior of RequireJS should be available in IBM Cognos analytics?
I'm asking all this because we are facing issues where some piece of JS code seems to be executed before the resource is loaded; we get error like ' "$" doesn't exist ..etc..'.
Any help / advice is welcome.
Thanks!
1). The callback of define() is only executed once, to define your module.So in your example, only A) will be executed, but B) and C) are the code of methods in the prototype of monModule
Here are steps which RequireJS is doing under the hood when you execute define() function:
It loads async module's dependencies
When dependencies are loaded, it executes the callback, injecting loaded dependencies as arguments. The callback should returns a value which is defined module.
It stores internally the returned value, so when other module require just defined module it will be injected as argument.
2) You can calculate md5 checksum for that file and validate it against the "official" RequireJS library
About errors you are get. Maybe you need to setup shim to make sure that jQuery is loaded before the module which needs jQuery?
Does the node interpreter look for core modules (let's say "fs") within the node binary? If yes, are these modules packaged as js files. Are the core modules that are referenced within our code converted to c/c++ code first and then executed? For example, I see a method in the _tls_common.js (https://github.com/nodejs/node/blob/master/lib/_tls_common.js) file called "loadPKCS12" and the only place that I see this method being referenced/defined is within the "node_crypto.cc" file (https://github.com/nodejs/node/blob/master/src/node_crypto.cc). So how does node link a method in javascript with the one defined in the c/c++ file?
here is the extract from the _tls_common.js file that makes use of the "loadPKCS12" method:
if (passphrase) {
c.context.loadPKCS12(buf, toBuf(passphrase));
} else {
c.context.loadPKCS12(buf);
}
}
} else {
const buf = toBuf(options.pfx);
const passphrase = options.passphrase;
if (passphrase) {
c.context.loadPKCS12(buf, toBuf(passphrase));
} else {
c.context.loadPKCS12(buf);
There are two different (but seemingly related) questions asked here. The first one is: "How the core modules work?". Second one being "How does NodeJS let c++ code get referenced and executed in JavaScript?". Let's take them one by one.
How the core modules work?
The core modules are packaged with NodeJS binary. And, while they are packaged with the binary, they are not converted to c++ code before packaging. The internal modules are loaded into memory during bootstrap of the node process. When a program executes, lets say require('fs'), the require function simply returns the already loaded module from cache. The actual loading of the internal module obviously happens in c++ code.
How does NodeJS let c++ code get referenced in JS?
This ability comes partly from V8 engine which exposes the ability to create and manage JS constructs in C++, and partly from NodeJS / LibUV which create a wrapper on top of V8 to provide the execution environment. The documentation about such node modules can be accessed here. As the documentation states, these c++ modules can be used in JS file by requiring them, like any other ordinary JS module.
Your example for use of c++ function in JS (loadPKCS12), however, is more special case of internal c++ functionality of NodeJS. loadPKCS12 is called on a object of SecureContext imported from crypto c++ module. If you follow the link to SecureContext import in _tls_common.js above, you will see that the crypto is not loaded using require(), instead a special (global) method internalBinding is used to obtain the reference. At the last line in node_crypto.cc file, initializer for internal module crypto is registered. Following the chain of initialization, node::crypto::Initialize calls node::crypto::SecureContext::Initialize which creates a function template, assigns the appropriate prototype methods and exports it on target. Eventually these exported functionalities from C++ world are imported and used in JS-World using internalBinding.
i've been working with NodeJS 0.11.x distros for some time now, mainly because i believe that generators and the yield statement bring big advances in terms of asynchronous manageability (see coffy-script and suspend).
that said, there's a serious setback when running bleeding-edge, unstable NodeJS installs: when doing npm install xy-module, gyp will fail (always? sometimes?) when trying to compile any C components.
is there a general reason this must be so? is there any trick / patch / configuration i can apply to remedy the situation? if a given module does compile on NodeJS 0.10.x, but fails on 0.11.x, should i expect it to compile on 0.12.x as soon as that becomes available?
Update i cross-posted the issue on the NodeJS mailing list, and ben noordhuis was kind enough to share some details. quoting his message:
The two main changes are as follows:
Persistent<T> no longer derives from Handle<T>. To recreate the
Handle from a Persistent, call Local<T>::New(isolate, persistent).
You can obtain the isolate with Isolate::GetCurrent() (but note that
Isolate::GetCurrent() will probably go away in newer versions of V8.)
The prototype of C++ callbacks and accessors has changed. Before,
your function looked like this:
Handle<Value> MyCallback(const Arguments& args) {
HandleScope handle_scope;
/* Do useful work, then: */
return handle_scope.Close(Integer::New(42));
/* Or: */
return handle_scope.Close(String::New("hello"));
/* Or: */
return Null();
}
In v0.11 and v0.12 that becomes:
void MyCallback(const FunctionCallbackInfo<Value>& args) {
Isolate* isolate = args.GetIsolate();
HandleScope handle_scope(isolate);
/* Do useful work, then: */
args.GetReturnValue().Set(42);
/* Or: */
args.GetReturnValue().Set(String::NewFromUtf8(isolate, "hello"));
/* Or: */
args.GetReturnValue().SetNull();
}
There have been more changes but these two impact every native add-on.
Answered in detail in NodeUp #52: http://nodeup.com/fiftytwo
Summary: major changes in the v8 API, some minor changes in Node, and the changes are still ongoing. But there are two projects that are designed to help with the problem, NAN (github/rvagg/nan) and shim / node-addon-layer (github/tjfontaine/node-addon-layer).
I'm trying to port a new versio of the Isis2 library from .NET on Windows to Mono/Linux. This new code uses MemoryMappedFile objects, and I suddenly am running into issues with the Mono.Posix.Helper library. I believe that my issues would vanish if I could successfully compile and run the following test program:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.IO.MemoryMappedFiles;
namespace foobar
{
class Program
{
static int CAPACITY = 100000;
static void Main(string[] args)
{
MemoryMappedFile mmf = MemoryMappedFile.CreateNew("test", CAPACITY);
MemoryMappedViewAccessor mva = mmf.CreateViewAccessor();
for (int n = 0; n < CAPACITY; n++)
{
byte b = (byte)(n & 0xFF);
mva.Write<byte>(n, ref b);
}
}
}
}
... at present, when I try to compile this on Mono I get a bewildering set of linker errors: it seems unable to find libMonoPosixHelper.so, although my LD_LIBRARY_PATH includes the directory containing that file, and then if I manage to get past that stage, I get "System.NotImplementedException: The requested feature is not implemented." at runtime. Yet I've looked at the Mono implementation of the CreateNew method; it seems fully implemented, and the same is true for the CreateViewAccessor method. Thus I have a sense that something is going badly wrong when linking to the Mono libraries.
Does anyone have experience with MemoryMappedFile objects under Mono? I see quite a few questions about this kind of issue here and on other sites, but all seem to be old threads...
OK, I figured at least part of this out by inspection of the Mono code implementing this API. In fact they implemented CreateNew in a way that departs pretty drastically from the .NET API, causing these methods to behave very differently from what you would expect.
For CreateNew, they actually require that the file name you specify be the name of an existing Linux file of size at least as large as the capacity you specify, and also do some other checks for access permissions (of course), exclusive access (which is at odds with sharing...) and to make sure the capacity you requested is > 0. So if you had the file previously open, or someone else does, this will fail -- in contrast to .NET, where you explicitly use memory-mapped files for sharing.
In contrast, CreateOrOpen appears to be "more or less" correctly implemented; switching to this version seems to solve the problem. To get the effect of CreateNew, do a Delete first, wrapping it in a try/catch to catch IOException if the file doesn't exist. Then use File.WriteAllBytes to create a file with your desired content. Then call CreateOrOpen. Now this sounds dumb, but it works. Obviously you can't guarantee atomicity this way (three operations rather than one), but at least you get the desired functionality.
I can live with these restrictions as it works out, but they may surprise others, and are totally different from the .NET API definition for MemoryMappedFile.
As for my linking issues, as far as I can tell there is a situation in which Mono doesn't use the LD_LIBRARY_PATH you specify correctly and hence can't find the .so file or .dll file you used. I'll post more on this if I can precisely pin down the circumstances -- on this one, I've worked around the issue by statically linking to the library.