Subsonic - Migration from 2.2 to 3.0 - subsonic

Where can I find an examples of code migration from SubsSonic 2.2 to 3.0.
For example we have Subssonic 2.2 code:
public static void SavePageModules(PageModuleCollection modCol)
{
modCol.SaveAll();
}
How do we write this method for 3.0.
Are Collection deprecated in 3.0.
Best regards.

There is no equivalent for SaveAll in Subsonic 3. I suggest you check the batch query feature: http://subsonicproject.com/docs/BatchQuery The migration implies rewrite code, since Subsonic 3 has been constructed over a different schema and using another paltform: the framework 3.5.

Related

How can I create User Defined Functions in Cassandra with Custom Java Class?

I couldn't find this anywhere online. How can I create a custom user defined function in cassandra?.
For Ex :
CREATE OR REPLACE FUNCTION customfunc(custommap map<text, int>)
CALLED ON NULL INPUT
RETURNS map<int,bigint>
LANGUAGE java AS 'return MyClass.mymethod(custommap);';
Where "MyClass" is a class that I can register in the Classpath?
I have the same issue, too. The custom class in UDF is support in cassandra 2.2.14, but not in cassandra 3.11.4.
Go through the source codes, cassandra 3.11.4 setup the UDF class loader with no parent class loader so that it have full control about what class/resource UDF uses. In org.apache.cassandra.cql3.functions.UDFunction.java, a whitelist and blacklist is used to control which class/package can be access.
For your issue, you should add the full name of MyClass into whitelist, and re-build the cassandra.
1. First build your java project that contains your class. Remember you have to add package name to your class.
Example :
package exp;
import java.lang.Math;
import java.util.*;
public class MyClass
{
public static Map<Integer,Long> mymethod(Map<String, Integer> data) {
Map<Integer,Long> map = new HashMap<>();
map.put(1, 10L);
map.put(2, 20L);
map.put(3, 30L);
return map;
}
}
After compile and build i have the jar test.jar
2. Copy the jar file to all cassandra node's $CASSANDRA_HOME/lib Directory
3. Restart All Cassandra Node
4. Create your custom function
Example :
CREATE OR REPLACE FUNCTION customfunc(custommap map<text, int>)
CALLED ON NULL INPUT
RETURNS map<int,bigint>
LANGUAGE java
AS 'return exp.MyClass.mymethod(custommap);';
Now you can use the function :
cassandra#cqlsh:test> SELECT * FROM test_fun ;
id | data
----+------------------
1 | {'a': 1, 'b': 2}
(1 rows)
cassandra#cqlsh:test> SELECT customfunc(data) FROM test_fun ;
test.customfunc(data)
-----------------------
{1: 10, 2: 20, 3: 30}
(1 rows)
Just adding my 2 cents to this thread as I tried building an external class method to support something similar. After trying for hours with Datastax Sandbox 5.1 I could not get this to work as it couldn't seem to find my class and kept raising type errors.
My guess is that external JAR-based code for UDFs is not supported (see http://koff.io/posts/hll-in-cassandra/ and https://issues.apache.org/jira/browse/CASSANDRA-9892). Support for "TRUSTED" JARs is in planning stages for Cassandra 4. It might work in earlier versions pre 3.0 but I'm using the latest version from Datastax.
To work around this issue, I had to fallback to using a Javascript version instead (I was trying to convert a JSON string into a Map object).
While I realize Java UDFs perform better, the code I was testing was using Java Nashorn javascript support anyway, so using Javascript might not be such a bad thing. It does end up with a simpler one-liner UDF.

ObjectQuery extensions from managed C++/CLI

I'm trying to move a project over to using Entity Framework, but to make it more fun, the project is in C++/CLR.
I've got a query
ObjectQuery<myData::Facility^>^ facQ = myContext->FacilitySet;
and I want to do this
int n = facQ.Count()
But I can't because c++ doesn't recognise extension methods using C# syntax. facQ->Count() doesn't work.
Using C# extension methods from managed C++/CLI shows the answer for user-defined extensions; but in this case, the extension is part of the .NET framework http://msdn.microsoft.com/en-us/library/bb349034%28v=vs.90%29.aspx.
Any ideas?
(I'm using visual studio 2008, and .NET 3.5).
System::Data::Objects::ObjectQuery implements IEnumerable<T>. The Count() method you see in C# is from the System::Linq::Enumerable class.
using namespace System::Linq;
int n = Enumerable::Count(facQ);
Also see this answer, which shows a couple examples of calling other extension methods in that class.

what's the proper way to handle multiple connections to multiple Excel files in ADO.NET 4.0?

I have a piece of code written using VS 2005 that works fine in computers running .NET 2.0 but hard crashes in computers running .NET 4.0.
The section of hte code that's causing the problem is a call to the DataAdapter's Fill() method. The code looks as follows:
private void button_Click(object sender, EventArgs e)
{
DataTable dt1 = new DataTable();
string connectionString = ... //connects to excelfile1.xls
string selectCommand = "SELECT * FROM [Sheet1$]";
using(OleDbDataAdapter adapter = new OleDbDataAdapter(selectCommand, connectionString))
{
adapter.SelectCommand.Connection.Open();
adapter.Fill(dt1);
}
DataTable dt2 = new DataTable();
connectionString = ... //connects to excelfile2.xls
using(OleDbDataAdapter adapter = new OleDbDataAdapter(selectCommand, connectionString))
{
adapter.SelectCommand.Connection.Open();
adapter.Fill(dt2);
}
}
Several things happen if I make slight modifications to the code:
if I remove the two calls to OleDbConnection.Open(), the code will work just fine with .NET 2.0, but hard crash with .NET 4.0.
if I remove only the second call to OleDbConnection.Open(), the code will work fine with .NET 2.0 and .NET 4.0. Alas, I need to retrieve data from two separate excel files and fill two separate DataTables each time the event is fired.
if I use both calls to OleDbConnection.Open(), as shown in the code above, the code will work fine with .NET 2.0 and .NET 4.0, BUT hard crash with .NET 4.0 the second or third time the user clicks on the button and the procedure runs.
My guess is that .NET 4.0 manages connections differently than .NET 2.0 and I'm missing some very important step.
Can someone please tell how I should write the above code in such a way that it will work fine under .NET 2.0 and .NET 4.0?
The problem was "Application Verifier".

Is it possible to use ASP.NET Dynamic Data and SubSonic 3?

Is it possible to use ASP.NET Dynamic Data with SubSonic 3 in-place of Linq to SQL classes or the Entity Framework? MetaModel.RegisterContext() throws an exception if you use the context class that SubSonic generates. I thought I remembered coming across a SubSonic/Dynamic Data example back before SubSonic 3 was released but I can't find it now. Has anyone been able to get this to work?
I just got Subsonic 3.0.0.4 ActiveRecord working last night in Visual Studio 2010 with my SQLite database after a little bit of work and I've tried to document the steps taken here for your benefit.
Start by adding a New Item -> WCF Data Service to the project you're using to host your webapp/webservices then modify it similar to my PinsDataService.svc.cs below:
public class PinsDataService : DataService<PINS.Lib.dbPINSDB>
{
// This method is called only once to initialize service-wide policies.
public static void InitializeService(DataServiceConfiguration config)
{
config.SetEntitySetAccessRule("*", EntitySetRights.All);
config.UseVerboseErrors = true;
config.DataServiceBehavior.MaxProtocolVersion = DataServiceProtocolVersion.V2;
}
}
At this point your Dynamic Data Service would probably be working if you matched all the database naming conventions perfectly but I didn't have that kind of luck. In my ActiveRecord.tt template I had to prepend the following two lines before the public partial class declarations:
[DataServiceKey("<#=tbl.PrimaryKey #>")]
[IgnoreProperties("Columns")]
public partial class <#=tbl.ClassName#>: IActiveRecord {
I then added references to System.Data and System.Data.Services.Client followed by the inclusion of using statements for using System.Data.Services and using System.Data.Services.Common at the top of the ActiveRecord.tt template.
The next step was to use the IUpdateable partial class implementation from this blog post http://blogs.msdn.com/aconrad/archive/2008/12/05/developing-an-astoria-data-provider-for-subsonic.aspx and change the public partial class dbPINSDB : IUpdatable to match my subsonic DatabaseName declared in Settings.ttinclude
Then to consume the data in a separate client app/library I started by adding a 'Service Reference' named PinsDataService to the PinsDataService.svc from my client app and went to town:
PinsDataService.dbPINSDB PinsDb =
new PinsDataService.dbPINSDB(new Uri("http://localhost:1918/PinsDataService.svc/"));
PinsDataService.Alarm activeAlarm =
PinsDb.Alarms.Where(i => i.ID == myAA.Alarm_ID).Take(1).ElementAt(0);
Note how I'm doing a Where query that returns only 1 object but I threw in the Take(1) and then ElementAt(0) because I kept getting errors when I tried to use SingleOrDefault() or First()
Hope this helps--also, I'm already aware that dbPINSDB is a really bad name for my Subsonic Database ;)

Problem with RunMigrations in SimpleRepository Example - Subsonic 3

I downloaded today Subsonic 3 and tried out the examples. I am having a problem with the SimpleRepository example and I wondered if anyone else has had this. In the HomeController there is a defintion as follows:
public HomeController() {
_repo = new SimpleRepository("Blog");
}
I wanted to enable the migrations and so changed it to this:
public HomeController() {
_repo = new SimpleRepository("Blog", SimpleRepositoryOptions.RunMigrations);
}
However, when this runs it causes an error - stating an issue "String or binary data would be truncated.".
If it makes a difference, the version of VS is 2008 (with the GDR applied)
This is still an issue in the latest 3.0.0.1 and .2 downloads..
You get this error message if the migration you are trying to run would edit/truncate data in your database.
Do you have sql profiler available? That way you can see the sql statement. If you don't have sql profiler available you will need to download the source and use debug to see the actual sql statement that it is trying to execute.
Way way way late to this party, but you probably need to add the [SubSonicLongString] attribute to the columns that have more than the default 225 characters for a plain String.

Resources