I have noticed that the custom properties of a webpart I developed return to their default values when I reboot my machine.
Is that a normal behavior? are the properties saved as far as the server is up, or there is some parameters I am missing.
Thank you.
EDIT: code:
namespace TestWebpart
{
[ToolboxItemAttribute(false)]
[XmlRoot(Namespace = "TestWebpart")]
public class GraphWebpart : Microsoft.SharePoint.WebPartPages.WebPart
{
// Visual Studio might automatically update this path when you change the Visual Web Part project item.
private const string _ascxPath = #"~/_CONTROLTEMPLATES/Test_Graph/TestWebpart/GraphWebpartUserControl.ascx";
protected override void CreateChildControls()
{
ReloadElements();
}
protected void ReloadElements()
{
Controls.Clear();
GraphWebpartUserControl control = (GraphWebpartUserControl)Page.LoadControl(_ascxPath);
control.xmlDataUrl = XMLFileUrl;
Controls.Add(control);
}
private static string _xmlFileUrl;
[WebBrowsable(true),
Personalizable(PersonalizationScope.Shared),
DefaultValue(""),
Description("xml"),
DisplayName("xml"),
WebDisplayName("xml")]
public string XMLFileUrl
{
get { return _xmlFileUrl; }
set {
_xmlFileUrl = value;
ReloadElements();
}
}
}
}
EDIT2:
Deleting static from the fields throws the flowing exception:
Web Part Error: An error occurred while setting the value of this property: TestWebpart:XMLFileUrl - Exception has been thrown by the target of an invocation.
Hide Error Details
[WebPartPageUserException: An error occurred while setting the value of this property: Blue_Graph.GraphWebpart.GraphWebpart:XMLFileUrl - Exception has been thrown by the target of an invocation.]
at Microsoft.SharePoint.WebPartPages.BinaryWebPartDeserializer.ApplyPropertyState(Control control)
at Microsoft.SharePoint.WebPartPages.BinaryWebPartDeserializer.Deserialize()
at Microsoft.SharePoint.WebPartPages.SPWebPartManager.CreateWebPartsFromRowSetData(Boolean onlyInitializeClosedWebParts)
First of all you should not have
private static string _xmlFileUrl;
it should be
private string _xmlFileUrl;
This static variable will be lost on IISRESET - won't work in a farm and has the potential to cause all sort of 'thread safe' issues if used multi-threaded environment (like a web server) so only use them if they are really needed.
When SharePoint loads a web part (or after you click Save/Apply in the toolpart) it uses reflection to find your properties (the [Browsable... attribute) and then serialization to load/save the value of the property to the database. One of these two is failing.
I would suspect that is some problem with the attribute - try this one and work backwards until it stops working ;)
[Browsable(true),
Category("Miscellaneous"),
DefaultValue(defaultText),
WebPartStorage(Storage.Personal),
FriendlyName("Text"),
Description("Text Property")]
Related
In c# .Net 4.0 I am attempting to automate WordPerfect.
To do this I add a reference in my project to the wpwin14.tlb file that lives in the WordPerfect program folder.
That has the effect of creating the COM interfaces within my project.
Next I should be able to write code that instantiates a WordPerfect.PerfectScript object that I can use to automate WordPerfect.
However, when I try to instantiate the WordPerfect.PerfectScript object c# throws the error:
"Unable to cast COM object of type 'System.__ComObject' to interface
type 'WordPerfect.PerfectScript'. This operation failed because the
QueryInterface call on the COM component for the interface with IID
'{C0E20006-0004-1000-0001-C0E1C0E1C0E1}' failed due to the following
error: The RPC server is unavailable. (Exception from HRESULT:
0x800706BA)."
The thing to zero in on in that message (I do believe) is that the RPC server is unavailable.
I have tried this with WordPerfect running in the background and without. And I have gone to my services and made sure that RPC services were all running and restarting everything.
Is it possible that I am getting blocked by a firewall? That is my only faintest guess
I just wrap it as an OLE call and clean up my COM object with FinalReleaseComObject.
Here's a simple wrapper class I've been using to open Wp docs and convert them to pdf. It cleans up nicely in our automated process:
public class WpInterop : IDisposable
{
private bool _disposed;
private PerfectScript _perfectScript;
public PerfectScript PerfectScript
{
get
{
if (_perfectScript == null)
{
Type psType = Type.GetTypeFromProgID("WordPerfect.PerfectScript");
_perfectScript = Activator.CreateInstance(psType) as PerfectScript;
}
return _perfectScript;
}
}
protected void Dispose(bool disposing)
{
if (disposing)
{
Marshal.FinalReleaseComObject(_perfectScript);
}
_disposed = true;
}
public void Dispose()
{
if (_disposed == false)
{
GC.SuppressFinalize(this);
Dispose(true);
}
}
}
Make sure your version of WordPerfect has all of the service packs and hot fixes installed. This step has fixed many random-sounding issues for me over the years. Looks like you are using X4, which is no longer supported by Corel, which means that the updates are no longer on its web site. You should be running version 14.0.0.756 (SP2 plus 2 hotfixes).
I just uninstalled WPX4 and re-installed it, without the service pack updates. Running this code gave the exact error as the OP:
using System.Runtime.InteropServices;
using WordPerfect;
namespace WP14TLB
{
class Program
{
static void Main(string[] args)
{
PerfectScript ps = new PerfectScript();
ps.WPActivate();
ps.KeyType("Hello WP World!");
Marshal.ReleaseComObject(ps);
ps = null;
}
}
}
Installing the service packs "magically" fixed the problem.
BTW, for future reference, you can also try the WPUniverse forums. There are quite a few WP experts who regularly answer difficult questions.
There is also a link to the X4 updates here:
I am not able to define a [BeforeFeature]/[AfterFeature] hook for my feature file. The application under test is WPF standalone desktop applications.
If I use [BeforeScenario]/[AfterScenario] everything works fine, the application starts without any problem, the designed steps are performed correctly and the app is closed.
Once I use the same steps with [BeforeFeature]/[AfterFeature] tags the application starts and the test fails with:
The following error occurred when this process was started: Object reference not set to an instance of an object.
Here is an example:
[Binding]
public class Setup
{
[BeforeScenario("setup_scenario")]
public static void BeforeAppScenario()
{
UILoader.General.StartApplication();
}
[AfterScenario("setup_scenario")]
public static void AfterAppScenario()
{
UILoader.General.CloseApplication();
}
[BeforeFeature("setup_feature")]
public static void BeforeAppFeature()
{
UILoader.General.StartApplication();
}
[AfterFeature("setup_feature")]
public static void AfterAppFeature()
{
UILoader.General.CloseApplication();
}
}
StartApplication/CloseApplication were recorded and auto-generated with Coded UI Test Builder:
public void StartApplication()
{
// Launch '%ProgramFiles%\...
ApplicationUnderTest Application = ApplicationUnderTest.Launch(this.StartApplicationParams.ExePath, this.StartApplicationParams.AlternateExePath);
}
public class StartApplicationParams
{
public string ExePath = "C:\\Program Files..."
public string AlternateExePath = "%ProgramFiles%\\..."
}
Noteworthy: I'm quite new with SpecFlow.
I can't figure it out why my test fails with [BeforeFeature] and works fine with [BeforeScenario].
It would be great if somebody could help me with this issue. Thanks!
I ran into a similar problem recently. Not sure if this can still help you, but it may be of use for people who stumble upon this question.
For BeforeFeature\AfterFeature to work, the feature itself needs to be tagged, tagging just specific scenarios will not work.
Your feature files should start like this:
#setup_feature
Feature: Name Of Your Feature
#setup_scenario
Scenario: ...
I'm new to WCF RIA Services, and have been working with LightSwitch for 4 or so months now.
I created a generic screen to be used for editing lookup tables all over my LightSwitch application, mostly to learn how to create a generic screen that can be used with different entity sets on a dynamic basis.
The screen is pretty simple:
Opened with arguments similar to this:
Application.ShowLookupTypesList("StatusTypes", "StatusTypeId"); which correspond to the entity set for the lookup table in the database.
Here's my WCF RIA service code:
using System.Data.Objects.DataClasses;
using System.Diagnostics;
using System.Reflection;
using System;
using System.Collections.Generic;
using System.ComponentModel.DataAnnotations;
using System.Data;
using System.Linq;
using System.ServiceModel.DomainServices.EntityFramework;
using System.ServiceModel.DomainServices.Server;
namespace WCF_RIA_Project
{
public class LookupType
{
[Key]
public int TypeId { get; set; }
public string Name { get; set; }
}
public static class EntityInfo
{
public static Type Type;
public static PropertyInfo Key;
public static PropertyInfo Set;
}
public class WCF_RIA_Service : LinqToEntitiesDomainService<WCSEntities>
{
public IQueryable<LookupType> GetLookupTypesByEntitySet(string EntitySetName, string KeyName)
{
EntityInfo.Set = ObjectContext.GetType().GetProperty(EntitySetName);
EntityInfo.Type = EntityInfo.Set.PropertyType.GetGenericArguments().First();
EntityInfo.Key = EntityInfo.Type.GetProperty(KeyName);
return GetTypes();
}
[Query(IsDefault = true)]
public IQueryable<LookupType> GetTypes()
{
var set = (IEnumerable<EntityObject>)EntityInfo.Set.GetValue(ObjectContext, null);
var types = from e in set
select new LookupType
{
TypeId = (int)EntityInfo.Key.GetValue(e, null),
Name = (string)EntityInfo.Type.GetProperty("Name").GetValue(e, null)
};
return types.AsQueryable();
}
public void InsertLookupType(LookupType lookupType)
{
dynamic e = Activator.CreateInstance(EntityInfo.Type);
EntityInfo.Key.SetValue(e, lookupType.TypeId, null);
e.Name = lookupType.Name;
dynamic set = EntityInfo.Set.GetValue(ObjectContext, null);
set.AddObject(e);
}
public void UpdateLookupType(LookupType currentLookupType)
{
var set = (IEnumerable<EntityObject>)EntityInfo.Set.GetValue(ObjectContext, null);
dynamic modified = set.FirstOrDefault(t => (int)EntityInfo.Key.GetValue(t, null) == currentLookupType.TypeId);
modified.Name = currentLookupType.Name;
}
public void DeleteLookupType(LookupType lookupType)
{
var set = (IEnumerable<EntityObject>)EntityInfo.Set.GetValue(ObjectContext, null);
var e = set.FirstOrDefault(t => (int)EntityInfo.Key.GetValue(t, null) == lookupType.TypeId);
Debug.Assert(e.EntityState != EntityState.Detached, "Entity was in a detached state.");
ObjectContext.ObjectStateManager.ChangeObjectState(e, EntityState.Deleted);
}
}
}
When I add an item to the list from the running screen, save it, then edit it and resave, I receive data conflict "Another user has deleted this record."
I can workaround this by reloading the query after save, but it's awkward.
If I remove, save, then readd and save an item with the same name I get unable to save data, "The context is already tracking a different entity with the same resource Uri."
Both of these problems only affect my generic screen using WCF RIA Services. When I build a ListDetail screen for a specific database entity there are no problems. It seems I'm missing some logic, any ideas?
I've learned that this the wrong approach to using LightSwitch.
There are several behind-the-scenes things this generic screen won't fully emulate and may not be do-able without quite a bit of work. The errors I've received are just one example. LightSwitch's built-in conflict resolution will also fail.
LS's RAD design means just creating a bunch of similar screens is the way to go, with some shared methods. If the actual layout needs changed across many identical screens at once, you can always find & replace the .lsml files if you're careful and make backups first. Note that modifying these files directly isn't supported.
I got that error recently. In my case I create a unique ID in my WCF RIA service, but in my screen behind code I must explicitly set a unique ID when I create the object that will later be passed to the WCF RIA Service insert method (this value will then be overwritten with the unique counter ID in the table of the underlying database).
See the sample code for this project:
http://lightswitchhelpwebsite.com/Blog/tabid/61/EntryId/157/A-Visual-Studio-LightSwitch-Picture-File-Manager.aspx
I wrote about this topic in another question.
However, I've since refactored my code to get rid of configuration access, thus allowing the specs to pass. Or so I thought. They run fine from within Visual Studio using TestDriven.Net. However, when I run them during rake using the mspec.exe tool, they still fail with a serialization exception. So I've created a completely self-contained example that does basically nothing except setup fake security credentials on the thread. This test passes just fine in TD.Net, but blows up in mspec.exe. Does anybody have any suggestions?
Update: I've discovered a work-around. After researching the issue, it seems the cause is that the assembly containing my principal object is not in the same folder as the mspec.exe. When mspec creates a new AppDomain to run my specs, that new AppDomain has to load the assembly with the principal object in order to deserialize it. That assembly is not in the same folder as the mspec EXE, so it fails. If I copied my assembly into the same folder as mspec, it works fine.
What I still don't understand is why ReSharper and TD.Net can run the test just fine? Do they not use mspec.exe to actually run the tests?
using System;
using System.Security.Principal;
using System.Threading;
using Machine.Specifications;
namespace MSpecTest
{
[Subject(typeof(MyViewModel))]
public class When_security_credentials_are_faked
{
static MyViewModel SUT;
Establish context = SetupFakeSecurityCredentials;
Because of = () =>
SUT = new MyViewModel();
It should_be_initialized = () =>
SUT.Initialized.ShouldBeTrue();
static void SetupFakeSecurityCredentials()
{
Thread.CurrentPrincipal = CreatePrincipal(CreateIdentity());
}
static MyIdentity CreateIdentity()
{
return new MyIdentity(Environment.UserName, "None", true);
}
static MyPrincipal CreatePrincipal(MyIdentity identity)
{
return new MyPrincipal(identity);
}
}
public class MyViewModel
{
public MyViewModel()
{
Initialized = true;
}
public bool Initialized { get; set; }
}
[Serializable]
public class MyPrincipal : IPrincipal
{
private readonly MyIdentity _identity;
public MyPrincipal(MyIdentity identity)
{
_identity = identity;
}
public bool IsInRole(string role)
{
return true;
}
public IIdentity Identity
{
get { return _identity; }
}
}
[Serializable]
public class MyIdentity : IIdentity
{
private readonly string _name;
private readonly string _authenticationType;
private readonly bool _isAuthenticated;
public MyIdentity(string name, string authenticationType, bool isAuthenticated)
{
_name = name;
_isAuthenticated = isAuthenticated;
_authenticationType = authenticationType;
}
public string Name
{
get { return _name; }
}
public string AuthenticationType
{
get { return _authenticationType; }
}
public bool IsAuthenticated
{
get { return _isAuthenticated; }
}
}
}
Dan,
thank you for providing a reproduction.
First off, the console runner works differently than the TestDriven.NET and ReSharper runners. Basically, the console runner has to perform a lot more setup work in that it creates a new AppDomain (plus configuration) for every assembly that is run. This is required to load the .dll.config file for your spec assembly.
Per spec assembly, two AppDomains are created:
The first AppDomain (Console) is created
implicitly when mspec.exe is
executed,
a second AppDomain is created by mspec.exe for the assembly containing the specs (Spec).
Both AppDomains communicate with each other through .NET Remoting: For example, when a spec is executed in the Spec AppDomain, it notifies the Console AppDomain of that fact. When Console receives the notification it acts accordingly by writing the spec information to the console.
This communiciation between Spec and Console is realized transparently through .NET Remoting. One property of .NET Remoting is that some properties of the calling AppDomain (Spec) are automatically included when sending notifications to the target AppDomain (Console). Thread.CurrentPrincipal is such a property. You can read more about that here: http://sontek.vox.com/library/post/re-iprincipal-iidentity-ihttpmodule-serializable.html
The context you provide will run in the Spec AppDomain. You set Thread.CurrentPrincipal in the Because. After Because ran, a notification will be issued to the Console AppDomain. The notification will include your custom MyPrincipal that the receiving Console AppDomain tries to deserialize. It cannot do that since it doesn't know about your spec assembly (as it is not included in its private bin path).
This is why you had to put your spec assembly in the same folder as mspec.exe.
There are two possible workarounds:
Derive MyPrincipal and MyIdentity from MarshalByRefObject so that they can take part in cross-AppDomain communication through a proxy (instead of being serialized)
Set Thread.CurrentPrincipal transiently in the Because
(Text is required for formatting to work -- please ignore)
Because of = () =>
{
var previousPrincipal = Thread.CurrentPrincipal;
try
{
Thread.CurrentPrincipal = new MyPrincipal(...);
SUT = new MyViewModel();
}
finally
{
Thread.CurrentPrincipal = previousPrincipal;
}
}
ReSharper, for example, handles all the communication work for us. MSpec's ReSharper Runner can hook into the existing infrastructure (that, AFAIK, does not use .NET Remoting).
I have generated a C# SharePoint Sequential Workflow project using the very handy STSDEV tool (it got me around the requirement to have access to a 32-bit SharePoint installation which is required for other tools such as VSeWSS 1.3).
I've added a simple 'modify the title' action to test my basic setup:
public sealed partial class CopyWorkflow : SharePointSequentialWorkflowActivity
{
public CopyWorkflow()
{
InitializeComponent();
workflowProperties = new SPWorkflowActivationProperties();
}
public SPWorkflowActivationProperties workflowProperties;
private void onWorkflowActivated1_Invoked_1(object sender, ExternalDataEventArgs e)
{
workflowProperties.Item["Title"] = workflowProperties.Item["Title"].ToString() + ": Processed by Workflow";
workflowProperties.Item.Update();
}
}
Whoever, after installing my workflow via WSP into an installation of WSS 3.0, activating the feature, and configuring the workflow to start whenever a new item is created for a particular list, I get my breakpoint in onWorkflowActivated1_Invoked_1 hit, but the workflowProperties.Item is always NULL instead of an SPListItem representing the item that was just added.
What do I need to do to get the Item to be filled when this callback is called?
Update: I've noticed that the thread executing the workflow is running anonymously rather than as the logged in user or the system user, and therefore won't have access to the list data. Furthermore, the SharePoint log file show the following exception:
Unexpected System.ArgumentNullException: Value cannot be null. Parameter name: uriString at System.Uri..ctor(String uriString) at Microsoft.SharePoint.SPSite..ctor(String requestUrl) at Microsoft.SharePoint.Workflow.SPWorkflowActivationProperties.<get_Site>b__0() at Microsoft.SharePoint.SPSecurity.CodeToRunElevatedWrapper(Object state) at Microsoft.SharePoint.SPSecurity.<>c__DisplayClass4.<RunWithElevatedPrivileges>b__2() at Microsoft.SharePoint.Utilities.SecurityContext.RunAsProcess(CodeToRunElevated secureCode) at Microsoft.SharePoint.SPSecurity.RunWithElevatedPrivileges(WaitCallback secureCode, Object param) at Microsoft.SharePoint.SPSecurity.RunWithElevatedPrivileges(CodeToRunElevated secureCode) at Microsoft.SharePoint.Workflow.SPWorkflowActivationProperties....
and
Unexpected ...get_Site() at Microsoft.SharePoint.Workflow.SPWorkflowActivationProperties.get_Web() at Microsoft.SharePoint.Workflow.SPWorkflowActivationProperties.get_Item() at BechtelWorkflow.CopyWorkflow.onWorkflowActivated1_Invoked_1(Object sender, ExternalDataEventArgs e) at System.Workflow.ComponentModel.Activity.RaiseGenericEvent[T](DependencyProperty dependencyEvent, Object sender, T e) at System.Workflow.Activities.HandleExternalEventActivity.RaiseEvent(Object[] args) at System.Workflow.Activities.HandleExternalEventActivity.Execute(ActivityExecutionContext executionContext) at System.Workflow.ComponentModel.ActivityExecutor'1.Execute(T activity, ActivityExecutionContext executionContext) at System.Workflow.ComponentModel.ActivityExecutor'1.Execute(Activity activi...
Have you bound WorkflowActivationProperties with Workflow designer?
WorkflowActivationProperties http://img718.imageshack.us/img718/9703/ss20100305091353.png
This issue occurs if the the InitialStateName of the designer in the workflow properties is not equal to "Initial state" or is pointed to other stage abruptly.
Once a state wherein we have the workflowProperties ,etc like the above image. Things start working as required.