Log4net Appender to Azure Document Db - azure

I am trying to write a custom appender for Log4net by referring the article http://korrozia.blogspot.in/2016/11/how-to-store-log4net-messages-to-azure.html. Documents are getting created in Azure Doc DB, but the log message is not appearing in document. Below is my custom appender code
public class DocumentDbAppender: AppenderSkeleton
{
private DocumentClient client;
public string DatabaseId
{
get;
set;
}
public string CollectionId
{
get;
set;
}
public string EndpointUrl
{
get;
set;
}
public string AuthKey
{
get;
set;
}
protected override void Append(LoggingEvent loggingEvent)
{
try
{
using (client = new DocumentClient(new Uri(EndpointUrl), AuthKey))
{
var database = RetrieveOrCreateDatabaseAsync(DatabaseId).Result;
var collection = RetrieveOrCreateCollectionAsync(database.SelfLink,
CollectionId).Result;
var document = CreateDocumentAsync(client, collection.SelfLink, loggingEvent).Result;
}
}
catch (DocumentClientException de)
{
Exception baseException = de.GetBaseException();
Debug.Print("Status code {0} error occurred: {1}, Message: {2}", de.StatusCode,
de.Message, baseException.Message);
}
catch (Exception e)
{
Exception baseException = e.GetBaseException();
Debug.Print("Error: {0}, Message: {1}", e.Message, baseException.Message);
}
}
private async Task<Database> RetrieveOrCreateDatabaseAsync(string id)
{
// Try to retrieve the database (Microsoft.Azure.Documents.Database) whose Id is equal to databaseId
var database = client.CreateDatabaseQuery().Where(db => db.Id == DatabaseId).
AsEnumerable().FirstOrDefault();
// If the previous call didn't return a Database, it is necessary to create it
if (database == null)
{
database = await client.CreateDatabaseAsync(new Database { Id = DatabaseId }).ConfigureAwait(false);
Debug.Print("Created Database: id - {0} and selfLink - {1}", database.Id, database.SelfLink);
}
return database;
}
private async Task<DocumentCollection> RetrieveOrCreateCollectionAsync(string databaseSelfLink,
string id)
{
// Try to retrieve the collection (Microsoft.Azure.Documents.DocumentCollection) whose
// Id is equal to collectionId
var collection = client.CreateDocumentCollectionQuery(databaseSelfLink).
Where(c => c.Id == id).ToArray().FirstOrDefault();
// If the previous call didn't return a Collection, it is necessary to create it
if (collection == null)
{
collection = await client.CreateDocumentCollectionAsync(databaseSelfLink,
new DocumentCollection { Id = id }).ConfigureAwait(false);
}
return collection;
}
private async Task<Document> CreateDocumentAsync(DocumentClient client,
string collectionSelfLink, LoggingEvent loggingEvent)
{
var doc = await client.CreateDocumentAsync(collectionSelfLink,
loggingEvent).ConfigureAwait(false);
return doc;
}
}
My Web.config looks like below
<log4net>
<!--This is for local testing. Comment out this appender when moved to azure cloud-->
<!--<appender name="RollingLogFileAppender" type="log4net.Appender.RollingFileAppender">
<file value=".\\Logs\\WebApi.log"/>
<appendToFile value="true"/>
<rollingStyle value="Size"/>
<maxSizeRollBackups value="10"/>
<maximumFileSize value="10MB"/>
<staticLogFileName value="true"/>
<layout type="log4net.Layout.PatternLayout">
<conversionPattern value="%-5p %d %5rms %-22.22c{1} %-18.25M - %m%n"/>
</layout>
</appender>-->
<appender name="DocumentDbAppender"
type="MyApp.Infra.Logger.DocumentDbAppender, MyApp.Infra.Logger">
<DatabaseId>Diagnostics</DatabaseId>
<CollectionId>Logs</CollectionId>
<EndpointUrl>https://mydb.documents.azure.com:443/</EndpointUrl>
<AuthKey>UB0w==</AuthKey>
<layout type="log4net.Layout.PatternLayout">
<conversionPattern value="%-5p %d %5rms %-22.22c{1} %-18.25M - %m%n"/>
<!--<conversionPattern value="%m"/>-->
</layout>
</appender>
I am creating logger instance as below
log4net.Util.LogLog.InternalDebugging = true;
log4net.Config.XmlConfigurator.Configure();
log4netLogger = log4net.LogManager.GetLogger(
System.Reflection.MethodBase.GetCurrentMethod().DeclaringType);
If I simply writes a info message, it looks like below in Document DB
{
"$type": "log4net.Core.LoggingEvent, log4net",
"LoggerName": MyApp.Infra.Logger.Log4NetLogger",
"Level": {
"$type": "log4net.Core.Level, log4net",
"Name": "INFO",
"Value": 40000,
"DisplayName": "INFO"
},
"Message": null,
"ThreadName": null,
"TimeStamp": "2017-03-20T15:57:17.7133358+05:30",
"LocationInfo": null,
"UserName": null,
"ExceptionString": null,
"Properties": null,
"Domain": null,
"Identity": null,
"id": "23071629-d896-4812-9a87-89871d969780"
}
Please note that Message is null. I am not sure why it is happening. If I use a Rolling File Appender, I am able to get the message without any issue.

From the link that you provided, we could find it overrides Append method and create a document for LoggingEvent object by using the CreateDocumentAsync method in Append method. Please try to check if Exception object ex is null when you write info message.
LogManager.GetLogger(System.Reflection.MethodBase.GetCurrentMethod().DeclaringType).Info("info mes", ex);
Besides, please set breakpoint inside Append(LoggingEvent loggingEvent) method to watch the properties of loggingEvent before you call CreateDocumentAsync method to create the document.

Related

Azure Diagnostics wrt Custom Logs and honoring scheduledTransferPeriod

I have implemented my own TraceListener similar to http://blogs.technet.com/b/meamcs/archive/2013/05/23/diagnostics-of-cloud-services-custom-trace-listener.aspx .
One thing I noticed is that that logs show up immediately in My Azure Table Storage. I wonder if this is expected with Custom Trace Listeners or because I am in a development environment.
My diagnosics.wadcfg
<?xml version="1.0" encoding="utf-8"?>
<DiagnosticMonitorConfiguration configurationChangePollInterval="PT1M""overallQuotaInMB="4096" xmlns="http://schemas.microsoft.com/ServiceHosting/2010/10/DiagnosticsConfiguration">
<DiagnosticInfrastructureLogs scheduledTransferLogLevelFilter="Information" />
<Directories scheduledTransferPeriod="PT1M">
<IISLogs container="wad-iis-logfiles" />
<CrashDumps container="wad-crash-dumps" />
</Directories>
<Logs bufferQuotaInMB="0" scheduledTransferPeriod="PT30M" scheduledTransferLogLevelFilter="Information" />
</DiagnosticMonitorConfiguration>
I have changed my approach a bit. Now I am defining in the web config of my webrole. I notice when I set autoflush to true in the webconfig, every thing works but scheduledTransferPeriod is not honored because the flush method pushes to the table storage. I would like to have scheduleTransferPeriod trigger the flush or trigger flush after a certain number of log entries like the buffer is full. Then I can also flush on server shutdown. Is there any method or event on the CustomTraceListener where I can listen to the scheduleTransferPeriod?
<system.diagnostics>
<!--http://msdn.microsoft.com/en-us/library/sk36c28t(v=vs.110).aspx
By default autoflush is false.
By default useGlobalLock is true. While we try to be threadsafe, we keep this default for now. Later if we would like to increase performance we can remove this. see http://msdn.microsoft.com/en-us/library/system.diagnostics.trace.usegloballock(v=vs.110).aspx -->
<trace>
<listeners>
<add name="TableTraceListener"
type="Pos.Services.Implementation.TableTraceListener, Pos.Services.Implementation"
/>
<remove name="Default" />
</listeners>
</trace>
</system.diagnostics>
I have modified the custom trace listener to the following:
namespace Pos.Services.Implementation
{
class TableTraceListener : TraceListener
{
#region Fields
//connection string for azure storage
readonly string _connectionString;
//Custom sql storage table for logs.
//TODO put in config
readonly string _diagnosticsTable;
[ThreadStatic]
static StringBuilder _messageBuffer;
readonly object _initializationSection = new object();
bool _isInitialized;
CloudTableClient _tableStorage;
readonly object _traceLogAccess = new object();
readonly List<LogEntry> _traceLog = new List<LogEntry>();
#endregion
#region Constructors
public TableTraceListener() : base("TableTraceListener")
{
_connectionString = RoleEnvironment.GetConfigurationSettingValue("DiagConnection");
_diagnosticsTable = RoleEnvironment.GetConfigurationSettingValue("DiagTableName");
}
#endregion
#region Methods
/// <summary>
/// Flushes the entries to the storage table
/// </summary>
public override void Flush()
{
if (!_isInitialized)
{
lock (_initializationSection)
{
if (!_isInitialized)
{
Initialize();
}
}
}
var context = _tableStorage.GetTableServiceContext();
context.MergeOption = MergeOption.AppendOnly;
lock (_traceLogAccess)
{
_traceLog.ForEach(entry => context.AddObject(_diagnosticsTable, entry));
_traceLog.Clear();
}
if (context.Entities.Count > 0)
{
context.BeginSaveChangesWithRetries(SaveChangesOptions.None, (ar) => context.EndSaveChangesWithRetries(ar), null);
}
}
/// <summary>
/// Creates the storage table object. This class does not need to be locked because the caller is locked.
/// </summary>
private void Initialize()
{
var account = CloudStorageAccount.Parse(_connectionString);
_tableStorage = account.CreateCloudTableClient();
_tableStorage.GetTableReference(_diagnosticsTable).CreateIfNotExists();
_isInitialized = true;
}
public override bool IsThreadSafe
{
get
{
return true;
}
}
#region Trace and Write Methods
/// <summary>
/// Writes the message to a string buffer
/// </summary>
/// <param name="message">the Message</param>
public override void Write(string message)
{
if (_messageBuffer == null)
_messageBuffer = new StringBuilder();
_messageBuffer.Append(message);
}
/// <summary>
/// Writes the message with a line breaker to a string buffer
/// </summary>
/// <param name="message"></param>
public override void WriteLine(string message)
{
if (_messageBuffer == null)
_messageBuffer = new StringBuilder();
_messageBuffer.AppendLine(message);
}
/// <summary>
/// Appends the trace information and message
/// </summary>
/// <param name="eventCache">the Event Cache</param>
/// <param name="source">the Source</param>
/// <param name="eventType">the Event Type</param>
/// <param name="id">the Id</param>
/// <param name="message">the Message</param>
public override void TraceEvent(TraceEventCache eventCache, string source, TraceEventType eventType, int id, string message)
{
base.TraceEvent(eventCache, source, eventType, id, message);
AppendEntry(id, eventType, eventCache);
}
/// <summary>
/// Adds the trace information to a collection of LogEntry objects
/// </summary>
/// <param name="id">the Id</param>
/// <param name="eventType">the Event Type</param>
/// <param name="eventCache">the EventCache</param>
private void AppendEntry(int id, TraceEventType eventType, TraceEventCache eventCache)
{
if (_messageBuffer == null)
_messageBuffer = new StringBuilder();
var message = _messageBuffer.ToString();
_messageBuffer.Length = 0;
if (message.EndsWith(Environment.NewLine))
message = message.Substring(0, message.Length - Environment.NewLine.Length);
if (message.Length == 0)
return;
var entry = new LogEntry()
{
PartitionKey = string.Format("{0:D10}", eventCache.Timestamp >> 30),
RowKey = string.Format("{0:D19}", eventCache.Timestamp),
EventTickCount = eventCache.Timestamp,
Level = (int)eventType,
EventId = id,
Pid = eventCache.ProcessId,
Tid = eventCache.ThreadId,
Message = message
};
lock (_traceLogAccess)
_traceLog.Add(entry);
}
#endregion
#endregion
}
}
I looked at the source code in the blog post you referred. If you notice in the code for Details method, it is calling Trace.Flush() method which is essentially writing the trace log data collected so far in table storage. In other words, the custom trace listener is not picking up the scheduled transfer period from diagnostics.wadcfg file at all.
At this point, I do not think there is a solution for leveraging the scheduledTransferPeriod and custom logs. I ended up living with the immediate transfer as I wanted my own table schema. At some point I may write my own transfer interval.

NetDataContractSerializer produces invalid XML

My NetDataContractSerializer seems to be confused: The end of the XML appears twice:
<?xml version="1.0" encoding="utf-8"?>
<Project xmlns:i="http://www.w3.org/2001/XMLSchema-instance" z:Id="1"
[...]
<d2p1:anyType i:nil="true" />
</d2p1:_items>
<d2p1:_size>2</d2p1:_size>
<d2p1:_version>2</d2p1:_version>
</d2p1:items>
</ProjectParts>
<ProjectPath z:Id="31">D:\t10\</ProjectPath>
</Project>ze>
<d2p1:_version>3</d2p1:_version>
</d2p1:items>
<d2p1:_monitor xmlns:d7p1="http://schemas.datacontract.org/2004/07/System.Collections.ObjectModel" z:Id="33">
<d7p1:_busyCount>0</d7p1:_busyCount>
</d2p1:_monitor>
</Elements>
<Project z:Ref="1" i:nil="true" xmlns="http://schemas.datacontract.org/2004/07/Modules.WorkspaceManager.Types" />
</d2p1:anyType>
<d2p1:anyType i:nil="true" />
<d2p1:anyType i:nil="true" />
</d2p1:_items>
<d2p1:_size>2</d2p1:_size>
<d2p1:_version>2</d2p1:_version>
</d2p1:items>
</ProjectParts>
<ProjectPath z:Id="34">D:\t10\</ProjectPath>
</Project>
As you can see, there is some serious stammering going on. It happens occasionally and I can't reproduce the error. Any ideas? Could it be caused by the file being opened in VS while it's being written?
I serialize my object like this:
private void SerializeToFile(object objectToSerialize)
{
Stream stream = null;
try
{
stream = File.Open(_fileName, FileMode.OpenOrCreate, FileAccess.Write);
using (var writer = XmlWriter.Create(stream, new XmlWriterSettings { Indent = true }))
{
NetDataContractSerializer serializer = new NetDataContractSerializer();
serializer.WriteObject(writer, objectToSerialize);
}
}
finally
{
if (stream != null) stream.Close();
}
}
And the class serialized looks like this:
[DataContract(IsReference = true)]
public class Project : IProject
{
[DataMember] public string ProjectPath { get; set; }
[DataMember] public string ProjectName { get; set; }
[DataMember] public Collection<IProjectPart> ProjectParts { get; set; }
public T GetPart<T>() where T : IProjectPart
{
return ProjectParts.OfType<T>().First();
}
public void RegisterPart<T>(T part) where T : IProjectPart
{
if (ProjectParts.Any(p => p.GetType().IsInstanceOfType(part))) throw new InvalidOperationException("Part already registered.");
ProjectParts.Add(part);
part.Project = this;
}
public void Load()
{
foreach (var projectPart in ProjectParts)
{
projectPart.Load();
}
}
public void Unload()
{
foreach (var projectPart in ProjectParts)
{
projectPart.Unload();
}
}
public void Save()
{
foreach (var projectPart in ProjectParts)
{
projectPart.Save();
}
}
public Project()
{
ProjectParts = new Collection<IProjectPart>();
}
}
Thank you!
The issue is simple - when you serialize over and over your object, you do it with different size of IProjectPart collection. The File.Open method does not clear the file from previous content so assume following steps :
i) serialize object with two IProjectPart instaces - let's say it will take 10 lines of xml file
ii) serialize object again with one IProjectPart instance in the collection - this time it will take 8 lines of xml file
iii) lines 9 and 10 will be filled with old xml data since they are not cleared between serialization attempts - so there is some duplicated-trash-looking xml data.
Try it for yourself , you will see exactly how those multiple tags are generated.
NOTE : The 8 and 10 lines are approximate values for my implementation
NOTE 2 : I suggest using using statement for the stream inside serialization method(as for all IDisposable objects) :
private void SerializeToFile(object objectToSerialize)
{
using(var stream = File.Open(_fileName, FileMode.OpenOrCreate, FileAccess.Write))
{
using (var writer = XmlWriter.Create(stream, new XmlWriterSettings { Indent = true }))
{
NetDataContractSerializer serializer = new NetDataContractSerializer();
serializer.WriteObject(writer, objectToSerialize);
}
}
}

The given key was not present in the dictionary Plugin CRM 2011 online

Could anyone tell me what I am doing wrong, I've been trying for over a week now.
Follow the code.
Unexpected exception from plug-in (Execute):
Microsoft.Crm.Sdk.Samples.ProjectTotalAmount:
System.Collections.Generic.KeyNotFoundException: The given key was not
present in the dictionary.
namespace Microsoft.Crm.Sdk.Samples
{
public class ProjectTotalAmount : IPlugin
{
public void Execute(IServiceProvider serviceProvider)
{
Microsoft.Xrm.Sdk.IPluginExecutionContext context = (Microsoft.Xrm.Sdk.IPluginExecutionContext) serviceProvider.GetService(typeof(Microsoft.Xrm.Sdk.IPluginExecutionContext));
if (context.InputParameters.Contains("Target") &&
context.InputParameters["Target"] is Entity)
{
IOrganizationServiceFactory serviceFactory = (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory));
IOrganizationService service = serviceFactory.CreateOrganizationService(context.UserId);
//create a service context
var ServiceContext = new OrganizationServiceContext(service);
//ITracingService tracingService = localContext.TracingService;
Entity entity = (Entity)context.InputParameters["Target"];
if (entity.LogicalName == "new_project")
{
Guid projectGUID = ((EntityReference)entity["new_project"]).Id;
Entity a = service.Retrieve("new_project", ((EntityReference)entity["new_project"]).Id, new ColumnSet(true));
decimal totalAmount = 0;
try
{
//fetchxml to get the sum total of estimatedvalue
string new_amount_sum = string.Format(#"
<fetch distinct='false' mapping='logical' aggregate='true'>
<entity name='new_projectitem'>
<attribute name='new_amount' alias='new_amount' aggregate='sum' />
<filter type='and'>
<condition attribute='new_projectid' operator='eq' value='{0}' uiname='' uitype='' />
</filter>
</entity>
</fetch>", a.Id);
EntityCollection new_amount_sum_result = service.RetrieveMultiple(new FetchExpression(new_amount_sum));
foreach (var c in new_amount_sum_result.Entities)
{
totalAmount = ((Money)((AliasedValue)c["new_amount_sum"]).Value).Value;
}
//updating the field on the account
Entity acc = new Entity("new_project");
acc.Id = a.Id;
acc.Attributes.Add("new_amount", new Money(totalAmount));
service.Update(acc);
}
catch (FaultException ex)
{
throw new InvalidPluginExecutionException("An error occurred in the plug-in.", ex);
}
}
}
}
}
}
The settings for the plugin:
Post-validation
Synchronous execution mode
Server deployment
A few pointers to help you before we start looking at your code...
This error usually means your code is referring to an attribute that does not exist (or does not have a value)
You haven't said on which message your plugin is registered. This may affect the available parameters at runtime
You've commented out your tracingService variable but this can help you at least see how far your code has come. Reinstate it and add a few lines such as this to track your progress prior to the failure. This information will be written to the error log that is offered to you in the client side Exception dialog.:
tracingService.Trace("Project Id is {0}", projectGUID);`
and
tracingService.Trace("Number of returned records: {0}", new_amount_sum_result.Entities.Count);`
The following line seems entirely redundant since you are only using the attribute Id from a and this already exists as entity.Id:
Entity a = service.Retrieve("new_project", ((EntityReference)entity["new_project"]).Id, new ColumnSet(true));

Log assembly version that contains the code that logs with log4net?

Is there a way to add the assembly version of assembly that is emiting the logging event?
I don't think there is anything built-in that will do this, so you will probably need to create your own custom PatternConverter and PatternLayout (which is very easy)
The biggest problem is speed, since this will require log4net to generate the caller information and (in their own words)
WARNING Generating the caller class information is slow. Thus, its use should be avoided unless execution speed is not an issue.
If speed is not an issue, then something like this should work.
public class AssemblyVersionPatternConverter : log4net.Util.PatternConverter
{
protected override void Convert(TextWriter writer, object state)
{
var le = state as log4net.Core.LoggingEvent;
if (le != null)
writer.Write(GetAssemblyVersion(le.LocationInformation.ClassName));
}
}
public class AssemblyVersionPatternLayout : log4net.Layout.PatternLayout
{
public AssemblyVersionPatternLayout()
{
AddConverter( new ConverterInfo
{
Name = "AssemblyVersion",
Type = typeof(AssemblyVersionPatternConverter)
});
}
}
Apart from writing the GetAssemblyVersion method, that is all it takes to implement your own customer PatternConverter and PatternLayout.
You then have to change your log4net configuration file to tell it to use your custom routines.
eg if you have something like
<layout type="log4net.Layout.PatternLayout">
<conversionPattern value="%5level %message%newline" />
</layout>
you change it to something like
<layout type="MyNameSpace.AssemblyVersionPatternLayout, MyDllname">
<conversionPattern value="%5level %AssemblyVersion %message%newline" />
</layout>
The GetAssemblyVersion method, could be something like the following
private string GetAssemblyVersion(string className)
{
Type type = Type.GetType(className);
System.Reflection.Assembly assembly ;
if (type != null)
assembly = type.Assembly ;
else
assembly = AppDomain.CurrentDomain.GetAssemblies()
.FirstOrDefault(a => a.GetType(className) != null);
if (assembly == null)
return String.Empty;
return assembly.FullName.Split(',')[1].Replace("Version=", "");
}
Note, it is not recommended to use assembly.GetName().Version as this can throw a SecurityException if the account has very local privileges.
Check this out: Log Event Context http://www.beefycode.com/post/Log4Net-Tutorial-pt-6-Log-Event-Context.aspx
At start set ThreadContext with Properties you need:
class Program
{
private static log4net.ILog Log = log4net.LogManager.GetLogger( System.Reflection.MethodBase.GetCurrentMethod().DeclaringType );
 
static void Main( string[] args )
{
            log4net.Config.XmlConfigurator.Configure();
            log4net.ThreadContext.Properties["myContext"] = "Logging from Main";
            log4net.ThreadContext.Properties["AssemblyVersion"] = GetType().Assembly.GetName().Version;
            Log.Info( "this is an info message" );
            Console.ReadLine();
        }
    }
}
Use this Properties in your appender conversionPattern:
<appender name="ConsoleAppender" type="log4net.Appender.ConsoleAppender">
<layout type="log4net.Layout.PatternLayout">
<conversionPattern value="%logger (%property{myContext}) (%property{AssemblyVersion}) [%level]- %message%newline" />
</layout>
</appender>

sharepoint custom workflow activity is doing nothing, not even log messages

I am trying to create a custom workflow activity which gives me the list of recipients that I am storing in list. But once I deploy and start the workflow nothing is happening, not even log messages are coming. So i tried to debug the code but breakpoints are not set and I am getting the error "The Breakpoint will not currently be hit. No symbols have been loaded for this document." Can anyone please help me to deal with this issue.
Below are the steps I have followed in creating this activity.
1. created a workflow activity library.
(please find my code file attached)
added .dll to GAC
updated web.config and WSS.actions files.
Now I see the action in designer, so i have created a workflow using designer.
strted the workflow manually on an item.
Here nothing is happening, not even an error. Please let me know if you need any further information.
Please find the code below.
using System;
using System.ComponentModel;
using System.ComponentModel.Design;
using System.Collections;
using System.Drawing;
using System.Linq;
using System.Workflow.ComponentModel.Compiler;
using System.Workflow.ComponentModel.Serialization;
using System.Workflow.ComponentModel;
using System.Workflow.ComponentModel.Design;
using System.Workflow.Runtime;
using System.Workflow.Activities;
using System.Workflow.Activities.Rules;
using Microsoft.SharePoint;
using System.Diagnostics;
using Microsoft.SharePoint.Workflow;
using Microsoft.SharePoint.WorkflowActions;
namespace CustomWorkflowActivityLibrary
{
public partial class CustomWorkflowActivity: SequenceActivity
{
SPList _list;
private EventLog _log;
SPFieldUserValueCollection objUserFieldValueCol;
string semailsettingKeyword1;
string semailsettingKeyword2;
public CustomWorkflowActivity()
{
InitializeComponent();
}
public static DependencyProperty __ContextProperty = DependencyProperty.Register("__Context", typeof(WorkflowContext), typeof(CustomWorkflowActivity));
[DescriptionAttribute("__Context")]
[BrowsableAttribute(true)]
[DesignerSerializationVisibilityAttribute(DesignerSerializationVisibility.Visible)]
public WorkflowContext __Context
{
get { return ((WorkflowContext)(base.GetValue(CustomWorkflowActivity.__ContextProperty))); }
set { base.SetValue(CustomWorkflowActivity.__ContextProperty, value); }
}
public static DependencyProperty ListIdProperty = DependencyProperty.Register("ListId", typeof(string), typeof(CustomWorkflowActivity));
[DescriptionAttribute("ListId")]
[BrowsableAttribute(true)]
[DesignerSerializationVisibilityAttribute(DesignerSerializationVisibility.Visible)]
public string ListId
{
get { return ((string)(base.GetValue(CustomWorkflowActivity.ListIdProperty))); }
set { base.SetValue(CustomWorkflowActivity.ListIdProperty, value); }
}
public static DependencyProperty ListItemProperty = DependencyProperty.Register("ListItem", typeof(int), typeof(CustomWorkflowActivity));
[DescriptionAttribute("ListItem")]
[BrowsableAttribute(true)]
[DesignerSerializationVisibilityAttribute(DesignerSerializationVisibility.Visible)]
public int ListItem
{
get { return ((int)(base.GetValue(CustomWorkflowActivity.ListItemProperty))); }
set { base.SetValue(CustomWorkflowActivity.ListItemProperty, value); }
}
private void codeActivity1_ExecuteCode(object sender, EventArgs e)
{
}
protected override ActivityExecutionStatus Execute(ActivityExecutionContext executionContext)
{
_log = new EventLog("Add Description");
_log.Source = "Share Point Workflows";
try
{
//Execute method as a elevated method
SPSecurity.CodeToRunElevated elevatedExecuteMethod = new SPSecurity.CodeToRunElevated(ExecuteMethod);
SPSecurity.RunWithElevatedPrivileges(elevatedExecuteMethod);
}
catch (Exception ex)
{
_log.WriteEntry("Error" + ex.Message.ToString(), EventLogEntryType.Error);
}
return ActivityExecutionStatus.Closed;
}
private void ExecuteMethod()
{
try
{
//retrieveing the Site object
SPSite _site = new SPSite(__Context.Site.Url);
//retrieveing the Web object
SPWeb _web = (SPWeb)(__Context.Web);
//retrieveing the list object
_list = _web.Lists[new Guid(this.ListId)];
//retrieveing the list item object
SPListItem _listItem = _list.GetItemById(this.ListItem);
_site.AllowUnsafeUpdates = true;
_web.AllowUnsafeUpdates = true;
string semailsubject = _listItem["E-Mail Subject"].ToString();
string semailfrom = _listItem["emailfrom"].ToString();
_log = new EventLog("get vendor info");
_log.WriteEntry("semailsubject");
_log.WriteEntry("semailfrom");
/* _listItem.Update();
_list.Update();
_site.AllowUnsafeUpdates = false;
_web.AllowUnsafeUpdates = false;*/
using (SPSite mysite = new SPSite("http://dlglobaltest.dl.com/Admin/IT/Application%20Development%20Group/LibraryEmailDistribution"))
{
using (SPWeb myweb = mysite.OpenWeb())
{
SPList settingsList = myweb.Lists["EmailDistributionSettings"];
SPQuery oQuery = new SPQuery();
oQuery.Query = "<Where><Eq><FieldRef Name='Sender' /><Value Type='Text'>" + semailfrom + "</Value></Eq></Where>";
SPListItemCollection ColListItems = settingsList.GetItems(oQuery);
foreach (SPListItem oListItem in ColListItems)
{
semailsettingKeyword1 = oListItem["Keyword1"].ToString();
semailsettingKeyword2 = oListItem["Keyword2"].ToString();
//SPFieldUserValue objUserFieldValue = new SPFieldUserValue(myweb, oListItem["Recipients"].ToString());
if ((semailsubject.Contains(semailsettingKeyword1)) || (semailsubject.Contains(semailsettingKeyword2)))
{
objUserFieldValueCol = new SPFieldUserValueCollection(myweb, oListItem["Recipients"].ToString());
_log = new EventLog(objUserFieldValueCol.ToString());
}
}
}
}
}
catch (Exception ex)
{ }
}
}
}
Web.Config:
<authorizedType Assembly="CustomWorkflowActivityLibrary, Version=1.0.0.0, Culture=neutral, PublicKeyToken=a95e146fc1062337" Namespace="CustomWorkflowActivityLibrary" TypeName="*" Authorized="True" />
WSS.Actions:
<Action Name="Get Recipients"
ClassName="CustomWorkflowActivityLibrary.CustomWorkflowActivity"
Assembly="CustomWorkflowActivityLibrary, Version=1.0.0.0, Culture=neutral, PublicKeyToken=a95e146fc1062337"
AppliesTo="all" Category="Custom">
<RuleDesigner Sentence="Get Recipients for %1 ">
<FieldBind Field="ListId,ListItem" Text="this list" Id="1" DesignerType="ChooseListItem" />
</RuleDesigner>
<Parameters>
<Parameter Name="__Context" Type="Microsoft.SharePoint.WorkflowActions.WorkflowContext" Direction="In" />
<Parameter Name="ListId" Type="System.String, mscorlib" Direction="In" />
<Parameter Name="ListItem" Type="System.Int32, mscorlib" Direction="In" />
</Parameters>
</Action>
Thanks,
I am not sure if this will help but you could try changing the following line in your code from this:
SPSecurity.RunWithElevatedPrivileges(elevatedExecuteMethod);
To this:
SPSecurity.RunWithElevatedPrivileges(delegate(){
ExecuteMethod();
});
Another shot-in-the-dark reply:
Try changing the class you're inheriting ( http://msdn.microsoft.com/en-us/library/ms173149(v=VS.80).aspx ) from SequenceActivity to Activity (SequenceActivity inherits from CompositeActivity, which itself inherits from Activity. See: http://msdn.microsoft.com/en-us/library/system.workflow.activities.sequenceactivity(v=VS.90).aspx )
If that doesn't work, try removing your constructor entirely. You should be able to use the base (Sequence)Activity constructor (since you're inheriting the class, not implementing it)
Hope that helps...

Resources