how to test a failed moq - c#-4.0

I have used a happy test to create a method and now i am using a null test on it.
I need to change the assert in the test method but i have no clue how to go about it. i did some searches but i can seem to only find happy path tests or returns in the main method. is there a way to do a moq and test for not excecuting or is the only way having the method return a variable (a Boolean in this case)
the method
public void Upload(Data.RPADataEntity RPADataEntity)
{
if (RPADataEntity != null)
{
//Give RPA the first status and then insert it into the database it.
RPADataEntity.RPAStatusID = Convert.ToInt32(Enum.RPAStatusEnum.RPAStatus.FileInputDataUploaded);
_IRPADataLayer.InsertRpaData(RPADataEntity);
}
}
the test method
[TestMethod]
public void TestUploadRPAEntityNull()
{
//Arange
var fileinputtest = new FileInput();
RPADataEntity RPADataEntity = null;
//Act
fileinputtest.Upload(RPADataEntity);
//Assert
_mockRepository.Verify(x => x.InsertRpaData(RPADataEntity));
}

This should do it:
_mockRepository.Verify(x => x.InsertRpaData(RPADataEntity), Times.Never());

Related

What's the best way to override the project allocation process

Regarding the 'Run Allocations by Projects' process - I have a customization where I'd like to add conditions to the Allocation process so that it doesn't execute unless those conditions are met. I've added a checkbox user field to the 'Allocation Rules' screen (PM207500), and I'd like that field to be used (in combination with other criteria) to determine whether or not to actually execute the Allocation for that PMTask row.
I've overriden the PMAllocator.Execute method as follows, but it doesn't seem to be working properly. Here is the code I've come up with in an extension of the PMAllocator graph:
[PXOverride]
public virtual void Execute(List<PMTask> tasks)
{
Base.PreselectAccountGroups();
if (Base.PreSelectTasksTransactions(tasks))
{
foreach (PMTask task in tasks)
{
//Get the allocation id for the task...
var pmalloc = (PMAllocation)PXSelect<PMAllocation,
Where<PMAllocation.allocationID, Equal<Required<PMAllocation.allocationID>>>>.Select(Base, task.AllocationID);
//if (pmalloc == null) return;
//Get the cache extension / user field...
var pmallocext = PXCache<PMAllocation>.GetExtension<PMAllocationExt>(pmalloc);
if (pmallocext.UsrRunAfterProjectCompletion == true)
{
//Get the project...
var pmproj = (PMProject)PXSelect<PMProject,
Where<PMProject.contractID, Equal<Required<PMProject.contractID>>>>.Select(Base, task.ProjectID);
if (pmproj.ExpireDate > DateTime.Today || pmproj.ExpireDate == null)
{
//do nothing..
}
else
{
Base.Execute(task, false);
}
}
else
{
Base.Execute(task, false);
}
}
}
}
But I'm not sure if this is the correct way to do it. It seems to be adding an extra allocation line. I'm not really even sure if the base execute method is NOT being called if I don't explicitly call it here.
Can someone point out the best way of accomplishing this?
When you use just [PXOverride] - base Execute(...) method will be called before yours.
To replace the base method you should specify additional parameter - a delegate. In your case it can be like that:
public delegate void ExecuteDelegate(List<PMTask> tasks);
[PXOverride]
public virtual void Execute(List<PMTask> tasks, ExecuteDelegate BaseExecute)
{
... your code
}

Get Result of System.Threading.Tasks.Task (ApplicationUser MVC)

I just want to get my ApplicationUser in MVC. I'm trying this code:
public async System.Threading.Tasks.Task<ApplicationUser> GetApplicationUser()
{
return await _userManager.GetUserAsync(HttpContext.User);
}
var user = GetApplicationUser();
var user2 = user.Result;
However, when I try to access user.Result I get exception
"One or more errors occurred. (Object reference not set to an instance of an object.)"
"Object reference not set to an instance of an object."
Not sure how to access the result.
Make sure you controller action is aync Task, like so:
public async Task<IActionResult> Test()
{
ApplicationUser user = await GetCurrentUserAsync();
if (user != null) {
// do more stuff here
}
// do more stuff here
return View();
}
private Task<ApplicationUser> GetCurrentUserAsync()
{
return _userManager.GetUserAsync(HttpContext.User);
}
From the code you posted (without any exception details), I would guess that the problem is that _userManager is null.
However, you'll run into another problem if you fix that. Specifically, you should be using await instead of Result; the latter can cause deadlocks.
I had to call the .Wait() function manually, and then I can access the .Result successfully. In addition, my HttpContext was null so I had to use it inside of Index() like
public IActionResult Index()
{
DoSomethingWith(HttpContext);
return View();
}
and
System.Threading.Tasks.Task<ApplicationUser> user = GetApplicationUser(context);
user.Wait();
var user2 = user.Result;

Groovy addBatch/executeBatch with autoGenerated keys

Has anyone retrieved the auto-generated keys for a database insert while using Groovy SQL's withBatch method? I have the following code
def Sql target = ...//database connection
target.withBatch { ps ->
insertableStuff.each { ps.addBatch ( it ) }
ps.executeBatch()
def results = ps.getGeneratedKeys() //what do I do with this?
}
We're using DB2, and I've successfully tested the getGeneratedKeys method with a single statement/result set, but once I wrap the process in a batch, I'm not sure what objects I'm dealing with anymore.
According to IBM, it is possible to get the results back, but their example is using standard JDBC objects, not the groovy ones. Any ideas?
I took the Groovy SQL stuff out the picture to see if I could get something working, I wanted to make sure that DB2 for z/OS actually supported the function, and was able to get the generated values. I was using IBM's example, however I had to add some extra code to handle for the casting that the IBM example is using.
SQL target = ...//get database connection
def preparedStatement = target.connection.prepareStatement(statement, ['ISN'] as String[])
ResultSet[] resultSets = ((DB2PreparedStatement) (ps.getDelegate().getDelegate())).getDBGeneratedKeys()
resultSets.each { ResultSet results ->
while(results.next()) {
println results.getInt(1)
}
}
So... that's a little clunky, but it's functional. Unfortunately, by controlling the statement myself, I lost all of the parameter mapping that Groovy normally does for me.
I was looking through the groovy Sql source code and can see where they are explicitly telling the database connection not to handle parameters, so I'm thinking I'll add a new method to Sql.metaClass that can pass in a list of the auto-generated column names or something to make this more palatable.
I also want to see if there's a way to get the getGeneratedKeys method working so that I don't have to do all of that casting. At the very least, a utility method to safely handle the casting for me.
try {
withinBatch = true;
PreparedStatement statement = (PreparedStatement) getAbstractStatement(new CreatePreparedStatementCommand(0), connection, sql);
configure(statement);
psWrapper = new BatchingPreparedStatementWrapper(statement, indexPropList, batchSize, LOG, this);
closure.call(psWrapper);
return psWrapper.executeBatch();
} catch (SQLException e) {
The createNewPreparedStatement(0) prevents the creation of a statement which could return the auto-generated keys.
Just to make sure I wasn't crazy, I re-tried the 'getGeneratedKeys' method again with a statement that I know works and I got no results (see below). I had to recursively spin through the results to find the IBM class. So... not my favorite code, it's pretty brittle, but it's functional. Now I just need to see if I can still use the withBatch method somehow, I'll obviously need to override some things.
println 'print using getGeneratedKeys'
def results = preparedStatement.getGeneratedKeys()
while (results.next()) {
println SqlGroovyMethods.toRowResult(results)
}
println 'print using delegate processing'
println getGeneratedKeys(preparedStatement)
private List getGeneratedKeys(PreparedStatement statement) {
switch (statement) {
case DelegatingStatement:
return getGeneratedKeys(DelegatingStatement.cast(statement).getDelegate())
case DB2PreparedStatement:
ResultSet[] resultSets = DB2PreparedStatement.cast(statement).getDBGeneratedKeys()
List keys = []
resultSets.each { ResultSet results ->
while (results.next()) {
keys << SqlGroovyMethods.toRowResult(results)
}
}
return keys
default:
return [SqlGroovyMethods.toRowResult(statement.getGeneratedKeys())]
}
}
---- Console Output ----
print using getGeneratedKeys
print using delegate processing
[[KEY:7391], [KEY:7392]]
Okay, got it working. I had to hack my way into the Groovy SQL class, and there are some things that I just couldn't do because the methods in the Groovy class were private, so this implementation doesn't support cachedStatements, the isWithinBatch method won't operate correctly in the closure, and there's no access to the number of rows that were updated.
It'd be nice to see some variation of this in the base Groovy code, perhaps with a extension point where you put in your own handler (since you wouldn't want the IBM specific stuff in the base Groovy code), but at least I have a workable solution now.
public class SqlWithGeneratedKeys extends Sql {
public SqlWithGeneratedKeys(Sql parent) {
super(parent);
}
public List<GroovyRowResult> withBatch(String pSql, String [] keys, Closure closure) throws SQLException {
return this.withBatch(0, pSql, keys, closure);
}
public List<GroovyRowResult> withBatch(int batchSize, String pSql, String [] keys, Closure closure) throws SQLException {
final Connection connection = this.createConnection();
List<Tuple> indexPropList = null;
final SqlWithParams preCheck = this.buildSqlWithIndexedProps(pSql);
BatchingPreparedStatementWrapper psWrapper = null;
String sql = pSql;
if (preCheck != null) {
indexPropList = new ArrayList<Tuple>();
for (final Object next : preCheck.getParams()) {
indexPropList.add((Tuple) next);
}
sql = preCheck.getSql();
}
PreparedStatement statement = null;
try {
statement = connection.prepareStatement(sql, keys);
this.configure(statement);
psWrapper = new BatchingPreparedStatementWrapper(statement, indexPropList, batchSize, LOG, this);
closure.call(psWrapper);
psWrapper.executeBatch();
return this.getGeneratedKeys(statement);
} catch (final SQLException e) {
LOG.warning("Error during batch execution of '" + sql + "' with message: " + e.getMessage());
throw e;
} finally {
BaseDBServices.closeDBElements(connection, statement, null);
}
}
protected List<GroovyRowResult> getGeneratedKeys(Statement statement) throws SQLException {
if (statement instanceof DelegatingStatement) {
return this.getGeneratedKeys(DelegatingStatement.class.cast(statement).getDelegate());
} else if (statement instanceof DB2PreparedStatement) {
final ResultSet[] resultSets = DB2PreparedStatement.class.cast(statement).getDBGeneratedKeys();
final List<GroovyRowResult> keys = new ArrayList<GroovyRowResult>();
for (final ResultSet results : resultSets) {
while (results.next()) {
keys.add(SqlGroovyMethods.toRowResult(results));
}
}
return keys;
}
return Arrays.asList(SqlGroovyMethods.toRowResult(statement.getGeneratedKeys()));
}
}
Calling it is nice and clean.
println new SqlWithGeneratedKeys(target).withBatch(statement, ['ISN'] as String[]) { ps ->
rows.each {
ps.addBatch(it)
}
}

first steps with FakeItEasy and problems with Action type

I have the following (here simplified) code which I want to test with FakeItEasy.
public class ActionExecutor : IActionExecutor
{
public void TransactionalExecutionOf(Action action)
{
try
{
// ...
action();
// ...
}
catch
{
// ...
Rollback();
}
}
public void Commit()
{ }
public void Rollback()
{ }
}
public class Service : IService
{
private readonly IRepository _repository;
private readonly IActionExecutor _actionExecutor;
// ctor for CI
public void ServiceMethod(string name)
{
_actionExecutor.TransactionalExecutionOf(() =>
{
var item = _repository.FindByName(ItemSpecs.FindByNameSpec(name));
if (item == null) throw new ServiceException("Item not found");
item.DoSomething();
_actionExecutor.Commit();
}
}
}
I want to test that the ServiceException is thrown so i setup my test like that
var repo = A.Fake<IRepository>();
A.CallTo(() => repo.FindByName(A<ISpec<Item>>.Ignored))
.Returns(null);
var executor = A.Fake<IActionExecutor>();
executor.Configure()
.CallsTo(x => x.Rollback()).DoesNothing();
executor.Configure()
.CallsTo(x => x.Commit()).DoesNothing();
executor.Configure()
.CallsTo(x => x.TransactionalExecutionOf(A<Action>.Ignored))
.CallsBaseMethod();
With the following code
var service = new Service(executor, repo);
service.ServiceMethod("notExists")
.Throws(new ServiceException());
I get the following message
The current proxy generator can not intercept the specified method
for the following reason:
- Sealed methods can not be intercepted.
If I call the method directly on the service like
var service = new Service(executor, repo);
service.ServiceMethod("NotExists");
I get this message
This is a DynamicProxy2 error: The interceptor attempted to 'Proceed'
for method 'Void TransactionalExecutionOf(System.Action)' which has no
target. When calling method without target there is no implementation
to 'proceed' to and it is the responsibility of the interceptor to
mimic the implementation (set return value, out arguments etc)
Now I am a bit confused and don't know what to do next.
Problems comes from the way you create fake and what you later expect it to do:
var executor = A.Fake<IActionExecutor>();
// ...
executor.Configure()
.CallsTo(x => x.TransactionalExecutionOf(A<Action>.Ignored))
.CallsBaseMethod();
What base method? FakeItEasy has no idea what the base class is, and hence the DynamicProxy2 exception in your second case. You can create partial mock this way:
var executor = A.Fake<ActionExecutor>();
Note that we're basing on actual implementation, not interface anymore
This however introduces a new set of problems, as methods on ActionExecutor are not virtual and therefore interceptor cannot hook up to well - intercept them. To make your current setup work, you'll have to change your ActionExecutor and make (all) the methods virtual.
However, you may (or even should) want to avoid modifications of existing code (which sometimes might not even be an option). You could then set up your IActionExecutor fake like this:
var executor = A.Fake<IActionExecutor>();
A.CallTo(() => executor.TransactionalExecutionOf(A<Action>.Ignored))
.Invokes(f => new ActionExecutor()
.TransactionalExecutionOf((Action)f.Arguments.First())
);
This will allow you to work on faked object, with the exception of call to TransactionalExecutionOf which will be redirected to actual implementation.

Playframework Excel file generation

I've installed excel module in order to generate reports from datas recorded by my application into database.
It works fine : i can create report simply by clicking on a link into my main page and render into excel template.
But i'd rather generate excel file periodically (using a job) and save it into a shared folder, and that without any human action (so not by clicking on a link).
It's like I want to trigger the associated controller to render into my template automatically.
Does anyone got any tips on it for me?
So the problem is you can't pass some parameters into the job, or...?
Using something like this just doesn't work?
#On("0 45 4-23 ? * MON-FRI")
public class ExcelJob extends Job {
public void doJob() {
// generate excel
}
}
I wrote my own Excel generator using JExcel, and I use it for scheduled generation without a problem. It also doesn't require a template, because the report structure is derived from annotations. This is roughly 20 lines of code - you may want to try it for yourself.
This is really rough and lacks good user feedback, but gives you the idea...
Excel generator - not Play-specific in any way
public class ExcelGenerator
{
public void generateReport(Function successCallback,
Function failureCallback)
{
try
{
byte[] report = // generate your report somehow
successCallback.execute(report);
}
catch (Exception e)
{
failureCallback.execute(e.getMessage());
}
}
}
A function interface for callbacks (very basic)
public interface Function
{
public void execute(Object... args);
}
Your Play controller
public class MyController extends Controller
{
public static void index()
{
render();
}
public static void createReport()
{
Function failureCallback = new Function()
{
public void execute(Object... args)
{
flash.error(args[0]);
indxe();
}
};
Function successCallback = new Function()
{
public void execute(Object... args)
{
renderBinary((byte[])args[0]);
}
};
ExcelGenerator excelGenerator = new ExcelGenerator();
excelGenerator.generateReport(successCallback,
failureCallback);
}
}
Finally, re-use the ExcelGenerator from your job
public class MyJob extends Job
{
public void doJob()
{
Function failureCallback = new Function()
{
public void execute(Object... args)
{
Logger.error(args[0]);
}
}
Function successCallback = new Function()
{
public void execute(Object... args)
{
byte[] report = (byte[])args[0];
// write report to disk
}
}
ExcelGenerator excelGenerator = new ExcelGenerator();
excelGenerator.generateReport(successCallback,
failureCallback);
}
}
You'll still need to write your own report generator, or refactor the existing excel module to provide what you need.
So if you want to run and manage several jobs you can do something like this
for (int i = 0; i < 10; i++) {
SendingMessageJob sendingMessageJob = new SendingMessageJob();
promises.add(sendingMessageJob.now());
}
boolean allDone = false;
while (!allDone) {
allDone = true;
for (F.Promise promise : promises) {
if (!promise.isDone()) {
allDone = false;
break;
}
}
}
// when arrive here all jobs have finished their process
You can check the Play documentation, specifically the section on jobs, where you'll see examples on how to create automatically triggered methods. This should solve your issue.
EDIT (update on comment):
You can manually trigger a job, do this:
new MyExcelGeneratorJob().doJob();
Thing is, Play is stateless, so the job should use data from the database. Instead of trying to pass parameters from your request into the Job (won't work) try to store that data in a staging area in the database that the job loads and processes to generate the excel.

Resources