Why is parallel.Invoke not working in this case - c#-4.0

I have an array of files like this..
string[] unZippedFiles;
the idea is that I want to parse these files in paralle. As they are parsed a record gets placed on a concurrentbag. As record is getting placed I want to kick of the update function.
Here is what I am doing in my Main():
foreach(var file in unZippedFiles)
{ Parallel.Invoke
(
() => ImportFiles(file),
() => UpdateTest()
);
}
this is what the code of Update loooks like.
static void UpdateTest( )
{
Console.WriteLine("Updating/Inserting merchant information.");
while (!merchCollection.IsEmpty || producingRecords )
{
merchant x;
if (merchCollection.TryTake(out x))
{
UPDATE_MERCHANT(x.m_id, x.mInfo, x.month, x.year);
}
}
}
This is what the import code looks like. It's pretty much a giant string parser.
System.IO.StreamReader SR = new System.IO.StreamReader(fileName);
long COUNTER = 0;
StringBuilder contents = new StringBuilder( );
string M_ID = "";
string BOF_DELIMITER = "%%MS_SKEY_0000_000_PDF:";
string EOF_DELIMITER = "%%EOF";
try
{
record_count = 0;
producingRecords = true;
for (COUNTER = 0; COUNTER <= SR.BaseStream.Length - 1; COUNTER++)
{
if (SR.EndOfStream)
{
break;
}
contents.AppendLine(Strings.Trim(SR.ReadLine()));
contents.AppendLine(System.Environment.NewLine);
//contents += Strings.Trim(SR.ReadLine());
//contents += Strings.Chr(10);
if (contents.ToString().IndexOf((EOF_DELIMITER)) > -1)
{
if (contents.ToString().StartsWith(BOF_DELIMITER) & contents.ToString().IndexOf(EOF_DELIMITER) > -1)
{
string data = contents.ToString();
M_ID = data.Substring(data.IndexOf("_M") + 2, data.Substring(data.IndexOf("_M") + 2).IndexOf("_"));
Console.WriteLine("Merchant: " + M_ID);
merchant newmerch;
newmerch.m_id = M_ID;
newmerch.mInfo = data.Substring(0, (data.IndexOf(EOF_DELIMITER) + 5));
newmerch.month = DateTime.Now.AddMonths(-1).Month;
newmerch.year = DateTime.Now.AddMonths(-1).Year;
//Update(newmerch);
merchCollection.Add(newmerch);
}
contents.Clear();
//GC.Collect();
}
}
SR.Close();
// UpdateTest();
}
catch (Exception ex)
{
producingRecords = false;
}
finally
{
producingRecords = false;
}
}
the problem i am having is that the Update runs once and then the importfile function just takes over and does not yield to the update function. Any ideas on what am I doing wrong would be of great help.

Here's my stab at fixing your thread synchronisation. Note that I haven't changed any of the code from the functional standpoint (with the exception of taking out the catch - it's generally a bad idea; exceptions need to be propagated).
Forgive if something doesn't compile - I'm writing this based on incomplete snippets.
Main
foreach(var file in unZippedFiles)
{
using (var merchCollection = new BlockingCollection<merchant>())
{
Parallel.Invoke
(
() => ImportFiles(file, merchCollection),
() => UpdateTest(merchCollection)
);
}
}
Update
private void UpdateTest(BlockingCollection<merchant> merchCollection)
{
Console.WriteLine("Updating/Inserting merchant information.");
foreach (merchant x in merchCollection.GetConsumingEnumerable())
{
UPDATE_MERCHANT(x.m_id, x.mInfo, x.month, x.year);
}
}
Import
Don't forget to pass in merchCollection as a parameter - it should not be static.
System.IO.StreamReader SR = new System.IO.StreamReader(fileName);
long COUNTER = 0;
StringBuilder contents = new StringBuilder( );
string M_ID = "";
string BOF_DELIMITER = "%%MS_SKEY_0000_000_PDF:";
string EOF_DELIMITER = "%%EOF";
try
{
record_count = 0;
for (COUNTER = 0; COUNTER <= SR.BaseStream.Length - 1; COUNTER++)
{
if (SR.EndOfStream)
{
break;
}
contents.AppendLine(Strings.Trim(SR.ReadLine()));
contents.AppendLine(System.Environment.NewLine);
//contents += Strings.Trim(SR.ReadLine());
//contents += Strings.Chr(10);
if (contents.ToString().IndexOf((EOF_DELIMITER)) > -1)
{
if (contents.ToString().StartsWith(BOF_DELIMITER) & contents.ToString().IndexOf(EOF_DELIMITER) > -1)
{
string data = contents.ToString();
M_ID = data.Substring(data.IndexOf("_M") + 2, data.Substring(data.IndexOf("_M") + 2).IndexOf("_"));
Console.WriteLine("Merchant: " + M_ID);
merchant newmerch;
newmerch.m_id = M_ID;
newmerch.mInfo = data.Substring(0, (data.IndexOf(EOF_DELIMITER) + 5));
newmerch.month = DateTime.Now.AddMonths(-1).Month;
newmerch.year = DateTime.Now.AddMonths(-1).Year;
//Update(newmerch);
merchCollection.Add(newmerch);
}
contents.Clear();
//GC.Collect();
}
}
SR.Close();
// UpdateTest();
}
finally
{
merchCollection.CompleteAdding();
}
}

Related

REVIT Transfer floor sketch to Void Extrusion in Family

Struggling with some Revit code to copy the profile of a floor and use it as the sketch profile for a void extrusion in a family.
Here is the Full Sharp Development Code. It half works in my custom project template, when I try to use it in an out of the box project generated from revit default template it gives the error "a managed exception was thrown by revit or by one of its external applications"
In my template it cannot properly split the curves into a secondary array. It says the array elements are being changed but when the loop runs again the element i is back to it's original content??? The TaskDialog clearly says the elements have changed, until the loop iterates again.
Full code: To work it requires a generic family with the name "Void - Custom" to be in the project. The "If found" near the bottom last page and a half of code, is where the for loop is not behaving as expected.
/*
* Created by SharpDevelop.
* User: arautio
* Date: 4/30/2019
* Time: 11:10 AM
*
* To change this template use Tools | Options | Coding | Edit Standard Headers.
*/
using System;
using Autodesk.Revit.UI;
using Autodesk.Revit.DB;
using Autodesk.Revit.DB.Architecture;
using Autodesk.Revit.DB.Structure;
using Autodesk.Revit.UI.Selection;
using System.Collections.Generic;
using System.Linq;
using Autodesk.Revit.ApplicationServices;
using Autodesk.Revit.Attributes;
using System.Text;
using System.IO;
using System.Diagnostics;
namespace ARC
{
[Autodesk.Revit.Attributes.Transaction(Autodesk.Revit.Attributes.TransactionMode.Manual)]
[Autodesk.Revit.DB.Macros.AddInId("3411F411-6FC1-4A4D-9CFD-37ABB2028A15")]
public partial class ThisApplication
{
private void Module_Startup(object sender, EventArgs e)
{
}
private void Module_Shutdown(object sender, EventArgs e)
{
}
#region Revit Macros generated code
private void InternalStartup()
{
this.Startup += new System.EventHandler(Module_Startup);
this.Shutdown += new System.EventHandler(Module_Shutdown);
}
#endregion
public void FloorGrating()
{
StringBuilder sb = new StringBuilder();
Dictionary<Floor, List<ModelCurve>> dict_SketchLines = new Dictionary<Floor, List<ModelCurve>>();
UIDocument uidoc = this.ActiveUIDocument;
Document document = uidoc.Document;
View activev = document.ActiveView;
ElementId levelId = null;
levelId = activev.LevelId;
Element levelem = document.GetElement( levelId );
Level lev = document.ActiveView.GenLevel;
Reference refsel = uidoc.Selection.PickObject(ObjectType.Element, "Select Floor to Add Grating To");
Element elem = document.GetElement(refsel.ElementId);
Floor f = elem as Floor;
List<ElementId> _deleted = null;
using (Transaction t = new Transaction(document, "temp"))
{
t.Start();
document.Regenerate();
_deleted = document.Delete(elem.Id).ToList();
t.RollBack();
}
bool SketchLinesFound = false;
List<ModelCurve> _sketchCurves = new List<ModelCurve>();
foreach (var id in _deleted)
{
ModelCurve mc = document.GetElement(id) as ModelCurve;
if (mc != null)
{
_sketchCurves.Add(mc);
SketchLinesFound = true;
}
else
{
if (SketchLinesFound) break;
}
}
dict_SketchLines.Add(f, _sketchCurves);
foreach (Floor key in dict_SketchLines.Keys)
{
List<ModelCurve> _curves = dict_SketchLines[key];
sb.AppendLine(string.Format("floor {0} has sketchlines:", key.Id));
foreach (ModelCurve mc in _curves)
{
sb.AppendLine(string.Format("{0} <{1}>", mc.GetType(), mc.Id));
sb.AppendLine(string.Format("<{0}>", mc.GeometryCurve.IsBound.ToString()));
if (mc.GetType().ToString() == "Autodesk.Revit.DB.ModelArc" && mc.GeometryCurve.IsBound == false)
{
TaskDialog.Show("Revit", "Circle Found");
}
try
{
sb.AppendLine(string.Format("<{0} -- {1}>", mc.GeometryCurve.GetEndPoint(0), mc.GeometryCurve.GetEndPoint(1)));
}
catch
{
}
}
sb.AppendLine();
}
//TaskDialog.Show("debug", sb.ToString());
Document docfamily;
Family fam;
string ftitle = document.Title;
string fpath = document.PathName;
int ftitlelen = ftitle.Length + 4;
int fpathlen = fpath.Length;
int finpathlen = fpathlen - ftitlelen;
string sfinpath = fpath.Substring(0,finpathlen);
string famname = "GratingVoid";
string fext = ".rfa";
int counter = 1;
while (counter < 100)
{
famname = ("GratingVoid" + counter as String);
Family family = FindElementByName(document,typeof(Family),famname)as Family;
if( null == family )
{
sfinpath = (sfinpath + famname + fext);
counter = 1000;
}
counter += 1;
}
FilteredElementCollector collector0 = new FilteredElementCollector(document);
ICollection<Element> collection0 = collector0.WhereElementIsNotElementType().ToElements();
List<FamilySymbol> fsym0 = new FilteredElementCollector(document).OfClass(typeof(FamilySymbol)).Cast<FamilySymbol>().ToList();
FamilySymbol famsymb0 = null;
foreach (FamilySymbol symb in fsym0)
{
if (symb.Name == "Void - Custom")
{
famsymb0 = symb as FamilySymbol;
}
}
fam = famsymb0.Family;
docfamily = document.EditFamily(fam);
try
{
docfamily.SaveAs(#sfinpath);
}
catch
{
TaskDialog.Show("Revit", "Could Not Save Void Family");
}
using (Transaction trans = new Transaction(docfamily))
{
trans.Start("family");
bool circleflag = false;
ElementId delid = null;
FilteredElementCollector collector = new FilteredElementCollector( docfamily );
foreach(Element element in collector.OfClass(typeof(GenericForm)))
{
delid = element.Id;
}
docfamily.Delete(delid);
CurveArray loccurva = new CurveArray();
foreach (Floor key in dict_SketchLines.Keys)
{
List<ModelCurve> _curves = dict_SketchLines[key];
foreach (ModelCurve mc in _curves)
{
if (mc.GetType().ToString() == "Autodesk.Revit.DB.ModelArc" && mc.GeometryCurve.IsBound == false)
{
circleflag = true;
}
LocationCurve lcurve = mc.Location as LocationCurve;
Curve c = lcurve.Curve as Curve;
loccurva.Append(c);
}
}
try
{
if (circleflag == true && loccurva.Size == 2)
{
Curve tempc;
if (loccurva.get_Item(0).GetType().ToString() == "Autodesk.Revit.DB.Arc")
{
tempc = loccurva.get_Item(0);
}
else
{
tempc = loccurva.get_Item(1);
}
loccurva.Clear();
loccurva.Append(tempc);
}
CurveArrArray newcurarr = new CurveArrArray();
newcurarr.Append(loccurva);
SortCurvesContiguousArray(newcurarr);
TaskDialog.Show("Revit CurveArray Array Size" , newcurarr.Size.ToString());
foreach (CurveArray ca in newcurarr)
{
TaskDialog.Show("Revit CurveArray within Array Size" , ca.Size.ToString());
}
// Below is edited for error control - leaving out the secondary loops for now
CurveArrArray switcharr = new CurveArrArray();
//switcharr.Append(newcurarr.get_Item(1));
switcharr.Append(newcurarr.get_Item(0));
//SortCurvesContiguousArray(loccurva);
//CurveArrArray newcurarr = new CurveArrArray();
//newcurarr.Append(loccurva);
double end = 1;
SketchPlane sketch = FindElementByName( docfamily,typeof( SketchPlane ), "Ref. Level" ) as SketchPlane;
docfamily.FamilyCreate.NewExtrusion(false, switcharr, sketch, end);
}
catch
{
TaskDialog.Show("Revit", "Could Not Write to Curve Array or Create Extrusion");
}
trans.Commit();
}
docfamily.Save();
docfamily.LoadFamily(document, new CustomFamilyLoadOption());
docfamily.Close();
File.Delete(sfinpath);
Family familynew = FindElementByName(document,typeof(Family),famname)as Family;
if( null == familynew )
{
TaskDialog.Show("Revit", "Family Does Not Exist");
}
FilteredElementCollector collector1 = new FilteredElementCollector(document);
ICollection<Element> collection = collector1.WhereElementIsNotElementType().ToElements();
List<FamilySymbol> fsym = new FilteredElementCollector(document).OfClass(typeof(FamilySymbol)).Cast<FamilySymbol>().ToList();
FamilySymbol famsymb = null;
foreach (FamilySymbol symb in fsym)
{
if (symb.Name == famname)
{
famsymb = symb as FamilySymbol;
}
}
using (Transaction trans = new Transaction(document))
{
trans.Start("PlaceVoid");
if( ! famsymb.IsActive )
{
famsymb.Activate();
}
XYZ p = new XYZ(0,0,0);
FamilyInstance gratingvoid = document.Create.NewFamilyInstance( p, famsymb, lev, lev, StructuralType.NonStructural );
document.Regenerate();
trans.Commit();
}
}
//--------------------------------------------------------------------------------------------------------------------------------------------------------
static public Element FindElementByName(Document doc,Type targetType,string targetName)
{
return new FilteredElementCollector( doc ).OfClass( targetType ).FirstOrDefault<Element>(e => e.Name.Equals( targetName ) );
}
//--------------------------------------------------------------------------------------------------------------------------------------------------------
public class CustomFamilyLoadOption : IFamilyLoadOptions
{
public bool OnFamilyFound(bool familyInUse, out bool overwriteParameterValues)
{
overwriteParameterValues = true;
return true;
}
public bool OnSharedFamilyFound(Family sharedFamily,bool familyInUse,out FamilySource source, out bool overwriteParameterValues)
{
source = FamilySource.Family;
overwriteParameterValues = true;
return true;
}
}
//--------------------------------------------------------------------------------------------------------------------------------------------------------
const double _inch = 1.0 / 12.0;
const double _sixteenth = _inch / 16.0;
static Curve CreateReversedCurve(Curve orig )
{
//if( !IsSupported( orig ) )
//{
// throw new NotImplementedException("CreateReversedCurve for type " + orig.GetType().Name );
//}
if( orig is Line )
{
//return creapp.NewLineBound(orig.GetEndPoint( 1 ), orig.GetEndPoint( 0 ) );
return Line.CreateBound(orig.GetEndPoint( 1 ), orig.GetEndPoint( 0 ) );
}
else if( orig is Arc )
{
// return creapp.NewArc( orig.GetEndPoint( 1 ), orig.GetEndPoint( 0 ), orig.Evaluate( 0.5, true ) );
return Arc.Create( orig.GetEndPoint( 1 ), orig.GetEndPoint( 0 ), orig.Evaluate( 0.5, true ) );
}
else
{
throw new Exception(
"CreateReversedCurve - Unreachable" );
}
}
public static void SortCurvesContiguousArray(CurveArrArray curvesarr)
{
double _precision1 = 1.0 / 12.0 / 16.0; // around 0.00520833
double _precision2 = 0.001; // limit for CurveLoop.Create(...)
int cn = curvesarr.Size;
int ci = 0;
while (ci < cn)
{
CurveArray curves = curvesarr.get_Item(ci);
ci +=1;
// account for multiple curve loops with secondary array
CurveArray loop1 = new CurveArray();
CurveArray loop2 = new CurveArray();
int n = curves.Size;
int split = 1;
// Walk through each curve (after the first)
// to match up the curves in order
for (int i = 0; i < n; ++i)
{
TaskDialog.Show("Revit I Loop Run", i.ToString());
Curve curve = curves.get_Item(i);
if (curve.GetType().ToString() == "Autodesk.Revit.DB.Arc" && curve.IsBound == false)
{
break;
}
XYZ beginPoint = curve.GetEndPoint(0);
XYZ endPoint = curve.GetEndPoint(1);
XYZ p,q;
// Find curve with start point = end point
bool found = (i + 1 >= n);
for (int j = i + 1; j < n; ++j)
{
p = curves.get_Item(j).GetEndPoint(0);
q = curves.get_Item(j).GetEndPoint(1);
// If there is a match end->start,
// this is the next curve
if (p.DistanceTo(endPoint) < _precision1)
{
if (p.DistanceTo(endPoint) > _precision2)
{
XYZ intermediate = new XYZ((endPoint.X + p.X) / 2.0, (endPoint.Y + p.Y) / 2.0, (endPoint.Z + p.Z) / 2.0);
curves.set_Item(i, Line.CreateBound(beginPoint, intermediate));
curves.set_Item(j, Line.CreateBound(intermediate, q));
}
if (i + 1 != j)
{
Curve tmp = curves.get_Item(i + 1);
curves.set_Item(i + 1, curves.get_Item(j));
curves.set_Item(j, tmp);
}
found = true;
break;
}
// If there is a match end->end,
// reverse the next curve
if (q.DistanceTo(endPoint) < _precision1)
{
if (q.DistanceTo(endPoint) > _precision2)
{
XYZ intermediate = new XYZ((endPoint.X + q.X) / 2.0, (endPoint.Y + q.Y) / 2.0, (endPoint.Z + q.Z) / 2.0);
curves.set_Item(i, Line.CreateBound(beginPoint, intermediate));
curves.set_Item(j, Line.CreateBound(p, intermediate));
}
if (i + 1 == j)
{
curves.set_Item(i + 1, CreateReversedCurve(curves.get_Item(j)));
}
else
{
Curve tmp = curves.get_Item(i + 1);
curves.set_Item(i + 1, CreateReversedCurve(curves.get_Item(j)));
curves.set_Item(j, tmp);
}
found = true;
break;
}
}
if (!found)
{
// if not found, must be part of a new loop - move it to the back and keep going and add to second array
TaskDialog.Show("Revit No Match Found for item", i.ToString());
TaskDialog.Show("Revit", "Moveing it to back of list");
Curve tmp1 = curves.get_Item(i);
TaskDialog.Show("Revit tmp1 Current i item endpt", tmp1.GetEndPoint(0).ToString());
loop2.Append(tmp1);
Curve tmp2 = curves.get_Item(n - split);
TaskDialog.Show("Revit tmp2 Back of list item endpt", tmp2.GetEndPoint(0).ToString());
// set current item to rear
curves.set_Item(i, tmp2);
// set rear item to current
curves.set_Item(n - split, tmp1);
TaskDialog.Show("Revit new item i endpt", curves.get_Item(i).GetEndPoint(0).ToString());
TaskDialog.Show("Revit moved item endpt", curves.get_Item(n - split).GetEndPoint(0).ToString());
// error testing - try to append in a different manner and check values
//curves.set_Item(i, Line.CreateBound(curves.get_Item(i).GetEndPoint(0), curves.get_Item(i).GetEndPoint(1)));
//curves.set_Item(n - split, Line.CreateBound(curves.get_Item(n - split).GetEndPoint(0), curves.get_Item(n - split).GetEndPoint(1)));
//Curve ncurve = Line.CreateBound(curves.get_Item(n - split).GetEndPoint(0), curves.get_Item(n - split).GetEndPoint(1));
//TaskDialog.Show("Revit Appended to Loop2 Endpoint", ncurve.GetEndPoint(0).ToString());
//loop2.Append(ncurve);
//set the split off counter so items not fitting in first loop can be split to new array.
split += 1;
//reset the counter back so item moved from rear can be checked in next run of for loop
i -= 2;
}
//set counter to end for loop when all items that do not fit in first loop are processed
if (i >= n - (split + 1))
{
TaskDialog.Show("Revit", "End Of Looping");
TaskDialog.Show("Revit - The Split Number", split.ToString());
i = n;
}
}
int counter = 0;
// recreate array with only items from first loop found
while (counter <= (n - split))
{
loop1.Append(curves.get_Item(counter));
counter += 1;
}
TaskDialog.Show("Revit loop1 Size", loop1.Size.ToString());
curvesarr.Clear();
curvesarr.Append(loop1);
if (loop2.Size > 0)
{
string stringinfo = "";
// run the loop detection on a second array that was split from the first
TaskDialog.Show("Revit loop2 Size", loop2.Size.ToString());
CurveArrArray tmpcurvesarr = new CurveArrArray();
tmpcurvesarr.Append(loop2);
SortCurvesContiguousArray(tmpcurvesarr);
loop2.Clear();
loop2 = tmpcurvesarr.get_Item(0);
curvesarr.Append(loop2);
foreach (Curve ccc in loop2)
{
stringinfo = (stringinfo + " " + ccc.GetEndPoint(0).ToString() + " - " + ccc.GetEndPoint(1).ToString());
}
TaskDialog.Show("Revit", stringinfo);
}
}
}
}
}
Thanks for any and all help.
Shane

Using Epplus to import data from an Excel file to SQL Server database table

I've tried implementing thishttps://www.paragon-inc.com/resources/blogs-posts/easy_excel_interaction_pt6 on an ASP.NET MVC 5 Application.
//SEE CODE BELOW
[HttpPost]
public ActionResult Upload(HttpPostedFileBase file)
{
var regPIN = DB.AspNetUsers.Where(i => i.Id == user.Id).Select(i => i.registrationPIN).FirstOrDefault();
if (file != null && file.ContentLength > 0)
{
var extension = Path.GetExtension(file.FileName);
var excelFile = Path.Combine(Server.MapPath("~/App_Data/BulkImports"),regPIN + extension);
if (System.IO.File.Exists(excelFile))
{
System.IO.File.Delete(excelFile);
}
else if (file.ContentType == "application/vnd.ms-excel" || file.ContentType == "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet")
{
file.SaveAs(excelFile);//WORKS FINE
//BEGINING OF IMPORT
FileInfo eFile = new FileInfo(excelFile);
using (var excelPackage = new ExcelPackage(eFile))
{
if (!eFile.Name.EndsWith("xlsx"))//Return ModelState.AddModelError()
{ ModelState.AddModelError("", "Incompartible Excel Document. Please use MSExcel 2007 and Above!"); }
else
{
var worksheet = excelPackage.Workbook.Worksheets[1];
if (worksheet == null) { ModelState.AddModelError("", "Wrong Excel Format!"); }// return ImportResults.WrongFormat;
else
{
var lastRow = worksheet.Dimension.End.Row;
while (lastRow >= 1)
{
var range = worksheet.Cells[lastRow, 1, lastRow, 3];
if (range.Any(c => c.Value != null))
{ break; }
lastRow--;
}
using (var db = new BlackBox_FinaleEntities())// var db = new BlackBox_FinaleEntities())
{
for (var row = 2; row <= lastRow; row++)
{
var newPerson = new personalDetails
{
identificationType = worksheet.Cells[row, 1].Value.ToString(),
idNumber = worksheet.Cells[row, 2].Value.ToString(),
idSerial = worksheet.Cells[row, 3].Value.ToString(),
fullName = worksheet.Cells[row, 4].Value.ToString(),
dob = DateTime.Parse(worksheet.Cells[row, 5].Value.ToString()),
gender = worksheet.Cells[row, 6].Value.ToString()
};
DB.personalDetails.Add(newPerson);
try { db.SaveChanges(); }
catch (Exception) { }
}
}
}
}
}//END OF IMPORT
ViewBag.Message = "Your file was successfully uploaded.";
return RedirectToAction("Index");
}
ViewBag.Message = "Error: Your file was not uploaded. Ensure you upload an excel workbook file.";
return View();
}
else
{
ViewBag.Message = "Error: Your file was not uploaded. Ensure you upload an excel workbook file.";
return View();
}
}
See Picture Error
Any help would be greatly appreciated mates.
you can do like this:
public bool readXLS(string FilePath)
{
FileInfo existingFile = new FileInfo(FilePath);
using (ExcelPackage package = new ExcelPackage(existingFile))
{
//get the first worksheet in the workbook
ExcelWorksheet worksheet = package.Workbook.Worksheets[1];
int colCount = worksheet.Dimension.End.Column; //get Column Count
int rowCount = worksheet.Dimension.End.Row; //get row count
string queryString = "INSERT INTO tableName VALUES"; //Here I am using "blind insert". You can specify the column names Blient inset is strongly not recommanded
string eachVal = "";
bool status;
for (int row = 1; row <= rowCount; row++)
{
queryString += "(";
for (int col = 1; col <= colCount; col++)
{
eachVal = worksheet.Cells[row, col].Value.ToString().Trim();
queryString += "'" + eachVal + "',";
}
queryString = queryString.Remove(queryString.Length - 1, 1); //removing last comma (,) from the string
if (row % 1000 == 0) //On every 1000 query will execute, as maximum of 1000 will be executed at a time.
{
queryString += ")";
status = this.runQuery(queryString); //executing query
if (status == false)
return status;
queryString = "INSERT INTO tableName VALUES";
}
else
{
queryString += "),";
}
}
queryString = queryString.Remove(queryString.Length - 1, 1); //removing last comma (,) from the string
status = this.runQuery(queryString); //executing query
return status;
}
}
Details: http://sforsuresh.in/read-data-excel-sheet-insert-database-table-c/

Command Timeout, longer to get response back

I am executing large query,so my app throwing time out error. Some of the thread suggested to added command time out but after adding those lines it take longer to get response back, any idea why or what am i missing in my code?
public int CreateRecord(string theCommand, DataSet theInputData)
{
int functionReturnValue = 0;
int retVal = 0;
SqlParameter objSqlParameter = default(SqlParameter);
DataSet dsParameter = new DataSet();
int i = 0;
try
{
//Set the command text (stored procedure name or SQL statement).
mobj_SqlCommand.CommandTimeout = 120;
mobj_SqlCommand.CommandText = theCommand;
mobj_SqlCommand.CommandType = CommandType.StoredProcedure;
for (i = 0; i <= (theInputData.Tables.Count - 1); i++)
{
if (theInputData.Tables[i].Rows.Count > 0)
{
dsParameter.Tables.Add(theInputData.Tables[i].Copy());
}
}
objSqlParameter = new SqlParameter("#theXmlData", SqlDbType.Text);
objSqlParameter.Direction = ParameterDirection.Input;
objSqlParameter.Value = "<?xml version=\"1.0\" encoding=\"iso-8859-1\"?>" + dsParameter.GetXml();
//Attach to the parameter to mobj_SqlCommand.
mobj_SqlCommand.Parameters.Add(objSqlParameter);
//Finally, execute the command.
retVal = (int)mobj_SqlCommand.ExecuteScalar();
//Detach the parameters from mobj_SqlCommand, so it can be used again.
mobj_SqlCommand.Parameters.Clear();
functionReturnValue = retVal;
}
catch (Exception ex)
{
throw new System.Exception(ex.Message);
}
finally
{
//Clean up the objects created in this object.
if (mobj_SqlConnection.State == ConnectionState.Open)
{
mobj_SqlConnection.Close();
mobj_SqlConnection.Dispose();
mobj_SqlConnection = null;
}
if ((mobj_SqlCommand != null))
{
mobj_SqlCommand.Dispose();
mobj_SqlCommand = null;
}
if ((mobj_SqlDataAdapter != null))
{
mobj_SqlDataAdapter.Dispose();
mobj_SqlDataAdapter = null;
}
if ((dsParameter != null))
{
dsParameter.Dispose();
dsParameter = null;
}
objSqlParameter = null;
}
return functionReturnValue;
}

How to create sourcemaps for concatenated files

I want to concatenate a bunch of different files of a single type into one large file. For example, many javascript files into one large file, many css files down to one etc. I want to create a sourcemap of the files pre concatenation, but I do not know where to start. I am working in Node, but I am also open to solutions in other environments.
I know there are tools that can do this, but they seem to be on a language by language basis (uglifyjs, cssmin or whatever its called these days), but I want a tool that is not language specific.
Also, I would like to define how the files are bound. For example, in javascript I want to give each file its own closure with an IIFE. Such as:
(function () {
// File
}());
I can also think of other wrappers I would like to implement for different files.
Here are my options as I see it right now. However, I don't know which is best or how to start any of them.
Find a module that does this (I'm working in a Node.js environment)
Create an algorithm with Mozilla's source-map module. For that I also see a couple options.
Only map each line to the new line location
Map every single character to the new location
Map every word to its new location (this options seems way out of scope)
Don't even worry about source maps
What do you guys think about these options. I've already tried options 2.1 and 2.2, but the solution seemed way too complicated for a concatenation algorithm and it did not perform perfectly in the Google Chrome browser tools.
I implemented code without any dependencies like this:
export interface SourceMap {
version: number; // always 3
file?: string;
sourceRoot?: string;
sources: string[];
sourcesContent?: string[];
names?: string[];
mappings: string | Buffer;
}
const emptySourceMap: SourceMap = { version: 3, sources: [], mappings: new Buffer(0) }
var charToInteger = new Buffer(256);
var integerToChar = new Buffer(64);
charToInteger.fill(255);
'ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/='.split('').forEach((char, i) => {
charToInteger[char.charCodeAt(0)] = i;
integerToChar[i] = char.charCodeAt(0);
});
class DynamicBuffer {
buffer: Buffer;
size: number;
constructor() {
this.buffer = new Buffer(512);
this.size = 0;
}
ensureCapacity(capacity: number) {
if (this.buffer.length >= capacity)
return;
let oldBuffer = this.buffer;
this.buffer = new Buffer(Math.max(oldBuffer.length * 2, capacity));
oldBuffer.copy(this.buffer);
}
addByte(b: number) {
this.ensureCapacity(this.size + 1);
this.buffer[this.size++] = b;
}
addVLQ(num: number) {
var clamped: number;
if (num < 0) {
num = (-num << 1) | 1;
} else {
num <<= 1;
}
do {
clamped = num & 31;
num >>= 5;
if (num > 0) {
clamped |= 32;
}
this.addByte(integerToChar[clamped]);
} while (num > 0);
}
addString(s: string) {
let l = Buffer.byteLength(s);
this.ensureCapacity(this.size + l);
this.buffer.write(s, this.size);
this.size += l;
}
addBuffer(b: Buffer) {
this.ensureCapacity(this.size + b.length);
b.copy(this.buffer, this.size);
this.size += b.length;
}
toBuffer(): Buffer {
return this.buffer.slice(0, this.size);
}
}
function countNL(b: Buffer): number {
let res = 0;
for (let i = 0; i < b.length; i++) {
if (b[i] === 10) res++;
}
return res;
}
export class SourceMapBuilder {
outputBuffer: DynamicBuffer;
sources: string[];
mappings: DynamicBuffer;
lastSourceIndex = 0;
lastSourceLine = 0;
lastSourceCol = 0;
constructor() {
this.outputBuffer = new DynamicBuffer();
this.mappings = new DynamicBuffer();
this.sources = [];
}
addLine(text: string) {
this.outputBuffer.addString(text);
this.outputBuffer.addByte(10);
this.mappings.addByte(59); // ;
}
addSource(content: Buffer, sourceMap?: SourceMap) {
if (sourceMap == null) sourceMap = emptySourceMap;
this.outputBuffer.addBuffer(content);
let sourceLines = countNL(content);
if (content.length > 0 && content[content.length - 1] !== 10) {
sourceLines++;
this.outputBuffer.addByte(10);
}
let sourceRemap = [];
sourceMap.sources.forEach((v) => {
let pos = this.sources.indexOf(v);
if (pos < 0) {
pos = this.sources.length;
this.sources.push(v);
}
sourceRemap.push(pos);
});
let lastOutputCol = 0;
let inputMappings = (typeof sourceMap.mappings === "string") ? new Buffer(<string>sourceMap.mappings) : <Buffer>sourceMap.mappings;
let outputLine = 0;
let ip = 0;
let inOutputCol = 0;
let inSourceIndex = 0;
let inSourceLine = 0;
let inSourceCol = 0;
let shift = 0;
let value = 0;
let valpos = 0;
const commit = () => {
if (valpos === 0) return;
this.mappings.addVLQ(inOutputCol - lastOutputCol);
lastOutputCol = inOutputCol;
if (valpos === 1) {
valpos = 0;
return;
}
let outSourceIndex = sourceRemap[inSourceIndex];
this.mappings.addVLQ(outSourceIndex - this.lastSourceIndex);
this.lastSourceIndex = outSourceIndex;
this.mappings.addVLQ(inSourceLine - this.lastSourceLine);
this.lastSourceLine = inSourceLine;
this.mappings.addVLQ(inSourceCol - this.lastSourceCol);
this.lastSourceCol = inSourceCol;
valpos = 0;
}
while (ip < inputMappings.length) {
let b = inputMappings[ip++];
if (b === 59) { // ;
commit();
this.mappings.addByte(59);
inOutputCol = 0;
lastOutputCol = 0;
outputLine++;
} else if (b === 44) { // ,
commit();
this.mappings.addByte(44);
} else {
b = charToInteger[b];
if (b === 255) throw new Error("Invalid sourceMap");
value += (b & 31) << shift;
if (b & 32) {
shift += 5;
} else {
let shouldNegate = value & 1;
value >>= 1;
if (shouldNegate) value = -value;
switch (valpos) {
case 0: inOutputCol += value; break;
case 1: inSourceIndex += value; break;
case 2: inSourceLine += value; break;
case 3: inSourceCol += value; break;
}
valpos++;
value = shift = 0;
}
}
}
commit();
while (outputLine < sourceLines) {
this.mappings.addByte(59);
outputLine++;
}
}
toContent(): Buffer {
return this.outputBuffer.toBuffer();
}
toSourceMap(sourceRoot?: string): Buffer {
return new Buffer(JSON.stringify({ version: 3, sourceRoot, sources: this.sources, mappings: this.mappings.toBuffer().toString() }));
}
}
I, at first, implemented "index map" from that spec, only to find out that it is not supported by any browser.
Another project that could be useful to look at is magic string.

why large file is corrupting sometimes while uploading it to Azure blob

I am uploading a large file in azure storage . I am uploading a file in to 4 MB chunks. I used the following code from last 1 year but from last one month whenever I am uploading file It is getting corrupt some times and some times It uploads fine.
Can any one suggest me what I need to change in the code.
//Uploads a file from the file system to a blob. Parallel implementation.
public void ParallelUploadFile(CloudBlockBlob blob1, string fileName1, BlobRequestOptions options1, int maxBlockSize = 4 * 1024 * 1024, int rowId)
{
blob = blob1;
fileName = fileName1;
options = options1;
file = new FileInfo(fileName);
var fileStream = new FileStream(fileName, FileMode.Open, FileAccess.Read,FileShare.ReadWrite);
long fileSize = file.Length;
//Get the filesize
long fileSizeInMb = file.Length/1024/1024;
// let's figure out how big the file is here
long leftToRead = fileSize;
long startPosition = 0;
// have 1 block for every maxBlockSize bytes plus 1 for the remainder
var blockCount =
((int) Math.Floor((double) (fileSize/maxBlockSize))) + 1;
blockIds = new List<string>();
// populate the control array...
for (int j = 0; j < blockCount; j++)
{
var toRead = (int) (maxBlockSize < leftToRead
? maxBlockSize
: leftToRead);
var blockId = Convert.ToBase64String(
Encoding.ASCII.GetBytes(
string.Format("BlockId{0}", j.ToString("0000000"))));
transferDetails.Add(new BlockTransferDetail()
{
StartPosition = startPosition,
BytesToRead = toRead,
BlockId = blockId
});
if (toRead > 0)
{
blockIds.Add(blockId);
}
// increment the starting position
startPosition += toRead;
leftToRead -= toRead;
}
//*******
//PUT THE NO OF THREAD LOGIC HERE
//*******
int runFrom = 0;
int runTo = 0;
int uploadParametersCount = 0;
TotalUpload = Convert.ToInt64(fileSizeInMb);
for (int count = 0; count < transferDetails.Count; )
{
//Create uploading file parameters
uploadParametersesList.Add(new UploadParameters()
{
FileName = file.FullName,
BlockSize = 3900000,
//BlockSize = 4194304,
LoopFrom = runFrom + runTo,
IsPutBlockList = false,
UploadedBytes = 0,
Fs = fileStream,
RowIndex = rowId,
FileSize = Convert.ToInt64(fileSizeInMb)
});
//Logic to create correct threads
if (transferDetails.Count < 50)
{
runTo = transferDetails.Count;
uploadParametersesList[uploadParametersCount].LoopTo += runTo;
count += transferDetails.Count;
}
else
{
var tmp = transferDetails.Count - runTo;
if (tmp > 50 && tmp < 100)
{
runTo += tmp;
count += tmp;
uploadParametersesList[uploadParametersCount].LoopTo += runTo;
}
else
{
runTo += 50;
count += 50;
uploadParametersesList[uploadParametersCount].LoopTo += runTo;
}
}
//Add to Global Const
GlobalConst.UploadedParameters.Add(uploadParametersesList[uploadParametersCount]);
//Start the thread
int parametersCount = uploadParametersCount;
var thread = new Thread(() => ThRunThis(uploadParametersesList[parametersCount]))
{Priority = ThreadPriority.Highest};
thread.Start();
uploadParametersCount++;
//Start a timer here to put all blocks on azure blob
aTimer.Elapsed += OnTimedEvent;
aTimer.Interval = 5000;
aTimer.Start();
}
}
//Timer callback
private void OnTimedEvent(object source, ElapsedEventArgs e)
{
if (uploadParametersesList.Count(o => o.IsPutBlockList) == uploadParametersesList.Count)
{
aTimer.Elapsed -= OnTimedEvent;
aTimer.Stop();
//Finally commit it
try
{
uploadParametersesList.ForEach(x => x.Status = "Uploaded");
blob.PutBlockList(blockIds);
IsCompleted = true;
}
catch (Exception exception)
{
Console.WriteLine(exception.Message);
}
}
}
//Main thread
private void ThRunThis(UploadParameters uploadParameters)
{
try
{
for (int j = uploadParameters.LoopFrom; j < uploadParameters.LoopTo; j++)
{
br = new BinaryReader(uploadParameters.Fs);
var bytes = new byte[transferDetails[j].BytesToRead];
//move the file system reader to the proper position
uploadParameters.Fs.Seek(transferDetails[j].StartPosition, SeekOrigin.Begin);
br.Read(bytes, 0, transferDetails[j].BytesToRead);
if (bytes.Length > 0)
{
//calculate the block-level hash
MD5 md5 = new MD5CryptoServiceProvider();
byte[] blockHash = md5.ComputeHash(bytes);
string convertedHash = Convert.ToBase64String(blockHash, 0, 16);
blob.PutBlock(transferDetails[j].BlockId, new MemoryStream(bytes), convertedHash, options);
//Update Uploaded Bytes
uploadParameters.UploadedBytes += transferDetails[j].BytesToRead;
TotalUploadedBytes += transferDetails[j].BytesToRead;
Console.WriteLine(Thread.CurrentThread.Name);
//Try to free the memory
try
{
GC.Collect();
}
catch (Exception exception)
{
Console.WriteLine(exception.Message);
}
}
}
//Is Completed
uploadParameters.IsPutBlockList = true;
}
catch (Exception exception)
{
Console.WriteLine(Thread.CurrentThread.Name);
uploadParameters.Exception = exception.Message;
Console.WriteLine(exception.Message);
}
}
It's been a long time since I touched large blob upload with threads, but it looks like your block list is getting out of sequence by threads.
Why don't you get the block list from cloud once all blocks have been uploaded and then use that list for putBlockList. That would make sure you get them in correct sequence.

Resources