I need to pause audios while recording. So I'm using mP4 parser to merge the audios of .3GPP format. If the audios are less than 1 second, it gets merged. But if the audio size exceeds more than 1 second, it doesn't get merged. Here is my code:
private void mergeFiles(File filePath, ArrayList audioFiles) throws IOException {
int count = audioFiles.size();
Movie[] inMovies = new Movie[count];
for (int i = 0; i < count; i++) {
Movie movie = MovieCreator.build(new FileDataSourceImpl(file));
inMovies[i] = movie;
}
List<Track> movieTracks = new LinkedList<Track>();
for (Movie m : inMovies) {
for (Track t : m.getTracks()) {
movieTracks.add(t);
}
}
Movie finalMovie = new Movie();
if (movieTracks.size() > 0)
finalMovie.addTrack(new AppendTrack(movieTracks.toArray(new Track[movieTracks.size()])));
Container container = new DefaultMp4Builder().build(finalMovie);
File anotherFile = new File(getFinalFileName());
final FileOutputStream fos = new FileOutputStream(anotherFile);
final WritableByteChannel bb = Channels.newChannel(fos);
container.writeContainer(bb);
fos.close();
for (int i = 0; i < count; i++) {
File arrayFile = new File(mAudioFiles.get(i));
if (arrayFile.exists())
arrayFile.delete();
}
}
Is anyone facing a similar situation..?
Related
I have an ASP.NET Core app, with a model, the aim is to allow user to upload an excel file and then save the file to the model/table. I have the below method
[HttpPost]
[ValidateAntiForgeryToken]
public async Task<IActionResult> Upload(IFormFile file)
{
string webRootPath = _hostEnvironment.WebRootPath;
var uploads = Path.Combine(webRootPath, "Upload");
var files = HttpContext.Request.Form.Files;
var extension = Path.GetExtension(files[0].FileName);
using (var filesStream = new FileStream(Path.Combine(uploads, file.FileName), FileMode.Create))
{
files[0].CopyTo(filesStream);
}
var list = new List<User>();
using (var stream = new MemoryStream())
{
await file.CopyToAsync(stream);
using (var package = new ExcelPackage(stream))
{
ExcelWorksheet worksheet = package.Workbook.Worksheets[0];
var rowcount = worksheet.Dimension.Rows;
for (int row = 2; row <= rowcount; row++)
{
list.Add(new User
{
Name = worksheet.Cells[row, 1]?.Value?.ToString().Trim(),
Address1 = worksheet.Cells[row, 2]?.Value?.ToString().Trim(),
PostCode = worksheet.Cells[row, 3]?.Value?.ToString().Trim(),
Mobile = worksheet.Cells[row, 4]?.Value?.ToString().Trim(),
});
}
}
}
foreach (var user in list)
{
_db.User.AddAsyncy(user);
}
_db.SaveChangesAsyncy();
return View();
}
This code works fine by processing an excel file uploaded by a user but the problem I'm having is that when the file is large say above 3 mb, it takes well over 8 minutes to upload.
Any idea how to speed this up please? Thanks.
There are two things you can do to increase speed.
1)Instead of reading excel file with ExcelWorksheet class go with a library called ExcelDataReader which can read around 600k records under a minute.
sample code
Model
class Person
{
public int id,
public string name
}
//and excel file has both columns in model ,the we can read with below code
using ExcelDataReader;
System.Text.Encoding.RegisterProvider(System.Text.CodePagesEncodingProvider.Instance);
var fileName = "./Person.xlsx";
var timer = new Stopwatch();
timer.Start();
int counter=0;
List<Person> persons = new List<Person>();
using (var stream = System.IO.File.Open(fileName, FileMode.Open, FileAccess.Read))
{
using (var reader = ExcelReaderFactory.CreateReader(stream))
{
while (reader.Read()) //Each row of the file
{
var person = new Person
{
id = reader.GetValue(0).ToString(),
name = reader.GetValue(1).ToString()
}
persons.Add(person)
counter++;
}
timer.Stop();
duration = timer.ElapsedMilliseconds / 1000;
//to check performace print duration and persons list
}
}
https://github.com/ExcelDataReader/ExcelDataReader
2)Once you read and store data in a list, you can store that data in DataTable class and insert into database using Oracle.ManagedDataAccess.Client Nuget package instead of EFcore. This method is fast. Please go through below link for doing this with Oracle database.
https://www.c-sharpcorner.com/article/two-ways-to-insert-bulk-data-into-oracle-database-using-c-sharp/
var db_timer = new Stopwatch();
db_timer.Start();
DataTable dt = new DataTable();
dt.Columns.Add("id");
dt.Columns.Add("name");
for (int i = 0; i < counter; i++)
{
DataRow dr = dt.NewRow();
dr["id"] = persons[i].id;
dr["name"] = persons[i].name;
dt.Rows.Add(dr);
}
using (var connection = new OracleConnection(oracleConString))
{
connection.Open();
int[] ids = new int[dt.Rows.Count];
string[] names = new string[dt.Rows.Count];
for (int j = 0; j < dt.Rows.Count; j++)
{
ids[j] = Convert.ToString(dt.Rows[j]["id"]);
names[j] = Convert.ToString(dt.Rows[j]["name"]);
}
OracleParameter id = new OracleParameter();
id.OracleDbType = OracleDbType.Int32;
id.Value = ids;
OracleParameter name = new OracleParameter();
name.OracleDbType = OracleDbType.Varchar2;
name.Value = names;
OracleCommand cmd = connection.CreateCommand();
cmd.CommandText = "INSERT INTO TEST(id,name) VALUES (:1,:2)";
cmd.ArrayBindCount = ids.Length;
cmd.Parameters.Add(id);
cmd.Parameters.Add(name);
cmd.ExecuteNonQuery();
}
just sample code you can user timer to check how much time it is taking to execute.
I have some problems with splitting the file and I don't know what to do now. I have to split the movie and then merge the splited elements into one. And i have to use FileStream. If you can help me, i will be really happy :)
using System;
using System.Collections.Generic;
using System.IO;
class Program
{
static void Main()
{
string source = "../../movie.avi";
string destination;
int n = int.Parse(Console.ReadLine());
for (int i = 0; i < n; i++)
{
destination = "Part-" + i +".avi";
Slice(source, destination, n);
}
List<int> files = new List<int>();
//Assemble(, destination);
}
static void Slice(string sourceFile, string destinationDirectory, int parts)
{
using (var source = new FileStream(sourceFile, FileMode.Open))
{
for (int i = 0; i < parts; i++)
{
using (var destination = new FileStream(destinationDirectory, FileMode.CreateNew))
{
double fileLength = source.Length;
byte[] buffer = new byte[4096];
while (true)
{
int readBytes = source.Read(buffer, 0, buffer.Length);
if (readBytes == 0)
{
break;
}
destination.Write(buffer, 0, readBytes);
}
}
}
}
}
static void Assemble(List<string> files, string destinationDirectory)
{
}
}
I'm working on a project which is about taking attendance of a class through the class video. I'm training the data when the program is running and it is taking a lot of time to train the data. Is there any way by which I can save the trained data and use directly in the program. Below is my code:
public static void main(String[] args) throws MalformedURLException, IOException, VideoCaptureException
{
FKEFaceDetector faceDetector = new FKEFaceDetector(new HaarCascadeDetector(40));
EigenFaceRecogniser<KEDetectedFace, Person> faceRecogniser = EigenFaceRecogniser.create(20, new RotateScaleAligner(), 1, DoubleFVComparison.CORRELATION, 0.9f);
final FaceRecognitionEngine<KEDetectedFace, Person> faceEngine = FaceRecognitionEngine.create(faceDetector, faceRecogniser);
Video<MBFImage> video;
//video = new VideoCapture(320, 100);
video = new XuggleVideo(new URL("file:///home/kamal/Videos/Samplevideo1.mp4"));
Person[] dataset = new Person[12];
dataset[0] = new Person("a");
dataset[1] = new Person("b");
dataset[2] = new Person("c");
dataset[3] = new Person("d");
dataset[4] = new Person("e");
dataset[5] = new Person("f");
dataset[6] = new Person("g");
dataset[7] = new Person("h");
dataset[8] = new Person("i");
dataset[9] = new Person("j");
dataset[10] = new Person("k");
dataset[11] = new Person("l");
int dcount;
for(int i = 0; i < 12; i++)
{
dcount = 0;
for(int j = 1; j <= 20 && dcount == 0; j++)
{
MBFImage mbfImage = ImageUtilities.readMBF(new URL("file:///home/kamal/Pictures/"+i+"/"+j+".png"));
FImage fimg = mbfImage.flatten();
List<KEDetectedFace> faces = faceEngine.getDetector().detectFaces(fimg);
if(faces.size() > 0)
{
faceEngine.train(faces.get(0), dataset[i]);
dcount++;
}
}
}
VideoDisplay<MBFImage> vd = VideoDisplay.createVideoDisplay(video);
vd.addVideoListener(new VideoDisplayListener<MBFImage>() {
public void afterUpdate(VideoDisplay<MBFImage> display) {
}
public void beforeUpdate(MBFImage frame)
{
FImage image = frame.flatten();
List<KEDetectedFace> faces = faceEngine.getDetector().detectFaces(image);
for(DetectedFace face : faces) {
frame.drawShape(face.getBounds(), RGBColour.RED);
try {
List<IndependentPair<KEDetectedFace, ScoredAnnotation<Person>>> rfaces = faceEngine.recogniseBest(face.getFacePatch());
ScoredAnnotation<Person> score = rfaces.get(0).getSecondObject();
if (score != null)
{
System.out.println("Mr. "+score.annotation+" is Present.");
}
else
{
System.out.println("Recognizing");
}
} catch (Exception e) {
}
}
}
});
}
Yes, just use the static methods in the org.openimaj.io.IOUtils class to write the faceEngine to disk once it's trained and read it back in again.
I am uploading a large file in azure storage . I am uploading a file in to 4 MB chunks. I used the following code from last 1 year but from last one month whenever I am uploading file It is getting corrupt some times and some times It uploads fine.
Can any one suggest me what I need to change in the code.
//Uploads a file from the file system to a blob. Parallel implementation.
public void ParallelUploadFile(CloudBlockBlob blob1, string fileName1, BlobRequestOptions options1, int maxBlockSize = 4 * 1024 * 1024, int rowId)
{
blob = blob1;
fileName = fileName1;
options = options1;
file = new FileInfo(fileName);
var fileStream = new FileStream(fileName, FileMode.Open, FileAccess.Read,FileShare.ReadWrite);
long fileSize = file.Length;
//Get the filesize
long fileSizeInMb = file.Length/1024/1024;
// let's figure out how big the file is here
long leftToRead = fileSize;
long startPosition = 0;
// have 1 block for every maxBlockSize bytes plus 1 for the remainder
var blockCount =
((int) Math.Floor((double) (fileSize/maxBlockSize))) + 1;
blockIds = new List<string>();
// populate the control array...
for (int j = 0; j < blockCount; j++)
{
var toRead = (int) (maxBlockSize < leftToRead
? maxBlockSize
: leftToRead);
var blockId = Convert.ToBase64String(
Encoding.ASCII.GetBytes(
string.Format("BlockId{0}", j.ToString("0000000"))));
transferDetails.Add(new BlockTransferDetail()
{
StartPosition = startPosition,
BytesToRead = toRead,
BlockId = blockId
});
if (toRead > 0)
{
blockIds.Add(blockId);
}
// increment the starting position
startPosition += toRead;
leftToRead -= toRead;
}
//*******
//PUT THE NO OF THREAD LOGIC HERE
//*******
int runFrom = 0;
int runTo = 0;
int uploadParametersCount = 0;
TotalUpload = Convert.ToInt64(fileSizeInMb);
for (int count = 0; count < transferDetails.Count; )
{
//Create uploading file parameters
uploadParametersesList.Add(new UploadParameters()
{
FileName = file.FullName,
BlockSize = 3900000,
//BlockSize = 4194304,
LoopFrom = runFrom + runTo,
IsPutBlockList = false,
UploadedBytes = 0,
Fs = fileStream,
RowIndex = rowId,
FileSize = Convert.ToInt64(fileSizeInMb)
});
//Logic to create correct threads
if (transferDetails.Count < 50)
{
runTo = transferDetails.Count;
uploadParametersesList[uploadParametersCount].LoopTo += runTo;
count += transferDetails.Count;
}
else
{
var tmp = transferDetails.Count - runTo;
if (tmp > 50 && tmp < 100)
{
runTo += tmp;
count += tmp;
uploadParametersesList[uploadParametersCount].LoopTo += runTo;
}
else
{
runTo += 50;
count += 50;
uploadParametersesList[uploadParametersCount].LoopTo += runTo;
}
}
//Add to Global Const
GlobalConst.UploadedParameters.Add(uploadParametersesList[uploadParametersCount]);
//Start the thread
int parametersCount = uploadParametersCount;
var thread = new Thread(() => ThRunThis(uploadParametersesList[parametersCount]))
{Priority = ThreadPriority.Highest};
thread.Start();
uploadParametersCount++;
//Start a timer here to put all blocks on azure blob
aTimer.Elapsed += OnTimedEvent;
aTimer.Interval = 5000;
aTimer.Start();
}
}
//Timer callback
private void OnTimedEvent(object source, ElapsedEventArgs e)
{
if (uploadParametersesList.Count(o => o.IsPutBlockList) == uploadParametersesList.Count)
{
aTimer.Elapsed -= OnTimedEvent;
aTimer.Stop();
//Finally commit it
try
{
uploadParametersesList.ForEach(x => x.Status = "Uploaded");
blob.PutBlockList(blockIds);
IsCompleted = true;
}
catch (Exception exception)
{
Console.WriteLine(exception.Message);
}
}
}
//Main thread
private void ThRunThis(UploadParameters uploadParameters)
{
try
{
for (int j = uploadParameters.LoopFrom; j < uploadParameters.LoopTo; j++)
{
br = new BinaryReader(uploadParameters.Fs);
var bytes = new byte[transferDetails[j].BytesToRead];
//move the file system reader to the proper position
uploadParameters.Fs.Seek(transferDetails[j].StartPosition, SeekOrigin.Begin);
br.Read(bytes, 0, transferDetails[j].BytesToRead);
if (bytes.Length > 0)
{
//calculate the block-level hash
MD5 md5 = new MD5CryptoServiceProvider();
byte[] blockHash = md5.ComputeHash(bytes);
string convertedHash = Convert.ToBase64String(blockHash, 0, 16);
blob.PutBlock(transferDetails[j].BlockId, new MemoryStream(bytes), convertedHash, options);
//Update Uploaded Bytes
uploadParameters.UploadedBytes += transferDetails[j].BytesToRead;
TotalUploadedBytes += transferDetails[j].BytesToRead;
Console.WriteLine(Thread.CurrentThread.Name);
//Try to free the memory
try
{
GC.Collect();
}
catch (Exception exception)
{
Console.WriteLine(exception.Message);
}
}
}
//Is Completed
uploadParameters.IsPutBlockList = true;
}
catch (Exception exception)
{
Console.WriteLine(Thread.CurrentThread.Name);
uploadParameters.Exception = exception.Message;
Console.WriteLine(exception.Message);
}
}
It's been a long time since I touched large blob upload with threads, but it looks like your block list is getting out of sequence by threads.
Why don't you get the block list from cloud once all blocks have been uploaded and then use that list for putBlockList. That would make sure you get them in correct sequence.
I'm running a multi thread image compressing process. The original file is 1280x
960 high resolution PNG file, about 1800KB. I need compress to <70KB JPEG file. When I process few vehicles, the process runs fine. when I process over 20 vehicles, I start to get out of memory error. Here is the code.
private static ImageCodecInfo GetEncoderInfo(String mimeType)
{
int j;
ImageCodecInfo[] encoders;
encoders = ImageCodecInfo.GetImageEncoders();
for (j = 0; j < encoders.Length; ++j)
{
if (encoders[j].MimeType == mimeType)
return encoders[j];
}
return null;
}
public static void SaveAsJpg(string inFilePath = null, string outputFileName = null, long compression = 70, long quality = 70)
{
System.Drawing.Image orgimage = System.Drawing.Image.FromFile(inFilePath);
var imgIn = new Bitmap(orgimage);
var imgOut = new Bitmap(imgIn.Width, imgIn.Height);
Graphics g = Graphics.FromImage(imgOut);
g.Clear(Color.White);
g.DrawImage(imgIn, 0, 0, imgIn.Width, imgIn.Height);
EncoderParameters encoding = new EncoderParameters(2);
encoding.Param[0] = new EncoderParameter(System.Drawing.Imaging.Encoder.Compression, compression);
encoding.Param[1] = new EncoderParameter(System.Drawing.Imaging.Encoder.Quality, quality);
ImageCodecInfo myImageCodecInfo = GetEncoderInfo("image/jpeg");
imgOut.Save(outputFileName, myImageCodecInfo, encoding);
}
Thanks in advance for any suggestion.
I think you need to dispose the graphic objects before leaving the method.
g.Dispose();
orgiImage.Dispose();
imgIn.Dispose();
imgOut.Dispose();
Disposing System.drawing objects