I'm trying to loop through a fileList in order to perform a delete query. First i fetched data from table "files" in database where attribute "postnumber"=user input. Then it is saved into the "fileList:Files[]". Then a loop through this fileList in order to perform a delete query. but it keeps saying that
"ERROR TypeError: undefined is not iterable (cannot read property
Symbol(Symbol.iterator))". See this image =>
forum-admin-list.component.ts
import { FileService } from 'src/app/shared/file.service';
import { Files } from 'src/app/shared/files.model';
export class ForumAdminListComponent {
fileList:Files[];
onDelete(pNo:string){
this.fservice.getPost(pNo).subscribe(actionArray => {
this.fileList = actionArray.map(item => {
return {
id: item.payload.doc.id,
...item.payload.doc.data()
} as Files;
})
});
for(let i of this.fileList){
this.storage.storage.refFromURL(i.path).delete();
this.firestore.doc("files/"+i.id).delete();
}
}
}
files.model.ts
export class Files {
id:string;
pNo:string;
downloadURL:string;
path:string;
}
file.service.ts
export class FileService {
formData: Files;
constructor(private firestore: AngularFirestore) { }
getPost(userRef){
return this.firestore.collection('files',ref=>ref.where('pNo','==',userRef)).snapshotChanges();
}
}
You're looping through the fileList outside the subscribe(), meaning it won't actually wait for the Observable to be resolved. Try to loop inside your subscribe().
onDelete(pNo:string){
this.fservice.getPost(pNo).subscribe(actionArray => {
this.fileList = actionArray.map(item => {
return {
id: item.payload.doc.id,
...item.payload.doc.data()
} as Files[];
for(let i of this.fileList){
this.storage.storage.refFromURL(i.path).delete();
this.firestore.doc("files/"+i.id).delete();
}
})
});
}
Also you might wanna mark the result of the subscription as Files[] instead of as File
Related
I'm using `casl` with a `Nest.js` project and I have a problem.
I want to deny users to read collections unless they are public.
My entity class looks like
class Collection {
isPrivate?: boolean
}
My casl factory looks like
allow(Action.READ, Collection, { isPublic: true })`
So I want to allow reading collection ONLY if it is public.
Currently, when I try ability.can(Action.READ, Collection) (so for a list) it returns true. I want it to return false because it's not public.
Now I give you complete code so you are able to understand the context.
import { Injectable } from '#nestjs/common';
import {
AbilityBuilder,
createMongoAbility,
ExtractSubjectType,
InferSubjects,
MongoAbility,
} from '#casl/ability';
import { JwtPayload, Role } from '../../authentication/types';
import { Action } from '../constants/action';
import { Collection } from '../../collection/entities/collection.entity';
import { User } from '../../user/entities/user.entity';
import { Media } from '../../media/entities/media.entity';
type Subjects =
| InferSubjects<
typeof Collection | typeof User | typeof Media | Collection | User | Media
>
| 'all';
export type AppAbility = MongoAbility<[Action, Subjects]>;
#Injectable()
export class AuthorizationAbilityFactory {
createForUser(user?: JwtPayload) {
const {
can: allow,
cannot: deny,
build,
} = new AbilityBuilder<AppAbility>(createMongoAbility);
allow(Action.READ, Collection, { isPublic: true }).because(
'Public collections can be read',
);
allow(Action.READ, Media);
if (user?.role !== Role.USER) {
allow(Action.CREATE, User);
}
return build({
// Read https://casl.js.org/v5/en/guide/subject-type-detection#use-classes-as-subject-types for details
detectSubjectType: (item) =>
item.constructor as ExtractSubjectType<Subjects>,
});
}
}
And the failing test
const ability = authorizationAbilityFactory.createForUser(); // <- anonymous user
expect(ability.can(Action.READ, Collection)).toBe(false); // <- this fails
I'm using generator-jhipster and I want to create blueprints for entity-client. After writing entity files, a postWriting function will call addEnitiyToMenu in generator-jhipster/generators/client/needle-api/needle-client-react.js to add new entity generated to file menu/entities.tsx
I need to override this function to write a different entityEntry with original one.
But I can't find the template for it. What should I do?
I found that I can write these function by my own. There is example code if you need
function generateFileModel(aFile, needleTag, ...content) {
return {
file: aFile,
needle: needleTag,
splicable: content,
};
}
function addBlockContentToFile(rewriteFileModel, generator) {
try {
return jhipsterUtils.rewriteFile(rewriteFileModel, generator);
} catch (e) {
console.error(e);
return null;
}
}
function addToMenu() {
if (this.skipClient) return;
if (!this.embedded) {
this.addEntityToModule();
const entityMenuPath = `${this.CLIENT_MAIN_SRC_DIR}app/shared/layout/menus/entities.tsx`;
const entityEntry =
// prettier-ignore
this.stripMargin(`|<Menu.Item key="${this.entityStateName}" icon={<FileOutlined />}>
| <Link to="/${this.entityStateName}">
| ${this.enableTranslation ? `<Translate contentKey="global.menu.entities.${this.entityTranslationKeyMenu}" />` : `${_.startCase(this.entityStateName)}`}
| </Link>
| </Menu.Item>`);
const rewriteFileModel = generateFileModel(entityMenuPath, 'jhipster-needle-add-entity-to-menu', entityEntry);
addBlockContentToFile(rewriteFileModel, this);
}
}
function replaceTranslations() {
if (this.clientFramework === VUE && !this.enableTranslation) {
if (!this.readOnly) {
utils.vueReplaceTranslation(this, [
`app/entities/${this.entityFolderName}/${this.entityFileName}.vue`,
`app/entities/${this.entityFolderName}/${this.entityFileName}-update.vue`,
`app/entities/${this.entityFolderName}/${this.entityFileName}-details.vue`,
]);
} else {
utils.vueReplaceTranslation(this, [
`app/entities/${this.entityFolderName}/${this.entityFileName}.vue`,
`app/entities/${this.entityFolderName}/${this.entityFileName}-details.vue`,
]);
}
}
}
I have a website where I can upload a .xlsx file which contains some rows of information for my database. I read the documentation for laravel-excel but it looks like it only works with progress bar if you use the console method; which I don't.
I currently just use a plain HTML upload form, no ajax yet.
But to create this progress bar for this I need to convert it to ajax, which is no hassle, that I can do.
But how would I create the progress bar when uploading the file and iterating through each row in the Excel file?
This is the controller and method where the upload gets done:
/**
* Import companies
*
* #param Import $request
* #return \Illuminate\Routing\Redirector|\Illuminate\Http\RedirectResponse
*/
public function postImport(Import $request)
{
# Import using Import class
Excel::import(new CompaniesImport, $request->file('file'));
return redirect(route('dashboard.companies.index.get'))->with('success', 'Import successfull!');
}
And this is the import file:
public function model(array $row)
{
# Don't create or validate on empty rows
# Bad workaround
# TODO: better solution
if (!array_filter($row)) {
return null;
}
# Create company
$company = new Company;
$company->crn = $row['crn'];
$company->name = $row['name'];
$company->email = $row['email'];
$company->phone = $row['phone'];
$company->website = (!empty($row['website'])) ? Helper::addScheme($row['website']) : '';
$company->save();
# Everything empty.. delete address
if (!empty($row['country']) || !empty($row['state']) || !empty($row['postal']) || !empty($row['address']) || !empty($row['zip'])) {
# Create address
$address = new CompanyAddress;
$address->company_id = $company->id;
$address->country = $row['country'];
$address->state = $row['state'];
$address->postal = $row['postal'];
$address->address = $row['address'];
$address->zip = $row['zip'];
$address->save();
# Attach
$company->addresses()->save($address);
}
return $company;
}
I know this is not much at this point. I just need some help figuring out how I would create this progress bar, because I'm pretty stuck.
My thought is to create a ajax upload form though, but from there I don't know.
Just an idea, but you could use the Laravel session to store the total_row_count and processed_row_count during the import execution. Then, you could create a separate AJAX call on a setInterval() to poll those session values (e.g., once per second). This would allow you to calculate your progress as processed_row_count / total_row_count, and output to a visual progress bar. – matticustard
Putting #matticustard comment into practice. Below is just sample of how things could be implemented, and maybe there are areas to improve.
1. Routes
import route to initialize Excel import.
import-status route will be used to get latest import status
Route::post('import', [ProductController::class, 'import']);
Route::get('import-status', [ProductController::class, 'status']);
2. Controller
import action will validate uploaded file, and pass $id to ProductsImport class. As it will be queued and run in the background, there is no access to current session. We will use cache in the background. It will be good idea to generate more randomized $id if more concurrent imports will be processed, for now just unix date to keep simple.
You currently cannot queue xls imports. PhpSpreadsheet's Xls reader contains some non-utf8 characters, which makes it impossible to queue.
XLS imports could not be queued
public function import()
{
request()->validate([
'file' => ['required', 'mimes:xlsx'],
]);
$id = now()->unix()
session([ 'import' => $id ]);
Excel::queueImport(new ProductsImport($id), request()->file('file')->store('temp'));
return redirect()->back();
}
Get latest import status from cache, passing $id from session.
public function status()
{
$id = session('import');
return response([
'started' => filled(cache("start_date_$id")),
'finished' => filled(cache("end_date_$id")),
'current_row' => (int) cache("current_row_$id"),
'total_rows' => (int) cache("total_rows_$id"),
]);
}
3. Import class
Using WithEvents BeforeImport we set total rows of our excel file to the cache. Using onRow we set currently processing row to the cache. And AfterReset clear all the data.
<?php
namespace App\Imports;
use App\Models\Product;
use Maatwebsite\Excel\Row;
use Maatwebsite\Excel\Concerns\OnEachRow;
use Maatwebsite\Excel\Events\AfterImport;
use Maatwebsite\Excel\Events\BeforeImport;
use Maatwebsite\Excel\Concerns\WithEvents;
use Illuminate\Contracts\Queue\ShouldQueue;
use Maatwebsite\Excel\Concerns\WithStartRow;
use Maatwebsite\Excel\Concerns\WithChunkReading;
use Maatwebsite\Excel\Concerns\WithMultipleSheets;
class ProductsImport implements OnEachRow, WithEvents, WithChunkReading, ShouldQueue
{
public $id;
public function __construct(int $id)
{
$this->id = $id;
}
public function chunkSize(): int
{
return 100;
}
public function registerEvents(): array
{
return [
BeforeImport::class => function (BeforeImport $event) {
$totalRows = $event->getReader()->getTotalRows();
if (filled($totalRows)) {
cache()->forever("total_rows_{$this->id}", array_values($totalRows)[0]);
cache()->forever("start_date_{$this->id}", now()->unix());
}
},
AfterImport::class => function (AfterImport $event) {
cache(["end_date_{$this->id}" => now()], now()->addMinute());
cache()->forget("total_rows_{$this->id}");
cache()->forget("start_date_{$this->id}");
cache()->forget("current_row_{$this->id}");
},
];
}
public function onRow(Row $row)
{
$rowIndex = $row->getIndex();
$row = array_map('trim', $row->toArray());
cache()->forever("current_row_{$this->id}", $rowIndex);
// sleep(0.2);
Product::create([ ... ]);
}
}
4. Front end
On the front-end side this is just sample how things could be handled. Here I used vuejs, ant-design-vue and lodash.
After uploading file handleChange method is called
On successful upload trackProgress method is called for the first time
trackProgress method is recursive function, calling itself on complete
with lodash _.debounce method we can prevent calling it too much
export default {
data() {
this.trackProgress = _.debounce(this.trackProgress, 1000);
return {
visible: true,
current_row: 0,
total_rows: 0,
progress: 0,
};
},
methods: {
handleChange(info) {
const status = info.file.status;
if (status === "done") {
this.trackProgress();
} else if (status === "error") {
this.$message.error(_.get(info, 'file.response.errors.file.0', `${info.file.name} file upload failed.`));
}
},
async trackProgress() {
const { data } = await axios.get('/import-status');
if (data.finished) {
this.current_row = this.total_rows
this.progress = 100
return;
};
this.total_rows = data.total_rows;
this.current_row = data.current_row;
this.progress = Math.ceil(data.current_row / data.total_rows * 100);
this.trackProgress();
},
close() {
if (this.progress > 0 && this.progress < 100) {
if (confirm('Do you want to close')) {
this.$emit('close')
window.location.reload()
}
} else {
this.$emit('close')
window.location.reload()
}
}
},
};
<template>
<a-modal
title="Upload excel"
v-model="visible"
cancel-text="Close"
ok-text="Confirm"
:closable="false"
:maskClosable="false"
destroyOnClose
>
<a-upload-dragger
name="file"
:multiple="false"
:showUploadList="false"
:action="`/import`"
#change="handleChange"
>
<p class="ant-upload-drag-icon">
<a-icon type="inbox" />
</p>
<p class="ant-upload-text">Click to upload</p>
</a-upload-dragger>
<a-progress class="mt-5" :percent="progress" :show-info="false" />
<div class="text-right mt-1">{{ this.current_row }} / {{ this.total_rows }}</div>
<template slot="footer">
<a-button #click="close">Close</a-button>
</template>
</a-modal>
</template>
<script>
export default {
data() {
this.trackProgress = _.debounce(this.trackProgress, 1000);
return {
visible: true,
current_row: 0,
total_rows: 0,
progress: 0,
};
},
methods: {
handleChange(info) {
const status = info.file.status;
if (status === "done") {
this.trackProgress();
} else if (status === "error") {
this.$message.error(_.get(info, 'file.response.errors.file.0', `${info.file.name} file upload failed.`));
}
},
async trackProgress() {
const { data } = await axios.get('/import-status');
if (data.finished) {
this.current_row = this.total_rows
this.progress = 100
return;
};
this.total_rows = data.total_rows;
this.current_row = data.current_row;
this.progress = Math.ceil(data.current_row / data.total_rows * 100);
this.trackProgress();
},
close() {
if (this.progress > 0 && this.progress < 100) {
if (confirm('Do you want to close')) {
this.$emit('close')
window.location.reload()
}
} else {
this.$emit('close')
window.location.reload()
}
}
},
};
</script>
Hi I'm very new to angular.
I have an url like http://localhost:3000/prizes/brand1 , brand2, brand3
I want to reload my grid items and filter them base on that brand.
{ path: 'prizes/:brand_name', component: PrizeComponent }
This is the method that I have to get the items
getPrizes() {
return this._http.get('/prizes')
.map(data => data.json()).toPromise()
}
Question
How do I reload/call back the get method base on the new route and how do I get the brand name and pass that one as a parameter to the get method.
Regards
If to use dataService to fetch data:
#Injectable()
export class daraService {
private brandName: string = '';
constructor(aroute: ActivatedRoute) {
aroute.url.subscribe((data) => {
console.log('params ', data); //check for brand here
this.brandName = data[data.length - 1]
});
}
getPrizes() {
// you can access this.brandName value here
return this._http.get('/prizes')
.map(data => data.json()).toPromise()
}
}
I have the following readable stream in typescript:
import {Readable} from "stream";
enum InputState {
NOT_READABLE,
READABLE,
ENDED
}
export class Aggregator extends Readable {
private inputs: Array<NodeJS.ReadableStream>;
private states: Array<InputState>;
private records: Array<any>;
constructor(options, inputs: Array<NodeJS.ReadableStream>) {
// force object mode
options.objectMode = true;
super(options);
this.inputs = inputs;
// set initial state
this.states = this.inputs.map(() => InputState.NOT_READABLE);
this.records = this.inputs.map(() => null);
// register event handlers for input streams
this.inputs.forEach((input, i) => {
input.on("readable", () => {
console.log("input", i, "readable event fired");
this.states[i] = InputState.READABLE;
if (this._readable) { this.emit("_readable"); }
});
input.on("end", () => {
console.log("input", i, "end event fired");
this.states[i] = InputState.ENDED;
// if (this._end) { this.push(null); return; }
if (this._readable) { this.emit("_readable"); }
});
});
}
get _readable () {
return this.states.every(
state => state === InputState.READABLE ||
state === InputState.ENDED);
}
get _end () {
return this.states.every(state => state === InputState.ENDED);
}
_aggregate () {
console.log("calling _aggregate");
let timestamp = Infinity,
indexes = [];
console.log("initial record state", JSON.stringify(this.records));
this.records.forEach((record, i) => {
// try to read missing records
if (!this.records[i] && this.states[i] !== InputState.ENDED) {
this.records[i] = this.inputs[i].read();
if (!this.records[i]) {
this.states[i] = InputState.NOT_READABLE;
return;
}
}
// update timestamp if a better one is found
if (this.records[i] && timestamp > this.records[i].t) {
timestamp = this.records[i].t;
// clean the indexes array
indexes.length = 0;
}
// include the record index if has the required timestamp
if (this.records[i] && this.records[i].t === timestamp) {
indexes.push(i);
}
});
console.log("final record state", JSON.stringify(this.records), indexes, timestamp);
// end prematurely if after trying to read inputs the aggregator is
// not ready
if (!this._readable) {
console.log("end prematurely trying to read inputs", this.states);
this.push(null);
return;
}
// end prematurely if all inputs are ended and there is no remaining
// record values
if (this._end && indexes.length === 0) {
console.log("end on empty indexes", this.states);
this.push(null);
return;
}
// create the aggregated record
let record = {
t: timestamp,
v: this.records.map(
(r, i) => indexes.indexOf(i) !== -1 ? r.v : null
)
};
console.log("aggregated record", JSON.stringify(record));
if (this.push(record)) {
console.log("record pushed downstream");
// remove records already aggregated and pushed
indexes.forEach(i => { this.records[i] = null; });
this.records.forEach((record, i) => {
// try to read missing records
if (!this.records[i] && this.states[i] !== InputState.ENDED) {
this.records[i] = this.inputs[i].read();
if (!this.records[i]) {
this.states[i] = InputState.NOT_READABLE;
}
}
});
} else {
console.log("record failed to push downstream");
}
}
_read () {
console.log("calling _read", this._readable);
if (this._readable) { this._aggregate(); }
else {
this.once("_readable", this._aggregate.bind(this));
}
}
}
It is designed to aggregate multiple input streams in object mode. In the end it aggregate multiple time series data streams into a single one. The problem i'm facing is that when i test the feature i'm seeing repeatedly the message record failed to push downstream and immediately the message calling _read true and in between just the 3 messages related to the aggregation algorithm. So the Readable stream machinery is calling _read and every time it's failing the push() call. Any idea why is this happening? Did you know of a library that implement this kind of algorithm or a better way to implement this feature?
I will answer myself the question.
The problem was that i was misunderstanding the meaning of the this.push() return value call. I think a false return value mean that the current push operation fail but the real meaning is that the next push operation will fail.
A simple fix to the code shown above is to replace this:
if (this.push(record)) {
console.log("record pushed downstream");
// remove records already aggregated and pushed
indexes.forEach(i => { this.records[i] = null; });
this.records.forEach((record, i) => {
// try to read missing records
if (!this.records[i] && this.states[i] !== InputState.ENDED) {
this.records[i] = this.inputs[i].read();
if (!this.records[i]) {
this.states[i] = InputState.NOT_READABLE;
}
}
});
} else {
console.log("record failed to push downstream");
}
By this:
this.push(record);
console.log("record pushed downstream");
// remove records already aggregated and pushed
indexes.forEach(i => { this.records[i] = null; });
this.records.forEach((record, i) => {
// try to read missing records
if (!this.records[i] && this.states[i] !== InputState.ENDED) {
this.records[i] = this.inputs[i].read();
if (!this.records[i]) {
this.states[i] = InputState.NOT_READABLE;
}
}
});
You can notice that the only difference is avoid conditioning operations on the return value of the this.push() call. Given that the current implementation call this.push() only once per _read() call this simple change solve the issue.
It means feeding is faster than consuming. The official approach is enlarge its highWaterMark, Default: 16384 (16KB), or 16 for objectMode. As long as its inner buffer is big enough, the push function will always return true. It does not have to be single push() in single _read(). You may push as much as the highWaterMark indicates in a single _read().