MongoDB value not incrementing after incremented once - node.js

I have been working on a project for a url shortener, where I am trying to increment views every time someone hit that api or someone make a get request. I don't know what is getting wrong but the views are just incrementing for once only.
Here's the code I wrote to increment the views every time a get request is made.
Get - link
Incrementing in mongodb - link
I do not know what is going wrong, I have already searched through lot of forums, stack overflow, articles already asked problems.
Note -
The default value of views is 0 , here is the value in post request.
Distinct users, generated by are dummy values as of now.
Mongoose Schema Link - link
I am creating this project with nodejs, expressjs, mongodb(database), mongoose(orm), pugjs(view engine).
Project Link : link
Here's the file tree
├── LICENSE
├── README.md
├── client
│ ├── assets
│ │ ├── bg.svg
│ │ ├── favicon
│ │ │ └── favicon.ico
│ │ └── fonts
│ │ ├── Apercu\ Medium.woff
│ │ ├── Apercu\ Mono.woff
│ │ └── Apercu_Regular.woff
│ ├── css
│ │ └── style.css
│ ├── index.html
│ └── js
│ └── script.js
├── index.js
├── models
│ ├── admin_model.js
│ └── urlshorten.js
├── package-lock.json
├── package.json
├── routes
│ ├── admin.js
│ ├── auth.js
│ ├── custom.js
│ ├── stats.js
│ └── urlShorten.js
├── static
│ ├── css
│ │ └── style.css
│ ├── favicon.ico
│ ├── fonts
│ │ └── Inter-Regular.woff
│ └── urlshort.gif
└── views
├── index.pug
└── script.js

In simple terms:
1. It is because of 301 Status Code for Redirect you are using.
2. 301 Status Code represents a Permanent Redirect.
3. The browser stores the main URL in the memory.
4. The next time you request the short link from your browser, it gets the Original
URL from its memory and sends you there, without going through the Server.
5. Since the request doesn't go through the server, the server fails to increment it.
Solution:
Change 301 (Permanent) Status code to 307 (Temporary) Status Code.
FILE: ShortLink/routes/urlShortner.js
Change the below lines
Line 37: res.writeHead(301, {
Line 38: Location: url.inputUrl
Line 39: });
to
Line 37: res.writeHead(307, {
Line 38: Location: url.inputUrl
Line 39: });
I have also created a pull request to your Github Repo which you can verify.

Related

My step definitions are not being picked in Cypress

My current project tree for cypress looks something like this:
├── cypress
│ ├── OtherProjectFolder
│ │ ├── frontend
│ │ │ └── TestUI.feature
│ ├── pages_objects
│ │ ├── mainPage.js
│ └── step_definitions
│ │ └── Testui.js
│ ├── e2e
│ │ ├── backend
│ │ │ └── TestBackend.feature
│ ├── pages_objects
│ │ ├── backendPage.js
│ └── step_definitions
│ │ └── TestBackend.js
Essentially I want to define all my step definitions in a different director, and all my page objects in a different directory, because I have many project to automate.
Here is my current cucumber preprocessor look like in package.json:
"cypress-cucumber-preprocessor": {
"nonGlobalStepDefinitions": false,
"step_definitions": "cypress/e2e"
}
If I change the path of the stepsDefinition to "cypress/OtherProjectFolder", this time it does not picked the steps in e2e. If I just type "cypress" I get this error. Please check attached screenshot. I'm wondering if there is a way to make stepDefinitions global?
try:
"cypress-cucumber-preprocessor": {
"nonGlobalStepDefinitions": false,
"stepDefinitions": "cypress/your_folder_name/*.js"
}
you can of course add more path after that.
Good to look at:
https://github.com/badeball/cypress-cucumber-preprocessor/blob/master/docs/step-definitions.md

Deployment error with docusaurus when trying to deploy via now to Vercel(Zeit)

I am facing a big problem while deploying to vercel via now.
I am able to successfully execute yarn run build and yarn run start, but when I am trying to deploy it via now.
I get this error
Error: You have 'doc' in your headerLinks, but no 'docs' folder exists one level up from 'website' folder. Did you run `docusaurus-init` or `npm run examples`? If so, make sure you rename 'docs-examples-from-docusaurus' to 'docs'.
2020-08-24T18:48:36.566Z at forEach (/vercel/58999d95/node_modules/docusaurus/lib/core/nav/HeaderNav.js:253:15)
2020-08-24T18:48:36.566Z at Array.forEach (<anonymous>)
2020-08-24T18:48:36.566Z at HeaderNav.renderResponsiveNav (/vercel/58999d95/node_modules/docusaurus/lib/core/nav/HeaderNav.js:248:17)
2020-08-24T18:48:36.566Z at HeaderNav.render (/vercel/58999d95/node_modules/docusaurus/lib/core/nav/HeaderNav.js:325:19)
2020-08-24T18:48:36.566Z at processChild (/vercel/58999d95/node_modules/react-dom/cjs/react-dom-server.node.development.js:3134:18)
2020-08-24T18:48:36.567Z at resolve (/vercel/58999d95/node_modules/react-dom/cjs/react-dom-server.node.development.js:2960:5)
2020-08-24T18:48:36.567Z at ReactDOMServerRenderer.render (/vercel/58999d95/node_modules/react-dom/cjs/react-dom-server.node.development.js:3435:22)
2020-08-24T18:48:36.567Z at ReactDOMServerRenderer.read (/vercel/58999d95/node_modules/react-dom/cjs/react-dom-server.node.development.js:3373:29)
2020-08-24T18:48:36.567Z at renderToStaticMarkup (/vercel/58999d95/node_modules/react-dom/cjs/react-dom-server.node.development.js:4004:27)
2020-08-24T18:48:36.567Z at renderToStaticMarkupWithDoctype (/vercel/58999d95/node_modules/docusaurus/lib/server/renderUtils.js:16:48)
2020-08-24T18:48:36.607Z error Command failed with exit code 1.
Here is my file structure.
example
├── Dockerfile
├── docker-compose.yml
├── docs
│ ├── doc1.md
│ ├── doc2.md
│ ├── doc3.md
│ ├── exampledoc4.md
│ └── exampledoc5.md
└── website
├── README.md
├── blog
│ ├── 2016-03-11-blog-post.md
│ ├── 2017-04-10-blog-post-two.md
│ ├── 2017-09-25-testing-rss.md
│ ├── 2017-09-26-adding-rss.md
│ └── 2017-10-24-new-version-1.0.0.md
├── core
│ └── Footer.js
├── package.json
├── pages
│ └── en
│ ├── help.js
│ ├── index.js
│ └── users.js
├── sidebars.json
├── siteConfig.js
├── static
│ ├── css
│ │ └── custom.css
│ └── img
│ ├── favicon.ico
│ ├── oss_logo.png
│ ├── undraw_code_review.svg
│ ├── undraw_monitor.svg
│ ├── undraw_note_list.svg
│ ├── undraw_online.svg
│ ├── undraw_open_source.svg
│ ├── undraw_operating_system.svg
│ ├── undraw_react.svg
│ ├── undraw_tweetstorm.svg
│ └── undraw_youtube_tutorial.svg
└── yarn.lock
Done in 0.57s.
This is actually generated via npx docusaurus-init, but still unable to deploy.
Any help would be highly appreciated :)
Fixed it by doing npm install and yarn run build

Proper way to structure a program formed by 3 sub-programs [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 2 years ago.
Improve this question
I'm building a system formed by 3 programs, lets call them A, B and C. Right now I think that my file structure is a disaster.
Inside my root folder I have some shortcuts to .bat files that run the programs and a folder called programs_data. Inside that folder I have 4 separate folders, one for each program plus a common folder.
The issue is that I need each program and the sub-scripts in those programs to be able to import from the common folder and I need program C and B to be able to call program A API.
As it is right now, I have a mess of appending to sys.path in the sub files to import functions from upper levels.
What's the correct way to structure something like this?
Current structure:
root
├── Configuration.lnk
├── Documentation.lnk
├── Program A.lnk
├── Program B.lnk
├── Program C.lnk
├── programs_data
│ ├── Program A
│ │ ├── Program A API.py
│ │ ├── Program A.bat
│ │ ├── Program A.py
│ │ ├── src
│ │ │ ├── server.py
│ │ │ ├── test_functions.py
│ │ │ └── validation.py
│ │ └── targets
│ │ ├── sql_querys
│ │ │ ├── query1.sql
│ │ │ ├── query2.sql
│ │ │ └── queryn.sql
│ │ ├── target1.py
│ │ ├── target2.py
│ │ ├── target3.py
│ │ └── targetn.py
│ ├── Program B
│ │ ├── Program B.bat
│ │ ├── Program B.py
│ │ ├── classifiers
│ │ │ ├── classifier1.py
│ │ │ ├── classifier2.py
│ │ │ └── classifiern.py
│ │ ├── events.log
│ │ ├── o365_token.txt
│ │ └── src
│ │ ├── batchECImport.py
│ │ ├── classifier.py
│ │ └── logger.py
│ ├── Program C
│ │ ├── Program C.bat
│ │ ├── Program C.py
│ │ ├── Reports
│ │ │ ├── report 1
│ │ │ │ └── report.py
│ │ │ └── report 2
│ │ │ └── report.py
│ │ ├── o365_token.txt
│ │ ├── schedule.xlsx
│ │ └── src
│ │ └── report.py
│ └── common
│ ├── APIMailboxManager.py
│ ├── Documentation
│ │ └── Documentation.pdf
│ ├── FlexibleProcess.py
│ ├── config.py
│ ├── misc.py
│ ├── print_functions.py
│ ├── production_support_passwords.py
│ ├── reports_log.db
│ └── reports_log.py
└── schedule spreadsheet.Ink
Thanks!
You could use __init__.py files to configure the import path of your modules in each directory. Documentation here.
In these files, you should only add the relative path to the common folder.
I don't think there is an other way to have a common folder if you want to avoid duplicated code...
I would suggest to put your root folder to the PYTHONPATH environment variable (https://www.tutorialspoint.com/What-is-PYTHONPATH-environment-variable-in-Python) so that you don't have to append to the sys.path.
That way you can easily write code like this
from root.programs_data.src import server

Looping through every script in a folder as a parallel job,

I have a need to run scripts in the folders an parralel jobs.
Here is what my folder structure looks like:
.
├── quo1374
├── quo2147
├── quo1407
......
├── quo1342
│ ├── dist
│ │ └── v0.1.0-alpha
│ │ └── unix
│ │ └── mjolnir
│ ├── examples
│ │ ├── values-local-134217728-m4.2xlarge.yaml
│ │
│ ├── remote_script.sh
│ └── run
│ ├── quo1342-134217728-m4.2xlarge
│ │ ├── quo1342-134217728-m4.2xlarge
│ │ └── quo1342-134217728-m4.2xlarge.sh
│ ├── quo1342-134217728-m4.xlarge
│ │ ├── quo1342-134217728-m4.xlarge
│ │ └── quo1342-134217728-m4.xlarge.sh
│ ├── quo1342-134217728-m5.12xlarge
│ │ ├── quo1342-134217728-m5.12xlarge
│ │ └── quo1342-134217728-m5.12xlarge.sh
│ ├── quo1342-134217728-m5.16xlarge
│ │ ├── quo1342-134217728-m5.16xlarge
│ │ └── quo1342-134217728-m5.16xlarge.sh
│ ├── quo1342-134217728-m5.24xlarge
│ │ ├── quo1342-134217728-m5.24xlarge
│ │ └── quo1342-134217728-m5.24xlarge.sh
│ ├── quo1342-134217728-m5.4xlarge
│ │ ├── quo1342-134217728-m5.4xlarge
│ │ └── quo1342-134217728-m5.4xlarge.sh
│ ├── quo1342-134217728-m5.8xlarge
│ │ ├── quo1342-134217728-m5.8xlarge
│ │ └── quo1342-134217728-m5.8xlarge.sh
│ ├── quo1342-134217728-m5.metal
│ │ ├── quo1342-134217728-m5.metal
│ │ └── quo1342-134217728-m5.metal.sh
│ ├── quo1342-134217728-t2.2xlarge
│ │ ├── quo1342-134217728-t2.2xlarge
│ │ └── quo1342-134217728-t2.2xlarge.sh
│ ├── quo1342-134217728-t3a.2xlarge
│ │ ├── quo1342-134217728-t3a.2xlarge
│ │ └── quo1342-134217728-t3a.2xlarge.sh
│ └── quo1342-134217728-t3a.xlarge
│ ├── quo1342-134217728-t3a.xlarge
│ └── quo1342-134217728-t3a.xlarge.sh
For example, the script └── quo1342-134217728-m4.2xlarge.sh runs one job. This a subset of jobs I would like to run. I am trying to come up with a s ript that will take the content of run/quo134*/quo1342-134217728*.sh and run it as a seperate job, i.e. when activated, I would loop through each of the scripts in the folder , but the entire job would be held by a &. The reasoning behind this is that I have about 12 separate folders that look like this . I would love to run them in parallel. it is however important that the scripts within the folders are run sequentially.
Here is an attempt of what I trying to do . Although it does not work, I hope it add clarity to my question.
for f in *
do cd $f/run
for f in *.sh
bash "$f" -H &
cd ..
done
done
I would appreciate any pointers on this.
Update
The answers from dash-o helped, but lead to another issue. The bash scripts use relative paths eg.. quo1342-134217728-t3a.xlarge.sh contains references like
../../dist/v0.1.0-alpha/unix/mjolnir
when I use your script it runs, but it appears that the execution does not respect the file path in the script i.e
ssh: Could not resolve hostname : Name or service not known + ../../dist/v0.1.0-alpha/unix/mjolnir destroy ../../examples/values-local-549755813888-t3a.xlarge.yaml
Is there a way to run the script that doesnt break this
You can implement with a help function
Function run_folder {
local dir=$1 f=
cd $dir/run
# sequential execution
for f in */*.sh ; do
# Execute each test in it's folder.
(cd ${f%/*} && bash ${f##*/} -H)
done
}
# parallel execution
For j in * ; do
run_folder $j &
Done
wait

Rebase Relative Assets in Gulp

I'm using gulp-pretty-url to keep page URLs clean in a project, such that a file src/about.html will be output as app/about/index.html. Some files end up in deeper structures though, like src/series-460.html to app/series/460/index.html. Of course, each of the HTML files references Javascript and CSS files, and they need to use relative paths.
How can I rebase relative asset paths in files as they're being run through a gulp task? Here's the file structure I'm working with:
project
├── app
│ ├── about
│ │ └──index.html
│ ├── assets
│ │ ├── css
│ │ │ └── styles.css
│ │ └── js
│ │ └── scripts.js
│ └── index.html
└── src
├── assets
│ ├── css
│ │ └── styles.css
│ └── js
│ └── scripts.js
├── about.html
└── index.html

Resources