Django: Download Excel after User Clicks a Button - python-3.x

In my Django project, I need to send an in-memory Excel file to the client side for download. The download should start after the user clicks a button.
Here is my project structure:
C:.
│ db.sqlite3
│ manage.py
│ Pipfile
│ Pipfile.lock
│ requirements.txt
│
├───app
│ │ admin.py
│ │ apps.py
│ │ forms.py
│ │ models.py
│ │ tests.py
│ │ urls.py
│ │ views.py
│ │ __init__.py
│ │
│ ├───migrations
│ │ __init__.py
│ │
│ └───templates
│ └───app
│ home.html
│
└───project
settings.py
urls.py
wsgi.py
__init__.py
My app/forms.py:
from django import forms
class HomeForm(forms.Form):
OPTIONS = (
('name', 'name'),
('city', 'city')
)
columns = forms.MultipleChoiceField(widget=forms.CheckboxSelectMultiple,
choices=OPTIONS)
My app/views.py:
from django.views.generic import TemplateView
from django.shortcuts import render
from app.forms import HomeForm
class HomeView(TemplateView):
template = 'app/home.html'
def get(self, request):
form = HomeForm()
return render(request, self.template, {'form': form})
This is my app/urls.py:
from django.urls import path
from . import views
urlpatterns = [
path('', views.HomeView.as_view(), name="home"),
]
My project/urls.py:
from django.contrib import admin
from django.urls import path, include
urlpatterns = [
path('', include('app.urls')),
path('admin/', admin.site.urls),
]
My app/home.html:
<!DOCTYPE html>
<html>
<head>
<title>Generate Data</title>
</head>
<body>
<form method="get">
{% csrf_token %}
{{ form.as_p }}
<input type="submit" value="Generate Excel">
</form>
</html>
</body>
The page with the checkboxes
I’m new to Django. How can I get the columns (name, city) as a list, create a dictionary {‘name’: [‘Bob’, ‘Tom’], ‘city’: [‘San Francisco’, ‘Atlanta’]}, and use the dictionary in the following function, which creates in-memory Excel data:
import pandas as pd
from io import BytesIO as IO
from django.http import HttpResponse
import xlsxwriter
def write_to_excel():
df_output = pd.DataFrame({'name': ['Bob', 'Tom'], 'city': ['San Francisco', 'Atlanta']})
# my "Excel" file, which is an in-memory output file (buffer)
# for the new workbook
excel_file = IO()
xlwriter = pd.ExcelWriter(excel_file, engine='xlsxwriter')
df_output.to_excel(xlwriter, 'sheetname')
xlwriter.save()
xlwriter.close()
# important step, rewind the buffer or when it is read() you'll get nothing
# but an error message when you try to open your zero length file in Excel
excel_file.seek(0)
# set the mime type so that the browser knows what to do with the file
response = HttpResponse(excel_file.read(), content_type='application/vnd.openxmlformats-officedocument.spreadsheetml.sheet')
# set the file name in the Content-Disposition header
response['Content-Disposition'] = 'attachment; filename=myfile.xlsx'
return response
When a user clicks the Generate Excel button, the Excel file with the data should be downloaded. I've given all of this code because I think it's necessary for someone to help me.

Related

How to import all express router files from multiple directories in nodejs?

I'm building a REST API with versioning support. Here is my directory structure.
.
├── src
│ ├── api
│ │ ├── v1
│ │ │ ├── modules ─ ...
│ │ │ ├── routers
│ │ │ │ ├─── auth.router.js
│ │ │ │ ├─── posts.router.js
│ │ ├── v2
│ │ │ ├── modules ─ ...
│ │ │ ├── routers ─ ...
├── app.js
I want the router files imported to app.js. I've looked for the solution for hours but all I found is how to import each file manually through app.use(). This is doable but as the version numbers and router files keep increasing, this can lead to redundant work. I need a way to import these files with the least manual lines of code possible.
It is not possible to directly do this with Express, generally people mange modules manually with NodeJS, as it doesn't take a lot of work to do at all. In terms of version numbers, you could specify a version setting or constant somwhere, and import depending on that number.
For instance:
// routes.js
const apiVersion = "v2";
module.exports = {
require(`./${apiVersion}/auth.route`),
}
If this is not ideal, one hacky way to manage this is by grabbing all of the route files with the fs module, and importing them automatically. This is quite a hacky way of doing it, but I came up with something like this:
// router.js
const fs = require("fs/promises");
const { Router } = require("express");
const router = Router();
const apiVersion = "v2";
const loadRoutes = async () => {
// grab all the route files from a directory using fs
// use require to grab them from the source files
}
const routes = loadRoutes();
routes.forEach(route => {
router.use(route);
})
// app.js
const router = require("./path/to/router");
// ...boilerplate
app.use(router);

Application Factory Pattern: AttributeError: 'tuple' object has no attribute 'shell_context_processor' [closed]

Closed. This question is not reproducible or was caused by typos. It is not currently accepting answers.
This question was caused by a typo or a problem that can no longer be reproduced. While similar questions may be on-topic here, this one was resolved in a way less likely to help future readers.
Closed 2 years ago.
Improve this question
I am trying to cast my Flask app into a App Factory Pattern, but I got stuck on the following error:
Traceback (most recent call last):
File "/usr/local/bin/flask", line 8, in <module>
sys.exit(main())
File "/usr/local/lib/python3.7/site-packages/flask/cli.py", line 967, in main
cli.main(args=sys.argv[1:], prog_name="python -m flask" if as_module else None)
File "/usr/local/lib/python3.7/site-packages/flask/cli.py", line 586, in main
return super(FlaskGroup, self).main(*args, **kwargs)
File "/usr/local/lib/python3.7/site-packages/click/core.py", line 782, in main
rv = self.invoke(ctx)
File "/usr/local/lib/python3.7/site-packages/click/core.py", line 1259, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/usr/local/lib/python3.7/site-packages/click/core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/usr/local/lib/python3.7/site-packages/click/core.py", line 610, in invoke
return callback(*args, **kwargs)
File "/usr/local/lib/python3.7/site-packages/click/decorators.py", line 73, in new_func
return ctx.invoke(f, obj, *args, **kwargs)
File "/usr/local/lib/python3.7/site-packages/click/core.py", line 610, in invoke
return callback(*args, **kwargs)
File "/usr/local/lib/python3.7/site-packages/flask/cli.py", line 848, in run_command
app = DispatchingApp(info.load_app, use_eager_loading=eager_loading)
File "/usr/local/lib/python3.7/site-packages/flask/cli.py", line 305, in __init__
self._load_unlocked()
File "/usr/local/lib/python3.7/site-packages/flask/cli.py", line 330, in _load_unlocked
self._app = rv = self.loader()
File "/usr/local/lib/python3.7/site-packages/flask/cli.py", line 388, in load_app
app = locate_app(self, import_name, name)
File "/usr/local/lib/python3.7/site-packages/flask/cli.py", line 240, in locate_app
__import__(module_name)
File "/home/chrdina/python/werda/werda.py", line 7, in <module>
#app.shell_context_processor
AttributeError: 'tuple' object has no attribute 'shell_context_processor'
I tried to find out where the AttributeError happens, but I'm totally stuck. I don't unterstand where the mentioned tuple is located in my code. I tried deleting the shell_context_processor part, but then the app seems to not know it's an app at all and throws this "Error: Failed to find Flask application or factory in module "werda". Use "FLASK_APP=werda:name to specify one.").
If I then try to export FLASK_APP again (it's already set in my .flaskenv) nothing changes.
I define the shell_context_processor in my werda.py (which is my app) in the top level of my directory.
werda.py:
from app import create_app, db
from app.models import User, Role, Employee, Language, Employee_Languages, Attendance, AttendanceArchive
app = create_app()
#app.shell_context_processor
def make_shell_context():
return {'db': db, 'User': User, 'Role': Role, 'Employee':Employee, 'Language':Language, 'Employee_Languages':Employee_Languages, 'Attendance':Attendance, 'AttendanceArchive':AttendanceArchive}
init.py:
from flask import Flask
from flask_sqlalchemy import SQLAlchemy
from flask_migrate import Migrate
from flask_admin import Admin
from flask_bootstrap import Bootstrap
from flask_security import Security, SQLAlchemyUserDatastore
from flask_mail import Mail
import logging
from logging.handlers import SMTPHandler, RotatingFileHandler
import os
from config import Config
db = SQLAlchemy()
migrate = Migrate()
bootstrap = Bootstrap()
from app.models import User, Role
user_datastore = SQLAlchemyUserDatastore(db, User, Role)
security = Security()
mail = Mail()
admin = Admin()
def create_app(config_class=Config):
app=Flask(__name__)
app.config.from_object(Config)
db.init_app(app)
migrate.init_app(app, db)
from app.models import User, Role
user_datastore = SQLAlchemyUserDatastore(db, User, Role)
security.init_app(app, user_datastore)
bootstrap.init_app(app)
mail.init_app(app)
app.config['FLASK_ADMIN_SWATCH'] = 'cerulean'
from app.errors import bp as errors_bp
app.register_blueprint(errors_bp)
from app.main import bp as main_bp
app.register_blueprint(main_bp)
admin = Admin(app, name='werda', template_mode='bootstrap3')
#send errors via mail:
if not app.debug:
if app.config['MAIL_SERVER']:
auth = None
if app.config['MAIL_USERNAME'] or app.config['MAIL_PASSWORD']:
auth = (app.config['MAIL_USERNAME'], app.config['MAIL_PASSWORD'])
secure = None
if app.config['MAIL_USE_TLS']:
secure = ()
mail_handler = SMTPHandler(
mailhost=(app.config['MAIL_SERVER'], app.config['MAIL_PORT']),
fromaddr='no-reply#' + app.config['MAIL_SERVER'],
toaddrs=app.config['ADMINS'], subject='werda Failure',
credentials=auth, secure=secure)
mail_handler.setLevel(logging.ERROR)
app.logger.addHandler(mail_handler)
#save errors to log file:
if not os.path.exists('logs'):
os.mkdir('logs')
file_handler = RotatingFileHandler('logs/werda.log', maxBytes=10240,
backupCount=10)
file_handler.setFormatter(logging.Formatter(
'%(asctime)s %(levelname)s: %(message)s [in %(pathname)s:%(lineno)d]'))
file_handler.setLevel(logging.INFO)
app.logger.addHandler(file_handler)
app.logger.setLevel(logging.INFO)
app.logger.info('werda startup')
return app, admin
from app import models
Directory Structure
.
├── app
│   ├── cli.py
│   ├── errors
│   │   ├── handlers.py
│   │   ├── __init__.py
│   │   └── __pycache__
│   ├── __init__.py
│   ├── __init__.pyc
│   ├── main
│   │   ├── forms.py
│   │   ├── __init__.py
│   │   ├── __pycache__
│   │   └── routes.py
│   ├── models.py
│   └── templates
│   ├── addnextweek.html
│   ├── base.html
│   ├── errors
│   │   ├── 403.html
│   │   ├── 404.html
│   │   └── 500.html
│   ├── index.html
│   ├── movetoarchive.html
│   ├── nextweek.html
│   ├── now.html
│   ├── russ.html
│   ├── security
│   │   ├── login_user.html
│   │   └── register_user.html
│   ├── thisweek.html
│   ├── thisweek_working.html
│   └── today.html
├── app.db
├── config.py
├── libc6_2.31-0ubuntu8+lp1871129~1_amd64.deb
├── README.md
├── requirements.txt
├── werda.py
Consider to use the init_app() function in your factory function. You also should modify your app.config.from_object(Config) to app.config.from_object(class_config)
from flask import Flask
from flask_sqlalchemy import SQLAlchemy
from flask_migrate import Migrate
from flask_admin import Admin
from flask_bootstrap import Bootstrap
from flask_security import Security, SQLAlchemyUserDatastore
from flask_mail import Mail
import logging
from logging.handlers import SMTPHandler, RotatingFileHandler
import os
from config import Config
from app.models import User, Role #<-----------repositioned
db = SQLAlchemy()
migrate = Migrate()
bootstrap = Bootstrap()
user_datastore = SQLAlchemyUserDatastore() #<-------------modified
security = Security()
mail = Mail()
admin = Admin()
def create_app(config_class=Config):
app=Flask(__name__)
app.config.from_object(class_config) #<------modified
db.init_app(app)
migrate.init_app(app, db)
user_datastore.init_app = SQLAlchemyUserDatastore(app, db, User, Role) #<-------modified
security.init_app(app, user_datastore)
bootstrap.init_app(app)
mail.init_app(app)
app.config['FLASK_ADMIN_SWATCH'] = 'cerulean'
from app.errors import bp as errors_bp
app.register_blueprint(errors_bp)
from app.main import bp as main_bp
app.register_blueprint(main_bp)
admin = Admin(app, name='werda', template_mode='bootstrap3')
#send errors via mail:
if not app.debug:
if app.config['MAIL_SERVER']:
auth = None
if app.config['MAIL_USERNAME'] or app.config['MAIL_PASSWORD']:
auth = (app.config['MAIL_USERNAME'], app.config['MAIL_PASSWORD'])
secure = None
if app.config['MAIL_USE_TLS']:
secure = ()
mail_handler = SMTPHandler(
mailhost=(app.config['MAIL_SERVER'], app.config['MAIL_PORT']),
fromaddr='no-reply#' + app.config['MAIL_SERVER'],
toaddrs=app.config['ADMINS'], subject='werda Failure',
credentials=auth, secure=secure)
mail_handler.setLevel(logging.ERROR)
app.logger.addHandler(mail_handler)
#save errors to log file:
if not os.path.exists('logs'):
os.mkdir('logs')
file_handler = RotatingFileHandler('logs/werda.log', maxBytes=10240,
backupCount=10)
file_handler.setFormatter(logging.Formatter(
'%(asctime)s %(levelname)s: %(message)s [in %(pathname)s:%(lineno)d]'))
file_handler.setLevel(logging.INFO)
app.logger.addHandler(file_handler)
app.logger.setLevel(logging.INFO)
app.logger.info('werda startup')
return app, admin
from app import models
I got it!
I return'ed app and admin in create_app(), which caused the problem. And it seems that's also where the mysterious Tuple was situated.
If create_app() only return's app my app starts as it should. Now to work out how to run flask-admin correctly.
Corrected code for now:
init.py:
from flask import Flask
from flask_sqlalchemy import SQLAlchemy
from flask_migrate import Migrate
from flask_admin import Admin
from flask_bootstrap import Bootstrap
from flask_security import Security, SQLAlchemyUserDatastore
from flask_mail import Mail
import logging
from logging.handlers import SMTPHandler, RotatingFileHandler
import os
from config import Config
db = SQLAlchemy()
migrate = Migrate()
bootstrap = Bootstrap()
from app.models import User, Role
user_datastore = SQLAlchemyUserDatastore(db, User, Role)
security = Security()
mail = Mail()
admin = Admin()
def create_app(config_class=Config):
app=Flask(__name__)
app.config.from_object(Config)
db.init_app(app)
migrate.init_app(app, db)
from app.models import User, Role
user_datastore = SQLAlchemyUserDatastore(db, User, Role)
security.init_app(app, user_datastore)
bootstrap.init_app(app)
mail.init_app(app)
app.config['FLASK_ADMIN_SWATCH'] = 'cerulean'
from app.errors import bp as errors_bp
app.register_blueprint(errors_bp)
from app.main import bp as main_bp
app.register_blueprint(main_bp)
admin = Admin(app, name='werda', template_mode='bootstrap3')
#send errors via mail:
if not app.debug:
if app.config['MAIL_SERVER']:
auth = None
if app.config['MAIL_USERNAME'] or app.config['MAIL_PASSWORD']:
auth = (app.config['MAIL_USERNAME'], app.config['MAIL_PASSWORD'])
secure = None
if app.config['MAIL_USE_TLS']:
secure = ()
mail_handler = SMTPHandler(
mailhost=(app.config['MAIL_SERVER'], app.config['MAIL_PORT']),
fromaddr='no-reply#' + app.config['MAIL_SERVER'],
toaddrs=app.config['ADMINS'], subject='werda Failure',
credentials=auth, secure=secure)
mail_handler.setLevel(logging.ERROR)
app.logger.addHandler(mail_handler)
#save errors to log file:
if not os.path.exists('logs'):
os.mkdir('logs')
file_handler = RotatingFileHandler('logs/werda.log', maxBytes=10240,
backupCount=10)
file_handler.setFormatter(logging.Formatter(
'%(asctime)s %(levelname)s: %(message)s [in %(pathname)s:%(lineno)d]'))
file_handler.setLevel(logging.INFO)
app.logger.addHandler(file_handler)
app.logger.setLevel(logging.INFO)
app.logger.info('werda startup')
return app #deleted admin here
from app import models

The package import path is different for dynamic codegen and static codegen

Here is the structure for src directory of my project:
.
├── config.ts
├── protos
│ ├── index.proto
│ ├── index.ts
│ ├── share
│ │ ├── topic.proto
│ │ ├── topic_pb.d.ts
│ │ ├── user.proto
│ │ └── user_pb.d.ts
│ ├── topic
│ │ ├── service.proto
│ │ ├── service_grpc_pb.d.ts
│ │ ├── service_pb.d.ts
│ │ ├── topic.integration.test.ts
│ │ ├── topic.proto
│ │ ├── topicServiceImpl.ts
│ │ ├── topicServiceImplDynamic.ts
│ │ └── topic_pb.d.ts
│ └── user
│ ├── service.proto
│ ├── service_grpc_pb.d.ts
│ ├── service_pb.d.ts
│ ├── user.proto
│ ├── userServiceImpl.ts
│ └── user_pb.d.ts
└── server.ts
share/user.proto:
syntax = "proto3";
package share;
message UserBase {
string loginname = 1;
string avatar_url = 2;
}
topic/topic.proto:
syntax = "proto3";
package topic;
import "share/user.proto";
enum Tab {
share = 0;
ask = 1;
good = 2;
job = 3;
}
message Topic {
string id = 1;
string author_id = 2;
Tab tab = 3;
string title = 4;
string content = 5;
share.UserBase author = 6;
bool good = 7;
bool top = 8;
int32 reply_count = 9;
int32 visit_count = 10;
string create_at = 11;
string last_reply_at = 12;
}
As you can see, I try to import share package and use UserBase message type in Topic message type. When I try to start the server, got error:
no such Type or Enum 'share.UserBase' in Type .topic.Topic
But when I changed the package import path to a relative path import "../share/user.proto";. It works fine and got server logs: Server is listening on http://localhost:3000.
Above is the usage of dynamic codegen.
Now, I switch to using static codegen, here is the shell script for generating the codes:
protoc \
--plugin=protoc-gen-ts=./node_modules/.bin/protoc-gen-ts \
--ts_out=./src/protos \
-I ./src/protos \
./src/protos/**/*.proto
It seems protocol buffer compiler doesn't support relative path, got error:
../share/user.proto: Backslashes, consecutive slashes, ".", or ".." are not allowed in the virtual path
And, I changed the the package import path back to import "share/user.proto";. It generated code correctly, but when I try to start my server, got same error:
no such Type or Enum 'share.UserBase' in Type .topic.Topic
It's weird.
Package versions:
"grpc-tools": "^1.6.6",
"grpc_tools_node_protoc_ts": "^4.1.3",
protoc --version
libprotoc 3.10.0
UPDATE:
repo: https://github.com/mrdulin/nodejs-grpc/tree/master/src
Your dynamic codegen is failing because you are not specifying the paths to search for imported .proto files. You can do this using the includeDirs option when calling protoLoader.loadSync, which works in a very similar way to the -I option you pass to protoc. In this case, you are loading the proto files from the src/protos directory, so it should be sufficient to pass the option includeDirs: [__dirname]. Then the import paths in your .proto files should be relative to that directory, just like when you use protoc.
You are probably seeing the same error when you try to use the static code generation because it is actually the dynamic codegen error; you don't appear to be removing the dynamic codegen code when trying to use the statically generated code.
However, the main problem you will face with the statically generated code is that you are only generating the TypeScript type definition files. You also need to generate JavaScript files to actually run it. The official Node gRPC plugin for proto is distributed in the grpc-tools package. It comes with a binary called grpc_tools_node_protoc, which should be used in place of protoc and automatically includes the plugin. You will still need to pass a --js_out flag to generate that code.

AWS SAM Nested Application in Python with Dynamorm

I am using AWS SAM to build a Serverless application. I followed the instruction to build a nested application.
My application structure is basically the following:
.
├── MAKEFILE
├── README.md
├── __init__.py
├── apps
│ ├── __init__.py
│ ├── account
│ │ ├── __init__.py
│ │ ├── endpoints.py
│ │ ├── models.py
│ │ ├── requirements.txt
│ │ └── template.yaml
├── samconfig.toml
└── template.yaml
The requirements.txt in the folder apps/account/ has the following python packages: boto3 marshmallow and dynamorm.
The sam build and sam deploy works fine and the lambda functions are deployed correctly. However, I receive an error when calling the lambda function. The logs show the following error Unable to import module 'endpoints': No module named 'dynamorm'.
Here are excerpts from my code:
endpoints.py
import json
import boto3
from models import Account
print('Loading function')
def account_info(event, context):
apiKey = event["requestContext"]["identity"]["apiKeyId"]
account_info = Account.get(id= apiKey)
return {
"statusCode": 200,
"body": json.dumps(account_info)
}
models.py
import datetime
from dynamorm import DynaModel, GlobalIndex, ProjectAll
from marshmallow import Schema, fields, validate, validates, ValidationError
class Account(DynaModel):
# Define our DynamoDB properties
class Table:
name = 'XXXXXXXXXX'
hash_key = 'id'
read = 10
write = 5
class Schema:
id = fields.String(required=True)
name = fields.String()
email = fields.String()
phonenumber = fields.String()
status = fields.String()
I am not sure what am I missing? Are there additional instructions to build a nested app in SAM?
Thank you so much for the help!
According to https://github.com/awslabs/aws-sam-cli/issues/1213, this feature is not supported yet.
In my case, I did 'sam build' on every nested stacks and fix parent yaml template as following (use template.yaml generated by sam build command), then works. But just workaround and not nice way.
XXX_APP:
Type: AWS::Serverless::Application
Properties:
Location: nest_application/.aws-sam/build/template.yaml

How to get data from BigQuery in React.js app?

What I want to do and the problem
I want to access a table in the BigQuery, but got error like
TypeError: fs.createReadStream is not a function
at GoogleAuth.getClient (googleauth.js:497)
at GoogleAuth.authorizeRequest (googleauth.js:530)
at BigQuery.makeAuthenticatedRequest (util.js:374)
at BigQuery.request_ (service.js:129)
at BigQuery.request (service.js:140)
at BigQuery.createJob (bigquery.js:942)
at BigQuery.wrapper (index.js:42)
at BigQuery.createQueryJob (bigquery.js:862)
at BigQuery.wrapper (index.js:42)
at BigQuery.query (bigquery.js:1264)
at index.js:69
at new Promise (<anonymous>)
at BigQuery.wrapper (index.js:54)
at Signup.handleClick (Signup.js:52)
at HTMLUnknownElement.callCallback (react-dom.development.js:336)
at Object.invokeGuardedCallbackDev (react-dom.development.js:385)
at invokeGuardedCallback (react-dom.development.js:440)
at invokeGuardedCallbackAndCatchFirstError (react-dom.development.js:454)
at executeDispatch (react-dom.development.js:584)
at executeDispatchesInOrder (react-dom.development.js:609)
at executeDispatchesAndRelease (react-dom.development.js:713)
at executeDispatchesAndReleaseTopLevel (react-dom.development.js:722)
at forEachAccumulated (react-dom.development.js:694)
at runEventsInBatch (react-dom.development.js:739)
at runExtractedPluginEventsInBatch (react-dom.development.js:880)
at handleTopLevel (react-dom.development.js:5803)
at batchedEventUpdates$1 (react-dom.development.js:24401)
at batchedEventUpdates (react-dom.development.js:1415)
at dispatchEventForPluginEventSystem (react-dom.development.js:5894)
at attemptToDispatchEvent (react-dom.development.js:6010)
at dispatchEvent (react-dom.development.js:5914)
at unstable_runWithPriority (scheduler.development.js:697)
at runWithPriority$2 (react-dom.development.js:12149)
at discreteUpdates$1 (react-dom.development.js:24417)
at discreteUpdates (react-dom.development.js:1438)
at dispatchDiscreteEvent (react-dom.development.js:5881)
My Codes
First, I want to check the code with simple code. There is a button and when clicked it, the handleClick function works. All element for access BigQuery are in this function for now.
import React from 'react';
import Avatar from '#material-ui/core/Avatar';
import Button from '#material-ui/core/Button';
import CssBaseline from '#material-ui/core/CssBaseline';
import PersonIcon from '#material-ui/icons/Person';
import Typography from '#material-ui/core/Typography';
import Container from '#material-ui/core/Container';
class Signup extends React.Component {
constructor(props) {
super(props);
this.handleClick = this.handleClick.bind(this);
}
handleClick(event) {
const { BigQuery } = require('#google-cloud/bigquery');
const bigquery = new BigQuery({
projectId: '(PROJECT_ID)',
keyFilename: '../../credentials/(credential file name).json',
});
const query = `
SELECT *
FROM \`(PROJECT_ID).paper_list.user_auth_info\`;
`
bigquery.query(query)
.then(data => {
const rows = data[0];
rows.forEach(row => alert("Hello"));
})
.catch(err => console.log(err));
}
render() {
const { classes } = this.props;
return (
<Container
component="main"
maxWidth="xs"
className={classes.outer}
>
<CssBaseline />
<div className={classes.paper}>
<Avatar className={classes.avatar}>
<PersonIcon />
</Avatar>
<Typography component="h1" variant="h5">
Sign Up
</Typography>
<form className={classes.form} noValidate>
<Button
fullWidth
variant="contained"
color="primary"
className={classes.submit}
onClick={this.handleClick}
>
Sign Up
</Button>
</form>
</div>
</Container>
);
}
}
The directory structure is like following,
.
├── Dockerfile
├── README.md
├── app.yaml
├── credentials # <- credential file is in this directory
├── node_modules
├── package-lock.json
├── package.json
├── public
└── src
├── App.css
├── App.js
├── App.test.js
├── Main.js
├── components
│   ├── AddFile.js
│   ├── Footer.js
│   ├── Header.js
│   ├── Inner.js
│   ├── Login.js
│   ├── PaperFolder.js
│   ├── PaperInfo.js
│   ├── PaperLabels.js
│   ├── SearchResultTable.js
│   ├── SideBar.js
│   ├── Signup.js # <- the code above
│   ├── TopLinks.js
│   └── ToppageMain.js
├── images
│   └── maarten-van-den-heuvel-8EzNkvLQosk-unsplash.jpg
├── index.css
├── index.js
├── serviceWorker.js
How to fix this error?
Thanks to the comment by Blundering Philosopher, I could understand the problem.
The BigQuery should be accessed from server-side, but the React.js is front-end, this was the problem. I wrote the API server with Go, and resolved it.

Resources