Python3 & SQLite3 JSON api calls - python-3.x

I'm starting my Python journey with a particular project in mind;
The title explains what I'm trying to do (make json api calls with python3.6 and sqlite3). I'm working on a mac.
My question is whether or not this setup is possible? Or if I should use MySQL, PostgreSQL or MongoDB?
If it is possible, am I going to have to use any 3rd party software to make it run?
Sorry if this is off topic, I'm new to SO and I've been trying to research this via google and so far no such luck.
Thank you in advance for any help you can provide.

Python 3.6 and sqlite both work on a Mac; whether your json api calls will depends on what service you are trying to make calls to (unless you are writing a server that services such calls, in which case you are fine).
Any further recommendations are either a) off topic for SO or b) dependent on what you want to do with these technologies.

Related

How do I use Node.js to read server file within a Django application?

I have a django application that I use to visualize data. The data itself is in an Amazon S3 bucket, and the plotting of the data is done using bokeh. One of the applications however, uses d3 instead. And while I managed to configure the bokeh apps to work properly, I don't know how to do that with the d3 app. It has to read data either directly from S3 or locally (If I download it within django views before rendering the web page) and plot it. But whatever I try I get a 404 Not Found error.
I'm learning as I go, so I don't know what I'm doing wrong. Going through SO, I found this question which gives an example of downloading a file from S3 using Node.js, but I am already running a django server so I don't know if that works. I should also mention that the files to be read are quite large (several megabytes). So, to summarize, my question is:
Is using Node.js my only solution here and can it be done without having both Nodejs and django running at the same time? I'm worried that this might be too complex for me to set up. Or better yet, what would be a recommended solution in my case? I am almost done with the whole project but, unfortunately, I've gotten stuck pretty bad here.
Thank you to anyone willing to help or offer advice.

Best way to do extraction of data from a REST API using the AWS ecosystem using Python

I'm looking for advice on how to properly go about extracting data from a REST Api using the AWS ecosystem. I'm currently using Python and chron to accomplish this but want to implement in AWS the "accepted way". I have researched Glue and Lambda. It looks pretty easy using Lambda. Using Glue looks to need 3rd party software like Progress DataDirect Autonomous REST Connector for JDBC to smoothly accomplish this but I would prefer to only use native AWS tools. What is the best way to go about doing this? Thanks in advance, Ed.

Can anyone suggest a software for automated database system

Can anyone help with my problem. Im trying to make system with collects data, stores it to a database and create a wep page that could show the data that ive gathered. The system is already done, database is okay. I want to migrate my system's data from the database to the web page ive created, automatically. i need your suggestion what software i could use to make it possible. Im using raspberry pi 3+B
You can python, flask and mysql libraries to develop such system.

Exasol and ESRI's ArcGIS - anyone managed to link them up?

I'm looking to utilise the speed of Exasolution with the mapping capabilities of ArcGIS.
Exasolution is an extremely fast database. It has spatial support, but I'd like to be able to render spatial features inside a map. So it could be via some kind of API from Esri, or maybe a third party mapping engine and use WMS/WFS etc.
Anyone had any joy with these products?
Cheers
You will likely have some joy with EXASolution's JDBC driver - EXASolution's Geospatial libraries are built on OpenGIS using the libGEOS libraries, so everything you can do with Postgres should be possible on EXASolution.
I did an introductory Geospatial-on-EXASOL video a while back which may be of interest https://www.youtube.com/watch?v=f6Erp1WWLHw
I would say that your question would get a better response in EXASOL's community section where EXASOL customers and techies can answer specific EXASOL questions. Go to exasol.com/community for more details.
Good luck - and do let me know how you get on
Graham Mossman
Solution Engineer
EXASOL A.G.
I just finished a short knowledge base article which shows you how to connect to ESRI's ArcGIS from within an EXASolution database:
https://www.exasol.com/support/browse/SOL-211
The approach is different from what Graham suggested, as it uses Esri's REST API in combination with Python scripts called from SQL. So, the database connects directly and in parallel to the REST API service, not involving the client at all when it comes to data enrichment.
Hope that helps,
Franz

Pywin32 on Google App Engine?

I am considering ways to read/modify large Excel spreadsheets with formula support in python, on Google App Engine. I am fairly unfamiliar with how COM works but I was wondering if anyone has successfully implemented pywin32 on GAE - or whether there are inherently problems with doing so, or if it's just a bad idea in general.
It seems like the only possible solution for Python (xlrd has no formula support) but if it doesn't work, I will resort to learning Java and trying JExcel API.
Any insight would be appreciated!
Google's servers are not running Windows, so no, there's no way whatsoever to use any Win32 APIs.
If you have to use GAE then you may process some stuff on a windows machine. You can use Pull Queues to lease tasks from GAE process them and then add them to Push Queues that will store the data in GAE

Resources