Building an API around Databricks Notebook - databricks

I'm very new to the Databricks community platform. I have recently developed an ML model using databricks and would like to productionize it using a Swagger API. I have tried it in bits and pieces but can't figure it out at all. Can someone please give me a flow of how to proceed with this? Thanks!
Edit: I would really appreciate a low-code or no-code type solution to this since I'm a data scientist and not really a programmer. Thanks!

Related

Automate Data Import to GA4

I am trying to automate refunds report to google analytics 4. I can't find good documentation on importing this data using analytics management API. I came across https://www.npmjs.com/package/#google-analytics/data which seems to be good for pulling reports from GA but couldn't get a way of doing data import.
I am writing a nodejs script and was hoping someone has encountered this scenario before and could share how they accomplished it. Any help or point in the right direction will be really appreciated.
The alternative to the UA Analytics Management api is the Google Analytics Admin API for ga4
To my knowledge it doesn't support data important at this time the API is still under development it may come in the future there is no way to know.
I would suggest looking at the measurement protocol for ga4 you may be able to use that

Basic question on downloading data for a Kubeflow pipeline

I'm a newbie on Kubeflow, just started exploring. I've setup a microk8s cluster and charmed kubeflow. I have executed a few examples trying to understand the different components. Now I'm trying to setup a pipeline from scratch for a classification problem. The problem that I'm facing is with handling the download of data.
Could anyone please point me to an example where data (preferably images) is downloaded from an external source?
All the examples that I can find are based on snakk datasets from sklearn or mnist etc. I'm rather looking for an example using a real world (or near to) data, example
https://storage.googleapis.com/mledu-datasets/cats_and_dogs_filtered.zip
Thanks in advance for any direction.
Tried exploring multiple kubeflow examples, blogs etc to find an example that contains real data rather than toy dataset. I couldn't find one.
I've found some jupyter notebook examples that use !wget to download in the notebook kernel, but I couldnt find how that can be converted to a kubeflow op step. I presumed func_to_container_op wouldn't work for such a scenario. As a next step I'm going to try using specs.AppDef from torchx to download. As I'm a total newbie, I wanted to make sure if I'm in the right direction.
I was able to download using wget for direct links and also I was able to configure k8s secrets and patch the serviceaccount with ImagePullSecret to get the downloads done from newly created containers.

How to migrate Google App Engine from Python2.7 and DataStore to Python3

My website was built using Google AppEngine, DataStore and Python2.7. It’s no longer working This site can’t be reached. I need to migrate to Python3 but I cannot identify which migration guide is best suited for me. Can anyone point me to the correct set? I would like to get it running as quickly as possible (I only have one hour a day to try to correct it -- I have an unrelated full-time job).
Migration guide
Google provides a step-by-step migration guide especially for AppEngine which you should follow.
Additionally, you will find lots of useful links there where you can read about the differences between Python 2 and Python 3 and the various migration tools available. Depending on your application those tools might even be able to do the migration (more or less) automatically for you.
Please note: This is the migration guide for the AppEngine standard environment. If you don't know what you're using, you're most likely using the standard environment. While some steps will differ when using the flexible environment, migration of the code base as described in the guide will always be required.
Video: Python 2 to 3: Migration Patterns & Motivators (Cloud Next '19)
There also is a recording of a talk by the Google Cloud Team on migration from Python 2 to 3 on YouTube.
Still having issues?
Migrating from Python 2 to 3 is a well-known problem and there is tons of information available on the internet. Most likely the problems you face have already been solved by someone, so a Google search for a specific problem will likely give you a working solution.

Exasol and ESRI's ArcGIS - anyone managed to link them up?

I'm looking to utilise the speed of Exasolution with the mapping capabilities of ArcGIS.
Exasolution is an extremely fast database. It has spatial support, but I'd like to be able to render spatial features inside a map. So it could be via some kind of API from Esri, or maybe a third party mapping engine and use WMS/WFS etc.
Anyone had any joy with these products?
Cheers
You will likely have some joy with EXASolution's JDBC driver - EXASolution's Geospatial libraries are built on OpenGIS using the libGEOS libraries, so everything you can do with Postgres should be possible on EXASolution.
I did an introductory Geospatial-on-EXASOL video a while back which may be of interest https://www.youtube.com/watch?v=f6Erp1WWLHw
I would say that your question would get a better response in EXASOL's community section where EXASOL customers and techies can answer specific EXASOL questions. Go to exasol.com/community for more details.
Good luck - and do let me know how you get on
Graham Mossman
Solution Engineer
EXASOL A.G.
I just finished a short knowledge base article which shows you how to connect to ESRI's ArcGIS from within an EXASolution database:
https://www.exasol.com/support/browse/SOL-211
The approach is different from what Graham suggested, as it uses Esri's REST API in combination with Python scripts called from SQL. So, the database connects directly and in parallel to the REST API service, not involving the client at all when it comes to data enrichment.
Hope that helps,
Franz

Apache ODE, BPEL, Invoke RESTful API

Apache ODE documentation seems to support this i.e. invoking/orchestrating RESTFul APIs.
No examples sources available on their site and even after trying hard on Google couldn't find anything useful.
Can someone help me to find a direction?
I'm using latest Apache ODE distribution with Eclipse BPEL designer.
We have a large SET of RESTFul APIs that provides the core interface to our business processes entirely. BPEL seems to be good Orchestration/Workflow programming solution but without the RESTFul API support out of the box I'm almost giving up on it.
I must be missing something here. Please suggest.
This sample is compliant with ws-BPEL 2.0 standard, we have tested only on wso2 bps, you'll be able to run it on ODE with minimal changes to the process. https://svn.wso2.org/repos/wso2/carbon/platform/branches/4.0.0/products/bps/3.0.0/modules/samples/product/src/main/resources/bpel/2.0/TestRESTProcess

Resources