Creating dashboards in tableau - excel

I need to combine the energy consumption every month in my company from various sources. I do the calculations in the excel sheets which I receive every month. How do i combine all the sheets and make a dashboard and also update the dashboard every month automatically once the excel is updated?
Which is the best form of tableau to use(Public,desktop or server)?
What exactly is the difference among the three?
Are the excel sheets a good data source in tableau?

You are asking a lot of questions which should probably be raised separately, but I will try to answer some of them anyway since they all relate to the same use case.
1. How do I combine and make a dashboard
Since Tableau 9.3 you are able to use Union. This will combine all your excel files into a single source of data you can use. I think your data sources should however have the same structure. Meaning the sheets containing information should have the same columns.
You can dynamically and automatically do this using wildcard search. This way it will try and add all files that for example are located in the same folder.
More information on this here.
From the moment you have at least one file as a data source you can start creating a dashboard.
2. Which is the best form of Tableau
I don't think you truly understand the difference between the Tableau applications you mention.
You will need Tableau Dashboard to actually create a dashboard.
If you want to be able to share this dashboard through the web you will need either Tableau Server, Tableau Public or Tableau Online. Everything published on Tableau public will be publicly available. So if your data is considered restricted, sensitive or should not be shared outside your company you should not consider this.
Tableau server on the other hand is server software you can install on a local host which allows you to publish your dashboards and sheets so people with a Tableau server license can access it through a web interface.
Then there is Tableau Online which offers almost the same except that Tableau will take care of the hosting. This is the SaaS solution for making your dashboards available online.
Lastly there is Tableau Reader which is a free desktop application that is able to open your Tableau workbooks, but cannot modify them and has limited access to external data sources.
3. Is Excel a good data source
This really depends on your use case and is probably opinion based. Since the possibility of union and the ability to automatically bring in and update data I think Excel files can be a useful resource. What you need to consider is where the Excel files are stored, how you will connect to them and how many users will need to access them. If other users can easily modify the Excel file and create errors this is another downside of using them as a source.
When you publish your dashboard on e.g. Tableau Server and you want the dashboard to automatically change there as well, the Excel file needs to be accessible from there as well and should not be included in the dashboard. If you feel like none of the above is an issue then at the moment Excel is great for you.

Related

Background loading Excel using task manager to load data and upload to salesforce

I have no knowledge on computer programming and I need a bit of help.
I'm using automate.io (a drag and drop integration software) to take a new row in excel and insert it into salesforce. That bit works all ok.
What I worry about is my excel document it is connected to an SQL server and auto refreshes every minute. The problem is that I have to have the Excel document open at all times for this to auto refresh to take place.
To combat this I used task scheduler to open the document at 7am even when there is no one logged in.
My question is,
will this work and is it reliable?
Will it work?
Only your testing can answer that.
Watch out for false positives, e.g. new record in database not picked up or not refreshed, and therefore not input to SalesForce in a timely manner.
Is it reliable?
Here are some ways to achieve what you want, in approximate descending order of reliability:
Get a third party to integrate the two databases directly (SalesForce and your SQL server), with updates triggered by any change in the data in your SQL server. There is a whole sub-industry of SalesForce integration businesses and individuals who would consider taking this on in return for money.
Get a standalone script (not Excel) running on a server near your database to monitor your DB for changes, and push new records to SalesForce via a direct API.
Get a standalone script (not Excel) running on a server near your database to monitor your DB for changes, and push new records to text files (not Excel) which are subsequently loaded into SalesForce.
Get Excel to refresh your DB for changes regularly via a data link (i.e. what you outlined), but have it new records to text files (not Excel) which are subsequently loaded into SalesForce.
Get Excel to refresh your DB for changes regularly via a data link (i.e. what you outlined), and have it push new records to SalesForce via third-party software as a substitute for actual integration.
You will notice your proposed solution is at or near the end. The list may inspire you to determine what tweaks you might make to the process to move up the list a little ways, without changing the approach completely (unless you want to and can justify it).
Any tweak that removes a link in the dependency chain helps reliability. Shorter chains are generally more reliable. Right now your chain sounds something like: Database > Server?/internet?/network? > Excel data link > Excel file > Task scheduler > internet > automate.io > API > Force.com > SalesForce.
Transaction volume, mission criticality, and other subjective criteria will help guide you as to what is most appropriate in your situation.

Automating Data Entry into VMS with E-Term32

I've been asked to figure out a way to do this, so please fill me in on whether this is even possible or if it shouldn't be done.
The goal is to automate data entry into VMS (we use E-Term32 for connecting to VMS). Things that have been discussed for this purpose: Excel spreadsheets, Dynamic Data Exchange, the macro tools available in E-Term32 (Emulation Command Language - ECL), OLE Automation, etc.
The envisioned process would go like:
Receive Excel file (or other data format like a text file)
Connect to VMS
Run Macro
Macro navigates the menu system and uses data from Excel file to enter data
I know there are "better" or easier ways to do this like building an application to enter the data, but my supervisors are concerned about circumventing the business logic built into the "Blue Screen" menu/applications. They are also sticklers on building new applications for stuff like this anyways.
How is the data stored on OpenVMS, may we assume in native RMS (indexed) files? or some database (RDB, Oracle, Adabas,...) ?
Whichever, it is sure to be perfectly possible to write directly in the datastore through some ODBC or JDBC method. Freeware, or commercial (ConnX, Attunity,...). Just google: OpenVMS ODBC
Once you find a method for direct data access one should indeed be concerned about the business logic. Field formatting, values ranges, foreign keys,...
Thus access can only be granted to (software managed by) trusted team players.
But that can be perfectly manageable and you may find the new method can be made more robust than those green-screen apps.
If direct data access is no negotiable, then there are still many options.
Screen-scrapers have been build, you should not attempt to write from scratch.
Check out commercial terminal-centric modernization tools like: http://www.rocketsoftware.com/products/rocket-legasuite-gui
presentation: http://de.openvms.org/Spring2009/03-Dutch_Police_FINAL.pdf
(I am not associated with the fine folks # Rocket, it is just one example I am aware of. There are surely more (commercial) options.
Now about those business rules. How clean is this implement?
Strict form/function seperation? Probably not, otherwise you would not be asking.
There are several RPC, GSOAP, methods available, free and for fee, that allow one to call OpenVMS service routines, passing in external data. That's probably the best path forward.
The company I work for, Attunity, sells such 'legacy adapter' tools in addition to ODBC/JDBC data access to files directly.
For example, using Attunity Connect software you can connect a row in a table, to the call of a subroutine. The basic plan would be to just use an SQL INSERT statement on Linux or Windows to an ODBC datasource which is connected to an OpenVMS target. The connect software will then call an OpenVMS subroutine in a shareable library to process the row, using or at least re-using the existing business logic for validation, normalization or de-normalization as the case might be. And yes, they can also expose a SELECT or MODIFY for lookups that are more complex than can be described in SQL.
Everybody happy! You can use modern tools, they can use the old code and checks.
And ofcourse another time-honored method is to just have an FTP drop point for data to be entered. Some OpenVMS job scans an agreed upon directory for 'action' files, and runs an OpenVMS program to process the data in the fashion similar to the terminal UI app. Cobol, Basic, re-using as much of the existing terminal code and logic as possible
Good luck!
Hein

excel 2007 pivot on ssas

I am new to this.
I built a pivot report in excel 2007 on SSAS. It connects to a cube on my local pc. Now I want to send this pivot report to other people to make them be able to view the pivot report and do some analysis by themselves (expanding year-month-day etc).
When my colleague tried he couldn't expand.
How can I achieve this?
Thank you,
Nian
Your colleague needs to be able to access the cube in order to refresh it. This means that your cube should be on a shared machine (like a server). I would recommend putting the cube on a server and setup a database read-only user login and setup the Excel file to use that username/password. You may be able to have your local machine be accessible, but I don't have experience with this and I would advise against it anyhow (your users wouldn't be able to refresh the cube if you don't have your computer on the network).
Also, even if you send them the file with data cached from the cube, only so much data gets cached. When you expand items, it won't need to request the data from the cube (on your machine/server) if it has that particular data cached. The same may happen when you create filters.

Automate the export of Facebook Insights data

I'm looking for a way of programmatically exporting Facebook insights data for my pages, in a way that I can automate it. Specifically, I'd like to create a scheduled task that runs daily, and that can save a CSV or Excel file of a page's insights data using a Facebook API. I would then have an ETL job that puts that data into a database.
I checked out the oData service for Excel, which appears to be broken. Does anyone know of a way to programmatically automate the export of insights data for Facebook pages?
It's possible and not too complicated once you know how to access the insights.
Here is how I proceed:
Login the user with the offline_access and read_insights.
read_insights allows me to access the insights for all the pages and applications the user is admin of.
offline_access gives me a permanent token that I can use to update the insights without having to wait for the user to login.
Retrieve the list of pages and applications the user is admin of, and store those in database.
When I want to get the insights for a page or application, I don't query FQL, I query the Graph API: First I calculate how many queries to graph.facebook.com/[object_id]/insights are necessary, according to the date range chosen. Then I generate a query to use with the Batch API (http://developers.facebook.com/docs/reference/api/batch/). That allows me to get all the data for all the available insights, for all the days in the date range, in only one query.
I parse the rather huge json object obtained (which weight a few Mb, be aware of that) and store everything in database.
Now that you have all the insights parsed and stored in database, you're just a few SQL queries away from manipulating the data the way you want, like displaying charts, or exporting in CSV or Excel format.
I have the code already made (and published as a temporarily free tool on www.social-insights.net), so exporting to excel would be quite fast and easy.
Let me know if I can help you with that.
It can be done before the week-end.
You would need to write something that uses the Insights part of the Facebook Graph API. I haven't seen something already written for this.
Check out http://megalytic.com. This is a service that exports FB Insights (along with Google Analytics, Twitter, and some others) to Excel.
A new tool is available: the Analytics Edge add-ins now have a Facebook connector that makes downloads a snap.
http://www.analyticsedge.com/facebook-connector/
There are a number of ways that you could do this. I would suggest your choice depends on two factors:
What is your level of coding skill?
How much data are you looking to move?
I can't answer 1 for you, but in your case you aren't moving that much data (in relative terms). I will still share three options of many.
HARD CODE IT
This would require a script that accesses Facebook's GraphAPI
AND a computer/server to process that request automatically.
I primarily use AWS and would suggest that you could launch an EC2
and have it scheduled to launch your script at X times. I haven't used AWS Pipeline, but I do know that it is designed in a way that you can have it run a script automatically as well... supposedly with a little less server know-how
USE THIRD PARTY ADD-ON
There are a lot of people who have similar data needs. It has led to a number of easy-to-use tools. I use Supermetrics Free to run occasional audits and make sure that our tools are running properly. Supermetrics is fast and has a really easy interface to access Facebooks API's and several others. I believe that you can also schedule refreshes and updates with it.
USE THIRD PARTY FULL-SERVICE ETL
There are also several services or freelancers that can set this up for you at little to no work on your own. Depending on where you want the data. Stitch is a service I have worked with on FB-ads. There might be better services, but it has fulfilled our needs for now.
MY SUGGESTION
You would probably be best served by using a third-party add-on like Supermetrics. It's fast and easy to use. The other methods might be more worth looking into if you had a lot more data to move, or needed it to be refreshed more often than daily.

Concurrent Excel Workbook connections over the internet?

Essentially I have an Excel file that is going to need to be worked on concurrently for a prolonged period of time. In the past I would simply 'Share the Workbook' and this would allow users on the network to view/change the file at the same time as other users on the network but in this particular instance everyone is disconnected from a central network and there is no mechanism available to place them on the same network. Does anyone know of a service out there that will allow all parties to edit this document in a central location concurrently?
A MOSS box came to mind but that seems like overkill for a single document. Thanks.
You could use SharePoint Foundation 2010 (free if you have Windows Server 2008 - so maybe a better fit than paying for MOSS) and import the Excel file as a list, and then sync the list with Excel.
Or, Google docs has a spreadsheet program - I think it allows importation of Excel spreadsheets.

Resources