Restricting Access in Power BI

“Let your data be free, man!”… Power BI is truly enabling a data culture in organizations across the world. It is allowing every business analyst and data savvy end user to get access to the data they need and allow them to share it easily with others. This gold mine of capability however presents challenges for security administrators that are heroically trying to protect Power BI users from themselves… maybe using the word “heroic” is a bit of an overstatement, but they really are trying to protect YOU from doing something that could in some cases maybe even get YOU fired!

information-security

Power BI default settings are intended for openness of your data… So, if you are tasked with being a Power BI administrator or simply a general security administrator for an organization that needs to protect sensitive data, below are the 5 things you will want to consider doing immediately when starting your Power BI journey.

There is a lot of good documenation available on these topics beyond what i have posted below, but this is a good starting point.

#1 – Publish to Web

publishtoweb

When your first user from your email domain (let’s assume contoso.com) logs into http://powerbi.com they have the ability to Publish any content to the web to be publicly/anonymously available so that it can be easily accessed from a blog or news site. This is a really cool feature for individuals or small businesses but for companies trying to protect PII or PHI data, you need to turn this feature off unless you want each individual to have this responsibility to understand and protect the content they are sharing.

There are several global settings/switches available in Power BI from the Power BI administrator Portal. You can access the administrator portal by clicking the gear icon in the upper right of Power BI after you are logged in.

adminportal

NOTE: this is only available to global administrators of your Office365 or Power BI PRO deployment. If you do not see this option after logging into powerbi.com, then you are not a global administrator or you have not bought any PRO licenses for your deployment of Power BI. Please see the bottom of this blog post on how to enable non global administrators to become Power BI administrators

Once you are in the Admin Portal, there are several switches in the “Tenant Settings” section. Setting the “Publish to Web” setting to “No” will ensure sensitive data cannot be unintendedly leaked to anonymous web connectivity…

tenantsettings

 

#2 – External Sharing

Many ISVs and Client Services organizations want to easily share content with their partners. This allows someone within the contoso.com domain to share data with someone from the northwinds.com domain. If you do not plan to entrust each powerbi.com user to understand what data they can share and what data they cannot, then you will want to turn off the “Allow sharing content to external users” switch that is also shown in the above screen shot.

#3 – Viral Power BI Sign Up

It pains me to even write this, but some organizations that have sensitive data such as PII or PHI data may not want to take the risk that an “un-authorized” or “un-vetted” user could put content in Power BI. Power BI tries to enable users to  be more self-sufficient. Power BI will allow any user within a domain (such as contoso.com) to login to Power BI and automatically be granted a FREE Power BI license. To ensure ONLY users that have been vetted to use Power BI can upload content and use the tool, a global admin will want to run the following PowerShell command to disable viral sign up option for Power BI…

Set-MsolCompanySettings -AllowAdHocSubscriptions $false

For a full description on this feature, please refer to the following article:

https://powerbi.microsoft.com/en-us/blog/ability-to-disable-free-sign-ups-for-the-free-power-bi/

#4 – Conditional Access

Power BI is software as a service meaning it is an application that is fully hosted on the Azure Cloud. By default, you can login to the service from anywhere in the world. For some organizations, this may cause uneasiness. With Power BI (and other Office 365 apps) you can manage where people can login from and also ensure that people are challenged with a multi-factor authentication such as getting a phone call or entering a code prior to allowing access to Power BI. An organization could completely lock down access from any location other than their network. This could prevent anyone from logging into Power BI at home or abroad. Note that this type of strong handed approach could prevent users from utilizing the Mobile App to get access to their data.

For more information on conditional access, please refer to the Power BI Blog post on it here

#5 – Auditing

You may want to turn auditing on so that you can track activity. This is actually done through the Office365 portal and the Power BI admin portal simply redirects you there.

audit

Documentation describing this feature can be found here

 

The above 5 items I feel are the essential Power BI items to consider when starting to allow users into Power BI. Below are a few more items to consider which are more geared towards data governance than security.

Tenant Settings

Two of the tenant settings were called out above. There are several other settings that you may choose to turn on/off depending on your organization.

“Publish content packs to the entire organization”  – this is turned on by default. It allows any person in the organization to affectively share a dashboard or report with the entire organization. You may choose to turn this off so people have to be very deliberate with which groups/individuals that content is shared with. If you keep an active directory group setup for the entire organization, then sharing with everyone is still possible, but more deliberate.

“Export Data” – turned on by default, this allows users to export data from visuals in Power BI to an Excel spreadsheet (CSV). Most of the time, users are already performing this from other reporting tools within the organization, but it can be turned off.

There is also a “Data Classification” setting to allow you to classify dashboards as being “High Business Impact”, “Low Business Impact”, or custom to your desired classification such as “Confidential” or “PHI”.

dataclassification

When an administrator has this turned on, users that publish dashboards have the option to specify the classification and it will show up in the upper “bread crumb trail”

breadcrumb

Users should be trained from the beginning to look for these tags so they can be careful about how they share their content within the organization.

 

Making Power BI Administrators that are not Global Office365 Administrators

If your organization already has Office365 and you are not a global administrator, you will not have access to all the features I have pointed out above. With the below PowerShell command however your global administrator can grant you access to the Power BI Admin Portal to manage the tenant switches.

Add-MsolRoleMember -RoleMemberEmailAddress “tim@contoso.com” -RoleName “Power BI Service Administrator”

This command also requires the Azure Active Directory Module

 

Conclusion

I am an advocate for an open data culture that allows users to take responsibility for the content that they publish, however, I also know that not every organization is ready to entrust that responsibility upon all of their users. For organizations with highly sensitive data the above controls can reduce the security burden of moving to a self-service environment like Power BI.

 

 

Map Visual – How to deal with large datasets

 

The Power BI bubble map is very useful when plotting geography points rather than shapes or areas. I’d say that in 90% of the cases in how you should be representing your data the out of the box functionality should be fine. Occasionally however you will run into scenarios that you must plot more than a few thousand data points in which you will exceed the upper bound and get the little information icon shown below.

20160801-004

001

What is frustrating about this when it comes to maps is that it does not take a representative sample of the data and instead will just “cut off” the data points. Other visualizations will show an information icon with this message instead:

002

There are 3 techniques that I have used to deal with this issue.

  • Creating a representative sample via DAX
  • Using R script with ggmap
  • Using a location hierarchy

All of these have pros and cons.

Getting a Representative Sample via DAX

The best way to illustrate the native map behavior and how we solve for it is by using an embedded Power BI report. I am going to use some anonymized data from a logistics company that represents cell phone signal strength across one of their routes. The route in this example starts in the Chicago, IL area (United States). See the first tab in the report below.

Note: Use the “full screen” button in the lower right to expand

In the 4 visuals above, notice that the 1st and 3rd only plot a tiny portion of the latitude/longitude. This is confirmed as if you hover over the highest point the latitude is much less than the max latitude shown in the multi-line card visual in the upper right. You can also just “zoom out” on these and see that no points are plotted anywhere near Chicago which is where this route starts.

I included the 2nd visual as I did find it interesting that when I used a value in the SIZE field that may be greater than 1, it seemed to plot more evenly across the entire span of the latitude, however I still would not consider this “representative”.

The 4th visual is simply a scatter plot of the latitude/longitude. Note when hovering over the “I” icon in the upper left it states that it is a representative sample. This is what we want to achieve in the bubble map that is not done natively. In this scenario where we have a continuous latitude/longitude plot, this is a good solution.

 

Now navigate to the second page of the report. Here you will find the map visual always giving the representative sample regardless of how many dates you select in the slicer. I have solved the real business problem here by adding the cellphone signal strength as the legend across the route.

Compare the min and max latitudes in the card visual with hovering over the highest and lowest plotted point on the map. You should notice them to be consistent and the points in between are filled nicely.

“If there is an easier way to do this than what i am about to show, please share with me”

I worked through several iterations before arriving at the conclusion that using RAND() ( or RANDBETWEEN() actually ) was the magic I needed. This is the random number generator expression for those of you that don’t know. And for those of you that do know, you are probably questioning my knowledge a little bit at this point 🙂 but this was truly the only way i got this to work without having to make too many assumptions about the data.

Let’s get into the DAX.

It is important to know how you plan to slice the data in your map visual. In my case, i want to be able to select a date or range of dates to review the cellphone signal strength and how it may vary over time. We can simply get the count of rows in the table with a formula like this:

I have appended “WRONG” to the measure name because we run into an issue of CONTEXT on the map and we have to do a context transition (as described by Alberto Ferrari here). In the correct formula below, we need to include ALL rows of the table, except for the rows we are filtering out with the date slicer.

Notice on the second page of my embedded report above, i have listed a couple of the measures in my card visual that are the wrong ones. Apply a date filter and then in the map select a single point. Notice how the “Total Signals (WRONG)” changes to 1. This is because you have just applied filter context to the measure when you selected that single point on the map. We have to use this measure in getting a representative sample but it has to ignore the map’s filter context which is why in the RIGHT measure above, we have done the context transition to get ALL of the rows except for those being filtered by the date.

Now we need to apply a sample rate to the selected rows. The sample rate will be dynamic depending on how many rows are selected. We start to exceed the bounds of the map visual at around 3000 points so i play it safe below and use 2500 as my denominator. This is because the later calculation will not exactly match this and we may end up with more than 2500 rows.

The CEILING function just rounds whatever the decimal number is up to the nearest integer as i have specified “1” as the second argument.

On page 2 in the report above, you can see how the sample rate changes as the total signals increases with multiple dates selected in the slicer

As the next building block in the representative sample formula, we will pull out the random number generator to give a random number between 1 and the dynamic sample rate that was calculated above

We will use the number generated from this measure and compare it to our Sample Rate. When they are equal we will plot the point.

stopjpg

Before you call me crazy and question my judgement… yes, i know that RAND or RANDBETWEEN does not guarantee me a consistent sampling of data. If my sample rate is 100, it may take me 150 or even 200 tries before my measure above equals the sample rate. But it also may only take 5 tries to equal it as well. I look at the use of RAND() as a “better than the alternative” approach as it gives me the opportunity to get a representative sample of my data versus getting a cut off and unusable data set.

In a prior attempt, i used the MOD function and some other DAX acrobats to try to ensure a consistent representation of the data, but in each attempt i was defeated by the filter context issue of the lat/long on the map. This is where I would humbly welcome feedback from any DAX ninjas if there is a better way.

The reason that using the RANDBETWEEN() function works is that it re-evaluates for every single point on the map trying to be plotted.

Below is the measure for the representative sample. Use this in the SIZE field.

As it is possible i could have multiple rows being counted for the same lat/long position, i add some additional DAX to ensure i only have a size of 1 or BLANK()

The result below is a nice smooth representative sample without too much DAX needing to be written.

20160801-003

Pros for using this approach:

  • Easy… low difficulty level
  • Native Bing map means it is zoom-able… in my use case, it is important to be able to zoom in on the areas that have bad signal coverage
  • Great for big datasets… I believe all the DAX used above is available when doing direct query

Cons

  • Using RAND does not ensure a consistent sampling
  • May not be well suited for geographically disperse data (see the hierarchical approach below)

Using R script with ggmap

Another way to approach the sampling problem is with an R visual. We can create an R Visual that uses ggmap to plot the data points for our route.

I drag in the LATITUDE, LONGITUDE, Max_Signal_Rate, and the Total Signals measure (to avoid pre-sampling the data, reset this measure to just use the COUNTROWS function)

Here is the R script that then needs to be put in the R Script Editor

To use the above code, you will need to be sure you have installed R on your machine and that you have also installed the ggmap and ggplot2 packages.

The above code will produce a plot shown below and can still be filtered with a date slicer.

20160801-005

As i am still learning R, i didn’t take the time to assign a specific coloring scheme for the plotted points so the above just took the default.

There are a couple of obvious CONS to this approach

  1. You have to know R. It is not nearly as easy as just using native Power BI visuals and a little bit of DAX
  2. The map is not zoom-able. It is completely static which does not allow me to achieve solving my business problem which is to zoom in on the spots that my signal strength is low. For my use case, this does not work, however in another use case, the zoom may not be an issue.

A couple of PROS for this approach

  1. I can plot more data points with ggmap. It appears there is still a data limit but it is larger than the out of box map. I have not yet determined what that limit is. I believe it is more to do with Power BI R visuals however and not with the R ggmap functionality.
  2. With the ability to “code your own visual”, there is much more control and opportunities to create a custom experience. instead of using the geom_point function we could use geom_tile to see a more “blocky” heat map style plot as shown in this tutorial or even using geom_path which would allow you to do more of a point A to point B type of map if you didn’t really need to plot information along the way as shown in my example.

Using a Location Hierarchy

This to me seems like an obvious approach when dealing with location data that has a natural hierarchy such as Country->State/Province or Country->City.

So instead of having something like this

20160801-006

by using a hierarchy in the “Location” field you can right click and “drill down” on a parent geography to see the details

20160801-007

20160801-008

This is a very clean way to keep your data points under the threshold of around 3000 without having to write any DAX or R. The biggest problem with this however is it is fully dependent on the location hierarchy giving the necessary split to keep the lowest level of detail going over the threshold.

 

These are 3 different ways of dealing with too much data trying to be plotted on your maps. Again, for the representative sample pattern i showed i am interested in any thoughts of better approaches. Leave them in the comments or ping me on Twitter at @angryanalytics.

90 Day Power BI Implementation Plan

Power BI Turns One Year Old Today!
Before I start with my normal ramblings, let’s take a quick moment to say “Happy Birthday” to Power BI… It officially went generally available on July 24th 2015. After one year, I can say from experience the momentum has been huge and this product has not disappointed. Below is a little happy birthday tribute video

Power BI Implementation Plan

Every organization should have an implementation plan when deciding to move from departmental use of Power BI to a full enterprise deployment. I have put together a 90 day implementation outline below that emphasizes quickly getting a pilot set of users on Power BI so that you can obtain real feedback early before you expose everyone in your organization to things that could have been avoided had you learned from a smaller deployment.

I have not included all important details below but this is meant to be an outline that can be tweaked to meet your organization’s needs. Again, I would like to reference the Power BI Governance Whitepaper here as it helps articulate at a detailed level things that are not included below.

In the below plan I have listed the use of an “enterprise data mart” as optional. I am a strong advocate for early adoption of SSAS Tabular models and I believe from day 1 they should be in consideration for “what you know now” to be data that should be really approached from an IT Managed solution rather than a business managed (or self-service) solution. However, there are other approaches and depending on what your underlying data strategy may be, these data marts might not be necessary.

 

First 30 days

  • Identify group of users for Power BI Pilot
  • Hold initial security meeting with security stakeholders to identify any foreseen issues – Previous blog post on content to consider
  • Initial short term policies are defined for High Business Impact (HBI)/Low Business Impact (LBI) data and Power BI usage
  • Training material is verified and initial pilot group is trained on Power BI at end of 30 day period
  • Active Directory Groups structure considered for future Row Level Security (RLS) requirements
  • Optional: Initial Tabular model for enterprise data is defined and shared with Pilot user group as first data mart external to the self service capabilities
  • High Level Design started to support long term BI strategy
    • This would be the underlying data strategy to consider various data storage options such as Hadoop, a large data warehouse, or simply relying on targeted data marts (such as SSAS Tabular models) to be used once self-service models mature into enterprise wide models

30-60 days

  • Power BI Pilot for the targeted users after training has taken place
    • This would be mostly self-service with some ability to use the initial enterprise data mart if one was implemented
  • Security requirements defined for long term Power BI implementation
    • Any gaps are addressed with additional policies or procedures
  • Detail design started to support long term BI strategy
  • Initial feedback from Power BI pilot group is collected and considered for Power BI implementation plan
  • Power BI implementation plan defined to support rest of organization
  • Define success criteria for the overall enterprise roll out.
    • This may be number of active users, number of reports, number of content packs

60-90 days

  • Optional: Refined enterprise data mart completed with Row Level Security considerations
  • Active directory structure to support organizational role out of Power BI completed
  • Training begins for rest of organization on Power BI
  • implementation of long term data strategy underway with phased approach for replacing “self-service” models with enterprise models as needed
  • Optional: High Level requirements for Power BI Embedded defined if external user access (such as customers or partners)

Beyond first 90 days, things to consider

  • majority of “knowledge workers” within the organization are taking advantage of Power BI to help run the business
  • Optional: Power BI Embedded development/pilot underway
  • Power BI continuous learning efforts through user groups / lunch and learns are established and underway
  • Evaluate (and continuously evaluate and update) the success criteria defined to see if they are being met

 

Dear Mr. CISO

“Dear Mr. CISO,

 As our Chief Information Security Officer I know you are exhausted trying to keep us safe from security breaches and you are ultimately responsible for any that may occur in our organization. I have started working with Power BI Desktop to get a more modern and effective view of the data that is running our business. My manager told me though that we can’t upload any of our data to the cloud. I asked her why and she just told me because we haven’t went through the necessary steps to ensure our data would be safe and that it is too much of a risk. When I pushed for more information, she said it is ultimately your team’s decision and when you approve it then we can use products like Power BI that have components that live in the cloud.”

data-security_2283310b

In so many organizations that are starting to see the benefits of using a Software as a Service (SaaS) solution like Power BI, this is the elephant in the room. Individual departments start to use the FREE version of Power BI. At some point the organization wants to limit the use until a security review has been completed by the security team. This blog post will discuss 5 main security areas your CISO will want to know about:

  1. Compliance
  2. Authentication
  3. Authorization
  4. Data Security
  5. Auditability

Power BI is a self-service solution. It is intended to enable business users to “do more” with their data. In the data security section of this post I will talk about how data can remain on premises and only be accessed via a private pipe through Power BI, but resist the urge to restrict all data from being uploaded to the cloud. It will greatly reduce your ability to truly enable a data culture in your organization. Create policies that define which data is High Business Impact (HBI) and which data is Low Business Impact (LBI). To start with, you may choose to keep your HBI data on premises only while allowing your LBI data to be published to the cloud. Over time, as your comfort grows with a cloud based solution, those policies can be modified.

Prior to discussing these 5 areas of security, I want to reference two white papers that have been extremely valuable to me. The Power BI Security White Paper is publicly available and describes in detail how Power BI as an application is architected and how data is stored in the cloud. There is also a great governance white paper published by Melissa Coates (@SQLChick) and Javier Guillén (@javiguillen) that will help you understand approaches to deploying Power BI.

Compliance

As of the time of this writing in July 2016, the below compliance are held/adhered to by Power BI. An updated list should be available in the Microsoft Trust Center.

pbicompliances

Most security conversations should start here. 100’s of pages of security requirements are documented in these compliance standards (I will leave it to you to bing or google them)… A security conversation should not start from ground zero. It should start from the coverage of the above compliances and ask “beyond these requirements, what are your concerns?” If Power BI meets these security requirements, why rehash them? Start with the “above and beyond” and reduce a week long security review effort to a couple of hours.

One compliance worth pointing out is ISO 27018. This is the compliance for Personal Identifiable information (PII) in the public cloud. This is often the data that is most scrutinized for its security in organizations that hold it. full name, birth date, SSN… no one wants to see this data leaked in a security breach. Power BI attests to holding this standard to keep this data safe.

Lastly on this topic, Microsoft strives to hold as many of the compliances that organizations frequently request. So check the Trust Center link above every few months for other compliances that may be important to you if they are not yet there.

Authentication

The front door… This is where arguably 80% or more of security breaches happen. Weak username/password security.

Have you heard of phishing attacks? Someone sends you an email that looks like a legitimate request with your company’s logo to reset or verify your credentials by clicking on a link. You go to the bogus page (that has your company’s logo on it) and you enter your username/password and it thanks you for your verification… it happens all the time… and sadly, it works so often…

Multi Factor Authentication (or MFA) to the rescue…
WAMFAAnnimated_512

Power BI can be setup to ensure that any time someone tries to login outside of the organization’s network, that a second (or even third) method of identification verification is had before allowing a user access. Often this is a requirement for the Power BI user to enter their credentials, but after that, required to accept a phone call (or use an app) to verify their identity prior to allowing that user access to the application.

So that little phishing attack above; when the hacker tries to use the credentials that were naively shared with him, the real user would get a call on their cellphone asking them to verify their identity to access the system and warn them if they didn’t just try to access the system to contact their IT administrator. It could even require them to enter a PIN only known to them if the hacker also knew the person and took their phone.

This single capability of multi factor authentication can in most ways be just as powerful of an authentication method as the badge you swipe every day to get into the building you work in.

Authorization

Once a user has validated who they are and has successfully logged in to Power BI, it then becomes a question of what data are they “authorized” to see.

auth

I previously did a blog post on how to distribute content in Power BI. This covered how to give the appropriate people access to the appropriate content. However, if you want to distribute a single report to multiple users but only allow each user to see the data they have access to, this can be achieved through the Row level Security (RLS) feature that has already been well documented.

Again, it is important to remember that Power BI is a self-service solution that allows the content author to have control on how data should be shared. Additional features will be added over time to restrict some capabilities and to provide further governance for enterprise solutions, but don’t get too caught up in what “is not restricted” currently as people generally do not go out of their way to find additional work.

Data Security

So now your data is outside the four walls of your data center… it is in the cloud… is it safe?

This is where i refer to the security white paper for in depth detail on how Power BI secures data. First of all, Power BI as an application follows general 3 tier architecture principles to ensure that the web front end communicates through services to the back end. The “Data Storage and Movement” section of the white paper discusses how nearly all of the data is encrypted at rest in Power BI with the remainder of data that is not is planned to be encrypted in Q3 of 2016.

If you are not quite ready to let your HBI data be uploaded to the cloud, Power BI does offer the capability to keep your data on premises so that Power BI basically becomes a web front end ONLY just as your own data center’s DMZ would be. This is a PRO feature via the use of the Enterprise Gateway and requires building reports in Direct Query mode.

directquery

Another feature of Power BI is the use of Express Route. This will send Power BI traffic over an MPLS line through your network carrier that is completely private. So instead of the data movement that is occurring over https to be going over the public internet, it can go through this private tunnel the same way your data center likely already communicates with its disaster recovery location that is 100s of miles away. Think of it as an extension of your own data center. For more information about Express Route and Power BI, please see the link below:
https://powerbi.microsoft.com/en-us/documentation/powerbi-admin-power-bi-expressroute/

With the data security controls in place and the option to keep specific data sets on premises if necessary, you need to ask the question “what is it that you don’t trust?” These are the same controls you would put in place for your most stringent security requirements for on premises applications.

An area where Microsoft likely goes beyond what is being done in your internal data center is the use of “Red Teaming”. This blog and white paper discusses this technique in detail but it is basically a group of security experts that try to penetrate the Microsoft cloud environment so that the threat detection and response measures can be constantly improved.

Auditability

<<UPDATE as of 8/26/2016, auditing feature is now available, see this video blog on this feature>>

Your CISO will want to know how user activity can be tracked/monitored in the Power BI environment even though it is managed by Microsoft and external to your data center.

It is likely that there are mission critical internal systems used in your organization that do not have the level of auditability that your CISO will want from a software as a service (SaaS) application. That may be understandable as it is the “fear of the unknown” as your organization does not control the environment in which Power BI is hosted.

Activities such as monitoring user logins to the system can be achieved through the O365 logging activities discussed here.

At the time of this writing there is not currently additional activities that can be logged for Power BI usage such as report views and data set usage, however this looks to be coming very soon. See the below link on the ideas site and it has been placed in the “Started” status which means it should be delivered in the coming months.
https://ideas.powerbi.com/forums/265200-power-bi-ideas/suggestions/13279371-better-auditing-of-user-activity-datasets-report

It is easy to forget with all the capabilities that Power BI has to offer that it has been generally available for less than a year (July 24th is the anniversary date). So it is somewhat understandable that these deeper administrative functions such as audit logging are just now being delivered. It will be up to your security team on if they want to allow the use of Power BI given all of the security features i have mentioned above or choose to wait until the additional auditing capability is in place. IMHO this is really one of the last steps to making Power BI a solution that your CISO should feel very comfortable with.

 

 

API Strategies with Power BI

API (Application Programming Interface). You surely have heard the term from your enterprise architects telling you that all data will be accessed through “the API” and you will no longer access data from the source. They will build all the interfaces for you and you won’t need that source connectivity anymore.

Tag-Cloud-Web-API

Enterprise architects are really smart (I used to be one), ha. In all of the wisdom we share as architects, the confession is that making data accessible to our business analysts was never our top priority… All of the design documentation that we wrote started with “end user experience” and had a lot of content around “3 Tier Architecture”. Reporting and Business intelligence usually ended up as the last few paragraphs after a few dozen sections of content on how the application would work and interface with all the back-end systems.

I am writing this to give the perspective that more often than not, data connectivity is not forefront in most application designs. I do believe that is starting to change as data is becoming a financial currency to most organizations, but nonetheless, these “pesky” APIs still stand in between you and the data you need to access.

So, you have been given a nice long URL that starts with “https://” to access the data you need. Now what?

In this article, I will show how to connect to your APIs through Power BI in a few different ways. I will start by showing the ODATA protocol but will quickly dive deep into APIs that are secured by the popular OAuth protocol and end with how awesome Power BI transverses nested JSON documents that get returned.

Connecting to ODATA Services

The most well suited API strategy for connecting business intelligence tools is the ODATA protocol. When your organization’s APIs are presented as an ODATA feed then it has a standard implied that power bi can understand. OData can be offered either through the more modern REST standard or more traditional and still widely used SOAP standard.

There is a public ODATA feed for the Northwind sample data set here:
http://services.odata.org/northwind/northwind.svc/

To connect to ODATA feeds, simply go to “Get Data” -> “ODATA” and enter the URL.

Get Data ODATA

As this is a standard format for APIs, the benefit is that you are presented with a table list from the service just like you would be if you were connecting directly to a database.

ODATA Table Navigator

 

Most APIs will have additional request parameters that can be added to get more specific data. The below url connects to the same service as above but returns only a single record:
http://services.odata.org/Northwind/Northwind.svc/Orders(10643)

ODATA to table

You have successfully just connected to your first API. From this point forward, ODATA behaves in the same way as other data sources and you can start shaping data in the “Edit Queries” window.

If you can influence your organization to adopt this standard for APIs, your life will be much easier consuming them. However, the purpose of this blog site is to help you use technologies such as Power BI in an enterprise scenario, and it is most likely that your APIs are not ODATA compliant…. So let’s get on with the FUN stuff.

Non ODATA APIs

After you ask your architect if the API you are trying to connect to is ODATA compliant and she tells you NO, the next alternative is to use the Web data source.

connect to web

When connecting to a web URL for your internal (or external API) it is likely going to be a secured and will have a prefix of “https” instead of “http”. In our ODATA example above, I did not connect to a secured service but had I, the below dialog would have been presented back.

connect to web 2

There are 5 different authentication methods that are quite easy to connect with. It is possible that internal APIs may use Basic or Windows credentials. Someone will have to provide to you a basic user/password combination for connecting to the service or if using Windows credentials someone will need to verify you have been granted access with your own user/password combination.

If connecting with one of these five authentication types, it is pretty straight forward and you can skip ahead to the end of this post where I show how to transverse the JSON documents that likely will be returned as a result.

Connecting to OAuth Services

Many organizations are moving towards OAuth authentication protocol. Notice in the screenshot above, this protocol is not natively supported in Power BI. This protocol has an extra layer of complexity as it requires two services. One service will be called with an api key and an api secret and will return a token to then be used with all subsequent requests to the resource URL.

We will get in the weeds rather quickly in the rest of this blog post so prior to the half of you that will stop reading at this point, please go to the Power BI Ideas site and VOTE for OAuth protocol to be added as a first class authentication provider. Here is the link: https://ideas.powerbi.com/forums/265200-power-bi-ideas/suggestions/13195278-oauth

Also, the below content is available in a Power BI Template file here. Read the content below but feel free to download the template to save a lot of the grunt work.

Setting up a Twitter App for our OAuth API Example

First of all, twitter connectivity via Power Query has already been blogged about a few times. The most useful post I found on it was by Chris Koester here. The script I am using later in this article is nearly identical to what I learned from his post. I will talk through the necessary script that has to be written but also focus on how you can transverse JSON data that gets returned from most modern APIs you will encounter.

Twitter uses OAuth2 for authentication. This will require you to setup a free developer account if you have not already.

Go to https://dev.twitter.com and sign in with your twitter account. You will then need to create a twitter application which will give us the necessary API Key and API Secret that will be used to authenticate with the twitter Search API.

In the new Twitter UI, the applications section is a little hard to find. At the bottom of https://dev.twitter.com are some links. On the right hand side you will see “Manage Your Apps”

Twitter UI

You will see a “Create New App” button that will take you through a dialog to setup your application

Callback URL issue with OAuth based APIs

In the application creation dialog, you will notice a Callback URL entry that we will leave blank for this example.

callback url issue

In a traditional OAuth flow that has a UI associated with it, this is the URL that the flow will return user interaction back to after the user agrees to allow the application to act on their behalf. This blog is not meant to go into that detail but you can review this link for more information.

As with OAuth, we have to get an authentication token from the first service we are going to call, there is an issue with some apis as they only pass this token back to the Callback URL. Power BI cannot read a response returned to a callback URL and if the API you are using requires that “authentication flow” then it can’t be used with Power BI (well, that is not entirely true, but to prevent this blog from being 50 pages long, we will say it can’t). For instance, the LinkedIn API requires this flow so it is not something we can connect to from Power BI easily.

As long as the token is returned in the response from the first authentication service call (as the Twitter API does), we will be able to integrate it with Power BI.

Connecting to the Twitter Search API

Once you have entered the data to create an application, you will land on your “App details” screen. Navigate to the “Keys and Access Tokens” tab and copy the API Key and Secret so they can be used later.

Note: These should be treated like passwords and not made public

twitter access tokens

Back in Power BI, go back to the Web data source and we will just type in an arbitrary web URL as we will be replacing all of this with a script. I used http://twitter.com and was presented with the navigator dialog to select a table from that page. Click “Edit” to open the Query Editor.

connect to arbitrary table

It is a good time to take advantage of a new feature that was added in the April Power BI Desktop update for Parameters. For deeper insight on this feature, check out the power bi blog entry.

manage parameters

Create 4 new parameters as follows:
API KEY: <<your API Key>>
API Secret: <<your API Secret>>
Token URL: https://api.twitter.com/oauth2/token
Search URL: https://api.twitter.com/1.1/search/tweets.json?q=Analytics&count=100

Notice the Search URL has a couple of query string parameters. In my example we will be searching on the word “Analytics”

Once the parameters are entered you will see them in the queries list with their values along with the arbitrary table you pulled in from the Web data source

query editor parameters

 

Advanced Editor

Now that we have our parameters in place, navigate back to the Table Query and click on “Advanced Editor”

advanced editor

Once in the advanced editor, replace the script with the following:

In the above code, there are two calls to Web.Contents function. The first is to call the token service to get a token returned to then be used in the second service call to actually get results.

OAuth is a preferred authentication method as the token that is returned from the first service is only valid for a finite period of time. So if you were to manually reproduce this token and then just called the search service with it, it would no longer be valid the next time you tried to refresh this data.

That is why the script above is so powerful as it handles the token generation so that you have a repeatable automated process. Kudos to @cjkoester for putting this script together.

A couple other things to note in the script is the use of our parameters. They are prefixed with the “#” sign. Also, not all OAuth services will follow the same signature above but this can be used as an example and ideally your development team can tell you changes you may need to make in the pattern to work.

I renamed the query to “TwitterSearch”. The result should now look similar to this.

query editor twitter search start

 

Working with the JSON response

JSON (javascript object notation) is the most popular format used for transmitting data for modern APIs. XML is also still widely used and if your APIs return XML it is likely many of these same steps will apply.

I am really impressed with the JSON parser that is built into the query editor of Power BI. For instance, when you look at the results of the JSON that gets returned in the Twitter API that we just connected to, it looks like this:

JSON via Fiddler

I can count 6 levels of nesting in the small portion of the JSON response I am showing above (via Fiddler). The Power BI Query Editor handles this with no issues.

We will create two separate queries from this response, one for the statuses and the other for the hashtags.

Taking the TwitterSearch result, the two top level entities are “statuses” and “search_metadata”. Click the “List” value for “statuses” and this will expand the record set for “statuses”.

query editor to table

Notice the “To Table” option at the top, by changing this list to a table, we will be able to expand the Record to all the values. Click “Ok” in the To Table dialog and you will see the magic expand button pop up in the column header.

query editor expand button

Click the button and then uncheck the “Use original column name as prefix” to avoid the “Column1” showing up as a prefix for to every column created.

query editor choose columns 1

Click Ok.

Congratulations! You now have the last 100 statuses for the search term “Analytics” that we used in the search URL string we entered as the parameter above: https://api.twitter.com/1.1/search/tweets.json?q=Analytics&count=100

Let’s use this same result to create an additional query for hashtags. Simply right click on the “TwitterSearch” query and select “Duplicate”

query editor duplicate

Rename the query “Hashtags”.

Find the “Entities” column and click the magic expand button again.

query editor expand entities

On the column expansion dialog simply click “Ok” again to see all the results of the nested JSON document.

query editor expand hashtags

Click the magic expand button again on “hashtags”.

This will bring in the null values and the “List” links will become “Record” links.

Finally click the magic expand button one last time to be greeted by the hashtag records we wanted.

query editor hashtags expanded

Right click “text.1” and rename it “hashtag”.

Now we can use the “Choose Columns” dialog to only select a few of the columns to reduce our query to just the information important to the hashtags.

query editor choose columns

From here you can “Close and Apply” your Query Editor window and will have two queries. One for the statuses of your twitter search and the other for the hashtags that you can cross filter against.

A simple viz may look like this

viz

Power BI Template

Another new feature in the April update of Power BI is the Power BI Template file. I have saved all of the above work in a template file that I am sharing by clicking the Export option from the File menu.

template screen shot

This retains all of the work that has been completed above but doesn’t retain any of the data. So when you open this template file, it will prompt you for the parameters that were discussed above.

Here is the template file link again. Angry Analytics Twitter API Example

Conclusion

In closing, this article was intended to show you multiple ways that you can connect to APIs as it is likely your organization has set standards around data connectivity that no longer allows connecting directly to the source.

Be an advocate for a Data Culture. Putting data in the hands of more people as quickly as possible will result in the ability to achieve more. Challenge your development teams to make data accessible. Data is currency!

Power BI Content Workflow

I work with many enterprise level organizations in the Midwestern United States. There is a lot of excitement around Power BI, however I often run into challenges regarding features that are expected with a business intelligence solution that can meet the needs of an enterprise such as governance, administration, and lifecycle management.

pendulum

In Business Intelligence, there is a pendulum that swings between self-service and governance that wreaks havoc on most organizations of reputable size. I can confidently say there is little doubt from an end user feature set that Power BI competes well with any solution on the market. With a FREE and PRO offering at only 9.99 a month per user, the price point is hard to beat. But what about total cost of ownership (TCO)? What about the enterprise deployments of Power BI? I intend to use this blog to help with answering these questions regarding Power BI, SSAS Tabular models, and an overall agile approach to business intelligence within the enterprise.

Let’s start with discussing lifecycle management and taking reports through development, testing, and production similar to how you would deploy a codebase for an application.

power bi cheapskate

First of all, we need to ensure that we are working with the PRO version of Power BI. Yes, if you want Enterprise level features, you are going to have to pay for it. Two PRO features are being used below: Power BI Group Workspaces and Content Packs.

Group Workspaces in Power BI allow for multiple users to “co-author” reports and share ownership over the Power BI content. You can easily create a group workspace from the left hand navigation bar.

power bi groups

Once a group has been created, you can assign users with either “Edit” or “View Only” access. This will be an important distinction for users you assign to this group between those who will be “co-authoring” and those who will be testing and doing quality assurance.

power bi groups add user

 

Once a group has been established with distinction between the content developers and the testers, the below diagram can be followed to achieve a content management workflow.

power bi content mgmt workflow

This diagram is also provided in pdf form below:
Power BI Content Workflow

In this workflow, One Drive For Business (ODFB) is used as a collaboration space for PBIX files. Alternatively, this could as be a source control environment or a local file share. A benefit in O365 deployments for Power BI is that when a group is created, a ODFB directory is also established for sharing.

The group workspace is used for collaboration between content authors and testers. Once development and testing iterations are complete, a Content Pack should be produced to distribute the production version of the dashboard, reports, and datasets to the rest of the organization. Content Packs are created by going to “Get Data” -> “My Organization” -> “Create Content Pack”.

power bi content pack create

IMPORTANT: It is critical that in an enterprise scenario that personal workspaces are not used to publish content packs. This creates a dependency on a single author to have full control over content and scenarios will consistently pop up that a single author may not be available when updates are needed on the content. Enterprise deployment scenarios can’t be achieved when content is tied to a personal workspace. Establish a group workspace structure that makes sense for your organization as pointed out in the Pre-Requisite text on the workflow diagram.

Making Revisions

As revisions occur on the content in the group workspace a notification appears in the upper right reminding that there is a content pack associated with the dashboard or report being modified and that an update needs to be applied to the content pack to publish the changes.

power bi content pack changes

I have talked with some organizations that originally considered bypassing the use of content packs and just were going to use group workspaces with “View Only” users getting the content distribution. This would not allow for the option to iterate on changes before they are published to the end users.

Once the iterations have been completed and the next revision is ready to publish, anyone with “Edit” privileges in the group can go to Settings (the gear icon in upper right) and “View Content Packs”.

power bi content pack update

A warning icon is displayed next to the content pack name indicating there are pending changes. Once you click “Edit” you will have the option to “Update” the content pack in which the changes will be distributed to those that have access to it.

power bi content pack update button

Connection Strings and Separation of Duties (SoD)

The above workflow still leaves a couple of desired features without answers.

With the group workspaces, we have separated the ability to develop and test report changes from publishing them to the rest of the organization, but how do we transition from a DEV/TEST data source to a PROD data source? This has to be achieved within the Power BI Desktop file (as of the time this blog post was written). If it is a new calculation or set of tables that are being tested, collaboration will have to occur between the development team and the testers as to when to change the connection strings from DEV/TEST to PROD sources and republish the PBIX file.

In my experience, often on the reporting and analytics side of things, production connectivity is required by the “power users” in each department that are doing the quality assurance as they can only tell if new report features will pass the “sniff test” when they are connected to data that they know. DEV/TEST environments often grow stale and don’t hold as much value on the reporting/analytics side as they do for application development. Connection string management will be enhanced in Power BI as the product matures, but even then, limited value will be achieved without the production data to verify changes.

Lastly, when we look at the fact that a group workspace only has an “Edit” and “View” role and lacks a “Publish” role, it may be of concern that a content author can push changes for content packs without separation of duty (which may require someone else to actually publish the change). If this is very critical to your business, then the separation of duty can be achieved by adding another group workspace to the mix that only publishers can have access to, but I would suggest this is often overkill. Because reporting cannot “alter” any underlying data, it does not as often have as strict of controls in place to protect from malicious code being deployed such as a transactional application may have. And remember, there is a significant amount of any enterprise organization that is still running their business from Excel spreadsheets being produced and managed by the “power users” in each department. They simply extract data from source systems and create Excel reports that often go all the way to the CFO or CEO in the organization. So as we come back to the initial comment in this blog regarding the pendulum of self-service vs governance, remember that these concerns probably already exist in your organization today with much less visibility than they would when using the proposed workflow mentioned above.

 

Feel free to leave any comments on this article or suggestions for future articles that touch on the use of Microsoft BI solutions such as Power BI for enterprise deployment.