Dear Mr. CISO

“Dear Mr. CISO,

 As our Chief Information Security Officer I know you are exhausted trying to keep us safe from security breaches and you are ultimately responsible for any that may occur in our organization. I have started working with Power BI Desktop to get a more modern and effective view of the data that is running our business. My manager told me though that we can’t upload any of our data to the cloud. I asked her why and she just told me because we haven’t went through the necessary steps to ensure our data would be safe and that it is too much of a risk. When I pushed for more information, she said it is ultimately your team’s decision and when you approve it then we can use products like Power BI that have components that live in the cloud.”

data-security_2283310b

In so many organizations that are starting to see the benefits of using a Software as a Service (SaaS) solution like Power BI, this is the elephant in the room. Individual departments start to use the FREE version of Power BI. At some point the organization wants to limit the use until a security review has been completed by the security team. This blog post will discuss 5 main security areas your CISO will want to know about:

  1. Compliance
  2. Authentication
  3. Authorization
  4. Data Security
  5. Auditability

Power BI is a self-service solution. It is intended to enable business users to “do more” with their data. In the data security section of this post I will talk about how data can remain on premises and only be accessed via a private pipe through Power BI, but resist the urge to restrict all data from being uploaded to the cloud. It will greatly reduce your ability to truly enable a data culture in your organization. Create policies that define which data is High Business Impact (HBI) and which data is Low Business Impact (LBI). To start with, you may choose to keep your HBI data on premises only while allowing your LBI data to be published to the cloud. Over time, as your comfort grows with a cloud based solution, those policies can be modified.

Prior to discussing these 5 areas of security, I want to reference two white papers that have been extremely valuable to me. The Power BI Security White Paper is publicly available and describes in detail how Power BI as an application is architected and how data is stored in the cloud. There is also a great governance white paper published by Melissa Coates (@SQLChick) and Javier Guillén (@javiguillen) that will help you understand approaches to deploying Power BI.

Compliance

As of the time of this writing in July 2016, the below compliance are held/adhered to by Power BI. An updated list should be available in the Microsoft Trust Center.

pbicompliances

Most security conversations should start here. 100’s of pages of security requirements are documented in these compliance standards (I will leave it to you to bing or google them)… A security conversation should not start from ground zero. It should start from the coverage of the above compliances and ask “beyond these requirements, what are your concerns?” If Power BI meets these security requirements, why rehash them? Start with the “above and beyond” and reduce a week long security review effort to a couple of hours.

One compliance worth pointing out is ISO 27018. This is the compliance for Personal Identifiable information (PII) in the public cloud. This is often the data that is most scrutinized for its security in organizations that hold it. full name, birth date, SSN… no one wants to see this data leaked in a security breach. Power BI attests to holding this standard to keep this data safe.

Lastly on this topic, Microsoft strives to hold as many of the compliances that organizations frequently request. So check the Trust Center link above every few months for other compliances that may be important to you if they are not yet there.

Authentication

The front door… This is where arguably 80% or more of security breaches happen. Weak username/password security.

Have you heard of phishing attacks? Someone sends you an email that looks like a legitimate request with your company’s logo to reset or verify your credentials by clicking on a link. You go to the bogus page (that has your company’s logo on it) and you enter your username/password and it thanks you for your verification… it happens all the time… and sadly, it works so often…

Multi Factor Authentication (or MFA) to the rescue…
WAMFAAnnimated_512

Power BI can be setup to ensure that any time someone tries to login outside of the organization’s network, that a second (or even third) method of identification verification is had before allowing a user access. Often this is a requirement for the Power BI user to enter their credentials, but after that, required to accept a phone call (or use an app) to verify their identity prior to allowing that user access to the application.

So that little phishing attack above; when the hacker tries to use the credentials that were naively shared with him, the real user would get a call on their cellphone asking them to verify their identity to access the system and warn them if they didn’t just try to access the system to contact their IT administrator. It could even require them to enter a PIN only known to them if the hacker also knew the person and took their phone.

This single capability of multi factor authentication can in most ways be just as powerful of an authentication method as the badge you swipe every day to get into the building you work in.

Authorization

Once a user has validated who they are and has successfully logged in to Power BI, it then becomes a question of what data are they “authorized” to see.

auth

I previously did a blog post on how to distribute content in Power BI. This covered how to give the appropriate people access to the appropriate content. However, if you want to distribute a single report to multiple users but only allow each user to see the data they have access to, this can be achieved through the Row level Security (RLS) feature that has already been well documented.

Again, it is important to remember that Power BI is a self-service solution that allows the content author to have control on how data should be shared. Additional features will be added over time to restrict some capabilities and to provide further governance for enterprise solutions, but don’t get too caught up in what “is not restricted” currently as people generally do not go out of their way to find additional work.

Data Security

So now your data is outside the four walls of your data center… it is in the cloud… is it safe?

This is where i refer to the security white paper for in depth detail on how Power BI secures data. First of all, Power BI as an application follows general 3 tier architecture principles to ensure that the web front end communicates through services to the back end. The “Data Storage and Movement” section of the white paper discusses how nearly all of the data is encrypted at rest in Power BI with the remainder of data that is not is planned to be encrypted in Q3 of 2016.

If you are not quite ready to let your HBI data be uploaded to the cloud, Power BI does offer the capability to keep your data on premises so that Power BI basically becomes a web front end ONLY just as your own data center’s DMZ would be. This is a PRO feature via the use of the Enterprise Gateway and requires building reports in Direct Query mode.

directquery

Another feature of Power BI is the use of Express Route. This will send Power BI traffic over an MPLS line through your network carrier that is completely private. So instead of the data movement that is occurring over https to be going over the public internet, it can go through this private tunnel the same way your data center likely already communicates with its disaster recovery location that is 100s of miles away. Think of it as an extension of your own data center. For more information about Express Route and Power BI, please see the link below:
https://powerbi.microsoft.com/en-us/documentation/powerbi-admin-power-bi-expressroute/

With the data security controls in place and the option to keep specific data sets on premises if necessary, you need to ask the question “what is it that you don’t trust?” These are the same controls you would put in place for your most stringent security requirements for on premises applications.

An area where Microsoft likely goes beyond what is being done in your internal data center is the use of “Red Teaming”. This blog and white paper discusses this technique in detail but it is basically a group of security experts that try to penetrate the Microsoft cloud environment so that the threat detection and response measures can be constantly improved.

Auditability

<<UPDATE as of 8/26/2016, auditing feature is now available, see this video blog on this feature>>

Your CISO will want to know how user activity can be tracked/monitored in the Power BI environment even though it is managed by Microsoft and external to your data center.

It is likely that there are mission critical internal systems used in your organization that do not have the level of auditability that your CISO will want from a software as a service (SaaS) application. That may be understandable as it is the “fear of the unknown” as your organization does not control the environment in which Power BI is hosted.

Activities such as monitoring user logins to the system can be achieved through the O365 logging activities discussed here.

At the time of this writing there is not currently additional activities that can be logged for Power BI usage such as report views and data set usage, however this looks to be coming very soon. See the below link on the ideas site and it has been placed in the “Started” status which means it should be delivered in the coming months.
https://ideas.powerbi.com/forums/265200-power-bi-ideas/suggestions/13279371-better-auditing-of-user-activity-datasets-report

It is easy to forget with all the capabilities that Power BI has to offer that it has been generally available for less than a year (July 24th is the anniversary date). So it is somewhat understandable that these deeper administrative functions such as audit logging are just now being delivered. It will be up to your security team on if they want to allow the use of Power BI given all of the security features i have mentioned above or choose to wait until the additional auditing capability is in place. IMHO this is really one of the last steps to making Power BI a solution that your CISO should feel very comfortable with.

 

 

API Strategies with Power BI

API (Application Programming Interface). You surely have heard the term from your enterprise architects telling you that all data will be accessed through “the API” and you will no longer access data from the source. They will build all the interfaces for you and you won’t need that source connectivity anymore.

Tag-Cloud-Web-API

Enterprise architects are really smart (I used to be one), ha. In all of the wisdom we share as architects, the confession is that making data accessible to our business analysts was never our top priority… All of the design documentation that we wrote started with “end user experience” and had a lot of content around “3 Tier Architecture”. Reporting and Business intelligence usually ended up as the last few paragraphs after a few dozen sections of content on how the application would work and interface with all the back-end systems.

I am writing this to give the perspective that more often than not, data connectivity is not forefront in most application designs. I do believe that is starting to change as data is becoming a financial currency to most organizations, but nonetheless, these “pesky” APIs still stand in between you and the data you need to access.

So, you have been given a nice long URL that starts with “https://” to access the data you need. Now what?

In this article, I will show how to connect to your APIs through Power BI in a few different ways. I will start by showing the ODATA protocol but will quickly dive deep into APIs that are secured by the popular OAuth protocol and end with how awesome Power BI transverses nested JSON documents that get returned.

Connecting to ODATA Services

The most well suited API strategy for connecting business intelligence tools is the ODATA protocol. When your organization’s APIs are presented as an ODATA feed then it has a standard implied that power bi can understand. OData can be offered either through the more modern REST standard or more traditional and still widely used SOAP standard.

There is a public ODATA feed for the Northwind sample data set here:
http://services.odata.org/northwind/northwind.svc/

To connect to ODATA feeds, simply go to “Get Data” -> “ODATA” and enter the URL.

Get Data ODATA

As this is a standard format for APIs, the benefit is that you are presented with a table list from the service just like you would be if you were connecting directly to a database.

ODATA Table Navigator

 

Most APIs will have additional request parameters that can be added to get more specific data. The below url connects to the same service as above but returns only a single record:
http://services.odata.org/Northwind/Northwind.svc/Orders(10643)

ODATA to table

You have successfully just connected to your first API. From this point forward, ODATA behaves in the same way as other data sources and you can start shaping data in the “Edit Queries” window.

If you can influence your organization to adopt this standard for APIs, your life will be much easier consuming them. However, the purpose of this blog site is to help you use technologies such as Power BI in an enterprise scenario, and it is most likely that your APIs are not ODATA compliant…. So let’s get on with the FUN stuff.

Non ODATA APIs

After you ask your architect if the API you are trying to connect to is ODATA compliant and she tells you NO, the next alternative is to use the Web data source.

connect to web

When connecting to a web URL for your internal (or external API) it is likely going to be a secured and will have a prefix of “https” instead of “http”. In our ODATA example above, I did not connect to a secured service but had I, the below dialog would have been presented back.

connect to web 2

There are 5 different authentication methods that are quite easy to connect with. It is possible that internal APIs may use Basic or Windows credentials. Someone will have to provide to you a basic user/password combination for connecting to the service or if using Windows credentials someone will need to verify you have been granted access with your own user/password combination.

If connecting with one of these five authentication types, it is pretty straight forward and you can skip ahead to the end of this post where I show how to transverse the JSON documents that likely will be returned as a result.

Connecting to OAuth Services

Many organizations are moving towards OAuth authentication protocol. Notice in the screenshot above, this protocol is not natively supported in Power BI. This protocol has an extra layer of complexity as it requires two services. One service will be called with an api key and an api secret and will return a token to then be used with all subsequent requests to the resource URL.

We will get in the weeds rather quickly in the rest of this blog post so prior to the half of you that will stop reading at this point, please go to the Power BI Ideas site and VOTE for OAuth protocol to be added as a first class authentication provider. Here is the link: https://ideas.powerbi.com/forums/265200-power-bi-ideas/suggestions/13195278-oauth

Also, the below content is available in a Power BI Template file here. Read the content below but feel free to download the template to save a lot of the grunt work.

Setting up a Twitter App for our OAuth API Example

First of all, twitter connectivity via Power Query has already been blogged about a few times. The most useful post I found on it was by Chris Koester here. The script I am using later in this article is nearly identical to what I learned from his post. I will talk through the necessary script that has to be written but also focus on how you can transverse JSON data that gets returned from most modern APIs you will encounter.

Twitter uses OAuth2 for authentication. This will require you to setup a free developer account if you have not already.

Go to https://dev.twitter.com and sign in with your twitter account. You will then need to create a twitter application which will give us the necessary API Key and API Secret that will be used to authenticate with the twitter Search API.

In the new Twitter UI, the applications section is a little hard to find. At the bottom of https://dev.twitter.com are some links. On the right hand side you will see “Manage Your Apps”

Twitter UI

You will see a “Create New App” button that will take you through a dialog to setup your application

Callback URL issue with OAuth based APIs

In the application creation dialog, you will notice a Callback URL entry that we will leave blank for this example.

callback url issue

In a traditional OAuth flow that has a UI associated with it, this is the URL that the flow will return user interaction back to after the user agrees to allow the application to act on their behalf. This blog is not meant to go into that detail but you can review this link for more information.

As with OAuth, we have to get an authentication token from the first service we are going to call, there is an issue with some apis as they only pass this token back to the Callback URL. Power BI cannot read a response returned to a callback URL and if the API you are using requires that “authentication flow” then it can’t be used with Power BI (well, that is not entirely true, but to prevent this blog from being 50 pages long, we will say it can’t). For instance, the LinkedIn API requires this flow so it is not something we can connect to from Power BI easily.

As long as the token is returned in the response from the first authentication service call (as the Twitter API does), we will be able to integrate it with Power BI.

Connecting to the Twitter Search API

Once you have entered the data to create an application, you will land on your “App details” screen. Navigate to the “Keys and Access Tokens” tab and copy the API Key and Secret so they can be used later.

Note: These should be treated like passwords and not made public

twitter access tokens

Back in Power BI, go back to the Web data source and we will just type in an arbitrary web URL as we will be replacing all of this with a script. I used http://twitter.com and was presented with the navigator dialog to select a table from that page. Click “Edit” to open the Query Editor.

connect to arbitrary table

It is a good time to take advantage of a new feature that was added in the April Power BI Desktop update for Parameters. For deeper insight on this feature, check out the power bi blog entry.

manage parameters

Create 4 new parameters as follows:
API KEY: <<your API Key>>
API Secret: <<your API Secret>>
Token URL: https://api.twitter.com/oauth2/token
Search URL: https://api.twitter.com/1.1/search/tweets.json?q=Analytics&count=100

Notice the Search URL has a couple of query string parameters. In my example we will be searching on the word “Analytics”

Once the parameters are entered you will see them in the queries list with their values along with the arbitrary table you pulled in from the Web data source

query editor parameters

 

Advanced Editor

Now that we have our parameters in place, navigate back to the Table Query and click on “Advanced Editor”

advanced editor

Once in the advanced editor, replace the script with the following:

In the above code, there are two calls to Web.Contents function. The first is to call the token service to get a token returned to then be used in the second service call to actually get results.

OAuth is a preferred authentication method as the token that is returned from the first service is only valid for a finite period of time. So if you were to manually reproduce this token and then just called the search service with it, it would no longer be valid the next time you tried to refresh this data.

That is why the script above is so powerful as it handles the token generation so that you have a repeatable automated process. Kudos to @cjkoester for putting this script together.

A couple other things to note in the script is the use of our parameters. They are prefixed with the “#” sign. Also, not all OAuth services will follow the same signature above but this can be used as an example and ideally your development team can tell you changes you may need to make in the pattern to work.

I renamed the query to “TwitterSearch”. The result should now look similar to this.

query editor twitter search start

 

Working with the JSON response

JSON (javascript object notation) is the most popular format used for transmitting data for modern APIs. XML is also still widely used and if your APIs return XML it is likely many of these same steps will apply.

I am really impressed with the JSON parser that is built into the query editor of Power BI. For instance, when you look at the results of the JSON that gets returned in the Twitter API that we just connected to, it looks like this:

JSON via Fiddler

I can count 6 levels of nesting in the small portion of the JSON response I am showing above (via Fiddler). The Power BI Query Editor handles this with no issues.

We will create two separate queries from this response, one for the statuses and the other for the hashtags.

Taking the TwitterSearch result, the two top level entities are “statuses” and “search_metadata”. Click the “List” value for “statuses” and this will expand the record set for “statuses”.

query editor to table

Notice the “To Table” option at the top, by changing this list to a table, we will be able to expand the Record to all the values. Click “Ok” in the To Table dialog and you will see the magic expand button pop up in the column header.

query editor expand button

Click the button and then uncheck the “Use original column name as prefix” to avoid the “Column1” showing up as a prefix for to every column created.

query editor choose columns 1

Click Ok.

Congratulations! You now have the last 100 statuses for the search term “Analytics” that we used in the search URL string we entered as the parameter above: https://api.twitter.com/1.1/search/tweets.json?q=Analytics&count=100

Let’s use this same result to create an additional query for hashtags. Simply right click on the “TwitterSearch” query and select “Duplicate”

query editor duplicate

Rename the query “Hashtags”.

Find the “Entities” column and click the magic expand button again.

query editor expand entities

On the column expansion dialog simply click “Ok” again to see all the results of the nested JSON document.

query editor expand hashtags

Click the magic expand button again on “hashtags”.

This will bring in the null values and the “List” links will become “Record” links.

Finally click the magic expand button one last time to be greeted by the hashtag records we wanted.

query editor hashtags expanded

Right click “text.1” and rename it “hashtag”.

Now we can use the “Choose Columns” dialog to only select a few of the columns to reduce our query to just the information important to the hashtags.

query editor choose columns

From here you can “Close and Apply” your Query Editor window and will have two queries. One for the statuses of your twitter search and the other for the hashtags that you can cross filter against.

A simple viz may look like this

viz

Power BI Template

Another new feature in the April update of Power BI is the Power BI Template file. I have saved all of the above work in a template file that I am sharing by clicking the Export option from the File menu.

template screen shot

This retains all of the work that has been completed above but doesn’t retain any of the data. So when you open this template file, it will prompt you for the parameters that were discussed above.

Here is the template file link again. Angry Analytics Twitter API Example

Conclusion

In closing, this article was intended to show you multiple ways that you can connect to APIs as it is likely your organization has set standards around data connectivity that no longer allows connecting directly to the source.

Be an advocate for a Data Culture. Putting data in the hands of more people as quickly as possible will result in the ability to achieve more. Challenge your development teams to make data accessible. Data is currency!

Power BI Content Workflow

I work with many enterprise level organizations in the Midwestern United States. There is a lot of excitement around Power BI, however I often run into challenges regarding features that are expected with a business intelligence solution that can meet the needs of an enterprise such as governance, administration, and lifecycle management.

pendulum

In Business Intelligence, there is a pendulum that swings between self-service and governance that wreaks havoc on most organizations of reputable size. I can confidently say there is little doubt from an end user feature set that Power BI competes well with any solution on the market. With a FREE and PRO offering at only 9.99 a month per user, the price point is hard to beat. But what about total cost of ownership (TCO)? What about the enterprise deployments of Power BI? I intend to use this blog to help with answering these questions regarding Power BI, SSAS Tabular models, and an overall agile approach to business intelligence within the enterprise.

Let’s start with discussing lifecycle management and taking reports through development, testing, and production similar to how you would deploy a codebase for an application.

power bi cheapskate

First of all, we need to ensure that we are working with the PRO version of Power BI. Yes, if you want Enterprise level features, you are going to have to pay for it. Two PRO features are being used below: Power BI Group Workspaces and Content Packs.

Group Workspaces in Power BI allow for multiple users to “co-author” reports and share ownership over the Power BI content. You can easily create a group workspace from the left hand navigation bar.

power bi groups

Once a group has been created, you can assign users with either “Edit” or “View Only” access. This will be an important distinction for users you assign to this group between those who will be “co-authoring” and those who will be testing and doing quality assurance.

power bi groups add user

 

Once a group has been established with distinction between the content developers and the testers, the below diagram can be followed to achieve a content management workflow.

power bi content mgmt workflow

This diagram is also provided in pdf form below:
Power BI Content Workflow

In this workflow, One Drive For Business (ODFB) is used as a collaboration space for PBIX files. Alternatively, this could as be a source control environment or a local file share. A benefit in O365 deployments for Power BI is that when a group is created, a ODFB directory is also established for sharing.

The group workspace is used for collaboration between content authors and testers. Once development and testing iterations are complete, a Content Pack should be produced to distribute the production version of the dashboard, reports, and datasets to the rest of the organization. Content Packs are created by going to “Get Data” -> “My Organization” -> “Create Content Pack”.

power bi content pack create

IMPORTANT: It is critical that in an enterprise scenario that personal workspaces are not used to publish content packs. This creates a dependency on a single author to have full control over content and scenarios will consistently pop up that a single author may not be available when updates are needed on the content. Enterprise deployment scenarios can’t be achieved when content is tied to a personal workspace. Establish a group workspace structure that makes sense for your organization as pointed out in the Pre-Requisite text on the workflow diagram.

Making Revisions

As revisions occur on the content in the group workspace a notification appears in the upper right reminding that there is a content pack associated with the dashboard or report being modified and that an update needs to be applied to the content pack to publish the changes.

power bi content pack changes

I have talked with some organizations that originally considered bypassing the use of content packs and just were going to use group workspaces with “View Only” users getting the content distribution. This would not allow for the option to iterate on changes before they are published to the end users.

Once the iterations have been completed and the next revision is ready to publish, anyone with “Edit” privileges in the group can go to Settings (the gear icon in upper right) and “View Content Packs”.

power bi content pack update

A warning icon is displayed next to the content pack name indicating there are pending changes. Once you click “Edit” you will have the option to “Update” the content pack in which the changes will be distributed to those that have access to it.

power bi content pack update button

Connection Strings and Separation of Duties (SoD)

The above workflow still leaves a couple of desired features without answers.

With the group workspaces, we have separated the ability to develop and test report changes from publishing them to the rest of the organization, but how do we transition from a DEV/TEST data source to a PROD data source? This has to be achieved within the Power BI Desktop file (as of the time this blog post was written). If it is a new calculation or set of tables that are being tested, collaboration will have to occur between the development team and the testers as to when to change the connection strings from DEV/TEST to PROD sources and republish the PBIX file.

In my experience, often on the reporting and analytics side of things, production connectivity is required by the “power users” in each department that are doing the quality assurance as they can only tell if new report features will pass the “sniff test” when they are connected to data that they know. DEV/TEST environments often grow stale and don’t hold as much value on the reporting/analytics side as they do for application development. Connection string management will be enhanced in Power BI as the product matures, but even then, limited value will be achieved without the production data to verify changes.

Lastly, when we look at the fact that a group workspace only has an “Edit” and “View” role and lacks a “Publish” role, it may be of concern that a content author can push changes for content packs without separation of duty (which may require someone else to actually publish the change). If this is very critical to your business, then the separation of duty can be achieved by adding another group workspace to the mix that only publishers can have access to, but I would suggest this is often overkill. Because reporting cannot “alter” any underlying data, it does not as often have as strict of controls in place to protect from malicious code being deployed such as a transactional application may have. And remember, there is a significant amount of any enterprise organization that is still running their business from Excel spreadsheets being produced and managed by the “power users” in each department. They simply extract data from source systems and create Excel reports that often go all the way to the CFO or CEO in the organization. So as we come back to the initial comment in this blog regarding the pendulum of self-service vs governance, remember that these concerns probably already exist in your organization today with much less visibility than they would when using the proposed workflow mentioned above.

 

Feel free to leave any comments on this article or suggestions for future articles that touch on the use of Microsoft BI solutions such as Power BI for enterprise deployment.