Power Bi Connect Data Part2
Power Bi Connect Data Part2
226
artifacts to the corresponding data sources by using the connection details and
credentials you provided. Make sure you only share connections (and their
credentials) that you're authorized to share.
Every user is limited to maximum 1000 data source connections in every cloud
tenant: If you reach the maximum number of data sources limit, verify that the
number of data sources per user isn't over the limit of 1,000 connections. To
resolve any related issues, you can manually remove existing data sources from the
admin center or, alternatively, use the following PowerShell script to find and bulk-
delete any data sources that exceed that limit.
PowerShell
switch ($environment) {
"Public" { $baseURL = "https://api.powerbi.com/v2.0/myorg/me/";
Break }
"USGov" { $baseURL = "https://api.powerbigov.us/v2.0/myorg/me/";
Break }
"China" { $baseURL = "https://api.powerbi.cn/v2.0/myorg/me/"; Break
}
"USGovHigh" { $baseURL =
"https://api.high.powerbigov.us/v2.0/myorg/me/"; Break }
"USGovMil" { $baseURL =
"https://api.mil.powerbigov.us/v2.0/myorg/me/"; Break }
}
foreach($dataource in $datasources.value)
{
if($datasource.gatewayType -eq "TenantCloud")
{
"cloud datasource found with id = {0}, name = {1}" -f
$dataource.id, $datasource.datasourceName
$gatewayId = $datasource.clusterId
$datasourceId = $dataource.id
227
$deleteDatasourceURL = $baseURL +
"gatewayClusters/$gatewayId/datasources/$datasourceId"
Invoke-PowerBIRestMethod -Url $deleteDatasourceURL -Method
DELETE
}
}
If you're an ISV or any other Power BI Embedded app owner with many customers,
use service principal profiles for multi-tenancy apps in Power BI embedded. If
you're not an ISV, you might reach this limit because you're creating a new data
source for every CSV or Excel file. To solve this, you might want to use the "upload
file box" in Power BI Desktop to select multiple Excel files, which creates multiple
data source connections. In this scenario, to ensure that only a single data source
is selected, we recommend that you instead select the folder containing those
Excel files.
You can't mix an Excel on-premises data source with an existing Analysis Services
DirectQuery data source; you can only include an Excel on-premises data source to
your report if it's in a separate query. In such situations, you can map the Excel
data source to a gateway, and leave the Analysis Services DirectQuery cloud data
source as-is.
Power BI Dataflow Gen1 and Fabric Dataflow Gen2 don't support sharable cloud
connections. Other versions, like Power Apps dataflows, do support sharable cloud
connections.
Related content
For more information about creating shareable cloud connections:
You can do all sorts of things with the Power BI service and Power BI Desktop. For more
information on its capabilities, check out the following resources:
Feedback
228
Was this page helpful? Yes No
229
Create and share cloud data sources in
the Power BI service
Article • 05/03/2024
With Power BI, you can create, share, and manage cloud connections for semantic
models and paginated reports, datamarts, and dataflows, as well as Power Query Online
experiences in Get data, all within the Power BI service user experience.
This article shows you how to create a shareable cloud connection, and then shows you
how to share that connection with others. Creating and sharing shareable cloud
connections have many advantages, as described in advantages of shareable cloud
connections.
In the window that appears, select New connection and from the pane that appears,
select Cloud.
230
Enter a name for the new connection, select the appropriate connection type from the
drop-down list, and provide the connection details for your data source. Once you've
filled in the information, select Create.
231
232
7 Note
When a .PBIX file with a cloud data source is published from Power BI Desktop, a
cloud connection is created automatically.
233
The Manage users window appears, where you can search users by name or by their
email address, and then grant them the permission level you want them to have. You
must at least grant User permission to allow users to connect their artifacts to the
connection's data source.
234
Once you've found the user and assigned permission, select Share at the bottom of the
Manage users window to apply your selections.
Open the settings for the semantic model to which you want the shareable connection
to apply, and expand the Gateway and cloud connections section. You'll notice that the
connection is mapped to a Personal Cloud Connection by default.
235
From the Maps to drop down, select the name of the shareable connection you created
and want to use, then select Apply.
That's it, you've now assigned your shareable cloud connection to the semantic model.
If you haven't created a shareable cloud connection yet when you're using this screen,
you can select the Create a connection option from the drop-down to be taken to the
Manage connections and gateways experience, and all the connection details from the
data source for which you selected the Create a connection drop-down are
prepopulated in the Create new cloud connection form.
236
If a tenant admin enables granular access control for all connection types, then granular
access control is enforced for the entire organization. Workspace admins and artifact
owners can't overrule granular access control enabled at the tenant level.
If granular access control isn't enforced at the tenant level, workspace admins can
enforce granular access control for their workspaces. And if workspace admins don’t
enforce granular access control, then artifact owners can decide whether to enforce
granular access control for each of their artifacts independently.
By default, granular access control is disabled at all three levels, enabling individual
artifact owners to enforce granular access control for each data connection type
selectively. However, it's likely more efficient to enable granular access control on a
workspace-by-workspace basis.
Related content
For important information about shareable cloud connections, including limitations and
considerations, read the following article:
Feedback
Was this page helpful? Yes No
237
Provide product feedback | Ask the community
238
Data sources in Power BI Desktop
Article • 11/12/2024
With Power BI Desktop, you can connect to data from many different sources. For a full
list of available data sources, see Power BI data sources.
To see available data sources, in the Home group of the Power BI Desktop ribbon, select
the Get data button label or down arrow to open the Common data sources list. If the
data source you want isn't listed under Common data sources, select More to open the
Get Data dialog box.
Or, open the Get Data dialog box directly by selecting the Get data icon itself.
239
This article provides an overview of the available data sources in Power BI Desktop and
explains how to connect to them. It also describes how to export or use data sources as
PBIDS files to make it easier to build new reports from the same data.
7 Note
The Power BI team is continually expanding the data sources available to Power BI
Desktop and the Power BI service. As such, you'll often see early versions of work-
in-progress data sources marked as Beta or Preview. Any data source marked as
Beta or Preview has limited support and functionality, and it shouldn't be used in
production environments. Additionally, any data source marked as Beta or Preview
for Power BI Desktop may not be available for use in the Power BI service or other
Microsoft services until the data source becomes generally available (GA).
240
Data sources
The Get Data dialog box organizes data types in the following categories:
All
File
Database
Microsoft Fabric
Power Platform
Azure
Online Services
Other
The All category includes all data connection types from all categories.
Excel Workbook
Text/CSV
XML
JSON
Folder
PDF
Parquet
SharePoint folder
7 Note
Some database connectors require that you enable them by selecting File >
Options and settings > Options, then selecting Preview features and enabling the
connector. If you don't see some of the connectors mentioned previously and want
to use them, check your Preview features settings. Also note that any data source
242
marked as Beta or Preview has limited support and functionality, and shouldn't be
used in production environments.
Microsoft Fabric
The Microsoft Fabric category provides the following data connections:
244
Dynamics 365 Customer Insights (Beta)
Databricks
Digital Construction Works Insights
Emigo Data Source
Entersoft Business Suite (Beta)
eWay-CRM
FactSet Analytics
Palantir Foundry
Hexagon PPM Smart® API
Industrial App Store
Planview OKR (beta)
Planview ProjectPlace
Quickbase
SoftOne BI (Beta)
Planview IdeaPlace
TeamDesk (beta)
Webtrends Analytics (Beta)
Witivio (Beta)
Zoho Creator
Automation Anywhere
CData Connect Cloud
Dynamics 365 Customer Insights (beta)
Databricks
Funnel
Intune Data Warehouse (Beta)
LEAP (Beta)
LinkedIn Learning
Product Insights (Beta)
Profisee
Samsara (Beta)
Supermetrics (beta)
Viva Insights
Zendesk (Beta)
BuildingConnected & TradeTapp (beta)
Smartsheet (Beta)
Web
245
SharePoint list
OData Feed
Active Directory
Microsoft Exchange
Hadoop File (HDFS)
Spark
Hive LLAP
R script
Python script
ODBC
OLE DB
Acterys : Model Automation & Planning (Beta)
Amazon OpenSearch Service (Beta)
Anaplan
Solver
Bloomberg Data and Analytics
Celonis EMS
Cherwell (Beta)
CloudBluePSA (Beta)
Cognite Data Fusion
EQuIS
FactSet RMS (Beta)
inwink (Beta)
Kognitwin
MicroStrategy for Power BI
OneStream (Beta)
OpenSearch Project (Beta)
Paxata
QubolePresto (Beta)
Roamler (Beta)
SIS-CC SDMX (Beta)
Shortcuts Business Insights (Beta)
Starburst Enterprise
SumTotal
SurveyMonkey
Tenforce (Smart)List
Usercube (Beta)
Vena
Vessel Insight
Wrike (Beta)
Zucchetti HR Infinity (Beta)
246
BitSight Security Ratings
BQE CORE
Wolters Kluwer CCH Tagetik
Delta Sharing
Eduframe (Beta)
FHIR
Google Sheets
InformationGrid
Jamf Pro (Beta)
SingleStore Direct Query Connector
Siteimprove
SolarWinds Service Desk
Microsoft Teams Personal Analytics (Beta)
Windsor (beta)
Blank Query
7 Note
At this time, it's not possible to connect to custom data sources secured using
Microsoft Entra ID.
Template apps
You can find template apps for your organization by selecting the Template Apps link
near the bottom of the Get data window.
247
Available Template Apps may vary based on your organization.
248
2. A connection window appears. Enter the URL or resource connection information,
and then select OK. The following screenshot shows a URL entered in the From
Web connection dialog box.
249
4. Select the tables and other data that you want to load. To load the data, select the
Load button at the bottom of the Navigator pane. To transform or edit the query
in Power Query Editor before loading the data, select the Transform Data button.
Connecting to data sources in Power BI Desktop is that easy. Try connecting to data
from our growing list of data sources, and check back often. We continue to add to this
list all the time.
You can create a PBIDS file to streamline the Get Data experience for new or beginner
report creators in your organization. If you create the PBIDS file from existing reports,
it's easier for beginning report authors to build new reports from the same data.
When an author opens a PBIDS file, Power BI Desktop prompts the user for credentials
to authenticate and connect to the data source that the file specifies. The Navigator
dialog box appears, and the user must select the tables from that data source to load
into the model. Users might also need to select the database and connection mode if
none was specified in the PBIDS file.
250
From that point forward, the user can begin building visualizations or select Recent
Sources to load a new set of tables into the model.
Currently, PBIDS files only support a single data source in one file. Specifying more than
one data source results in an error.
1. To create the PBIDS file, select File > Options and settings > Data source settings.
2. In the dialog that appears, select the data source you want to export as a PBIDS
file, and then select Export PBIDS.
251
3. In the Save As dialog box, give the file a name, and select Save. Power BI Desktop
generates the PBIDS file, which you can rename and save in your directory, and
share with others.
You can also open the file in a text editor, and modify the file further, including
specifying the mode of connection in the file itself. The following image shows a PBIDS
file open in a text editor.
If you prefer to manually create your PBIDS files in a text editor, you must specify the
required inputs for a single connection and save the file with the .pbids extension.
Optionally, you can also specify the connection mode as either DirectQuery or Import . If
252
mode is missing or null in the file, the user who opens the file in Power BI Desktop is
) Important
Some data sources will generate an error if columns are encrypted in the data
source. For example, if two or more columns in an Azure SQL Database are
encrypted during an Import action, an error will be returned. For more information,
see SQL Database.
The PBIDS file doesn't include authentication information and table and schema
information.
The following code snippets show several common examples for PBIDS files, but they
aren't complete or comprehensive. For other data sources, you can refer to the git Data
Source Reference (DSR) format for protocol and address information.
If you're editing or manually creating the connection files, these examples are for
convenience only, aren't meant to be comprehensive, and don't include all supported
connectors in DSR format.
Azure AS
JSON
{
"version": "0.1",
"connections": [
{
"details": {
"protocol": "analysis-services",
"address": {
"server": "server-here"
},
}
}
]
}
253
Folder
JSON
{
"version": "0.1",
"connections": [
{
"details": {
"protocol": "folder",
"address": {
"path": "folder-path-here"
}
}
}
]
}
OData
JSON
{
"version": "0.1",
"connections": [
{
"details": {
"protocol": "odata",
"address": {
"url": "URL-here"
}
}
}
]
}
SAP BW
JSON
{
"version": "0.1",
"connections": [
{
"details": {
"protocol": "sap-bw-olap",
"address": {
"server": "server-name-here",
"systemNumber": "system-number-here",
254
"clientId": "client-id-here"
},
}
}
]
}
SAP HANA
JSON
{
"version": "0.1",
"connections": [
{
"details": {
"protocol": "sap-hana-sql",
"address": {
"server": "server-name-here:port-here"
},
}
}
]
}
SharePoint list
The URL must point to the SharePoint site itself, not to a list within the site. Users get a
navigator that allows them to select one or more lists from that site, each of which
becomes a table in the model.
JSON
{
"version": "0.1",
"connections": [
{
"details": {
"protocol": "sharepoint-list",
"address": {
"url": "URL-here"
},
}
}
]
}
255
SQL Server
JSON
{
"version": "0.1",
"connections": [
{
"details": {
"protocol": "tds",
"address": {
"server": "server-name-here",
"database": "db-name-here (optional) "
}
},
"options": {},
"mode": "DirectQuery"
}
]
}
Text file
JSON
{
"version": "0.1",
"connections": [
{
"details": {
"protocol": "file",
"address": {
"path": "path-here"
}
}
}
]
}
Web
JSON
{
"version": "0.1",
"connections": [
{
"details": {
"protocol": "http",
256
"address": {
"url": "URL-here"
}
}
}
]
}
Dataflow
JSON
{
"version": "0.1",
"connections": [
{
"details": {
"protocol": "powerbi-dataflows",
"address": {
"workspace":"workspace id (Guid)",
"dataflow":"optional dataflow id (Guid)",
"entity":"optional entity name"
}
}
}
]
}
Related content
You can do all sorts of things with Power BI Desktop. For more information on its
capabilities, check out the following resources:
Feedback
Was this page helpful? Yes No
This article describes how to create and work with dynamic M query parameters in
Power BI Desktop. With dynamic M query parameters, model authors can configure the
filter or slicer values that report viewers can use for an M query parameter. Dynamic M
query parameters give model authors more control over the filter selections to
incorporate into DirectQuery source queries.
Model authors understand the intended semantics of their filters, and often know how
to write efficient queries against their data source. With dynamic M query parameters,
model authors can ensure that filter selections incorporate into source queries at the
right point to achieve the intended results with optimum performance. Dynamic M
query parameters can be especially useful for query performance optimization.
Watch Sujata explain and use dynamic M query parameters in the following video, and
then try them out yourself.
7 Note
This video might use earlier versions of Power BI Desktop or the Power BI service.
https://www.microsoft.com/en-us/videoplayer/embed/RE4QLsb?postJsllMsg=true
Prerequisites
To work through these procedures, you must have a valid M query that uses one or
more DirectQuery tables.
Add parameters
1. In Power BI Desktop, select Home > Transform data > Transform data to open the
Power Query Editor.
258
2. In the Power Query Editor, select New Parameters under Manage Parameters in
the ribbon.
3. In the Manage Parameters window, fill out the information about the parameter.
For more information, see Creating a parameter.
259
5. When you're done adding parameters, select OK.
260
Date/Time parameter, you generate the possible inputs to dynamically set the date for
the parameter.
2. Create a table for the values of the StartTime parameter, for example:
261
3. Create a second table for the values of the EndTime parameter, for example:
7 Note
Use a column name that's not in an actual table. If you use the same name as
an actual table column, the selected value applies as a filter in the query.
262
1. To bind a field, in the Power BI Desktop Model view, select the newly created field,
and in the Properties pane, select Advanced.
7 Note
The column data type should match the M parameter data type.
2. Select the dropdown under Bind to parameter and select the parameter that you
want to bind to the field:
263
Since this example is for setting the parameter to a single value, keep Multi-select
set to No, which is the default:
If you set the mapped column to No for Multi-select, you must use a single select
mode in the slicer, or require single select in the filter card.
If your use cases require passing multiple values to a single parameter, set the
control to Yes and make sure your M query is set up to accept multiple values.
Here's an example for RepoNameParameter , which allows multiple values:
3. Repeat these steps if you have other fields to bind to other parameters.
264
You can now reference this field in a slicer or as a filter:
265
Enable Select all
In this example, the Power BI Desktop model has a field called Country, which is a list of
countries/regions bound to an M parameter called countryNameMParameter. This
parameter is enabled for Multi-select, but isn't enabled for Select all. To be able to use
the Select all option in a slicer or filter card, take the following added steps:
266
To enable Select all for Country:
1. In the Advanced properties for Country, enable the Select all toggle, which
enables the Select all value input. Edit the Select all value or note the default
value.
The Select all value passes to the parameter as a list that contains the value you
defined. Therefore, when you define this value or use the default value, make sure
the value is unique and doesn't exist in the field that's bound to the parameter.
2. Launch the Power Query Editor, select the query, and then select Advanced Editor.
Edit the M query to use the Select all value to refer to the Select all option.
3. In the Advanced Editor, add a Boolean expression that evaluates to true if the
parameter is enabled for Multi-select and contains the Select all value, and
267
otherwise returns false :
4. Incorporate the result of the Select all Boolean expression into the source query.
The example has a Boolean query parameter in the source query called
includeAllCountries that is set to the result of the Boolean expression from the
previous step. You can use this parameter in a filter clause in the query, such that
false for the Boolean filters to the selected country or region names, and true
5. Once you update your M query to account for the new Select all value, you can
use the Select all function in slicers or filters.
For reference, here's the full query for the preceding example:
Kusto
let
selectedcountryNames = if Type.Is(Value.Type(countryNameMParameter),
List.Type) then
Text.Combine({"'", Text.Combine(countryNameMParameter, "','") , "'"})
268
else
Text.Combine({"'" , countryNameMParameter , "'"}),
selectAllCountries = if Type.Is(Value.Type(countryNameMParameter),
List.Type) then
List.Contains(countryNameMParameter, "__SelectAll__")
else
false,
ActualQueryWithKustoParameters =
"Covid19
| where includeAllCountries or Country
in(countryNames)
| where Timestamp > startTimep and Timestamp
< endTimep
| summarize sum(Confirmed) by Country,
bin(Timestamp, 30d)",
finalQuery = Text.Combine({KustoParametersDeclareQuery,
ActualQueryWithKustoParameters}),
Kusto
269
Products
| where Category == [Parameter inserted here] & HasReleased == 'True'
| project ReleaseDate, Name, Category, Region
There are no issues with a friendly user who passes an appropriate value for the
parameter, for example, Games :
However, an attacker might be able to pass a value that modifies the query to get
access to more data, for example, 'Games'// :
Products
| where Category == 'Games'// & HasReleased == 'True'
| project ReleaseDate, Name, Category, Region
In this example, the attacker can get access to information about games that haven't
released yet by changing part of the query into a comment.
If a data source supports importing stored procedures, consider storing your query logic
there and invoking it in the M query. Alternatively, if available, use a parameter-passing
mechanism that's built in to the source query language and connectors. For example,
Azure Data Explorer has built-in query parameter capabilities that are designed to
protect against injection attacks.
Kusto
270
Example that declares the parameter in the source query, or passes the parameter
value as an input to a source query function:
Kusto
Kusto
271
Any
Duration
True/False
Binary
Unsupported filters
Relative time slicer or filter
Relative date
Hierarchy slicer
Multifield include filter
Exclude filters / Not filters
Cross highlighting
Drill-down filter
Cross-drill filter
Top N filter
Unsupported operations
And
Contains
Less than
Greater than
Starts with
Does not start with
Is not
Does not contain
Is blank
Is not blank
Related content
For more information about Power BI Desktop capabilities, check out the following
resources:
DirectQuery in Power BI
What is Power BI Desktop?
Query overview in Power BI Desktop
Data types in Power BI Desktop
Tutorial: Shape and combine data in Power BI Desktop
272
Perform common query tasks in Power BI Desktop
Feedback
Was this page helpful? Yes No
273
Create a Power BI semantic model
directly from Log Analytics
Article • 11/10/2023
You can quickly create a Power BI semantic model directly from a Log Analytics query.
The semantic model will be full-fledged Power BI semantic model that you can use to
create reports, analyze in Excel, and more.
Creating a semantic model directly from a Log Analytics query is an easy and quick way
to share a semantic model, because if you save it to a shared workspace, everyone with
the sufficient permissions in the workspace can use it. You can also use semantic model
sharing to share it with other users who don’t have a role in the workspace.
This feature creates a semantic model in the Power BI service directly from a Log
Analytics query. If you need to model or transform the data in ways that aren't available
in the service, you can also export the query from Log Analytics, paste it into Power BI
Desktop, and do your advanced modeling there. For more information, see Create
Power BI semantic models and reports from Log Analytics queries.
Prerequisites
You must have a Power BI account to be able to use this functionality.
1. Open and run the Log Analytics query you want to use to create the Power BI
database.
274
3. Power BI will open and a dialog will ask you to name the semantic model and
choose a workspace to save it in. By default the semantic model will be given the
same name as the query and saved to My workspace. You can choose your own
name and destination workspace. If you're a free user in Power BI, you'll only be
able to save to My workspace.
The dialog also shows the URL of the Log Analytics data source. To prevent
inadvertently exposing sensitive data, make sure that you recognize the data
source and are familiar with the data. Select Review data if you want to check the
Log Analytic query results before allowing export to continue. For more
information about when reviewing the data might be a good idea, see Reviewing
the Log Analytics data.
4. Select Continue. Your semantic model will be created, and you'll be taken to the
details page of the new semantic model. From there you can do all the things you
can do with a regular Power BI semantic model - refresh the data, share the
semantic model, create new reports, and more. See semantic model details for
more information.
7 Note
275
deciding which credentials to choose, see Choosing which credentials to
authenticate with.
To keep the data fresh after you've created the semantic model, either refresh the data
manually or set up scheduled refresh.
Reviewing the data is important if you weren't the one who exported the Log Analytics
data, but rather received a link from someone for creating a semantic model from Log
Analytics. In such a case, you might not be familiar with the data that is being exported,
and hence it's important to review it to make sure that no sensitive data is inadvertently
being exposed.
276
If you get the following dialog, it means that you've already established a connection to
Log Analytics in the past. The credentials you used at that time may or may not be
different than the credentials of your current sign in. You need to choose whether to
continue using the sign-in details you used the last time you connected (The credentials
I used to connect to Power BI last time), or whether the connection should use your
current sign-in credentials from now on (My current credentials (these may be the same
or different)).
The Power BI view of the Log Analytics data is determined by the permissions of the
account used to establish the Power BI connection to the Log Analytics data source.
If you let Power BI use the sign-in details you used last time for the connection, the data
you'll see in the semantic model you're creating may differ from what you see in Log
Analytics. This is because the data that is shown in the semantic model is what the
account with the credentials you used last time can see in Log Analytics.
If you replace the credentials you used last time with your current sign-in credentials,
the data you see in the semantic model you're creating will be exactly the same as what
you see in Log Analytics. However, since the connection now uses your current login
credentials, views of the data in semantic models you might have created previously
from that Log Analytics query might also change, and this could affect reports and
other downstream items that users might have created based on those semantic
models.
Take the above considerations into account when you make your choice.
277
If you've never previously connected to Log Analytics from Power BI, Power BI will
automatically use your current credentials to establish the connection, and you won't
see this dialog.
Next steps
Log Analytics integration with Power BI
Semantic model details
Share access to a semantic model
278
Create a Power BI semantic model
directly from a SharePoint list
Article • 11/10/2023
You can quickly create a Power BI semantic model directly from a SharePoint list. The
semantic model will be full-fledged Power BI semantic model that you can use to create
reports, analyze in Excel, and more.
Creating a semantic model directly from a SharePoint list is an easy and quick way to
share a semantic model, because if you save it to a shared workspace, everyone with the
sufficient permissions in the workspace can use it. You can also use semantic model
sharing to share it with other users who don’t have a role in the workspace.
To keep the data fresh after you've created the semantic model, either refresh the data
manually or set up scheduled refresh.
This feature creates a semantic model in the Power BI service directly from a SharePoint
list. If you need to model or transform the data in ways that aren't available in the
service, you can also connect to the SharePoint list from Power BI Desktop. For more
information, see Create a report on a SharePoint List in Power BI Desktop.
Prerequisites
You must have a Power BI account to be able to use this functionality.
279
3. Power BI will open and a dialog will ask you to name the semantic model and
choose a workspace to save it in. By default the semantic model will be given the
same name as the SharePoint list and saved to My workspace. You can choose your
own name and destination workspace. If you're a free user in Power BI, you'll only
be able to save to My workspace.
The dialog also shows the URL of the data source (SharePoint site) and name of
the SharePoint list. To prevent inadvertently exposing sensitive data, make sure
that you recognize the data source and are familiar with the data. Select Review
data if you want to check the SharePoint list before allowing export to continue.
For more information about when reviewing the data might be a good idea, see
Reviewing the SharePoint list data.
4. Select Continue. Your semantic model will be created, and you'll be taken to the
details page of the new semantic model. From there you can do all the things you
can do with a regular Power BI semantic model - refresh the data, share the
280
semantic model, create new reports, and more. See semantic model details for
more information.
7 Note
To keep the data fresh after you've created the semantic model, either refresh the data
manually or set up scheduled refresh.
Reviewing the data is important if you weren't the one who exported the SharePoint list,
but rather received a link from someone for creating a semantic model from a
SharePoint list. In such a case, you might not be familiar with the data that is being
exported, and hence it's important to review it to make sure that no sensitive data is
inadvertently being exposed.
281
Choosing which credentials to authenticate
with
When you export a SharePoint list to Power BI, Power BI connects to the SharePoint site
to get the data from the list. In order to connect, it needs to authenticate with
SharePoint.
If you get the following dialog, it means that you've already established a connection to
the SharePoint site in the past. The credentials you used at that time may or may not be
different than the credentials of your current sign in. You need to choose whether to
continue using the sign-in details you used the last time you connected (The credentials
I used to connect to Power BI last time), or whether the connection should use your
current sign-in credentials from now on (My current credentials (these may be the same
or different)).
The Power BI view of the SharePoint list data is determined by the permissions of the
account used to establish the Power BI connection to the SharePoint data source (that is,
the SharePoint site).
If you let Power BI use the sign-in details you used last time for the connection, the data
you'll see in the semantic model you're creating may differ from what you see in the
SharePoint list. This is because the data that is shown in the semantic model is what the
account with the credentials you used last time can see in the SharePoint list.
If you replace the credentials you used last time with your current sign-in credentials,
the data you see in the semantic model you're creating will be exactly the same as what
282
you see in the SharePoint list. However, since the connection now uses your current
login credentials, views of the data in semantic models you might have created
previously from that SharePoint site might also change, and this could affect reports
and other downstream items that users might have created based on those semantic
models.
Take the above considerations into account when you make your choice.
If you've never previously connected to the SharePoint site from Power BI, Power BI will
automatically use your current credentials to establish the connection, and you won't
see this dialog.
Next steps
Semantic model details
Share access to a semantic model
283
Create a report on a SharePoint List in
Power BI Desktop
Article • 12/03/2024
Many teams and organizations use lists in SharePoint Online to store data because it's
easy to set up and easy for users to update. Sometimes a chart is a much easier way for
users to quickly understand the data rather than looking at the list itself. In this tutorial,
you learn how to transform your SharePoint list data into a Power BI report.
Watch this five-minute tutorial video, or scroll down for step-by-step instructions.
7 Note
This video might use earlier versions of Power BI Desktop or the Power BI service.
https://www.youtube-nocookie.com/embed/OZO3x2NF8Ak
In the Power BI service, you can also create a report quickly from data in a SharePoint
list.
If your purpose is to quickly create a semantic model in the Power BI service, you can do
so directly from the SharePoint list. For more information, see Create a semantic model
from a SharePoint list.
284
4. Select Connect.
5. Find the address (also known as a URL) of your SharePoint Online site that contains
your list. From a page in SharePoint Online, you can usually get the site address by
selecting Home in the navigation pane, or the icon for the site at the top, then
copying the address from your web browser's address bar.
7 Note
This video might use earlier versions of Power BI Desktop or the Power BI
service.
https://www.youtube-nocookie.com/embed/OZO3x2NF8Ak?start=48&end=90
6. In Power BI Desktop, paste the address into the Site URL field of the SharePoint
Online Lists dialog box, and then select OK.
285
7. You might or might not see a SharePoint access screen like the following image. If
you don't see it, skip to step 10. If you do see it, select Microsoft Account on the
left side of the page.
8. Select Sign In, and enter the user name and password you use to sign in to
Microsoft 365.
10. On the left side of the Navigator dialog box, select the checkbox beside the
SharePoint list you want to connect to.
286
11. Select Load. Power BI loads your list data into a new report.
2. Make sure your list columns with numbers show the Sum, or Sigma, icon in the
Data pane on the right. For any that don't, select the column header in the table
view, select the Structure group in the Column tools tab, then change the Data
type to Decimal Number or Whole Number, depending on the data. If prompted
to confirm your change, select Yes. If your number is a special format, like currency,
you can also choose that by setting the Format in the Formatting group.
7 Note
This video might use earlier versions of Power BI Desktop or the Power BI
service.
287
https://www.youtube-nocookie.com/embed/OZO3x2NF8Ak?start=147&end=204
3. On the left side of the Power BI Desktop screen, select the Report icon.
4. Select columns you want to visualize by selecting the checkboxes beside them in
the Data pane on the right.
7 Note
This video might use earlier versions of Power BI Desktop or the Power BI
service.
https://www.youtube-nocookie.com/embed/OZO3x2NF8Ak?start=215&end=252
6. You can create multiple visualizations in the same report by deselecting the
existing visual, then selecting checkboxes for other columns in the Data pane.
Related content
Create a report quickly from a SharePoint list
Feedback
Was this page helpful? Yes No
288
Connect to semantic models in the
Power BI service from Power BI Desktop
Article • 12/03/2024
In Power BI Desktop, you can create a data model and publish it to the Power BI service.
Then you and others can establish a live connection to the shared semantic model that's
in the Power BI service, and create many different reports from that common data
model. You can use the Power BI service live connection feature to create multiple
reports in .pbix files from the same semantic model, and save them to different
workspaces.
This article discusses the benefits, best practices, considerations, and limitations of the
Power BI service live connection feature.
289
One challenge with the popularity of Power BI is the resulting proliferation of reports,
dashboards, and underlying data models. It's easy to create compelling reports in Power
BI Desktop, publish those reports in the Power BI service, and create great dashboards
from those semantic models.
Because report creators often use the same or nearly the same semantic models,
knowing which semantic model a report is based on and the freshness of that semantic
model becomes a challenge. The Power BI service live connection addresses that
challenge by using common semantic models to make it easier and more consistent to
create, share, and expand on reports and dashboards.
If everyone on the team created their own versions of the semantic model and shared
their reports with the team, there would be many reports from different semantic
models in your team's Power BI workspace. It would be hard to tell which report was the
most recent, whether the semantic models were the same, or what the differences were.
With the Power BI service live connection feature, other team members can use the
analyst's published semantic model for their own reports in their own workspaces.
Everyone can use the same solid, vetted, published semantic model to build their own
unique reports.
290
In Power BI Desktop, the team business analyst creates a report and the semantic model
the report is based on. The analyst then publishes the report to the Power BI service, and
the report shows up in the team's workspace. For more information about workspaces,
see Workspaces in Power BI.
The business analyst can use the Build permission setting to make the report available
for anyone in or out of the workspace to see and use. Team members in and out of the
team workspace can now establish a live connection to the shared data model by using
the Power BI service live connection feature. Team members can create their own unique
reports, from the original semantic model, in their own workspaces.
The following image shows how one Power BI Desktop report and its data model
publish to the Power BI service. Other users connect to the data model by using the
Power BI service live connection, and base their own unique reports in their own
workspaces on the shared semantic model.
291
1. To publish the report, from Power BI Desktop, select Publish from the Home tab.
If you're not signed in to the Power BI service account, Power BI prompts you to
sign in.
2. Select the workspace destination to publish the report and semantic model to, and
choose Select. Anyone who has Build permission can then access that semantic
model. You can set Build permission in the Power BI service after publishing.
292
The publishing process begins, and Power BI Desktop shows the progress.
Once complete, Power BI Desktop shows success, and provides links to the report
in the Power BI service and to quick insights about the report.
293
3. Now that your report with its semantic model is in the Power BI service, you can
promote it, or attest to its quality and reliability. You can also request that the
report be certified by a central authority in your Power BI tenant. For more
information, see Endorse your content.
4. The last step is to set Build permission in the Power BI service for the semantic
model the report is based on. Build permission determines who can see and use
your semantic model. You can set Build permission in the workspace itself, or when
you share an app from the workspace. For more information, see Build permission
for shared semantic models.
1. In Power BI Desktop, on the Home tab, select Get data > Power BI semantic
models.
Or, select Get data, and on the Get Data screen, under All in the left pane, select
Power BI semantic models, and then select Connect.
2. The OneLake Catalog shows the workspaces you're a member of, and all the
shared semantic models you have Build permission for in any workspace.
Filter the list to My data or semantic models that are Endorsed in your org.
Search for a specific semantic model or filter by keyword.
See semantic model name, owner, workspace, last and next refresh time, and
sensitivity.
294
3. Select a semantic model, and then select Connect to establish a live connection to
the selected semantic model. Power BI Desktop loads the semantic model fields
and their values in real time.
Now you and others can create and share custom reports, all from the same semantic
model. This approach is a great way to have one knowledgeable person create a well-
formed semantic model. Many teammates can use that shared semantic model to create
their own reports.
In live connection mode, you can't modify the data model itself (for example, you
can't add new columns or tables). You can only create measures (Report Measures),
calculated columns, visual calculations, and calculated tables.
Only users with Build permission for a semantic model can connect to a published
semantic model by using the Power BI service live connection.
Hidden columns become visible to users with Build permission when they create
live connections to the semantic model in Power BI Desktop.
Free users only see datasets that are in their My Workspace and in Premium or
Fabric-based workspaces.
Because this connection is live, left navigation and modeling are disabled. The
behavior is similar to a SQL Server Analysis Services (SSAS) connection. However,
composite models in Power BI make it possible to combine data from different
sources. For more information, see Use composite models in Power BI Desktop.
Because this connection is live, row-level security (RLS) and similar connection
behaviors are enforced. This behavior is the same as when connected to SSAS.
If the owner modifies the original shared .pbix file, the shared semantic model and
report in the Power BI service are overwritten. Reports based on the semantic
model aren't overwritten, but any changes to the semantic model reflect in the
report.
Members of a workspace can't replace the original shared report. If they try to do
so, they get a prompt to rename the file and publish.
If members are required to publish, they need to download using the option A
Copy of your report and data. Make the necessary changes, then publish the report.
If you delete the shared semantic model in the Power BI service, reports based on
that semantic model will no longer work properly or display visuals. You can no
longer access that semantic model from Power BI Desktop.
Reports that share a semantic model on the Power BI service don't support
automated deployments that use the Power BI REST API.
Since the Power BI service connection is live, connecting to a dataset with a shared
report in other users' My Workspace is not supported.
Related content
For more information on DirectQuery and other Power BI data connection features,
check out the following resources:
296
Using DirectQuery for Power BI semantic models and Azure Analysis Services
(preview)
For more information about Power BI, see the following articles:
Feedback
Was this page helpful? Yes No
297
Import Excel workbooks into Power BI
Desktop
Article • 02/26/2025
With Power BI Desktop, you can easily import Excel workbooks that contain Power
Query queries and Power Pivot models into Power BI Desktop. Power BI Desktop
automatically creates reports and visualizations based on the Excel workbook. Once
imported, you can continue to improve and refine those reports with Power BI Desktop,
using the existing features and new features released with each Power BI Desktop
monthly update.
298
7 Note
To load or import Excel files from shared OneDrive for work or school folders
or from Microsoft 365 group folders, use the URL of the Excel file, and input it
into the Web data source in Power BI Desktop. There are a few steps you need
to follow to properly format the OneDrive for work or school URL; for
information and the correct series of steps, see Use OneDrive for work or
school links in Power BI Desktop.
Power BI Desktop analyzes the workbook and converts it into a Power BI Desktop
file (.pbix). This action is a one-time event. Once created with these steps, the
Power BI Desktop file has no dependence on the original Excel workbook. You can
modify, save, and share it without affecting the original workbook.
After the import finishes, a summary page appears that describes the items that
were converted. The summary page also lists any items that couldn't be imported.
299
4. Select Close.
Power BI Desktop imports the Excel workbook and loads a report based on the
workbook contents.
300
After the workbook is imported, you can continue working on the report. You can create
new visualizations, add data, or create new report pages by using any of the features
and capabilities included in Power BI Desktop.
ノ Expand table
Power Query queries All Power Query queries from Excel are converted to queries in Power BI
Desktop. If there are query groups defined in the Excel Workbook, the
same organization replicates in Power BI Desktop. All queries are loaded
unless they're set to Only Create Connection in the Import Data Excel
dialog box. Customize the load behavior by selecting Properties from the
Home tab of Power Query Editor in Power BI Desktop.
Power Pivot external All Power Pivot external data connections convert to queries in Power BI
data connections Desktop.
Linked tables or If there's a worksheet table in Excel linked to the data model, or linked to
current workbook a query (by using From Table or the Excel.CurrentWorkbook() function in
tables M), you'll see the following options:
Import the table to the Power BI Desktop file. This table is a one-time
snapshot of the data, after which the data is read-only in the table in
Power BI Desktop. There's a size limitation of 1 million characters (total,
combining all column headers and cells) for tables created using this
option.
Data model These data model objects convert to the equivalent objects in Power BI
calculated columns, Desktop. Note there are certain data categories that aren't available in
measures, KPIs, data Power BI Desktop, such as Image. In these cases, the data category
categories, and information resets for the columns in question.
relationships
301
Are there any limitations to importing a
workbook?
There are a few limitations to importing a workbook into Power BI Desktop:
Feedback
Was this page helpful? Yes No
302
Create visuals and reports with the
Microsoft Cost Management connector in
Power BI Desktop
Article • 04/30/2025
You can use the Microsoft Cost Management connector for Power BI Desktop to make
powerful, customized visualizations and reports that help you better understand your Azure
spend.
If you have an unsupported agreement, you can use Exports to save the cost data to a share
and then connect to it using Power BI. For more information, see Tutorial - Create and manage
Cost Management exports.
The Microsoft Cost Management connector uses OAuth 2.0 for authentication with Azure and
identifies users who are going to use the connector. Tokens generated in this process are valid
for a specific period. Power BI preserves the token for the next sign-in. OAuth 2.0 is a standard
for the process that goes on behind the scenes to ensure the secure handling of these
permissions. To connect, you must have Enterprise Administrator (read-only) or greater
permission to an EA billing account, or Contributor or greater permission to an MCA billing
account or billing profile.
7 Note
303
2. Select Azure from the list of data categories.
4. Select Connect.
5. In the dialog that appears, under Choose Scope, select Manually Input Scope for
Microsoft Customer Agreements, or select Enrollment Number for Enterprise
Agreements.
304
Connect to a billing account
To connect to a billing account, you need to retrieve your Billing account ID from the Azure
portal:
5. In the Azure Cost Management dialog in Power BI Desktop, under Choose Scope, select
Manually Input Scope.
305
6. Input the connection string as shown in the following example, replacing
{billingAccountId} with the data copied in the previous step.
/providers/Microsoft.Billing/billingAccounts/{billingAccountId}
Alternatively, for Choose Scope, select Enrollment Number and input the billing account
ID string as copied in the previous step.
Alternatively, if you want to download less than a month's worth of data you can set
Number of months to zero, then specify a date range using Start Date and End Date
values that equate to less than 31 days.
8. When prompted, sign in with your Azure user account and password. You must have
access to the billing account scope to successfully access the billing data.
3. In the menu, select Billing > Billing profiles, then select your billing profile.
306
4. In the menu, select Settings > Properties.
6. In the Azure Cost Management dialog in Power BI Desktop, under Choose Scope, select
Manually Input Scope.
7. Enter the billing profile resource ID string as shown in the following example, replacing
{billingAccountId} and {billingProfileId} with the data copied in the previous step.
/providers/Microsoft.Billing/billingAccounts/{billingAccountId}/billingProfiles/{bil
lingProfileId}
9. When prompted, sign in with your Azure user account and password. You must have
access to the billing profile to successfully access the billing profile data.
4. In the Azure Cost Management dialog in Power BI Desktop, under Choose Scope, select
Enrollment Number.
5. Under Scope Identifier, paste the billing account ID copied in the previous step.
307
7. When prompted, sign in with your Azure user account and password. You must use an
Enterprise Administrator account for Enterprise Agreements.
ノ Expand table
Balance summary EA only EA Enrollment Summary of the balance for the current billing
month for Enterprise Agreements.
Billing events MCA Billing Profile Event log of new invoices, credit purchases,
only etc. Microsoft Customer Agreement only.
Budgets EA, MCA EA Enrollment, MCA Budget details to view actual costs or usage
Billing Account, against existing budget targets.
MCA Billing Profile
308
Table Account Supported Scopes Description
Type
Credit lots MCA MCA Billing Profile Azure credit lot purchase details for the
only provided billing profile. Microsoft Customer
Agreement only.
Price sheets EA, MCA EA Enrollment, MCA Applicable meter rates for the provided billing
Billing Profile profile or EA enrollment.
RI charges EA, MCA EA Enrollment, MCA Charges associated to your Reserved Instances
Billing Profile (RIs) over the last 24 months. This table is in
the process of being deprecated; use RI
transactions instead.
RI transactions EA, MCA EA Enrollment, MCA List of transactions for reserved instances on
Billing Profile billing account scope.
RI usage details EA, MCA EA Enrollment, MCA Consumption details for your existing
Billing Profile Reserved Instances over the last month.
RI usage summary EA, MCA EA Enrollment, MCA Daily Azure reservation usage percentage.
Billing Profile
Usage details EA, MCA EA Enrollment, MCA A breakdown of consumed quantities and
Billing Account, estimated charges for the given billing profile
MCA Billing Profile on EA enrollment.
Usage details EA, MCA EA Enrollment, MCA A breakdown of consumed quantities and
amortized Billing Account, estimated amortized charges for the given
MCA Billing Profile billing profile on EA enrollment.
You can select a table to see a preview dialog. Select one or more tables by selecting the boxes
beside their names. When you're finished, select Load.
309
When you select Load, the data is loaded into Power BI Desktop. Once the data you selected is
loaded, the data tables and fields are shown in the Data pane.
Power BI doesn't support data row requests exceeding one million rows. Instead, you can
try using the export feature described in Create and manage Cost Management exports.
The Microsoft Cost Management data connector doesn't work with Office 365 GCC
customer accounts.
Data refresh: Cost and usage data is typically updated and available in the Azure portal
and supporting APIs within 8 to 24 hours, so we suggest you constrain Power BI
scheduled refreshes to once or twice a day.
Data source reuse: If you have multiple reports that are pulling the same data, and don't
need more report-specific data transformations, you should reuse the same data source.
310
Reusing the same data source reduces the amount of time required to pull the Usage
Details data.
You might receive a 400 bad request from the RI usage details when you try to refresh the data
if you've chosen a date parameter greater than three months. To mitigate the error, take the
following steps:
2. In Power Query Editor, select the RI usage details semantic model and select Advanced
Editor.
3. Update the Power Query code as shown in the following paragraphs, which split the calls
into three-month chunks. Make sure you note and retain your enrollment number, or
billing account/billing profile ID.
let
enrollmentNumber = "<<Enrollment Number>>",
optionalParameters1 = [startBillingDataWindow = "-9",
endBillingDataWindow = "-6"],
source1 = AzureCostManagement.Tables("Enrollment Number",
enrollmentNumber, 5, optionalParameters1),
riusagedetails1 = source1{[Key="riusagedetails"]}[Data],
optionalParameters2 = [startBillingDataWindow = "-6",
endBillingDataWindow = "-3"],
source2 = AzureCostManagement.Tables("Enrollment Number",
enrollmentNumber, 5, optionalParameters2),
riusagedetails2 = source2{[Key="riusagedetails"]}[Data],
riusagedetails = Table.Combine({riusagedetails1, riusagedetails2})
in
riusagedetails
let
billingProfileId = "<<Billing Profile Id>>",
optionalParameters1 = [startBillingDataWindow = "-9",
endBillingDataWindow = "-6"],
source1 = AzureCostManagement.Tables("Billing Profile Id",
311
billingProfileId, 5, optionalParameters1),
riusagedetails1 = source1{[Key="riusagedetails"]}[Data],
optionalParameters2 = [startBillingDataWindow = "-6",
endBillingDataWindow = "-3"],
source2 = AzureCostManagement.Tables("Billing Profile Id",
billingProfileId, 5, optionalParameters2),
riusagedetails2 = source2{[Key="riusagedetails"]}[Data],
riusagedetails = Table.Combine({riusagedetails1, riusagedetails2})
in
riusagedetails
4. Once you've updated the code with the appropriate update from the previous step, select
Done, then select Close & Apply.
You might run into a situation where tags aren't working in the usage details or the tags
column can't be transformed to json. This issue stems from the current UCDD API returning the
tags column by trimming the start and end brackets, which results in Power BI being unable to
transform the column because it returns it as a string. To mitigate this situation, take the
following steps.
3. In the Query Settings pane, under Applied Steps, you need to insert adding a custom
column to the steps, after the Navigation step.
4. From the menu ribbon, select Add Column > Custom Column.
5. Name the column TagsInJson, or whatever you prefer, then enter the following text in the
Custom column formula field:
DAX
6. Completing the previous steps creates a new column of tags in the json format.
7. You can now transfer and expand the column as you need to.
Authentication issues encountered with Microsoft Entra guest accounts. You might have the
appropriate permissions to access the enrollment or billing account, but receive an
authentication error similar to one of the following:
For guest accounts, use the following settings or options as you're prompted with the
authentication dialog when connecting with the Cost Management Power BI connector:
1. Select Sign-in.
2. Select Use another account (bottom of the dialog).
3. Select Sign-in options (bottom of the dialog box).
4. Select Sign into an organization.
5. For Domain name, provide the Fully Qualified Domain Name (FQDN) of the Microsoft
Entra domain into which you've been added as a guest.
6. Then, for Pick an account, select the user account that you’ve previously authenticated.
Related content
You can connect to many different data sources using Power BI Desktop. For more information,
see the following articles:
313
Connect to an Oracle database with
Power BI Desktop
Article • 08/15/2024
You can easily connect to Oracle to access and analyze data in Power BI Desktop. This
article describes the initial setup requirements for creating the connection.
OCMT is free software. It can be downloaded from the Oracle Client for Microsoft Tools
page and is available for 32-bit or 64-bit Power BI Desktop.
You can find step-by-step instructions on how to use OCMT and set up Oracle database
connectivity in Power BI Desktop here .
Related content
DirectQuery in Power BI
What is Power BI?
Data sources for the Power BI service
Oracle Client for Microsoft Tools
315
Feedback
Was this page helpful? Yes No
316
Enter data directly into Power BI
Desktop
Article • 02/26/2025
With Power BI Desktop, you can enter data directly and use that data in your reports
and visualizations. For example, you can copy portions of a workbook or web page, then
paste it into Power BI Desktop.
To enter data directly into Power BI Desktop in the form of a new table, select Enter data
from the Home ribbon.
317
If you want to shape the data you entered or pasted, select Edit to open Power Query
Editor. You can shape and transform the data before bringing it into Power BI Desktop.
Select Load to import the data as it appears.
When you select Load, Power BI Desktop creates a new table from your data, and makes
it available in the Fields pane. In the following image, Power BI Desktop shows your new
table, called Table, and the two fields within that table that were created.
And that’s it. It's that easy to enter data into Power BI Desktop.
You're now ready to use the data in Power BI Desktop. You can create visuals, reports, or
interact with any other data you might want to connect with and import, such as Excel
workbooks, databases, or any other data source.
7 Note
To update, add, or delete data within items created by Enter Data, changes must be
made in Power BI Desktop, and published. Data updates can't be made directly
from the Power BI service.
Related content
There are all sorts of data you can connect to using Power BI Desktop. For more
information on data sources, check out the following resources:
Feedback
Was this page helpful? Yes No
319
Connect to webpages from Power BI
Desktop
Article • 07/30/2024
You can connect to a webpage and import its data into Power BI Desktop, to use in your
visuals and in your data models.
In Power BI Desktop, select Get data > Web from the Home ribbon.
A dialog appears, asking for the URL of the webpage from which you want to import
data.
320
Once you've typed or pasted the URL, select OK.
Power BI Desktop connects to the webpage and then presents the page's available data
in the Navigator window. When you select one of the available data elements, such as
Individual factor scores..., the Navigator window displays a preview of that data on the
right side of the window.
You can choose the Transform Data button, which launches Power Query Editor, where
you can shape and transform the data on that webpage before importing it into Power
BI Desktop. Or you can select the Load button, and import all of the data elements you
selected in the left pane.
321
When you select Load, Power BI Desktop imports the selected items, and makes them
available in the Data pane, found on the right side of the Report view in Power BI
Desktop.
That's all there is to connecting to a webpage and bringing its data into Power BI
Desktop.
From there, you can drag those fields onto the Report canvas and create all the
visualizations you want. You can also use the data from that webpage just like you
would any other data. You can shape it, you can create relationships between it and
other data sources in your model, and otherwise do what you like to create the Power BI
report you want.
To see connecting to a webpage in more depth and action, take a look at Get started
with Power BI Desktop.
To change this option, select File > Options and settings > Options, then select
Security in the left pane.
322
Related content
There are all sorts of data you can connect to using Power BI Desktop. For more
information on data sources, check out the following resources:
Feedback
Was this page helpful? Yes No
323
Provide product feedback | Ask the community
324
Run Python scripts in Power BI Desktop
Article • 09/06/2024
You can run Python scripts directly in Power BI Desktop and import the resulting
datasets into a Power BI Desktop data model. From this model, you can create reports
and share them on the Power BI service. This article shows you how to enable Python
scripting and create a Python script that you can run to import data.
Prerequisites
To run Python scripts in Power BI Desktop, you need to install Python on your local
machine. You can download Python from the Python website . The current
Python scripting release supports Unicode characters and spaces in the installation
path.
The Power BI Python integration requires installation of the following two Python
packages. In a console or shell, use the pip command-line tool to install the
packages. The pip tool is packaged with recent Python versions.
Pandas is a software library for data manipulation and analysis. Pandas offers
data structures and operations for manipulating numerical tables and time
series. To import into Power BI, Python data must be in a pandas data frame .
A data frame is a two-dimensional data structure, such as a table with rows and
columns.
Console
1. In Power BI Desktop, select File > Options and settings > Options > Python
scripting. The Python script options page appears.
325
2. If necessary, supply or edit your local Python installation path under Detected
Python home directories. In the preceding image, the Python's installation local
path is C:\Users\Python. If you have more than one local Python installation, make
sure to select the one that you want to use.
3. Select OK.
) Important
Power BI runs scripts directly by using the python.exe executable from the directory
you provide in Settings. Python distributions that require an extra step to prepare
the environment, such as Conda, might fail to run. To avoid these issues, use the
official Python distribution from https://www.python.org . Another possible
solution is to start Power BI Desktop from your custom Python environment
prompt.
326
Create a Python script
Create a script in your local Python development environment and make sure it runs
successfully. To prepare and run a Python script in Power BI Desktop, there are a few
limitations:
Only pandas data frames import, so make sure the data you want to import to
Power BI is represented in a data frame.
Any Python script that runs longer than 30 minutes times out.
Interactive calls in the Python script, such as waiting for user input, halt the script's
execution.
If you set a working directory within the Python script, you must define a full path
to the working directory rather than a relative path.
Nested tables aren't supported.
Here's a simple example Python script that imports pandas and uses a data frame:
Python
import pandas as pd
data = [['Alex',10],['Bob',12],['Clarke',13]]
df = pd.DataFrame(data,columns=['Name','Age'])
print (df)
Output
Name Age
0 Alex 10.0
1 Bob 12.0
2 Clarke 13.0
1. In the Home group of the Power BI Desktop ribbon, select Get data.
2. In the Get Data dialog box, select Other > Python script, and then select Connect.
Power BI uses your latest installed Python version as the Python engine.
327
3. On the Python script screen, paste your Python script into the Script field, and
select OK.
328
4. If the script runs successfully, the Navigator window appears, and you can load the
data. Select the df table, and then select Load.
Power BI imports the data, and you can use it to create visualizations and reports. To
refresh the data, select Refresh in the Home group of the Power BI Desktop ribbon.
When you refresh, Power BI runs the Python script again.
) Important
If Python isn't installed or identified, a warning appears. You might also get a
warning if you have multiple local machine installations.
329
Related content
For more information about Python in Power BI, see:
Feedback
Was this page helpful? Yes No
330
Use Python in Power Query Editor
Article • 02/13/2023
You can use Python, a programming language widely used by statisticians, data
scientists, and data analysts, in the Power BI Desktop Power Query Editor. This
integration of Python into Power Query Editor lets you perform data cleansing using
Python, and perform advanced data shaping and analytics in datasets, including
completion of missing data, predictions, and clustering, just to name a few. Python is a
powerful language, and can be used in Power Query Editor to prepare your data model
and create reports.
Prerequisites
You'll need to install Python and pandas before you begin.
Install Python - To use Python in Power BI Desktop's Power Query Editor, you need
to install Python on your local machine. You can download and install Python for
free from many locations, including the Official Python download page , and the
Anaconda .
Install pandas - To use Python with the Power Query Editor, you'll also need to
install pandas . Pandas is used to move data between Power BI and the Python
environment.
1. First, load your data into Power BI Desktop. In this example, load the
EuStockMarkets_NA.csv file and select Get data > Text/CSV from the Home ribbon
in Power BI Desktop.
331
2. Select the file and select Open, and the CSV is displayed in the CSV file dialog.
3. Once the data is loaded, you see it in the Fields pane in Power BI Desktop.
332
4. Open Power Query Editor by selecting Transform data from the Home tab in
Power BI Desktop.
5. In the Transform tab, select Run Python Script and the Run Python Script editor
appears as shown in the next step. Rows 15 and 20 suffer from missing data, as do
other rows you can't see in the following image. The following steps show how
Python completes those rows for you.
333
6. For this example, enter the following script code:
Python
import pandas as pd
completedData = dataset.fillna(method='backfill', inplace=False)
dataset["completedValues"] = completedData["SMI missing values"]
7 Note
You need to have the pandas library installed in your Python environment for
the previous script code to work properly. To install pandas, run the following
command in your Python installation: pip install pandas
When put into the Run Python Script dialog, the code looks like the following
example:
7. After you select OK, Power Query Editor displays a warning about data privacy.
334
8. For the Python scripts to work properly in the Power BI service, all data sources
need to be set to public. For more information about privacy settings and their
implications, see Privacy Levels.
Notice a new column in the Fields pane called completedValues. Notice there are a
few missing data elements, such as on row 15 and 18. Take a look at how Python
handles that in the next section.
With just three lines of Python script, Power Query Editor filled in the missing values
with a predictive model.
335
Once that visual is complete, and any other visuals you might want to create using
Power BI Desktop, you can save the Power BI Desktop file. Power BI Desktop files save
with the .pbix file name extension. Then use the data model, including the Python scripts
that are part of it, in the Power BI service.
7 Note
Want to see a completed .pbix file with these steps completed? You're in luck. You
can download the completed Power BI Desktop file used in these examples right
here .
Once you upload the .pbix file to the Power BI service, a couple more steps are
necessary to enable data to refresh in the service and to enable visuals to be updated in
the service. The data needs access to Python for visuals to be updated. The other steps
are the following steps:
Enable scheduled refresh for the dataset. To enable scheduled refresh for the
workbook that contains your dataset with Python scripts, see Configuring
scheduled refresh, which also includes information about Personal Gateway.
Install the Personal Gateway. You need a Personal Gateway installed on the
machine where the file is located, and where Python is installed. The Power BI
service must access that workbook and re-render any updated visuals. For more
information, see install and configure Personal Gateway.
336
All Python data source settings must be set to Public, and all other steps in a query
created in Power Query Editor must also be public. To get to data source settings,
in Power BI Desktop select File > Options and settings > Data source settings.
From the Data Source Settings dialog, select the data sources and then select Edit
Permissions... and ensure that the Privacy Level is set to Public.
337
To enable scheduled refresh of your Python visuals or dataset, you need to enable
Scheduled refresh and have a Personal Gateway installed on the computer that
houses the workbook and the Python installation. For more information on both,
see the previous section in this article, which provides links to learn more about
each.
Nested tables, which are table of tables, are currently not supported.
There are all sorts of things you can do with Python and custom queries, so explore and
shape your data just the way you want it to appear.
338
Use an external Python IDE with Power
BI
Article • 01/13/2023
With Power BI Desktop, you can use your external Python Integrated Development
Environment (IDE) to create and refine Python scripts, then use those scripts in Power BI.
339
You can specify which Python IDE to use, and have it launch automatically from within
Power BI Desktop.
Requirements
To use this feature, you need to install a Python IDE on your local computer. Power BI
Desktop doesn't include, deploy, or install the Python engine, so you must separately
install Python on your local computer. You can choose which Python IDE to use, with the
following options:
You can install your favorite Python IDE, many of which are available for free, such
as the Visual Studio Code download page .
You can also install a different Python IDE and have Power BI Desktop launch that
Python IDE by doing one of the following:
You can associate .PY files with the external IDE you want Power BI Desktop to
launch.
You can specify the .exe that Power BI Desktop launches by selecting Other from
the Python script options section of the Options dialog. You can bring up the
Options dialog by going to File > Options and settings > Options.
340
If you have multiple Python IDEs installed, you can specify which is launched by
selecting it from the Detected Python IDEs drop-down in the Options dialog.
By default, Power BI Desktop launches Visual Studio Code as the external Python IDE if
it's installed on your local computer. If Visual Studio Code isn't installed and you have
Visual Studio that is launched instead. If neither of those Python IDEs is installed, the
application associated with .PY files is launched.
And if no .PY file association exists, it's possible to specify a path to a custom IDE in the
Set a Python home directory section of the Options dialog. You can also launch a
different Python IDE by selecting the Settings gear icon beside the Launch Python IDE
arrow icon, in Power BI Desktop.
341
2. Add a Python visualization to your canvas. If you haven't enabled script visuals yet,
you're prompted to do so.
3. After script visuals are enabled, a blank Python visual appears that's ready to
display the results of your script. The Python script editor pane also appears.
4. Now you can select the fields you want to use in your Python script. When you
select a field, the Python script editor field automatically creates script code based
on the field or fields you select. You can either create or paste your Python script
directly in the Python script editor pane, or you can leave it empty.
342
7 Note
5. You can now launch your Python IDE directly from Power BI Desktop. Select the
Launch Python IDE button, found on the right side of the Python script editor title
bar, as shown in this screenshot.
343
7 Note
Power BI Desktop adds the first three lines of the script so it can import your
data from Power BI Desktop once you run the script.
7. Any script you created in the Python script editor pane of Power BI Desktop
appears, starting in line 4, in your Python IDE. At this point, you can create your
Python script in the Python IDE. Once your Python script is complete in your
Python IDE, you need to copy and paste it back into the Python script editor pane
in Power BI Desktop, excluding the first three lines of the script that Power BI
Desktop automatically generated. Don't copy the first three lines of script back into
Power BI Desktop, those lines were only used to import your data to your Python
IDE from Power BI Desktop.
Known limitations
Launching a Python IDE directly from Power BI Desktop has a few limitations:
Automatically exporting your script from your Python IDE into Power BI Desktop
isn't supported.
344
Next steps
Take a look at the following additional information about Python in Power BI.
345
Create Power BI visuals with Python
05/30/2025
This tutorial helps you get started creating visuals with Python data in Power BI Desktop. You
use a few of the many available options and capabilities for creating visual reports by using
Python, pandas, and the Matplotlib library.
Prerequisites
Work through Run Python scripts in Power BI Desktop to:
Python
import pandas as pd
df = pd.DataFrame({
'Fname':['Harry','Sally','Paul','Abe','June','Mike','Tom'],
'Age':[21,34,42,18,24,80,22],
'Weight': [180, 130, 200, 140, 176, 142, 210],
'Gender':['M','F','M','M','F','M','M'],
'State':
['Washington','Oregon','California','Washington','Nevada','Texas','Nevada'],
'Children':[4,1,2,3,0,2,0],
'Pets':[3,2,2,5,0,1,5]
})
print (df)
346
2. In the Enable script visuals dialog box that appears, select Enable.
A placeholder Python visual image appears on the report canvas, and the Python script
editor appears along the bottom of the center pane.
3. Drag the Age, Children, Fname, Gender, Pets, State, and Weight fields to the Values
section where it says Add data fields here.
347
Based on your selections, the Python script editor generates the following binding code.
The editor creates a dataset dataframe with the fields you add.
The default aggregation is Don't summarize.
Similar to table visuals, fields are grouped and duplicate rows appear only once.
4. With the dataframe automatically generated by the fields you selected, you can write a
Python script that results in plotting to the Python default device. When the script is
complete, select the Run icon from the Python script editor title bar to run the script and
generate the visual.
Tips
Your Python script can use only fields that are added to the Values section. You can add
or remove fields while you work on your Python script. Power BI Desktop automatically
348
detects field changes. As you select or remove fields from the Values section, supporting
code in the Python script editor is automatically generated or removed.
In some cases, you might not want automatic grouping to occur, or you might want all
rows to appear, including duplicates. In those cases, you can add an index field to your
dataset that causes all rows to be considered unique and prevents grouping.
You can access columns in the dataset by using their names. For example, you can code
dataset["Age"] in your Python script to access the age field.
Power BI Desktop replots the visual when you select Run from the Python script editor
title bar, or whenever a data change occurs due to data refresh, filtering, or highlighting.
When you run a Python script that results in an error, the Python visual isn't plotted, and
an error message appears on the canvas. For error details, select See details in the
message.
To get a larger view of the visualizations, you can minimize the Python script editor.
1. In the Python script editor, under Paste or type your script code here, enter this code:
Python
Your Python script editor pane should now look like the following image:
The code imports the Matplotlib library, which plots and creates the visual.
2. Select the Run button to generate the following scatter plot in the Python visual.
349
Create a line plot with multiple columns
Create a line plot for each person that shows their number of children and pets.
1. Under Paste or type your script code here, remove or comment out the previous code,
and enter the following Python code:
Python
2. Select the Run button to generate the following line plot with multiple columns:
350
Create a bar plot
Create a bar plot for each person's age.
1. Under Paste or type your script code here, remove or comment out the previous code,
and enter the following Python code:
Python
351
Limitations
Python visuals in Power BI Desktop have the following limitations:
The data the Python visual uses for plotting is limited to 150,000 rows. If more than
150,000 rows are selected, only the top 150,000 rows are used, and a message appears on
the image. The input data also has a limit of 250 MB.
If the input dataset of a Python visual has a column that contains a string value longer
than 32,766 characters, that value is truncated.
If a Python visual calculation exceeds five minutes, the execution times out, which results
in an error.
As with other Power BI Desktop visuals, if you select data fields from different tables with
no defined relationship between them, an error occurs.
Python visuals refresh upon data updates, filtering, and highlighting. The image itself isn't
interactive.
Python visuals respond to highlighting elements in other visuals, but you can't select
elements in the Python visual to cross filter other elements.
Only plots to the Python default display device display correctly on the canvas. Avoid
explicitly using a different Python display device.
352
Python visuals don't support renaming input columns. Columns are referred to by their
original names during script execution.
Security
Python visuals use Python scripts, which could contain code that has security or privacy risks.
When you attempt to view or interact with a Python visual for the first time, you get a security
warning. Enable Python visuals only if you trust the author and source, or after you review and
understand the Python script.
Licensing
Python visuals require a Power BI Pro or Premium Per User (PPU) license to render in reports,
refresh, filter, and cross filter. Users of free Power BI can consume only reports that are shared
with them in Premium workspaces.
ノ Expand table
Python visuals in the service are supported in Fabric regions. This means that reports
published to workspaces will display the Python chart visual when the workspace has (1) a
Fabric license, (2) a Pro or PPU license, or (3) a premium license and the PBI home tenant
is in a region with Fabric Spark workload availability. Python visuals are supported in
Desktop for all users.
353
For more information about Power BI Pro licenses and how they differ from free licenses, see
Purchase and assign Power BI Pro user licenses.
Related content
This tutorial barely scratches the surface of the options and capabilities for creating visual
reports using Python, pandas, and the Matplotlib library. For more information, see the
following resources:
354
Learn which Python packages are
supported in Power BI
05/30/2025
You can use the powerful Python programming language to create visuals in Power BI. Many
Python packages are supported in Power BI and more are being supported all the time.
The following sections provide an alphabetical table of which Python packages are supported
in Power BI.
Power BI usually supports Python packages with free and open-source software licenses
such as GPL-2, GPL-3, MIT+, and so on.
Power BI supports packages published in PyPI. The service doesn't support private or
custom Python packages. Users are encouraged to make their private packages available
on PyPI before requesting the package be available in Power BI.
For Python visuals in Power BI Desktop, you can install any package, including custom
Python packages.
For security and privacy reasons, Python packages that provide client-server queries over
the web in the service, aren't supported. Networking is blocked for such attempts.
The approval process for including a new Python package has a tree of dependencies.
Some dependencies required to be installed in the service can't be supported.
ノ Expand table
355
Package Version
asttokens 2.4.1
certifi 2024.8.30
comm 0.2.2
contourpy 1.3.0
cycler 0.12.1
debugpy 1.8.5
decorator 5.1.1
exceptiongroup 1.2.2
executing 2.1.0
fonttools 4.53.1
importlib_metadata 8.4.0
ipykernel 6.29.4
ipython 8.27.0
jedi 0.19.1
joblib 1.4.2
jupyter_client 8.6.2
jupyter_core 5.7.2
kiwisolver 1.4.5
matplotlib 3.8.4
munkres 1.1.4
nest_asyncio 1.6.0
numpy 2.0.0
packaging 24.1
pandas 2.2.2
parso 0.8.4
356
Package Version
patsy 0.5.6
pexpect 4.9.0
pickleshare 0.7.5
pillow 10.4.0
pip 24
platformdirs 4.2.2
ply 3.11
prompt_toolkit 3.0.47
psutil 6.0.0
ptyprocess 0.7.0
pure_eval 0.2.3
Pygments 2.18.0
pyparsing 3.1.2
PyQt5 5.15.9
pytz 2024.1
pyzmq 26.2.0
scipy 1.13.1
seaborn 0.13.2
setuptools 70.0.0
sip 6.7.12
six 1.16.0
statsmodels 0.14.2
357
Package Version
threadpoolctl 3.5.0
toml 0.10.2
tomli 2.0.1
tornado 6.4.1
traitlets 5.14.3
typing_extensions 4.12.2
tzdata 2024.1
unicodedata2 15.1.0
wcwidth 0.2.13
wheel 0.44.0
xgboost 2.0.3
zipp 3.20.1
Related content
For more information about Python in Power BI, take a look at the following articles:
358
Run R scripts in Power BI Desktop
Article • 02/26/2025
You can run R scripts directly in Power BI Desktop and import the resulting semantic
models into a Power BI Desktop data model.
Install R
To run R scripts in Power BI Desktop, you need to install R on your local machine. You
can download and install R for free from many locations, including the CRAN
Repository . The current release supports Unicode characters and spaces (empty
characters) in the installation path.
Run R scripts
Using just a few steps in Power BI Desktop, you can run R scripts and create a data
model. With the data model, you can create reports and share them on the Power BI
service. R scripting in Power BI Desktop now supports number formats that contain
decimals (.) and commas (,).
Prepare an R script
To run an R script in Power BI Desktop, create the script in your local R development
environment, and make sure it runs successfully.
To run the script in Power BI Desktop, make sure the script runs successfully in a new
and unmodified workspace. This prerequisite means that all packages and dependencies
must be explicitly loaded and run. You can use source() to run dependent scripts.
When you prepare and run an R script in Power BI Desktop, there are a few limitations:
Because only data frames are imported, remember to represent the data you want
to import to Power BI in a data frame.
Columns typed as Complex and Vector aren't imported, and they're replaced with
error values in the created table.
Values of N/A are translated to NULL values in Power BI Desktop.
If an R script runs longer than 30 minutes, it times out.
Interactive calls in the R script, such as waiting for user input, halt the script's
execution.
359
When setting the working directory within the R script, you must define a full path
to the working directory, rather than a relative path.
R scripts can't run in the Power BI service.
1. In Power BI Desktop, select Get data, choose Other > R script, and then select
Connect:
2. If R is installed on your local machine, just copy your script into the script window
and select OK. The latest installed version is displayed as your R engine.
360
3. Select OK to run the R Script. When the script runs successfully, you can then
choose the resulting data frames to add to the Power BI model.
You can control which R installation to use to run your script. To specify your R
installation settings, choose File > Options and settings > Options, then select R
scripting. Under R script options, the Detected R home directories dropdown list shows
your current R installation choices. If the R installation you want isn't listed, pick Other,
and then browse to or enter your preferred R installation folder in Set an R home
directory.
361
Refresh
You can refresh an R script in Power BI Desktop. When you refresh an R script, Power BI
Desktop runs the R script again in the Power BI Desktop environment.
Related content
Take a look at the following additional information about R in Power BI.
Feedback
362
Was this page helpful? Yes No
363
Use R in Power Query Editor
Article • 02/26/2025
Install R
You can download R for free from the CRAN Repository .
Install mice
As a prerequisite, you must install the mice library in your R environment. Without
mice, the sample script code doesn't work properly. The mice package implements a
method to deal with missing data.
install.packages('mice')
364
2. Load the file into Power BI Desktop. From the Home tab, select Get data >
Text/CSV.
3. Select the EuStockMarkets_NA.csv file, and then choose Open. The CSV data is
displayed in the Text/CSV file dialog.
4. Select Load to load the data from the file. After Power BI Desktop loads the data,
the new table appears in the Fields pane.
365
5. To open Power Query Editor, from the Home ribbon select Transform data.
6. From the Transform tab, select Run R script. The Run R script editor appears. Rows
15 and 20 have missing data, as do other rows you can't see in the image. The
following steps show how R completes those rows for you.
7. For this example, enter the following script code in the Script box of the Run R
script window.
366
library(mice)
tempData <- mice(dataset,m=1,maxit=50,meth='pmm',seed=100)
completedData <- complete(tempData,1)
output <- dataset
output$completedValues <- completedData$"SMI missing values"
7 Note
You might need to overwrite a variable named output to properly create the
new semantic model with the filters applied.
8. Select OK. Power Query Editor displays a warning about data privacy.
9. Inside the warning message, select Continue. In the Privacy levels dialog that
appears, set all data sources to Public for the R scripts to work properly in the
Power BI service.
367
For more information about privacy settings and their implications, see Power BI
Desktop privacy levels.
When you run the script, you see the following result:
When you select Table next to Output in the table that appears, the table is
presented, as shown in the following image.
Notice the new column in the Fields pane called completedValues. The SMI
missing values column has a few missing data elements. Take a look at how R
handles that in the next section.
With just five lines of R script, Power Query Editor filled in the missing values with a
predictive model.
368
You can save all completed visuals in one Power BI Desktop .pbix file and use the data
model and its R scripts in the Power BI service.
7 Note
You can download a .pbix file with all these steps completed.
After you upload the .pbix file to the Power BI service, you need to take other steps to
enable service data refresh and updated visuals:
Enable scheduled refresh for the semantic model: To enable scheduled refresh for
the workbook containing your semantic model with R scripts, see Configuring
scheduled refresh. This article also includes information about on-premises data
gateways.
369
All R data source settings must be set to Public. All other steps in a Power Query
Editor query must also be public.
To get to the data source settings, in Power BI Desktop, select File > Options and
settings > Data source settings.
In the Data source settings dialog, select one or more data sources, and then
select Edit Permissions. Set the Privacy Level to Public.
370
Related content
There are all sorts of things you can do with R and custom queries. Explore and shape
your data just the way you want it to appear.
Feedback
Was this page helpful? Yes No
371
Use an external R IDE with Power BI
Article • 02/26/2025
With Power BI Desktop, you can use your external R IDE (Integrated Development
Environment) to create and refine R scripts, then use those scripts in Power BI.
372
Launch your external R IDE from Power BI Desktop and have your data automatically
imported and displayed in the R IDE. From there, you can modify the script in that
external R IDE, then paste it back into Power BI Desktop to create Power BI visuals and
reports. Specify which R IDE you would like to use, and have it launch automatically from
within Power BI Desktop.
Requirements
To use this feature, you need to install an R IDE on your local computer. Power BI
Desktop doesn't include, deploy, or install the R engine, so you must separately install R
on your local computer. You can choose which R IDE to use, with the following options:
You can install your favorite R IDE, many of which are available for free, such as the
CRAN Repository .
Power BI Desktop also supports R Studio and Visual Studio 2015 with R Tools for
Visual Studio editors.
You can also install a different R IDE and have Power BI Desktop launch that R IDE
by doing one of the following:
You can associate .R files with the external IDE you want Power BI Desktop to
launch.
You can specify the .exe that Power BI Desktop should launch by selecting
Other from the R Script Options section of the Options dialog. You can bring up
the Options dialog by going to File > Options and settings > Options.
373
If you have multiple R IDEs installed, you can specify which will be launched by selecting
it from the Detected R IDEs drop-down in the Options dialog.
By default, Power BI Desktop will launch R Studio as the external R IDE if it's installed on
your local computer; if R Studio isn't installed and you have Visual Studio 2015 with R
Tools for Visual Studio, that will be launched instead. If neither of those R IDEs is
installed, the application associated with .R files is launched.
And if no .R file association exists, it's possible to specify a path to a custom IDE in the
Browse to your preferred R IDE section of the Options dialog. You can also launch a
different R IDE by selecting the Settings gear icon beside the Edit script in external IDE
arrow icon, in Power BI Desktop.
374
Launch an R IDE from Power BI Desktop
To launch an R IDE from Power BI Desktop, take the following steps:
2. When script visuals are enabled, you can select an R visual from the Visualizations
pane, which creates a blank R visual that's ready to display the results of your
script. The R script editor pane also appears.
3. Select some fields from the Fields pane that you want to work with. If you haven't
enabled script visuals yet, you are prompted to do so.
4. Now you can select the fields you want to use in your R script. When you select a
field, the R script editor field automatically creates script code based on the field
or fields you select. You can either create (or paste) your R script directly in the R
script editor pane, or you can leave it empty.
375
7 Note
5. You can now launch your R IDE directly from Power BI Desktop. Select the Edit
script in external IDE button, found on the right side of the R script editor title
bar, as shown below.
376
7 Note
Power BI Desktop adds the first three lines of the script so it can import your
data from Power BI Desktop once you run the script.
7. Any script you created in the R script editor pane of Power BI Desktop appears
starting in line 4 in your R IDE. At this point, you can create your R script in the R
IDE. Once your R script is complete in your R IDE, you need to copy and paste it
back into the R script editor pane in Power BI Desktop, excluding the first three
lines of the script that Power BI Desktop automatically generated. Don't copy the
first three lines of script back into Power BI Desktop, those lines were only used to
import your data to your R IDE from Power BI Desktop.
Known limitations
Launching an R IDE directly from Power BI Desktop has a few limitations:
Automatically exporting your script from your R IDE into Power BI Desktop isn't
supported.
R Client editor (RGui.exe) isn't supported, because the editor itself doesn't support
opening files.
Related content
Take a look at the following additional information about R in Power BI.
377
Running R Scripts in Power BI Desktop
Create Power BI visuals using R
Feedback
Was this page helpful? Yes No
378
Create visuals by using R packages in the
Power BI service
05/30/2025
You can use the powerful R programming language to create visuals in the Power BI service.
The Power BI service supports almost thousand packages.
The following sections provide an alphabetical table of which R packages are supported in
Power BI, and which aren't. For more information about R in Power BI, see the R visuals article.
379
ノ Expand table
Package Version
abc 2.2.1
abc.data 1.1
abind 1.4-5
acepack 1.4.2
actuar 3.3-4
ade4 1.7-22
adegenet 2.1.10
AdMit 2.1.9
AER 1.2-13
agricolae 1.3-7
AlgDesign 1.2.1
alluvial 0.1-2
andrews 1.1.2
anomalize 0.3.0
anytime 0.3.9
aod 1.3.3
apcluster 1.4.13
ape 5.8
aplpack 1.3.5
approximator 1.2-8
arm 1.14-4
arsenal 3.6.3
arules 1.7-8
arulesViz 1.5.3
ash 1.0-15
380
Package Version
askpass 1.2.0
assertthat 0.2.1
audio 0.1-11
autocogs 0.1.4
automap 1.1-12
aweek 1.0.3
BACCO 2.1-0
backports 1.5.0
BaM 1.0.3
BAS 1.7.1
base64 2.0.1
base64enc 0.1-3
BayesDA 2012.04-1
BayesFactor 0.9.12-4.7
bayesGARCH 2.1.10
bayesm 3.1-6
bayesmix 0.7-6
bayesplot 1.11.1
bayesQR 2.4
bayesSurv 3.7
bayestestR 0.14.0
BayesTree 0.3-1.5
BayesX 0.3-3
BCBCSF 1.0-1
BDgraph 2.73
beanplot 1.3.1
381
Package Version
beepr 2
beeswarm 0.4.0
benford.analysis 0.1.5
BenfordTests 1.2.0
bfp 0.0-48
BH 1.84.0-0
bibtex 0.5.1
biglm 0.9-3
bindr 0.1.1
bindrcpp 0.2.3
binom 1.1-1.1
BiocManager 1.30.25
bit 4.0.5
bit64 4.0.5
bitops 1.0-8
bizdays 1.0.16
blandr 0.6.0
blme 1.0-5
blob 1.2.4
BLR 1.6
BMA 3.18.17
bmp 0.3
BMS 0.3.5
bnlearn 4.8.3
boa 1.1.8-2
BoolNet 2.1.9
382
Package Version
Boom 0.9.15
BoomSpikeSlab 1.2.6
boot 1.3-31
bootstrap 2019.6
Boruta 8.0.0
bqtl 1.0-36
BradleyTerry2 1.1-2
brew 1.0-10
brglm 0.7.2
brio 1.1.5
broom 1.0.6
broom.helpers 1.17.0
broom.mixed 0.2.9.5
bslib 0.8.0
bspec 1.6
bspmma 0.1-2
bsts 0.9.10
bupaR 0.5.4
C50 0.1.8
ca 0.71.1
cachem 1.1.0
Cairo 1.6-2
cairoDevice 2.28.2.2
calibrate 1.7.7
calibrator 1.2-8
callr 3.7.6
383
Package Version
car 3.1-2
carData 3.0-5
cards 0.2.2
caret 6.0-94
catnet 1.16.1
caTools 1.18.3
cclust 0.6-26
cellranger 1.1.0
ChainLadder 0.2.19
changepoint 2.2.4
checkmate 2.3.2
checkpoint 1.0.2
chk 0.9.2
choroplethrMaps 1.0.1
chron 2.3-61
circlize 0.4.16
Ckmeans.1d.dp 4.3.5
class 7.3-22
classInt 0.4-10
cli 3.6.3
ClickClust 1.1.6
clickstream 1.3.3
clipr 0.8.0
clock 0.7.1
clue 0.3-65
cluster 2.1.6
384
Package Version
clv 0.3-2.4
cmprsk 2.2-12
coda 0.19-4.1
codetools 0.2-20
coefplot 1.2.8
coin 1.4-3
collapsibleTree 0.1.8
colorRamps 2.3.4
colorspace 2.1-1
colourpicker 1.3.0
colourvalues 0.3.9
combinat 0.0-8
commonmark 1.9.1
compositions 2.0-8
CompQuadForm 1.4.3
confintr 1.0.2
conflicted 1.2.0
conquer 1.3.3
contfrac 1.1-12
CORElearn 1.57.3
corpcor 1.6.10
corrgram 1.14
corrplot 0.94
covr 3.6.4
cowplot 1.1.3
cplm 0.7-12
385
Package Version
cpp11 0.5.0
crayon 1.5.2
credentials 2.0.1
crosstalk 1.2.1
crul 1.5.0
ctv 0.9-5
cubature 2.0.4.6
Cubist 0.4.4
curl 5.2.1
cvar 0.5
CVST 0.2-3
cvTools 0.3.3
d3heatmap 0.6.1.2
d3Network 0.5.2.1
d3r 1.1.0
data.table 1.15.4
data.tree 1.1.0
datasauRus 0.1.8
datawizard 0.12.3
date 1.2-42
DBI 1.2.3
dbplyr 2.5.0
dbscan 1.2-0
dclone 2.3-2
dcurver 0.9.2
ddalpha 1.3.15
386
Package Version
deal 1.2-42
debugme 1.2.0
decido 0.3.0
deepnet 0.2.1
deldir 2.0-4
dendextend 1.17.1
DEoptimR 1.1-3
Deriv 4.1.3
desc 1.4.3
descr 1.1.8
deSolve 1.4
devtools 2.4.5
diagram 1.6.5
DiagrammeR 1.0.11
DiagrammeRsvg 0.1
dials 1.3.0
DiceDesign 1.1
dichromat 2.0-0.1
diffobj 0.3.5
digest 0.6.33
dimRed 0.2.6
diptest 0.77-1
distcrete 1.0.3
distributional 0.4.0
DistributionUtils 0.6-1
distrom 1.0.1
387
Package Version
dlm 1.1-6
DMwR 0.4.1
doBy 4.6.22
doFuture 1.0.1
doParallel 1.0.17
doSNOW 1.0.20
dotCall64 1.1-1
downlit 0.4.4
downloader 0.4
dplyr 1.1.4
DRR 0.0.4
dse 2020.2-1
DT 0.33
dtplyr 1.3.1
dtt 0.1-2
dtw 1.23-1
dygraphs 1.1.1.6
dynlm 0.3-6
e1071 1.7-14
earth 5.3.3
EbayesThresh 1.4-12
ebdbNet 1.2.8
ecm 7.2.0
edeaR 0.9.4
effects 4.2-2
effectsize 0.8.9
388
Package Version
egg 0.4.5
ellipse 0.5.0
ellipsis 0.3.2
elliptic 1.4-0
emmeans 1.10.4
emulator 1.2-24
energy 1.7-12
english 1.2-6
ensembleBMA 5.1.8
entropy 1.3.1
epitools 0.5-10.1
epitrix 0.4.0
estimability 1.5
eulerr 7.0.2
evaluate 0.22
evd 2.3-7
evdbayes 1.1-3
eventdataR 0.3.1
exactRankTests 0.8-35
expint 0.1-8
expm 1.0-0
extraDistr 1.10.0
extrafont 0.19
extrafontdb 1
extremevalues 2.3.4
ez 4.4-0
389
Package Version
factoextra 1.0.7
FactoMineR 2.11
fansi 1.0.5
faoutlier 0.7.6
farver 2.1.2
fastICA 1.2-5.1
fastmap 1.2.0
fastmatch 1.1-4
fBasics 4041.97
fda 6.1.8
fdrtool 1.2.18
fds 1.8
fGarch 4033.92
fields 16.2
filehash 2.4-6
filelock 1.0.3
FinCal 0.6.3
fitdistrplus 1.2-1
flashClust 1.01-2
flexclust 1.4-2
flexmix 2.3-19
float 0.3-2
FME 1.3.6.3
fmsb 0.7.6
FNN 1.1.4
fontawesome 0.5.2
390
Package Version
fontBitstreamVera 0.1.1
fontLiberation 0.1.0
fontquiver 0.2.1
forcats 1.0.0
foreach 1.5.2
forecast 8.23.0
forecastHybrid 5.0.19
foreign 0.8-87
formatR 1.14
formattable 0.2.1
Formula 1.2-5
fpc 2.2-12
fracdiff 1.5-3
fs 1.6.4
fTrading 3042.79
fUnitRoots 4040.81
furrr 0.3.1
futile.logger 1.4.3
futile.options 1.0.1
future 1.34.0
future.apply 1.11.2
gam 1.22-4
gamlr 1.13-8
gamlss 5.4-22
gamlss.data 6.0-6
gamlss.dist 6.1-1
391
Package Version
gargle 1.5.2
gbm 2.2.2
gbRd 0.4.12
gbutils 0.5
gclus 1.3.2
gdalUtils 2.0.3.2
gdata 3.0.0
gdtools 0.4.0
gee 4.13-27
genalg 0.2.1
generics 0.1.3
genetics 1.3.8.1.3
GenSA 1.1.14
geojson 0.3.5
geojsonio 0.11.3
geojsonlint 0.4.0
geojsonsf 2.0.3
geometries 0.2.4
geometry 0.4.7
geoR 1.9-4
geosphere 1.5-18
gert 2.1.1
gfonts 0.2.0
GGally 2.2.1
ggalt 0.4.0
gganimate 1.0.9
392
Package Version
ggcorrplot 0.1.4.1
ggdendro 0.2.0
ggeffects 1.7.1
ggExtra 0.10.1
ggfittext 0.10.2
ggforce 0.4.2
ggformula 0.12.0
ggfortify 0.4.17
ggfun 0.1.6
gghighlight 0.4.1
ggimage 0.3.3
ggiraph 0.8.10
ggjoy 0.4.1
ggm 2.3
ggmap 4.0.0
ggmcmc 1.5.1.1
ggplot2 3.5.1
ggplot2movies 0.0.1
ggplotify 0.1.2
ggpmisc 0.6.0
ggpp 0.5.8-1
ggpubr 0.6.0
ggQC 0.0.31
ggRandomForests 2.2.1
ggraph 2.2.1
ggrepel 0.9.5
393
Package Version
ggridges 0.5.6
ggsci 3.2.0
ggsignif 0.6.4
ggsoccer 0.1.7
ggstance 0.3.7
ggstats 0.6.0
ggtern 3.5.0
ggtext 0.1.2
ggthemes 5.1.0
gh 1.4.1
gistr 0.9.0
git2r 0.33.0
gitcreds 0.1.2
glasso 1.11
glmmTMB 1.1.9
glmnet 4.1-8
GlobalOptions 0.1.2
globals 0.16.3
glue 1.6.2
gmodels 2.19.1
gmp 0.7-5
gnm 1.1-5
goftest 1.2-3
googledrive 2.1.1
googlePolylines 0.8.4
googlesheets4 1.1.1
394
Package Version
googleVis 0.7.3
gower 1.0.1
GPArotation 2024.3-1
GPfit 1.0-8
gplots 3.1.3.1
graphlayouts 1.1.1
greybox 2.0.2
grid 4.3.3
gridBase 0.4-7
gridExtra 2.3
gridGraphics 0.5-1
gridSVG 1.7-5
gridtext 0.1.5
grImport 0.9-7
grImport2 0.3-3
grpreg 3.5.0
gsl 2.1-8
gss 2.2-7
gstat 2.1-2
gsubfn 0.7
gtable 0.3.5
gtools 3.9.5
gtrendsR 1.5.1
gWidgets 0.0-54.2
gWidgets2 1.0-9
gWidgets2tcltk 1.0-8
395
Package Version
gWidgetsRGtk2 0.0-86.1
gWidgetstcltk 0.0-55.1
haplo.stats 1.9.5.1
hardhat 1.4.0
hash 2.2.6.3
haven 2.5.4
hbsae 1.2
HDInterval 0.2.4
hdrcde 3.4
heatmaply 1.5.0
here 1.0.1
hexbin 1.28.4
hflights 0.1
HH 3.1-52
highcharter 0.9.4
highr 0.11
HistData 0.9-1
Hmisc 5.1-3
hms 1.1.3
hoardr 0.5.4
hrbrthemes 0.8.7
HSAUR 1.3-10
htmlTable 2.4.3
htmltools 0.5.8.1
htmlwidgets 1.6.4
hts 6.0.3
396
Package Version
httpcode 0.3.0
httpuv 1.6.15
httr 1.4.7
httr2 1.0.3
huge 1.3.5
hunspell 3.0.4
hydroTSM 0.7-0
hypergeo 1.2-13
IBrokers 0.10-2
ids 1.0.1
ifultools 2.0-26
igraph 2.0.3
imager 1.0.2
imputeTS 3.3
incidence 1.7.5
infer 1.0.7
influenceR 0.1.5
ini 0.3.1
inline 0.3.19
insight 0.20.4
interp 1.1-6
intervals 0.15.5
inum 1.0-5
investr 1.4.2
ipred 0.9-15
IRdisplay 1.1
397
Package Version
IRkernel 1.3.2
irlba 2.3.5.1
irr 0.84.1
isoband 0.2.7
ISOcodes 2024.02.12
iterators 1.0.14
janeaustenr 1.0.0
janitor 2.2.0
jmvcore 2.4.7
jomo 2.7-6
jpeg 0.1-10
jqr 1.3.4
jquerylib 0.1.4
jsonify 1.2.2
jsonlite 1.8.7
jsonvalidate 1.3.2
jtools 2.3.0
kableExtra 1.4.0
Kendall 2.2.1
kernlab 0.9-33
KernSmooth 2.23-24
kinship2 1.9.6.1
kknn 1.3.1
klaR 1.7-3
km.ci 0.5-6
KMsurv 0.1-5
398
Package Version
knitr 1.48
ks 1.14.2
labeling 0.4.3
labelled 2.13.0
laeken 0.5.3
Lahman 11.0-0
lambda.r 1.2.4
lars 1.3
later 1.3.2
latex2exp 0.9.6
lattice 0.22-6
latticeExtra 0.6-30
lava 1.8.0
lavaan 0.6-18
lazyeval 0.2.2
lda 1.5.2
leafem 0.2.3
leaflet 2.2.2
leaflet.esri 1.0.0
leaflet.extras 2.0.1
leaflet.providers 2.0.0
leafpop 0.1.0
leafsync 0.1.0
leaps 3.2
LearnBayes 2.15.1
lexicon 1.2.1
399
Package Version
lgr 0.4.4
lhs 1.2.0
libcoin 1.0-10
LiblineaR 2.10-23
LICORS 0.2.0
lifecycle 1.0.3
likert 1.3.5
limSolve 1.5.7.1
linelist 1.1.4
linprog 0.9-4
listenv 0.9.1
lm.beta 1.7-2
lme4 1.1-35.5
lmm 1.4
lmodel2 1.7-3
lmtest 0.9-40
lobstr 1.1.2
locfit 1.5-9.9
locpol 0.8.0
LogicReg 1.6.6
loo 2.8.0
lpSolve 5.6.20
lsa 0.73.3
lsmeans 2.30-0
lubridate 1.9.3
lwgeom 0.2-14
400
Package Version
magic 1.6-1
magick 2.8.4
magrittr 2.0.3
manipulateWidget 0.11.1
MAPA 2.0.7
mapdata 2.3.1
mapdeck 0.3.5
mapproj 1.2.11
maps 3.4.2
maptools 1.1-8
maptree 1.4-8
mapview 2.11.2
marima 2.2
markdown 1.13
MASS 7.3-60.0.1
Matching 4.10-14
MatchIt 4.5.5
matchmaker 0.1.1
mathjaxr 1.6-0
Matrix 1.6-5
matrixcalc 1.0-6
MatrixExtra 0.1.15
MatrixModels 0.5-3
matrixStats 1.4.0
maxLik 1.5-2.1
maxstat 0.7-25
401
Package Version
mboost 2.9-11
mclust 6.1.1
mcmc 0.9-8
MCMCglmm 2.36
MCMCpack 1.7-1
mda 0.5-4
memoise 2.0.1
merTools 0.6.2
meta 7.0-0
metadat 1.2-0
metafor 4.6-0
mgcv 1.9-1
mgsub 1.7.3
mi 1.1
mice 3.16.0
microbenchmark 1.5.0
mime 0.12
miniCRAN 0.3.0
miniUI 0.1.1.1
minpack.lm 1.2-4
minqa 1.2.8
mirt 1.41
misc3d 0.9-1
miscTools 0.6-28
mitml 0.4-5
mitools 2.4
402
Package Version
mixtools 2.0.0
mlapi 0.1.1
mlbench 2.1-5
mlogitBMA 0.1-7
mnormt 2.1.1
MNP 3.1-5
modeldata 1.4.0
modelenv 0.1.1
ModelMetrics 1.2.2.2
modelr 0.1.11
modeltools 0.2-23
mombf 3.5.4
moments 0.14.1
monomvn 1.9-20
monreg 0.1.4.1
mosaic 1.9.1
mosaicCore 0.9.4.0
mosaicData 0.20.4
msir 1.3.3
msm 1.7.1
multcomp 1.4-26
multcompView 0.1-10
multicool 1.0.1
munsell 0.5.1
mvoutlier 2.1.1
mvtnorm 1.3-1
403
Package Version
NADA 1.6-1.1
nanoparquet 0.3.1
NbClust 3.0.1
ncvreg 3.14.3
network 1.18.2
networkD3 0.4
neuralnet 1.44.2
ngram 3.2.3
nlme 3.1-165
nloptr 2.1.1
NLP 0.3-0
nls.multstart 1.3.0
NMF 0.21.0
nnet 7.3-19
nnls 1.5
nortest 1.0-4
numbers 0.8-5
numDeriv 2016.8-1.1
numform 0.7.0
OceanView 1.0.7
openair 2.18-2
openssl 2.2.1
ordinal 2023.12-4.1
osmar 1.1-7
outbreaks 1.9.0
outliers 0.15
404
Package Version
packcircles 0.3.6
padr 0.6.2
pan 1.9
pander 0.6.5
parallelly 1.38.0
parameters 0.22.2
parsnip 1.2.1
partitions 1.10-7
party 1.3-17
partykit 1.2-22
patchwork 1.2.0
pbapply 1.7-2
pbdZMQ 0.3-10
pbivnorm 0.6.0
pbkrtest 0.5.3
PCAmixdata 3.1
pcaPP 2.0-5
pdc 1.0.3
pegas 1.3
performance 0.12.3
PerformanceAnalytics 2.0.4
permute 0.9-7
perry 0.3.1
petrinetR 0.3.0
pheatmap 1.0.12
pillar 1.9.0
405
Package Version
pixmap 0.4-13
pkgbuild 1.4.4
pkgcache 2.2.2
pkgconfig 2.0.3
pkgdepends 0.7.2
pkgdown 2.1.0
pkgload 1.4.0
pkgmaker 0.32.10
platetools 0.1.7
plogr 0.2.0
plot3D 1.4.1
plot3Drgl 1.0.4
plotly 4.10.4
plotmo 3.6.4
plotrix 3.8-4
pls 2.8-4
plyr 1.8.9
png 0.1-8
polspline 1.1.25
polyclip 1.10-7
polylabelr 0.2.0
polynom 1.4-1
posterior 1.6.0
ppcor 1.1
prabclus 2.3-3
pracma 2.4.4
406
Package Version
praise 1.0.0
precrec 0.14.4
prediction 0.3.18
PresenceAbsence 1.1.11
prettyunits 1.2.0
pROC 1.18.5
processmapR 0.5.5
processmonitR 0.1.0
processx 3.8.4
prodlim 2024.06.25
profileModel 0.6.1
profvis 0.3.8
progress 1.2.3
progressr 0.14.0
proj4 1.0-14
promises 1.3.0
prophet 1
proto 1.0.0
protolite 2.3.0
proxy 0.4-27
pryr 0.1.6
ps 1.7.7
pscl 1.5.9
psych 2.4.3
purrr 1.0.2
pwr 1.3-0
407
Package Version
qap 0.1-2
qcc 2.7
qdapDictionaries 1.0.7
qdapRegex 0.7.8
qdapTools 1.3.7
qgraph 1.9.8
qicharts 0.5.8
qicharts2 0.7.5
quadprog 1.5-8
quanteda 3.3.1
quantmod 0.4.26
quantreg 5.98
questionr 0.7.8
QuickJSR 1.3.1
qvcalc 1.0.3
R.cache 0.16.0
R.matlab 3.7.0
R.methodsS3 1.8.2
R.oo 1.26.0
R.utils 2.12.3
r2d3 0.2.6
R2HTML 2.3.4
R2jags 0.8-5
R2OpenBUGS 3.2-3.2.1
R2WinBUGS 2.1-22.1
R6 2.5.1
408
Package Version
ragg 1.3.2
rainbow 3.8
ramps 0.6.18
RandomFields 3.3.14
RandomFieldsUtils 1.2.5
randomForest 4.7-1.1
randomForestSRC 3.3.1
ranger 0.16.0
RApiDatetime 0.0.9
rapidjsonr 1.2.0
rappdirs 0.3.3
raster 3.6-26
rattle 5.5.1
rayimage 0.10.0
rayshader 0.24.10
rayvertex 0.11.4
rbenchmark 1.0.0
rbibutils 2.2.16
Rblpapi 0.3.14
rbokeh 0.5.2
rcmdcheck 1.4.0
RColorBrewer 1.1-3
Rcpp 1.0.13
RcppArmadillo 14.0.0-1
RcppDE 0.1.7
RcppEigen 0.3.4.0.2
409
Package Version
RcppExamples 0.1.9
RcppParallel 5.1.9
RcppProgress 0.4.2
RcppRoll 0.3.1
RcppThread 2.1.7
RcppTOML 0.2.2
RCurl 1.98-1.16
Rdpack 2.6.1
readbitmap 0.1.5
readr 2.1.5
readxl 1.4.3
recipes 1.1.0
Redmonder 0.2.0
registry 0.5-1
relaimpo 2.2-7
relimp 1.0-5
rematch 2.0.0
rematch2 2.1.2
remotes 2.5.0
Renext 3.1-4
repr 1.1.6
reprex 2.1.1
reshape 0.8.9
reshape2 1.4.4
reticulate 1.39.0
rex 1.2.1
410
Package Version
rFerns 5.0.0
rfm 0.3.0
rgdal 1.6-7
rgeos 0.6-4
rgexf 0.16.3
rgl 1.3.1
RgoogleMaps 1.5.1
RGraphics 3.0-2
RGtk2 2.20.36.3
RhpcBLASctl 0.23-42
RInside 0.2.18
rio 1.2.2
rjags 16-Apr
rjson 0.2.21
RJSONIO 1.3-1.9
rlang 1.1.4
rlecuyer 0.3-8
rlist 0.4.6.2
rmapshaper 0.5.0
rmarkdown 2.28
Rmisc 1.5.1
Rmpfr 0.9-5
rms 6.8-1
RMySQL 0.10.28
rngtools 1.5.2
robCompositions 2.4.1
411
Package Version
robfilter 4.1.5
robustbase 0.99-4
robustHD 0.8.1
ROCR 1.0-11
RODBC 1.3-23
Rook 1.2
rootSolve 1.8.2.4
roxygen2 7.3.2
rpart 4.1.23
rpart.plot 3.1.2
rpivotTable 0.3.0
rprojroot 2.0.4
RPushbullet 0.3.4
rrcov 1.7-6
rsample 1.2.1
rsdmx 0.6-3
RSGHB 1.2.2
RSNNS 0.4-17
Rsolnp 1.16
rsparse 0.5.2
RSpectra 0.16-2
RSQLite 2.3.7
rstan 2.32.6
rstantools 2.4.0
rstatix 0.7.2
rstudioapi 0.16.0
412
Package Version
rsvg 2.6.0
RTextTools 1.4.3
Rttf2pt1 1.3.12
RUnit 0.4.33
runjags 2.2.2-4
Runuran 0.38
rvcheck 0.2.1
rversions 2.1.2
rvest 1.0.4
rworldmap 1.3-8
rworldxtra 1.01
s2 1.1.7
SampleSizeMeans 1.2.3
SampleSizeProportions 1.1.3
sandwich 3.1-0
sas7bdat 0.8
sass 0.4.9
satellite 1.0.5
sbgcop 0.98
scales 1.3.0
scatterplot3d 0.3-44
sciplot 1.2-0
segmented 2.1-2
selectr 0.4-2
sem 3.1-16
sentimentr 2.9.0
413
Package Version
seqinr 4.2-36
seriation 1.5.6
servr 0.31
sessioninfo 1.2.2
setRNG 2024.2-1
sets 1.0-25
sf 1.0-16
sfd 0.1.0
sfheaders 0.4.4
sfsmisc 1.1-19
sftime 0.2-0
sgeostat 1.0-27
shades 1.4.0
shape 1.4.6.1
shapefiles 0.7.2
shiny 1.9.1
shinyBS 0.61.1
shinycssloaders 1.1.0
shinyjs 2.1.0
shinyTime 1.0.3
showtext 0.9-7
showtextdb 3
SimDesign 2.17.1
SIS 0.8-8
SixSigma 0.11.1
sjlabelled 1.2.0
414
Package Version
sjmisc 2.8.10
sjPlot 2.8.16
sjstats 0.19.0
skmeans 0.2-17
slam 0.1-53
slider 0.3.1
sm 2.2-6.0
smooth 4.0.2
smoothSurv 2.6
sna 2.7-2
snakecase 0.11.1
snow 0.4-4
SnowballC 0.7.1
snowFT 1.6-1
sodium 1.3.1
sourcetools 0.1.7-1
sp 2.1-4
spacefillr 0.3.3
spacetime 1.3-2
spacyr 1.3.0
spam 2.10-0
SparseM 1.84-2
sparsepp 1.22
spatial 7.3-17
spatstat 3.0-7
spatstat.data 3.1-2
415
Package Version
spatstat.explore 3.2-6
spatstat.geom 3.2-9
spatstat.linnet 3.1-4
spatstat.model 3.2-10
spatstat.random 3.2-3
spatstat.sparse 3.1-0
spatstat.univar 3.0-1
spatstat.utils 3.1-0
spBayes 0.4-7
spData 2.3.3
spdep 1.3-5
spikeslab 1.1.6
splancs 2.01-45
splines 4.3.3
spls 2.2-3
splus2R 1.3-5
spTimer 3.3.2
sqldf 0.4-11
SQUAREM 2021.1
sROC 0.1-2
stabledist 0.7-2
stabs 0.6-4
StanHeaders 2.32.10
stars 0.6-6
statmod 1.5.0
statnet.common 4.9.0
416
Package Version
stepPlr 0.93
stinepack 1.5
stochvol 3.2.4
stopwords 2.3
stringdist 0.9.12
stringi 1.8.4
stringr 1.5.1
strucchange 1.5-4
styler 1.10.3
sugrrants 0.2.9
sunburstR 2.1.8
SuppDists 1.1-9.8
survey 4.4-2
survival 3.7-0
survminer 0.4.9
survMisc 0.5.6
svglite 2.1.3
svmpath 0.97
svUnit 1.0.6
sweep 0.2.5
sys 3.4.2
sysfonts 0.8.9
systemfit 1.1-30
systemfonts 1.1.0
syuzhet 1.0.7
tau 0.0-25
417
Package Version
tcltk 4.3.3
tcltk2 1.2-11
TeachingDemos 2.13
tensor 1.5
tensorA 0.36.2.1
terra 1.7-78
terrainmeshr 0.1.0
testthat 3.2.1.1
texreg 1.39.4
text2vec 0.6.4
textcat 1.0-8
textclean 0.9.3
textir 2.0-5
textmineR 3.0.5
textshape 1.7.5
textshaping 0.4.0
tfplot 2021.6-1
tframe 2015.12-1.1
tgp 2.4-23
TH.data 1.1-2
thief 0.3
threejs 0.3.3
tibble 3.2.1
tibbletime 0.1.8
tidycensus 1.6.5
tidygraph 1.3.0
418
Package Version
tidymodels 1.2.0
tidyquant 1.0.9
tidyr 1.3.1
tidyselect 1.2.1
tidytext 0.4.2
tidyverse 2.0.0
tiff 0.1-12
tigris 2.1
timechange 0.3.0
timeDate 4032.109
timelineS 0.1.1
timeSeries 4032.109
timetk 2.9.0
timevis 2.1.0
tinytex 0.52
tm 0.7-14
tmap 3.3-4
tmaptools 3.1-1
TMB 1.9.14
tmvnsim 1.0-2
tokenizers 0.3.0
topicmodels 0.2-16
TraMineR 2.2-10
transformr 0.1.5
tree 1.0-43
treemap 2.4-4
419
Package Version
treemapify 2.5.6
trelliscopejs 0.2.6
triebeard 0.4.1
trimcluster 0.1-5
truncnorm 1.0-9
TSA 1.3.1
tseries 0.10-57
tsfeatures 1.1.1
tsibble 1.1.5
tsintermittent 1.1
tsoutliers 0.6-10
TSP 1.2-4
TSstudio 0.1.7
TTR 0.24.4
tune 1.2.1
tweedie 2.3.5
tweenr 2.0.3
twitteR 1.1.9
tzdb 0.4.0
ucminf 1.2.2
udpipe 0.8.11
udunits2 0.13.2.1
units 0.8-5
UpSetR 1.4.0
urca 1.3-4
urlchecker 1.0.1
420
Package Version
urltools 1.7.3
useful 1.2.6.1
usethis 3.0.0
UsingR 2.0-7
usmap 0.7.1
usmapdata 0.3.0
utf8 1.2.4
uuid 1.1-1
V8 5.0.0
vars 1.6-1
vcd 1.4-12
vctrs 0.6.4
vdiffr 1.0.7
vegan 2.6-8
VennDiagram 1.7.3
VGAM 1.1-11
VIM 6.2.2
vioplot 0.5.0
viridis 0.6.5
viridisLite 0.4.2
visNetwork 2.1.2
vistime 1.2.4
vroom 1.6.5
waldo 0.5.3
warp 0.2.1
waterfalls 1.0.0
421
Package Version
wavethresh 4.7.3
webshot 0.5.5
webutils 1.2.1
WeibullR 1.2.1
weights 1.0.4
whisker 0.4.1
widgetframe 0.3.1
withr 3.0.1
wk 0.9.2
wmtsa 2.0-3
wordcloud 2.6
wordcloud2 0.2.1
workflows 1.1.4
workflowsets 1.1.0
writexl 1.5.0
xesreadR 0.2.3
xfun 0.47
xgboost 2.1.1.1
XML 3.99-0.17
xml2 1.3.6
xopen 1.0.1
xplorerr 0.1.2
xtable 1.8-4
xts 0.14.0
yaml 2.3.10
yardstick 1.3.1
422
Package Version
yarrr 0.1.5
YieldCurve 5.1
yulab.utils 0.1.7
zCompositions 1.5.0-4
zeallot 0.1.0
zic 0.9.1
zip 2.3.1
zipfR 0.6-70
zoo 1.8-12
MatrixGenerics 1.14.0
sparseMatrixStats 1.14.0
423
ノ Expand table
Publish to web isn't supported for R visuals. Reports with R visuals can still be published
publicly, but any R visuals will not render (charts will appear empty).
R visuals aren't supported with Service Principal Profiles for App Owns Data.
R visuals rendered from the Power BI service are subject to a limit of 30Mb. This limit
applies to the total payload of compressed input data and the R script itself. Always check
R visuals after publishing the report to ensure the report will display as expected.
R visuals with HTML and XML packages with fail to render due to Out Of Memory (OOM)
limits. Migrate the visual with the PbiViz tool. The visual must render to be considered
successfully migrated.
Related content
For more information about R in Power BI, take a look at the following articles:
424
Connect to Snowflake in the Power BI
service
Article • 09/25/2024
Connecting to Snowflake in the Power BI service differs from other connectors in only
one way. Snowflake has a capability for Microsoft Entra ID, an option for SSO (single
sign-on). Parts of the integration require different administrative roles across Snowflake,
Power BI, and Azure. You can choose to enable Microsoft Entra authentication without
using SSO. Basic authentication works similarly to other connectors in the service.
If you're the Snowflake admin, see Power BI SSO to Snowflake in the Snowflake
documentation.
If you're a Power BI admin, go to the Admin portal section to enable SSO.
If you're a Power BI semantic model creator, go to the Configure a semantic model
with Microsoft Entra ID section to enable SSO.
Admin portal
To enable SSO, a Fabric administrator has to turn on the setting in the Power BI Admin
portal. This setting approves sending Microsoft Entra authentication tokens to
Snowflake from within the Power BI service. This setting is set at an organizational level.
Follow these steps to enable SSO:
2. Select Settings from the page header menu, then select Admin portal.
425
4. Expand Snowflake SSO, toggle the setting to Enabled, then select Apply.
This step is required to consent to sending your Microsoft Entra token to the Snowflake
servers. After you enable the setting, it can take up to an hour for it to take effect.
For more information including steps for using Microsoft Entra ID, SSO, and Snowflake,
see Data gateway support for single sign-on with Microsoft Entra ID .
For information about how you can use the on-premises data gateway, see What is an
on-premises data gateway?
If you aren't using the gateway, you're all set. When you have Snowflake credentials
configured on your on-premises data gateway, but you're only using that data source in
your model, switch the Semantic model settings to off on the gateway for that data
model.
426
To turn on SSO for a semantic model:
2. Select the appropriate workspace, then choose Settings from the more options
menu that's located next to the semantic model name.
3. Select Data source credentials and sign in. The semantic model can be signed into
Snowflake with Basic or OAuth2 (Microsoft Entra ID) credentials. By using Microsoft
Entra ID, you can enable SSO in the next step.
4. Select the option End users use their own OAuth2 credentials when accessing
this data source via DirectQuery. This setting will enable Microsoft Entra SSO. The
Microsoft Entra credentials are sent for SSO.
427
After these steps are done, users should automatically use their Microsoft Entra
authentication to connect to data from that Snowflake semantic model.
If you choose not to enable SSO, then users refreshing the report will use the credentials
of the user who signed in, like most other Power BI reports.
Troubleshooting
If you run into any issues with the integration, see the Snowflake troubleshooting
guide .
Related content
Data sources for the Power BI service
Connect to semantic models in the Power BI service from Power BI desktop
Connect to Snowflake in Power BI Desktop
Feedback
428
Was this page helpful? Yes No
429
Connect to SSAS multidimensional
models in Power BI Desktop
Article • 02/26/2025
With Power BI Desktop, you can access SQL Server Analysis Services (SSAS)
multidimensional models, commonly referred to as SSAS MD.
To connect to an SSAS MD database, select Get data, choose Database > SQL Server
Analysis Services database, and then select Connect:
The Power BI service and Power BI Desktop both support SSAS multidimensional models
in live connection mode. You can publish and upload reports that use SSAS
Multidimensional models in live mode to the Power BI service.
ノ Expand table
Cube Model
Measure Measure
Perspective Perspective
KPI KPI
To help simplify complex models in a multidimensional model, you can define a set of
measures or KPIs in a cube to be located within a display folder. Power BI recognizes
display folders in tabular metadata, and it shows measures and KPIs within the display
folders. KPIs in multidimensional databases support Value, Goal, Status Graphic, and
Trend Graphic.
431
Dimension attribute type
Multidimensional models also support associating dimension attributes with specific
dimension attribute types. For example, a Geography dimension where the City, State-
Province, CountryRegion, and Postal Code dimension attributes have appropriate
geography types associated with them are exposed in the tabular metadata. Power BI
recognizes the metadata, enabling you to create map visualizations. You can recognize
these associations by the map icon next to the element in the Field pane in Power BI.
Power BI can also render images when you provide a field that contains uniform
resource locators (URLs) of the images. You might specify these fields as ImageURL types
in SQL Server Data Tools, or then in Power BI Desktop. Its type information is then
provided to Power BI in the tabular metadata. Power BI can then retrieve those images
from the URL and display them in visuals.
Parent-child hierarchies
Multidimensional models support parent-child hierarchies, which are presented as a
hierarchy in the tabular metadata. Each level of the parent-child hierarchy is exposed as
a hidden column in the tabular metadata. The key attribute of the parent-child
dimension isn't exposed in the tabular metadata.
The calculated members of user hierarchies aren't exposed in Power BI. You can instead
connect to a cube that contains calculated members on user hierarchies. However, you
432
can't see calculated members if they don't meet the constraints that are mentioned in
the previous bulleted list.
Security
Multidimensional models support dimension and cell level security by way of roles.
When you connect to a cube with Power BI, you're authenticated and evaluated for
appropriate permissions. If a user has dimension security applied, the respective
dimension members aren't seen by the user in Power BI. However, when a user has
defined a cell security permission where certain cells are restricted, that user can't
connect to the cube using Power BI.
Only enterprise and BI editions of SQL Server 2014 support live connections. For
the standard edition of SQL Server, SQL Server 2016 or later is required for live
connections.
Actions and named sets aren't exposed to Power BI. To create visuals and reports,
you can still connect to cubes that also contain actions or named sets.
When Power BI displays metadata for an SSAS model, occasionally you can't
retrieve data from the model. This scenario can occur if you've installed the 32-bit
version of the Microsoft Online Analytical Processing provider, but not the 64-bit
version. Installing the 64-bit version might resolve the issue.
You can't create report level measures when authoring a report that is connected
live to an SSAS multidimensional model. The only measures that are available are
measures defined in the MD model.
Default members
Dimension attributes
433
Dimension attribute types
Dimension calculated members, which:
must be a single real member when the dimension has more than one attribute;
can't be the key attribute of the dimension unless it's the only attribute; and
can't be a parent-child attribute.
Dimension security
Display folders
Hierarchies
ImageUrls
KPIs
KPI trends
Measures (with or without measure groups)
Measures as variant
Troubleshooting
The following list describes all known issues when connecting to SQL Server Analysis
Services.
Error : Couldn't load model schema. This error usually occurs when the user
connecting to Analysis Services doesn't have access to database/cube.
Feedback
Was this page helpful? Yes No
434
Connect to Analysis Services tabular
data in Power BI Desktop
Article • 07/30/2024
With Power BI Desktop, there are two ways you can connect to and get data from your
SQL Server Analysis Services tabular models:
Explore by using a live connection: When you use a live connection, items in your
tabular model or perspective, like tables, columns, and measures, appear in your Power
BI Desktop Data pane list. You can use Power BI Desktop's advanced visualization and
report tools to explore your tabular model in new, highly interactive ways.
When you connect live, no data from the tabular model is imported into Power BI
Desktop. Each time you interact with a visualization, Power BI Desktop queries the
tabular model and calculates the results that you see. You're always looking at the latest
data that is available in the tabular model, either from the last processing time, or from
DirectQuery tables available in the tabular model.
Keep in mind that tabular models are highly secure. Items that appear in Power BI
Desktop depend on your permissions for the tabular model that you're connected to.
When you've created dynamic reports in Power BI Desktop, you can share them by
publishing to your Power BI workspace. When you publish a Power BI Desktop file with a
live connection to a tabular model to your workspace, an on-premises data gateway
must be installed and configured by an administrator. For more information, see On-
premises data gateway.
Select items and import into Power BI Desktop: When you connect with this option,
you can select items like tables, columns, and measures in your tabular model or
perspective and load them into a Power BI Desktop model. Use Power BI Desktop's
Power Query Editor to further shape what you want and its modeling features to further
model the data. Because no live connection between Power BI Desktop and the tabular
model is maintained, you can then explore your Power BI Desktop model offline or
publish to your Power BI workspace.
3. In the SQL Server Analysis Services database window, enter the Server name,
choose a connection mode, and then select OK.
436
4. This step in the Navigator window depends on the connection mode you selected:
If you chose to select items and get data, select a tabular model or
perspective, and then select a particular table or column to load. To shape
your data before loading, select Transform data to open Power Query Editor.
When you’re ready, select Load to import the data into Power BI Desktop.
437
Answer: It depends. If you use Power BI Desktop to connect live to a tabular model, but
have no intention to publish to your Power BI workspace, you don't need a gateway. On
the other hand, if you do intend on publishing to your workspace, a data gateway is
necessary to ensure secure communication between the Power BI service and your on-
premises Analysis Services server. Be sure to talk to your Analysis Services server
administrator before installing a data gateway.
If you choose to select items and get data, you import tabular model data directly into
your Power BI Desktop file, so no gateway is necessary.
Question: What's the difference between connecting live to a tabular model from the
Power BI service versus connecting live from Power BI Desktop?
Answer: When you connect live to a tabular model from your workspace in the Power BI
service to an Analysis Services database on-premises in your organization, an on-
premises data gateway is required to secure communications between them. When you
connect live to a tabular model from Power BI Desktop, a gateway isn't required because
the Power BI Desktop and the Analysis Services server you’re connecting to are both
running on-premises in your organization. However, if you publish your Power BI
Desktop file to your Power BI workspace, a gateway is required.
Question: If I created a live connection, can I connect to another data source in the
same Power BI Desktop file?
Answer: No. You can't explore live data and connect to another type of data source in
the same file. If you’ve already imported data or connected to a different data source in
a Power BI Desktop file, you need to create a new file to explore live.
Question: If I created a live connection, can I edit the model or query in Power BI
Desktop?
Answer: You can create report level measures in the Power BI Desktop, but all other
query and modeling features are disabled when exploring live data.
Answer: Yes. Your current Windows credentials are used to connect to the Analysis
Services server. You can't use basic or stored credentials in either the Power BI service or
Power BI Desktop when exploring live.
Question: Are there any features of Analysis Services that change the way Power BI
behaves?
Answer: Yes. Depending on the features your tabular model uses, the experience in
Power BI Desktop might change. Some examples include:
You might see measures in the model grouped together at the top of the Data
pane list rather than in tables alongside columns. Don't worry, you can still use
them as normal, it's just easier to find them this way.
If the tabular model has calculation groups defined, you can use them only with
model measures and not with implicit measures you create by adding numeric
fields to a visual. The model might also have had the DiscourageImplicitMeasures
flag set manually, which has the same effect. For more information, see Calculation
groups.
1. Select Transform data > Data source settings from the Home tab.
2. In the Data source settings window, select the database from the list, then select
the Change Source... button.
3. In the SQL Server Analysis Services database window, enter the new Server name,
and then select OK.
Troubleshooting
The following list describes all known issues when connecting to SQL Server Analysis
Services (SSAS) or Azure Analysis Services:
439
Error: Couldn't load model schema. This error usually occurs when the user
connecting to Analysis Services doesn't have access to the database/model.
Feedback
Was this page helpful? Yes No
440
Use DirectQuery in Power BI Desktop
Article • 09/06/2024
When you connect to any data source with Power BI Desktop, you can import a copy of
the data. For some data sources, you can also connect directly to the data source
without importing data by using DirectQuery. This article explains the differences
between Import and DirectQuery connectivity modes and tells you how to connect to
data sources using DirectQuery. It also covers the considerations and limitations of
using DirectQuery, such as performance and security.
To determine whether a data source supports DirectQuery, view the full listing of
available data sources found in the article Connectors in Power Query, which also
applies to Power BI. Select the article that describes the data source you're interested in
from the list of supported connectors, then see the section in that connector's article
titled Capabilities supported. If DirectQuery isn't listed in that section for the data
source's article, DirectQuery isn't supported for that data connector.
Here are the differences between using Import and DirectQuery connectivity modes:
Import: A copy of the data from the selected tables and columns imports into
Power BI Desktop. As you create or interact with visualizations, Power BI Desktop
uses the imported data. To see underlying data changes after the initial import or
the most recent refresh, you must import the full semantic model again to refresh
the data.
DirectQuery: No data imports into Power BI Desktop. For relational sources, you
can select tables and columns to appear in the Power BI Desktop Data pane. For
multidimensional sources like SAP Business Warehouse (SAP BW), the dimensions
and measures of the selected cube appear in the Data pane. As you create or
interact with visualizations, Power BI Desktop queries the underlying data source,
so you're always viewing current data.
With DirectQuery, when you create or interact with a visualization, you must query the
underlying source. The time that's needed to refresh the visualization depends on the
performance of the underlying data source. If the data needed to service the request
was recently requested, Power BI Desktop uses the recent data to reduce the time
required to show the visualization. Selecting Refresh from the Home ribbon refreshes all
visualizations with current data.
Many data modeling and data transformations are available when using DirectQuery,
although with some performance-based limitations. For more information about
DirectQuery benefits, limitations, and recommendations, see DirectQuery in Power BI.
441
DirectQuery benefits
Some benefits of using DirectQuery include:
DirectQuery lets you build visualizations over very large semantic models, where it
would be infeasible to import all the data with pre-aggregation.
DirectQuery reports always use current data. Seeing underlying data changes
requires you to refresh the data, and reimporting large semantic models to refresh
data could be infeasible.
1. In the Home group of the Power BI Desktop ribbon, select Get data, and then
select a data source that DirectQuery supports, such as SQL Server.
2. In the dialog box for the connection, under Data Connectivity mode, select
DirectQuery.
442
You can publish DirectQuery reports to the Power BI service, but you need to take extra
steps for the Power BI service to open the reports.
To connect the Power BI service to DirectQuery data sources other than Azure SQL
Database, Azure Synapse Analytics (formerly SQL Data Warehouse), Amazon
Redshift, and Snowflake Data Warehouse, install an on-premises data gateway and
register the data source.
If you used DirectQuery with cloud sources like Azure SQL Database, Azure
Synapse, Amazon Redshift, or Snowflake Data Warehouse, you don't need an on-
premises data gateway. You still must provide credentials for the Power BI service
to open the published report. Without credentials, an error occurs when you try to
open a published report or explore a semantic model created with a DirectQuery
connection.
To provide credentials for opening the report and refreshing the data:
1. In the Power BI service, go to the workspace and locate the semantic model that
uses DirectQuery in the workspace content list.
2. Select the More options three horizontal dots icon next to the name of the
semantic model, then choose Settings.
3. Under Data source credentials, provide the credentials to connect to the data
source.
7 Note
If you used DirectQuery with an Azure SQL Database that has a private IP address,
you need to use an on-premises gateway.
Load on the source database also depends on the number of Power BI users who
consume the published report, especially if the report uses row-level security (RLS). The
refresh of a non-RLS dashboard tile shared by multiple users sends a single query to the
database, but refreshing a dashboard tile that uses RLS requires one query per user. The
increased queries significantly increase load and potentially affect performance.
1 million-row limit
DirectQuery defines a 1 million-row limit for data returned from cloud data sources,
which are any data sources that aren't on-premises. On-premises sources are limited to
a defined payload of about 4 MB per row, depending on proprietary compression
algorithm, or 16 MB for the entire visual. Premium capacities can set different maximum
row limits, as described in the blog post Power BI Premium new capacity settings .
Power BI creates queries that are as efficient as possible, but some generated queries
might retrieve too many rows from the underlying data source. For example, this
situation can occur with a simple chart that includes a high cardinality column with the
aggregation option set to No Calculation. The visual must have only columns with a
cardinality below 1 million, or must apply the appropriate filters.
The row limit doesn't apply to aggregations or calculations used to select the semantic
model DirectQuery returns, only to the rows returned. For example, the query that runs
on the data source can aggregate 10 million rows. As long as the data returned to
Power BI is less than 1 million rows, the query can accurately return the results. If the
data is over 1 million rows, Power BI shows an error, except in Premium capacity with
different admin-set limits. The error states: The resultset of a query to external data
source has exceeded the maximum allowed size of '1000000' rows.
Security considerations
By default, all users who consume a published report in the Power BI service connect to
the underlying data source by using the credentials entered after publication. This
situation is the same as for imported data. All users see the same data, regardless of any
security rules that the underlying source defines.
If you need per-user security implemented with DirectQuery sources, either use RLS or
configure Kerberos constrained authentication against the source. Kerberos isn't
444
available for all sources. For more information, see Row-level security (RLS) with Power
BI and Configure Kerberos-based SSO from Power BI service to on-premises data
sources.
If the Power Query Editor query is overly complex, an error occurs. To fix the error,
you must either delete the problematic step in Power Query Editor, or switch to
Import mode. Multidimensional sources like SAP BW can't use the Power Query
Editor.
For table or matrix visualizations, there's a 125-column limit for results that return
more than 500 rows from DirectQuery sources. These results display a scroll bar in
the table or matrix that lets you fetch more data. In that situation, the maximum
number of columns in the table or matrix is 125. If you must include more than 125
columns in a single table or matrix, consider creating measures that use MIN , MAX ,
FIRST , or LAST , because they don't count against this maximum.
You can't change from Import to DirectQuery mode. You can switch from
DirectQuery mode to Import mode if you import all the necessary data. It's not
possible to switch back, mostly because of the feature set that DirectQuery doesn't
support. DirectQuery models over multidimensional sources, like SAP BW, can't be
switched from DirectQuery to Import mode either, because of the different
treatment of external measures.
Calculated tables and calculated columns that reference a DirectQuery table from a
data source with single sign-on (SSO) authentication are supported in the Power BI
service with an assigned shareable cloud connection and/or granular access
control.
Related content
DirectQuery in Power BI
Data sources supported by DirectQuery
DirectQuery and SAP Business Warehouse
DirectQuery and SAP HANA
What is an on-premises data gateway?
445
Use composite models in Power BI Desktop
Feedback
Was this page helpful? Yes No
446
Connect to SAP Business Warehouse by
using DirectQuery in Power BI
Article • 02/26/2025
You can connect to SAP Business Warehouse (SAP BW) data sources directly using
DirectQuery. Given the OLAP/multidimensional nature of SAP BW, there are many
important differences between DirectQuery over SAP BW versus relational sources like
SQL Server. These differences are summarized as follows:
In addition, it's extremely important to understand that there are many features of SAP
BW that aren't supported in Power BI, and that because of the nature of the public
interface to SAP BW, there are important cases where the results seen through Power BI
don't match the ones seen when using an SAP tool. These limitations are described later
in this article. These limitations and behavior differences should be carefully reviewed to
ensure that the results seen through Power BI, as returned by the SAP public interface,
are interpreted correctly.
7 Note
The ability to use DirectQuery over SAP BW was in preview until the March 2018
update to Power BI Desktop. During the preview, feedback and suggested
improvements prompted a change that impacts reports that were created using
that preview version. Now that General Availability (GA) of DirectQuery over SAP
BW has released, you must discard any existing (preview-based) reports using
DirectQuery over SAP BW that were created with the pre-GA version.
447
In reports created with the pre-GA version of DirectQuery over SAP BW, errors
occur with those pre-GA reports upon invoking Refresh, as a result of attempting to
refresh the metadata with any changes to the underlying SAP BW cube. Please re-
create those reports from a blank report, using the GA version of DirectQuery over
SAP BW.
448
Multi-select and include/exclude: The ability to multi-select data points on a
visual is disabled if the points represent values from more than one column. For
example, given a bar chart showing Sales by Country/Region, with Category on the
Legend, it wouldn't be possible to select the point for (USA, Bikes) and (France,
Clothes). Similarly, it wouldn't be possible to select the point for (USA, Bikes) and
exclude it from the visual. Both limitations are imposed to reflect the support
offered by SAP BW.
ノ Expand table
Feature Description
Local calculations Local calculations defined in a BEx Query change the numbers as
displayed through tools like BEx Analyzer. However, they aren't
reflected in the numbers returned from SAP, through the public MDX
interface.
For example, when connecting to a query cube from a BEx query that
sets the aggregation to be Cumulated, or running sum, Power BI would
get back the base numbers, ignoring that setting. An analyst could then
apply a running sum calculation locally in Power BI, but would need to
exercise caution in how the numbers are interpreted if this action isn't
done.
Aggregations In some cases, particularly when dealing with multiple currencies, the
aggregate numbers returned by the SAP public interface don't match
the results shown by SAP tools.
For example, totals over different currencies would show as "*" in BEx
Analyzer, but the total would get returned by the SAP public interface,
without any information that such an aggregate number is
meaningless. Thus the number aggregating, say, $, EUR, and AUD,
would get displayed by Power BI.
Currency formatting Any currency formatting, for example, $2,300 or 4,000 AUD, isn't
reflected in Power BI.
449
Feature Description
Units of measure Units of measure, for example, 230 KG, aren't reflected in Power BI.
Key versus text (short, For an SAP BW characteristic like CostCenter , the field list shows a
medium, long) single column Cost Center. Using that column displays the default text.
By showing hidden fields, it's also possible to see the unique name
column that returns the unique name assigned by SAP BW, and is the
basis of uniqueness.
Multiple hierarchies of In SAP, a characteristic can have multiple hierarchies. Then in tools like
a characteristic BEx Analyzer, when a characteristic is included in a query, the user can
select the hierarchy to use.
In Power BI, the various hierarchies can be seen in the field list as
different hierarchies on the same dimension. However, selecting
multiple levels from two different hierarchies on the same dimension
results in empty data being returned by SAP.
Treatment of ragged
hierarchies
Scaling factor/reverse In SAP, a key figure can have a scaling factor, for example, 1000,
sign defined as a formatting option, meaning that all display is scaled by
that factor.
It can similarly have a property set that reverses the sign. Use of such a
key figure in Power BI in a visual, or as part of a calculation results in
the unscaled number being used. The sign isn't reversed. The
underlying scaling factor isn't available. In Power BI visuals, the scale
units shown on the axis (K,M,B) can be controlled as part of the visual
formatting.
Hierarchies where Initially when connecting to SAP BW, the information on the levels of a
levels hierarchy are retrieved, resulting in a set of fields in the field list. This
450