semantic-link


Namesemantic-link JSON
Version 0.9.1 PyPI version JSON
download
home_pagehttps://learn.microsoft.com/en-us/fabric/data-science/semantic-link-overview
SummarySemantic link for Microsoft Fabric
upload_time2025-02-26 08:43:20
maintainerNone
docs_urlNone
authorMicrosoft
requires_python>=3.10
licenseproprietary and confidential
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            Semantic link is a feature that allows you to establish a connection between [Power BI datasets](https://learn.microsoft.com/en-us/power-bi/connect-data/service-datasets-understand) and [Synapse Data Science in Microsoft Fabric](https://learn.microsoft.com/en-us/fabric/data-science/data-science-overview).

The primary goals of semantic link are to facilitate data connectivity, enable the propagation of semantic information, and seamlessly integrate with established tools used by data scientists, such as notebooks.

Semantic link helps you to preserve domain knowledge about data semantics in a standardized way that can speed up data analysis and reduce errors.

[Package (PyPi)](https://pypi.org/project/semantic-link/) | [API reference documentation](https://learn.microsoft.com/en-us/python/api/semantic-link-sempy/) | [Product documentation](https://learn.microsoft.com/en-us/fabric/data-science/semantic-link-overview) | [Samples](https://github.com/microsoft/fabric-samples/tree/main/docs-samples/data-science/semantic-link-samples)

By downloading, installing, using or accessing this distribution package for semantic link, you agree to the [Terms of Service](https://github.com/microsoft/semantic-link-functions/blob/main/sempy/LICENSE.txt).

This package has been tested with Microsoft Fabric.

# Getting started
## Prerequisites

* A [Microsoft Fabric subscription](https://learn.microsoft.com/en-us/fabric/enterprise/licenses). Or sign up for a free [Microsoft Fabric (Preview) trial](https://learn.microsoft.com/en-us/fabric/get-started/fabric-trial).
* Sign in to [Microsoft Fabric](https://fabric.microsoft.com/).
* Create [a new notebook](https://learn.microsoft.com/en-us/fabric/data-engineering/how-to-use-notebook#create-notebooks) or a new [spark job](https://learn.microsoft.com/en-us/fabric/data-engineering/create-spark-job-definition) to use this package. **Note that semantic link is supported only within Microsoft Fabric.**

## About the semantic link packages
The functionalities for semantic link are split into multiple packages to allow for a modular installation.
If you want to install only a subset of the semantic link functionalities, you can install the individual packages instead of the `semantic-link` meta-package.
This can help solve dependency issues. The following are some of the available packages:

* [semantic-link](https://pypi.org/project/semantic-link/) - The meta-package that depends on all the individual semantic link packages and serves as a convenient way to install all the semantic link packages at once.
* [semantic-link-sempy](https://pypi.org/project/semantic-link-sempy/) - The package that contains the core semantic link functionality.
* [semantic-link-functions-holidays](https://pypi.org/project/semantic-link-functions-holidays/) - A package that contains semantic functions for holidays and dependence on [holidays](https://pypi.org/project/holidays).
* [semantic-link-functions-geopandas](https://pypi.org/project/semantic-link-functions-geopandas/) - A package that contains semantic functions for geospatial data and dependence on [geopandas](https://pypi.org/project/geopandas).
* ...

## Install the `semantic-link` meta package

For Spark 3.4 and above, Semantic link is available in the default runtime when using Fabric, and there's no need to install it. If you're using Spark 3.3 or below, or if you want to update to the most recent version of Semantic Link, you have two options:
* To install the most recent version `semantic-link` in your notebook kernel by executing this code in a notebook cell:

  ```python
  %pip install -U semantic-link
  ```

* Alternatively, you can add semantic link to your Fabric environments directly. For more information, see [library management in Fabric environments](https://learn.microsoft.com/fabric/data-engineering/environment-manage-library).

# Key concepts
SemPy offers the following capabilitites:

* Connectivity to Power BI
* Connectivity through Power BI Spark native connector
* Data augmentation with Power BI measures
* Semantic propagation for pandas users
* Built-in and custom semantic functions

# Change logs

## 0.9.1
- add `retry_config` parameter to `BaseRestClient`
- add `dataset` parameter to `create_tom_server`
- fix: escaping "/" in workspace url
- fix: get fabric context error
- fix: trace add event schema for ActivityId and RequestId
- fix: missing rows calculation logic on direct lake mode models
- fix: occasional dataset not found error in list_tables
- fix: bpa rules for unescaped object names

## 0.9.0
- add model_memory_analyzer: support displaying VertiPaq statistics about the semantic model
- add run_model_bpa function: support displaying Best Practice Analyzer statistics about the semantic model
- update connect_semantic_model: support using either name or ID for workspace and dataset
- fix evaluate_dax for object columns: cast object columns to strings
- fix TOMWrapper all_unqualified_column_dependencies: correct the input type to be Measure

## 0.8.5
- Fix refresh_tom_cache: addressed the issue when refreshing the workspace with different identifiers (e.g., workspace name and workspace ID)
- Fix TOMWrapper: addressed the lineage compactibility error during TOMWrapper initialization
- Upgrade Microsoft.AnalysisServices.AdomdClient NuGet dependency to 19.87.7

## 0.8.4
- add TOMWrapper for semantic model
- delay sklearn import to import startup latency

## 0.8.3
- fix: race conditions in sempy initialization during multithreading

## 0.8.2
- Telemetry Updates, bugfixes

## 0.8.1
- fix inaccuracy in the row count for certain scenarios when using the list_tables function with extended=True.

## 0.8.0
- add list_dataflow_storage_accounts
- fix list_dataflows: fix listing wrong results
- fix overflowing column metadata resolving warnings in dataset clients
- update fabric.read_table: support setting the import option by onelake_import_method parameter
- update FabricDataFrame.to_lakehouse_table: support setting the export option by method parameter

## 0.7.7
- fix list_partitions: records per segment calculation
- Added resolve_dataset_id and resolve_dataset_name

## 0.7.6
- update evaluate_dax: allow limiting number rows
- fix create_notebook: supported resolving default lakehouse from another workspace
- fix api doc: removed broken xrefs
- fix get_artifact_id for high concurrency

## 0.7.5
- FabricRestClient & PowerBIRestClient: support waiting for long-run operations
- FabricRestClient & PowerBIRestClient: support paged responses
- Added resolve_item_id and resolve_item_name
- evaluate_dax: support reading data from semantic models with read-only access

## 0.7.4
- internal bug fixes

## 0.7.3
- add delta_column_mapping_mode parameter to FabricDataFrame.to_onelake_table
- update find_relationships: swap from/to for relationships to align with PowerBI
- make sure users can execute DAX against semantic models they have access to AND not have access to the workspace
- support jupyter runtime
- update list_columns : added missing workspace parameter
- fix list_partitions: record / segment computation
- fix list_tables_duplicates
- fix list_tables(extended=True)
- fix broken doc links

## 0.7.2
- list_* (additional_xmla_properties): handle property names that might fail for some rows
- fix list_tables

## 0.7.1
- fix list_annotations

## 0.7.0

- add create_tom_server
- add additional_xmla_properties argument to all applicable list_* functions
- add list_annotations
- update list_columns: alternate columns/tables
- update list_relationships: add extended argument
- update list_hierarchies: add extended argument
- update list_partitions: add extended argument
- update list_measures: add additional columns
- fix plot_relationship_metadata: arrows for relationships point in the same direction as PowerBI
- fix list_datasources

## 0.6.0
- add list_datasources
- add list_dataflows
- add list_apps
- add list_list_gateways
- add list_tables
- add list_calcuation_items
- add list_columns
- add list_perspectives
- introduce the "extended" flag to query DMVs with more information (e.g. table size)
- add additional xmla properties
- update capacity id to lower case
- make FabricDataFrame creation more robust
- fix list_translations

# Next steps
View our [Samples](https://github.com/microsoft/fabric-samples/tree/main/docs-samples/data-science/semantic-link-samples)



            

Raw data

            {
    "_id": null,
    "home_page": "https://learn.microsoft.com/en-us/fabric/data-science/semantic-link-overview",
    "name": "semantic-link",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": null,
    "keywords": null,
    "author": "Microsoft",
    "author_email": "semanticdatascience@service.microsoft.com",
    "download_url": null,
    "platform": "Microsoft Fabric",
    "description": "Semantic link is a feature that allows you to establish a connection between [Power BI datasets](https://learn.microsoft.com/en-us/power-bi/connect-data/service-datasets-understand) and [Synapse Data Science in Microsoft Fabric](https://learn.microsoft.com/en-us/fabric/data-science/data-science-overview).\n\nThe primary goals of semantic link are to facilitate data connectivity, enable the propagation of semantic information, and seamlessly integrate with established tools used by data scientists, such as notebooks.\n\nSemantic link helps you to preserve domain knowledge about data semantics in a standardized way that can speed up data analysis and reduce errors.\n\n[Package (PyPi)](https://pypi.org/project/semantic-link/) | [API reference documentation](https://learn.microsoft.com/en-us/python/api/semantic-link-sempy/) | [Product documentation](https://learn.microsoft.com/en-us/fabric/data-science/semantic-link-overview) | [Samples](https://github.com/microsoft/fabric-samples/tree/main/docs-samples/data-science/semantic-link-samples)\n\nBy downloading, installing, using or accessing this distribution package for semantic link, you agree to the [Terms of Service](https://github.com/microsoft/semantic-link-functions/blob/main/sempy/LICENSE.txt).\n\nThis package has been tested with Microsoft Fabric.\n\n# Getting started\n## Prerequisites\n\n* A [Microsoft Fabric subscription](https://learn.microsoft.com/en-us/fabric/enterprise/licenses). Or sign up for a free [Microsoft Fabric (Preview) trial](https://learn.microsoft.com/en-us/fabric/get-started/fabric-trial).\n* Sign in to [Microsoft Fabric](https://fabric.microsoft.com/).\n* Create [a new notebook](https://learn.microsoft.com/en-us/fabric/data-engineering/how-to-use-notebook#create-notebooks) or a new [spark job](https://learn.microsoft.com/en-us/fabric/data-engineering/create-spark-job-definition) to use this package. **Note that semantic link is supported only within Microsoft Fabric.**\n\n## About the semantic link packages\nThe functionalities for semantic link are split into multiple packages to allow for a modular installation.\nIf you want to install only a subset of the semantic link functionalities, you can install the individual packages instead of the `semantic-link` meta-package.\nThis can help solve dependency issues. The following are some of the available packages:\n\n* [semantic-link](https://pypi.org/project/semantic-link/) - The meta-package that depends on all the individual semantic link packages and serves as a convenient way to install all the semantic link packages at once.\n* [semantic-link-sempy](https://pypi.org/project/semantic-link-sempy/) - The package that contains the core semantic link functionality.\n* [semantic-link-functions-holidays](https://pypi.org/project/semantic-link-functions-holidays/) - A package that contains semantic functions for holidays and dependence on [holidays](https://pypi.org/project/holidays).\n* [semantic-link-functions-geopandas](https://pypi.org/project/semantic-link-functions-geopandas/) - A package that contains semantic functions for geospatial data and dependence on [geopandas](https://pypi.org/project/geopandas).\n* ...\n\n## Install the `semantic-link` meta package\n\nFor Spark 3.4 and above, Semantic link is available in the default runtime when using Fabric, and there's no need to install it. If you're using Spark 3.3 or below, or if you want to update to the most recent version of Semantic Link, you have two options:\n* To install the most recent version `semantic-link` in your notebook kernel by executing this code in a notebook cell:\n\n  ```python\n  %pip install -U semantic-link\n  ```\n\n* Alternatively, you can add semantic link to your Fabric environments directly. For more information, see [library management in Fabric environments](https://learn.microsoft.com/fabric/data-engineering/environment-manage-library).\n\n# Key concepts\nSemPy offers the following capabilitites:\n\n* Connectivity to Power BI\n* Connectivity through Power BI Spark native connector\n* Data augmentation with Power BI measures\n* Semantic propagation for pandas users\n* Built-in and custom semantic functions\n\n# Change logs\n\n## 0.9.1\n- add `retry_config` parameter to `BaseRestClient`\n- add `dataset` parameter to `create_tom_server`\n- fix: escaping \"/\" in workspace url\n- fix: get fabric context error\n- fix: trace add event schema for ActivityId and RequestId\n- fix: missing rows calculation logic on direct lake mode models\n- fix: occasional dataset not found error in list_tables\n- fix: bpa rules for unescaped object names\n\n## 0.9.0\n- add model_memory_analyzer: support displaying VertiPaq statistics about the semantic model\n- add run_model_bpa function: support displaying Best Practice Analyzer statistics about the semantic model\n- update connect_semantic_model: support using either name or ID for workspace and dataset\n- fix evaluate_dax for object columns: cast object columns to strings\n- fix TOMWrapper all_unqualified_column_dependencies: correct the input type to be Measure\n\n## 0.8.5\n- Fix refresh_tom_cache: addressed the issue when refreshing the workspace with different identifiers (e.g., workspace name and workspace ID)\n- Fix TOMWrapper: addressed the lineage compactibility error during TOMWrapper initialization\n- Upgrade Microsoft.AnalysisServices.AdomdClient NuGet dependency to 19.87.7\n\n## 0.8.4\n- add TOMWrapper for semantic model\n- delay sklearn import to import startup latency\n\n## 0.8.3\n- fix: race conditions in sempy initialization during multithreading\n\n## 0.8.2\n- Telemetry Updates, bugfixes\n\n## 0.8.1\n- fix inaccuracy in the row count for certain scenarios when using the list_tables function with extended=True.\n\n## 0.8.0\n- add list_dataflow_storage_accounts\n- fix list_dataflows: fix listing wrong results\n- fix overflowing column metadata resolving warnings in dataset clients\n- update fabric.read_table: support setting the import option by onelake_import_method parameter\n- update FabricDataFrame.to_lakehouse_table: support setting the export option by method parameter\n\n## 0.7.7\n- fix list_partitions: records per segment calculation\n- Added resolve_dataset_id and resolve_dataset_name\n\n## 0.7.6\n- update evaluate_dax: allow limiting number rows\n- fix create_notebook: supported resolving default lakehouse from another workspace\n- fix api doc: removed broken xrefs\n- fix get_artifact_id for high concurrency\n\n## 0.7.5\n- FabricRestClient & PowerBIRestClient: support waiting for long-run operations\n- FabricRestClient & PowerBIRestClient: support paged responses\n- Added resolve_item_id and resolve_item_name\n- evaluate_dax: support reading data from semantic models with read-only access\n\n## 0.7.4\n- internal bug fixes\n\n## 0.7.3\n- add delta_column_mapping_mode parameter to FabricDataFrame.to_onelake_table\n- update find_relationships: swap from/to for relationships to align with PowerBI\n- make sure users can execute DAX against semantic models they have access to AND not have access to the workspace\n- support jupyter runtime\n- update list_columns : added missing workspace parameter\n- fix list_partitions: record / segment computation\n- fix list_tables_duplicates\n- fix list_tables(extended=True)\n- fix broken doc links\n\n## 0.7.2\n- list_* (additional_xmla_properties): handle property names that might fail for some rows\n- fix list_tables\n\n## 0.7.1\n- fix list_annotations\n\n## 0.7.0\n\n- add create_tom_server\n- add additional_xmla_properties argument to all applicable list_* functions\n- add list_annotations\n- update list_columns: alternate columns/tables\n- update list_relationships: add extended argument\n- update list_hierarchies: add extended argument\n- update list_partitions: add extended argument\n- update list_measures: add additional columns\n- fix plot_relationship_metadata: arrows for relationships point in the same direction as PowerBI\n- fix list_datasources\n\n## 0.6.0\n- add list_datasources\n- add list_dataflows\n- add list_apps\n- add list_list_gateways\n- add list_tables\n- add list_calcuation_items\n- add list_columns\n- add list_perspectives\n- introduce the \"extended\" flag to query DMVs with more information (e.g. table size)\n- add additional xmla properties\n- update capacity id to lower case\n- make FabricDataFrame creation more robust\n- fix list_translations\n\n# Next steps\nView our [Samples](https://github.com/microsoft/fabric-samples/tree/main/docs-samples/data-science/semantic-link-samples)\n\n\n",
    "bugtrack_url": null,
    "license": "proprietary and confidential",
    "summary": "Semantic link for Microsoft Fabric",
    "version": "0.9.1",
    "project_urls": {
        "Homepage": "https://learn.microsoft.com/en-us/fabric/data-science/semantic-link-overview"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "e9d2356b0893084c03e3c57bf6f85dd25697ea0356f7d0e9cdfccf13ba3b08d3",
                "md5": "32b440ab15c1260ae6277b3fc60b37a0",
                "sha256": "a6f42ff90cbae541341d8b74951ca074f05ab4833089ccafc7c44c303c19b09d"
            },
            "downloads": -1,
            "filename": "semantic_link-0.9.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "32b440ab15c1260ae6277b3fc60b37a0",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 9877,
            "upload_time": "2025-02-26T08:43:20",
            "upload_time_iso_8601": "2025-02-26T08:43:20.613956Z",
            "url": "https://files.pythonhosted.org/packages/e9/d2/356b0893084c03e3c57bf6f85dd25697ea0356f7d0e9cdfccf13ba3b08d3/semantic_link-0.9.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-02-26 08:43:20",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "semantic-link"
}
        
Elapsed time: 0.44134s