Hints when you struggle with deploying the registration tool of the Azure Data Catalog

In this Blog Post I want to share my troubleshooting experience with the Azure Data Catalog. First, there was some errors to reach the Azure Endpoints, because my IT has blocked some URL’s. But I only received an unknown error in the frontend.

You can take a look into this Logfile to get more detailed information:

%appdata%\AzureDataCatalogErrorEventLogProd.log

On a other machine the tool has some issues by the login. The error was cleared via the Microsoft support. There was an error on the TLS authentication because there was some missing updates on the .NET Framework: https://docs.microsoft.com/en-us/mem/configmgr/core/plan-design/security/enable-tls-1-2-client#bkmk_winhttp

Hope this helps when other struggles with deploying the Azure Data Catalog.

Text Analysis with Power BI in different languages

According to the msdn article „How to integrate Text Analysis into Power BI„, I needed to detect the language of an comment and make a sentiment analysis of it. The reason to make it parametrized and not change the language key is easy, mostly in Germany I have comments in English, German Spain etc.

So, after I completed the Howto above, i created a new M Function named „Language“ with this code:

// Returns the two-letter language code (for example, 'en' for English) of the text
(text) => let
     apikey      = "YOUR_API_KEY_HERE",
    endpoint    = "https://<your-custom-subdomain>.cognitiveservices.azure.com/text/analytics" & "/v2.1/languages",
    jsontext    = Text.FromBinary(Json.FromValue(Text.Start(Text.Trim(text), 5000))),
    jsonbody    = "{ documents: [ { id: ""0"", text: " & jsontext & " } ] }",
    bytesbody   = Text.ToBinary(jsonbody),
    headers     = [#"Ocp-Apim-Subscription-Key" = apikey],
    bytesresp   = Web.Contents(endpoint, [Headers=headers, Content=bytesbody]),
    jsonresp    = Json.Document(bytesresp),
    language    = jsonresp[documents]{0}[detectedLanguages]{0}[iso6391Name]
in  language

The code above is 1:1 from the msdn webpage to get a two letter code of the language of the key merged subject and body.

This is also similar to the other steps, but it must be the first invoke function. The next step is to get the Key Phrases, but depending of the language of the column which are created before.
// Returns key phrases from the text in a comma-separated list
(text,lang as text) => let
     apikey      = "YOUR_API_KEY_HERE",
    endpoint    = "https://<your-custom-subdomain>.cognitiveservices.azure.com/text/analytics" & "/v2.1/keyPhrases",
    jsontext    = Text.FromBinary(Json.FromValue(Text.Start(Text.Trim(text), 5000))),
    jsonbody    = "{ documents: [ { language: """ & lang & """, id: ""0"", text: " & jsontext & " } ] }",
    bytesbody   = Text.ToBinary(jsonbody),
    headers     = [#"Ocp-Apim-Subscription-Key" = apikey],
    bytesresp   = Web.Contents(endpoint, [Headers=headers, Content=bytesbody]),
    jsonresp    = Json.Document(bytesresp),
    keyphrases  = Text.Lower(Text.Combine(jsonresp[documents]{0}[keyPhrases], ", "))
in  keyphrases

The JSon Body has no longer the hard coded en, is uses a new parameter language which is given on the function header. So you must edit the „invoke custom Function“-call with our new language column:

Now, we have our Key Phrases depending on the detected language and we can get the sentiment score also depending on the language with this code:
// Returns the sentiment score of the text, from 0.0 (least favorable) to 1.0 (most favorable)
(text,lang as text) => let
     apikey      = "YOUR_API_KEY_HERE",
    endpoint    = "https://<your-custom-subdomain>.cognitiveservices.azure.com/text/analytics" & "/v2.1/sentiment",
    jsontext    = Text.FromBinary(Json.FromValue(Text.Start(Text.Trim(text), 5000))),
    jsonbody    = "{ documents: [ { language: """ & lang & """, id: ""0"", text: " & jsontext & " } ] }",
    bytesbody   = Text.ToBinary(jsonbody),
    headers     = [#"Ocp-Apim-Subscription-Key" = apikey],
    bytesresp   = Web.Contents(endpoint, [Headers=headers, Content=bytesbody]),
    jsonresp    = Json.Document(bytesresp),
    sentiment   = jsonresp[documents]{0}[score]
in  sentiment

That’s it, after we make another invoke function call, we get the sentiment score based on the given language. The order after that in the M Query Editor:

Power BI Landscape

The following Map / Landscape was created by me, after a Power BI coaching the customer asks me: „Hey, now we know Power BI and we can create Reports. Whats now you job?“ – Therefore I havn’t a good answer and I started to tell him the whole universe. After that, I created a Min Map with everything I know about Power BI which I must discuss with a customer. So, the Map is not complete, and everytime I find time, I expand the Map with new things.

But I want to share this map with you. So, here is it! Have fun and when you have any comments, please let me know.

Power BI Paginated Reports – 1 of ?

This planned Blog-Series is dedicated to Power BI with Paginated Reports. The first part handles who to start with this Feature.

Paginated Reports are known from Reporting Services to create pixel perfect fitting reports, especially for Printouts and it is more an Feature of Power BI. Before Power BI comes to on premises, it was the only web reporting technology on the Microsoft BI Stack – to be honest, Microsoft Mobile Reports is was only a technology that was used from SSRS 2016 to Microsoft Power BI SSRS – launched in 2017.

So, Paginated Reports is not a new technology and you find in the internet a whole bunch of information how it can be used, and which features are included. Actually, this Power BI Feature is only for testing and it is not GA. So many data sources and features are not supported, you’ll find more information at https://docs.microsoft.com/en-us/power-bi/paginated-reports-faq

Okay, let’s start to create our first report:

  1. You need a Power BI Tenant with a Pro License
    You can also create a Office 365 Trial Tenant
    https://signup.microsoft.com/Signup?OfferId=02f65c1d-8b54-4d83-8941-fff22869499c&dl=ENTERPRISEPREMIUM_NOPSTNCONF&ali=1
  2. You need a paid Azure Tenant on a corporate mail address.
    If you create a trial on Office 365, you can also use the behind azure tenant https://portal.azure.com and create a trial with USD 200 for free.

After completing this prerequisite, you can create a Embedded Resource, which handles the workload for the Paginated Reports – at least it must be minimum an A4, this Plan covers the P1 resource from the Premium Plans.

Click on the name and assign a workspace

Activate the Paginated Reports Feature.

After committing this page, your workspace becomes a diamante after his name, this symbol shows you, that the workspace uses premium capacities.

Now you are able to upload Paginated Reports. To create a report, you need the Power BI Report Builder https://powerbi.microsoft.com/en-us/blog/power-bi-paginated-report-builder-now-available/

After you successfully installed the Report Builder, you can create a Blank Report. Now let’s add a data source:

Choose as a data source driver “Enter Data”

Next add a new dataset and choose the new data source

Click on “Query Designer” and enter some data. In this case enter some some Years, name the column “Year” and change the datatype to “Integer”

After you have entered some data, press okay on the Query Window Dialogue and make a right click on the report canvas to add a new table:

Next drag and drop the year column from the dataset into the table

Now, you can save the report on your hard drive – I named it “First Paginated Report”. Currently the is no option to save the report directly to PowerBI.com. (Please note, when you go to the preview of the report designer, in the most cases no data will be displayed in the report designer – but no worries, on the Power BI Tenant it works correctly)

Next let’s go to the Power BI tenant and choose our workspace with the assigned capacities and choose Create and Dataset.

Click on Files – Get.

On the next windows choose local file and select our newly created Paginated Report.

Now you will find your newly created reports in the „Reports“ section

That’s it! You have created and deployed a Paginated Report. Now you can view it also in the mobile app, print it and export it to other file formats.

After that, please pause your Azure SKU for Power Embedded. Because if it is paused, it does not cost any money!

Thanks for reading.

Get PowerShell Data into an SSIS Dataflow

In my last project I need to capture some data from the Active Directory – especially user from the local security groups. So I decided to use a Power Shell Script. With PS it is very easy to capture all user from the groups and walk through nested groups. because in my interested are only user, not a group which is nested into another group….

Get-ADGroup -Filter {(name -eq  [groupname]) } | Get-ADGroupMember -Recursive | Where { $_.objectClass -eq "user" } | Get-ADUser -properties * | Where-Object { $_.DistinguishedName -notlike '*some comment*' }| select SamAccountName  -unique

The script above select the unique usernaqmes from a given group and excludes user, who have an special comment. The script iterates though any nested group and returns the username. Now I need the list not in PS, I need it into SSIS to transfer the data into a database.

Next I created an SSIS Dataflow with a script task, which is used as a datasource

The dataflow …

Next I created a variable who stores the groupname who I want to capture and then I configured the source – I used C# as my preferred programming language:

The variable gruppenname stores the groupname – the screens are from a german project :-)

Next I need the the C# script with my PS snippet:

#region Namespaces
using System;
using System.Data;
using Microsoft.SqlServer.Dts.Pipeline.Wrapper;
using Microsoft.SqlServer.Dts.Runtime.Wrapper;
#endregion

#region Custom Namespaces
using System.Management.Automation;
using System.Management.Automation.Runspaces;
#endregion

using System.IO;
using System.Collections.ObjectModel;
using System.Text;

 [Microsoft.SqlServer.Dts.Pipeline.SSISScriptComponentEntryPointAttribute]
public class ScriptMain : UserComponent
{
    
    private StreamReader sr = null;
    Collection<PSObject> results;

    public override void PreExecute()
    {
        base.PreExecute();
        string groupname = this.Variables.Gruppenname;
        //string PowerShellScript = "Get-ADGroupMember -Identity \"" + groupname + "\" -Recursive | Select samAccountName";
        string PowerShellScript = "Get-ADGroup -Filter {(name -eq  \"" + groupname + "\") } | Get-ADGroupMember -Recursive | Where { $_.objectClass -eq \"user\" } | Get-ADUser -properties * | Where-Object { $_.DistinguishedName -notlike '*any comment*' }| select SamAccountName  -unique";
        Runspace runspace = RunspaceFactory.CreateRunspace();
        runspace.Open();
        Pipeline pipeline = runspace.CreatePipeline();
        pipeline.Commands.AddScript(PowerShellScript);
        results = pipeline.Invoke();
        runspace.Close();

    }



    public override void CreateNewOutputRows()
    {
        StringBuilder stringBuilder = new StringBuilder();
        foreach (PSObject obj in results)
        {
            Ausgabe0Buffer.AddRow();
            Ausgabe0Buffer.samAccountName = obj.ToString();
        }
    }
}
The samaccountname in the output columns stores the username from the AD

Thats it, now you have the PS output into an dataflow …