0 Comments

In today’s fast-paced digital world, delivering data to applications quickly, securely, and flexibly is crucial. That's why developers love GraphQL — it gives them the power to query exactly what they need, nothing more, nothing less. But setting up and maintaining a GraphQL API from scratch can be time-consuming.

Enter Microsoft Fabric: your all-in-one data platform that now makes it incredibly easy to expose your data as a GraphQL API, with minimal setup and zero infrastructure overhead.


Why GraphQL in Microsoft Fabric?

Traditionally, building APIs meant provisioning infrastructure, writing boilerplate code, and managing endpoints. But Fabric changes the game by:

  • Auto-generating GraphQL APIs directly over your Lakehouse tables

  • Providing a no-code or low-code experience for developers and data engineers

  • Offering a single, unified API endpoint to query multiple data sources within your workspace

This means you can go from raw data to a production-ready API in minutes.

Here's how easy it is:
  1. Create a Semantic Model in Microsoft Fabric using your Lakehouse data.

  2. Enable API Access to the model via the GraphQL API endpoint.

  3. Query your data with standard GraphQL syntax — no backend service, no custom resolvers required.

Example GraphQL Query:

query {
   SalesOrders {
     OrderID
     Customer {
       Name
       Email
     }
     TotalAmount
   }
}
Here is an example for GraphQL API in fabric. You can then copy the endpoint in your application and start querying the data.

1

0 Comments

Power Virtual Agents sind jetzt Teil von Microsoft Copilot Studio ...

What is Microsoft Copilot Studio?

Microsoft Copilot Studio is a powerful, user-friendly environment designed to streamline the development, management, and deployment of intelligent conversational AI agents. Part of Microsoft's broader Copilot ecosystem, Copilot Studio allows users to effortlessly define topics that guide conversations, implement precise orchestration to ensure dialogues flow logically and contextually, and create customizable actions to seamlessly integrate with backend services and APIs. Built to empower both developers and non-technical users, Copilot Studio simplifies the creation of sophisticated, personalized AI-driven experiences, significantly accelerating time-to-value while maintaining flexibility and control.


In this blog post, I'll dive deeper into how we can effectively leverage topics, orchestration, and actions in Microsoft Copilot Studio to build sophisticated and dynamic conversational agents. I'll first explore how clearly defined topics help your agent better understand user intent, then explain how orchestration enables smooth conversational flows by managing context and transitions between different interactions. Finally, I'll show how incorporating custom actions can extend the agent's capabilities, allowing seamless integration with external services and providing richer, more personalized experiences.


What is a topic?

A topic in Microsoft Copilot Studio represents a specific area or scenario of conversation that your AI agent can recognize and respond to effectively. Think of topics as conversational building blocks, each designed to handle particular user intents or questions. For example, you might have topics around booking appointments, answering product FAQs, or troubleshooting common issues. Defining clear and targeted topics helps your agent quickly detect what the user wants, allowing it to deliver focused and accurate responses, resulting in more natural and satisfying interactions.

topics

What is orchestration?

Orchestration in Microsoft Copilot Studio is the process of intelligently managing and guiding conversational flows across different topics and actions to ensure smooth, logical, and context-aware interactions. Think of orchestration as the conversation conductor, seamlessly deciding when and how to transition between different topics, invoking the appropriate actions, and maintaining the context throughout the dialogue. Good orchestration ensures that your AI agent can handle complex user journeys, adapt dynamically to user inputs, and deliver coherent, engaging, and human-like conversational experiences. 


As illustrated in the example above, orchestration makes decisions on which topic to navigate to next, based on the current context of the conversation. Effective orchestration ensures that your AI agent can handle complex user journeys, adapt dynamically to user inputs, and deliver coherent, engaging, and human-like conversational experiences.

A visual representation of conversational orchestration in AI agents. Illustrate a central node labeled 'Orchestration' connecting dynamically to multiple conversational topics and actions, represented as circles around it, with arrows showing smooth transitions between them. Use a modern, clean design, with bright colors on a white background, to convey clarity and intelligence.

orchestration


What is action?

An action in Microsoft Copilot Studio is a powerful capability that allows your conversational agent to interact with external systems, APIs, or backend processes to execute tasks or retrieve dynamic information. Actions extend the functionality of your AI agent beyond static responses, enabling it to perform real-world operations like checking inventory, scheduling meetings, processing orders, or pulling up personalized user data. By integrating actions, your conversational experiences become more meaningful, relevant, and capable of addressing users' real-time needs directly within the conversation.

actions

Now that we've explored the concepts of topics, orchestration, and actions in Microsoft Copilot Studio, let's dive in together to see how you can apply them practically and elevate your bot conversations to the next level.


In this demo, I'll showcase a practical use case: building a Copilot agent designed to keep my team updated with the latest azure updates.


Step one: Create copilot in Microsoft Copilot Studio

screen1


Step Two: Create topic

screen2


Step Three:Enable AI Orchestration

 screen3


Step Four: Create action to integrate with Teams.

screen4


Let’s check it out the full demo here.

https://youtu.be/b3axMOtt8yk?feature=shared

Once you’ve done the above, you will receive a notification in Teams


Conclusion

In conclusion, harnessing the power of topics, orchestration, and actions in Microsoft Copilot Studio allows you to create sophisticated, context-aware, and highly interactive conversational agents. By strategically defining clear conversational paths with topics, smoothly managing the dialogue flow with intelligent orchestration, and integrating practical actions for real-world functionality, you can significantly enhance user experience and boost productivity. Now, equipped with these insights, you're ready to build smarter, more dynamic AI-driven interactions and elevate your agent to the next level. Happy building!

0 Comments

Import data from Azure blob storage into Databricks

Data pipelines are essential for modern data solutions, and Azure Data Factory (ADF) provides a robust platform for building them. In this blog, we’ll walk through the process of setting up a pipeline in Azure Data Factory to import data from Azure Blob Storage into Databricks for processing.


Step 1: Prerequisites

Before setting up the pipeline, ensure the following prerequisites are met:

  1. Azure Blob Storage: Your source data should be stored in an Azure Blob Storage container.
  2. Azure Databricks Workspace: A Databricks workspace and cluster should be set up for data processing.
  3. Azure Data Factory Instance: Have an ADF instance provisioned in your Azure subscription.
  4. Linked Services Configuration:
    • Azure Blob Storage Linked Service: This enables ADF to connect to your data source.
    • Azure Databricks Linked Service: This enables ADF to connect to the target Databricks Delta Lake.

Both linked services are critical for establishing connections and configuring data pipelines between Blob Storage and Databricks.

1

  1. Access Permissions:
    • ADF needs Contributor access to Blob Storage and Databricks.
    • Ensure you have access to generate a Databricks personal access token.

You will also need to configure the Blob Storage access token in Databricks. This ensures the underlying Spark cluster can connect to the source data seamlessly. Without proper configuration, you may encounter errors like the one shown below.

ErrorCode=AzureDatabricksCommandError,Hit an error when running the command in Azure Databricks. Error details: <span class='ansi-red-fg'>Py4JJavaError</span>: An error occurred while calling o421.load.
: shaded.databricks.org.apache.hadoop.fs.azure.AzureException: shaded.databricks.org.apache.hadoop.fs.azure.AzureException: Container salesdata in account cxitxstorage.blob.core.windows.net not found, and we can't create it using anoynomous credentials, and no credentials found for them in the configuration.
Caused by: shaded.databricks.org.apache.hadoop.fs.azure.AzureException: Container salesdata in account cxitxstorage.blob.core.windows.net not found, and we can't create it using anoynomous credentials, and no credentials found for them in the configuration.
Caused by: shaded.databricks.org.apache.hadoop.fs.azure.AzureException: Container salesdata in account cxitxstorage.blob.core.windows.net not found, and we can't create it using anoynomous credentials, and no credentials found for them in the configuration..


2

In the Cluster Configuration you will need the following:

spark.hadoop.fs.azure.account.key.<account_name>.blob.core.windows.net {{secrets/<secret-scope-name>/<secret-name>}}

You can use Databricks CLI to create scope and secret with the below commands

  • databricks secrets create-scope <scope name>
  • databricks secrets put-secret --json '{                                                             "scope": "<scope name>",
         "key": "<secret-name>",
         "string_value": "<storage account key value>",
    }


Step 2: Create an ADF Pipeline

  1. Add a Copy Data Activity:

    • Add the Copy Data activity into the pipeline.
    • Set the Source to use the Blob Storage dataset and the Sink to use the Databricks Delta Lake dataset.
  2. Add Data Transformation (Optional):

    • Use a Databricks Notebook activity to run transformation scripts.
    • Link the notebook to your Databricks cluster and specify the notebook path.

Step 3: Test and Schedule the Pipeline

  1. Test the Pipeline:

    • Use the Debug feature in ADF to run the pipeline and verify its functionality.
  2. Schedule the Pipeline:

    • Add a time-based or event-based trigger to automate pipeline runs.

3

0 Comments

Harnessing Data Insights with AI


In the fast-evolving landscape of artificial intelligence, GPT-4 Omni stands at the forefront, promising not just advanced language processing capabilities, but also the potential to revolutionize how businesses derive insights from their data. Imagine a scenario where your finance team or C-level managers can seamlessly interact with your organization's data using natural language, thanks to the integration of GPT-4 Omni into your systems.


The Power of GPT-4 Omni in Data Analysis
GPT-4 Omni, developed by OpenAI, represents a significant leap forward in AI technology. Unlike its predecessors, GPT-4 Omni is designed to handle a broader range of tasks, including complex data analysis and generation of insights. This capability makes it an ideal candidate for businesses looking to democratize data access and empower non-technical users to explore and understand data in real-time.

Addressing Ad-Hoc Requests with Azure OpenAI Chatbot
Imagine a typical scenario: your finance team needs immediate insights into recent sales trends, or a C-level manager requires a quick analysis of profitability drivers. With an Azure OpenAI chatbot powered by GPT-4 Omni, these ad-hoc requests can be addressed swiftly and effectively. The chatbot can interact with users in natural language, understanding nuanced queries and providing meaningful responses based on the data at hand.


Demo Application: Bringing Data Insights to Life
Recently, I developed a demo application to showcase the capabilities of GPT-4 Omni in the realm of data analytics. In this demo, I uploaded a CSV file containing a sample sales dataset, complete with sales dates, products, categories, and revenue figures. The goal was to demonstrate how GPT-4 Omni can transform raw data into actionable insights through simple conversational queries.

Screenshot 2024-06-26 160337


How It Works: From Data Upload to Insights

You can watch the video here

  • Data Upload and Integration: The CSV file was uploaded into the demo application, which then processed and integrated the data into a format accessible to GPT-4 Omni.
  • Conversational Queries: Users interacted with the chatbot by asking questions such as:
    • "What are the top-selling products in the sales data?"
    • "Is there any correlation between unit price and quantity sold?"
    • "Are there any seasonal trends in the sales data?"


Natural Language Processing: GPT-4 Omni processed these queries, utilizing its advanced natural language understanding capabilities to interpret the intent behind each question.

Insight Generation: Based on the data provided, GPT-4 Omni generated insightful responses, presenting trends, correlations, and summaries in a clear and understandable manner.

The Role of Assistants API
The Assistants API plays a pivotal role in enhancing functionality and integration capabilities. It empowers developers to create AI assistants within their applications, enabling these assistants to intelligently respond to user queries using a variety of models, tools, and files. Currently, the Assistants API supports three key functionalities: Code Interpretation, File Search, and Function Calling. For more detailed information, refer to Quickstart - Getting started with Azure OpenAI Assistants (Preview) - Azure OpenAI | Microsoft Learn


Conclusion
As AI continues to advance, tools like GPT-4 Omni and the Assistants API are reshaping the business landscape, particularly in the realm of data analytics. The ability to leverage AI-driven insights from your own data, through intuitive and conversational interfaces, represents a significant competitive advantage. Whether it's optimizing operations, identifying new market opportunities, or improving financial forecasting, GPT-4 Omni and the Assistants API open doors to a more data-driven and agile business environment.

In conclusion, integrating GPT-4 Omni and leveraging the Assistants API into your data strategy not only enhances operational efficiency but also fosters a culture of data-driven decision-making across your organization. Embrace the future of AI-powered data insights and unlock new possibilities for growth and innovation.

0 Comments
  •   Posted in: 
  • ML

In today's data-driven world, businesses are constantly seeking ways to leverage their data for insights that can drive better decision-making and outcomes. Predictive modelling has emerged as a powerful tool for extracting actionable insights from data, enabling organizations to anticipate trends, forecast outcomes, and make informed decisions. Azure Machine Learning (Azure ML), a cloud-based platform, offers a suite of tools and services designed to simplify the process of building, training, and deploying predictive models. In this blog post, I’ll explore how to harness the capabilities of Azure ML to build a predictive model, focusing on Automated ML, Designer, feature selection, and propensity modelling with two-class classification regression.

Automated ML: Automated ML is a powerful feature of Azure ML that automates the process of building machine learning models. With Automated ML, you can quickly experiment with different algorithms, hyperparameters, and feature transformations to find the best-performing model for your dataset. By leveraging Automated ML, data scientists can save time and resources while still achieving high-quality results. In our predictive modelling journey, we'll start by utilizing Automated ML to explore various model configurations and identify the most promising candidates for further optimization.

Automated Machine learning

By default, the models are ordered by metric score as they complete. For this tutorial, the model that scores the highest based on the chosen AUC_weighted metric is at the top of the list.

111

navigates through the Details and the Metrics tabs to view the selected model's properties, metrics, and performance charts.

222


Feature Selection: Feature selection plays a crucial role in building predictive models by identifying the most relevant variables that contribute to the model's performance. Azure ML offers several feature selection techniques, ranging from univariate methods to more advanced algorithms. I'll employ these techniques to identify the most informative features in our dataset, reducing dimensionality and improving the interpretability of our model.

The screenshots below display the top four features ranked by their importance, as automatically determined by Automated ML. In this example, the decision to purchase a bike is influenced significantly by factors such as car ownership, age, marital status, and commute distance. These features emerge as key determinants in predicting the outcome, providing valuable insights into the underlying patterns driving consumer behaviour.

333


Designer: Azure ML Designer is a drag-and-drop interface that allows users to visually create, edit, and execute machine learning pipelines. With Designer, even users without extensive programming experience can easily build sophisticated machine learning workflows. We'll leverage Designer to construct our predictive modelling pipeline, incorporating data pre-processing steps, feature engineering techniques, and model training algorithms. By using Designer, we can streamline the development process and gain valuable insights into our data.


Propensity Modelling with Two-Class Classification Regression: Propensity modelling is a specialized form of predictive modelling that aims to predict the likelihood of a binary outcome, such as whether a customer will purchase a product or churn from a service. In our case, I’ll focus on building a propensity model using two-class classification regression techniques with designer. By training our model on historical data with known outcomes, we can predict the propensity of future observations to belong to a particular class. This information can then be used to target interventions or marketing campaigns effectively.

The diagram below illustrates the pipeline designed for training the propensity model using two-class logistic regression. This pipeline encapsulates the sequence of steps involved in preparing the data, selecting features, and training the model to predict binary outcomes. With each component carefully orchestrated, the pipeline ensures a systematic and effective approach to building the propensity model, empowering organizations to make informed decisions based on predictive insights.

444


The screenshot below presents the evaluation results, highlighting that the dataset is imbalanced, which corroborates with the findings detected by Automated ML. This imbalance in the dataset indicates a discrepancy in the distribution of classes, which could potentially impact the model's performance. Understanding and addressing this imbalance is crucial for ensuring the model's accuracy and reliability in real-world applications.

555

Below screenshot shows the data guardrails are run by Automated ML when automatic featurization is enabled. This is a sequence of checks over the input data to ensure high quality data is being used to train model.

666


Conclusion: In this blog post, we've explored how to build a predictive model with Azure ML, leveraging Automated ML, Designer, feature selection, and propensity modelling techniques. By harnessing the power of Azure ML, organizations can unlock valuable insights from their data and make data-driven decisions with confidence. Whether you're a seasoned data scientist or a novice analyst, Azure ML provides the tools and capabilities you need to succeed in the era of predictive analytics. So why wait? Start building your predictive models with Azure ML today and unlock the full potential of your data.