Blog

Insights & ideas

Stay ahead with expert articles, industry trends, and actionable insights to help you grow.

What it used to be and what it has become: visualLabs through my own eyes
10 mins read
Apr 9, 2025

What it used to be and what it has become: visualLabs through my own eyes

I first joined VisualLabs in the summer of 2020 as a junior business analyst. As you can see from the timeline, I was part of the mass junior recruitment. With three of us, the company grew to 8 people at that time.

Read more

I first joined VisualLabs in the summer of 2020 as a junior business analyst. As you can see from the timeline, I was part of the mass junior recruitment. With three of us, the company grew to 8 people at that time.

In the more than 1 year I worked here from 2020-2021, I was involved in quite a variety of tasks: building and improving Power BI reports, working a lot on a contract management application I built using the Power Platform, and also gaining insight into the beauty of Business Central. The latter also gave rise to some comical memories, such as the painstaking work involved in recording and subtitling training videos for clients, and how I was then, as an undergraduate student, on 'duty' for Christmas because I had no more holidays left for the year. But I got a lot of support from my senior colleagues in these things, they didn't let me get lost in the shuffle.

3 years later, in the summer of 2024, I rejoined VL, but now I work specifically with ERP. One thing that was very nice and new to me in the company was the company timeline. Where last time I was one of the mass junior hires, I'm now a part of the company life.

An amazing amount has happened in my time away, and it's great to see these events being shared by my colleagues, creating a stronger sense of belonging.

What has actually changed in these 3 years? I haven't had the chance to go through everything since I rejoined, and there's not enough space to go into it all here, so I'll just give you a few snippets.

Office

The first of these is probably the new office: the move from Zsigmond Square to Montevideo Street was already done when I was still here as a junior. But who I couldn't enjoy it then, and I wasn't part of the "moving in", but still, when I returned here 3 years later, I felt like I had shaped it. Interpret this to mean that the ethos that makes visuallabs to visuallabs, I think, changed very little, and the homeliness of the office reflected that.

Specialisation

The company has made huge progress in terms of specialisation and staff numbers while I was away: the team has grown to 35 people, and there are now separate business units for all the tasks I had the opportunity to join on a rotational basis as a junior. These are the CE team, who build business applications for clients, the data team, who deliver data analytics and visualisation solutions, and there's the ERP team - of which I became part - where we introduce Microsoft's enterprise management solutions (Dynamics 365 Finance and Operations and Business Central) to clients.What I would perhaps highlight from this is that even though these specialisations have evolved, it has not brought with it a siloed operation. To deliver work in our own area, we have access to the knowledge of other areas, and we mutually help each other across teams to deliver the highest quality service possible. From this perspective, what has changed in 3 years? I would say nothing; what worked then on a small scale, works now on a bigger scale.

Agile operation

We have had a solid vision of how we deliver solutions since I was a junior employee here: the agile methodology. What was in its infancy is now mature. If not fully agile, it uses the agile elements so well that it supports our work to a great extent on a day-to-day basis.It helps us communicate internally and to our customers by allowing them to post issues in DevOps that we help them resolve; we write features, user stories, test cases that help us with needs assessment and implementation. We have daily stand-up meetings with the team in the mornings where we discuss our stumbling blocks, at the end of the week we have sprint rounds where we always plan the next week's sprint, and monthly we have a retros where we pay special attention to feedback to each other, looking back on the past 1 month.

Team and all the fun

Unfortunately, during my first job, I didn't get much of that live because of Covid, but even then I had those short conversations at the beginning of a call or at the morning "all-people" DSMs that reinforced the sense of belonging to the team and the good atmosphere. Fortunately, we have kept this habit ever since, so no call is ever dull. And once the epidemic subsided, these community events only grew stronger, with regular team-building events, VL team-building retreats, co-hosted Christmas and Halloween parties.It's also a good day at the office. Although it varies from day to day, we have little rituals that colour the days and take the focus off work. For example, the daily lunch together in the office, chit-chat while making coffee, or just passing a funny comment to each other at the next desk, or the monthly office day when we all go in and look back over the past month. In short, you never get bored here. 😊

Coming back to a place where I've worked before is a special experience - especially when so much has changed in the meantime. VisualLabs has retained the supportive community and vibrancy that I grew to love, while reaching new levels of development and professionalism. This journey has been a learning experience not only for the company, but also for me, as the old and new experiences have given me a stronger, more mature perspective. I look forward to being a part of the next chapter and seeing where the company goes in the future!

Recap: Budapest BI Forum
10 mins read
Apr 9, 2025

Recap: Budapest BI Forum

The first day was a full-day Tabular Editor workshop, where we covered the basics and discussed topics such as controlling perspectives, writing macros, and refreshing partitions. The other two days of the conference were packed with learning, and here are my key takeaways from my favorite sessions.

Read more

Hey everyone! Here’s a summary of the Budapest BI Forum 2024, where I had the chance to dive into some intriguing topics and engage in inspiring conversations.

The first day was a full-day Tabular Editor workshop, where we covered the basics and discussed topics such as controlling perspectives, writing macros, and refreshing partitions. The other two days of the conference were packed with learning, and here are my key takeaways from my favorite sessions.

Keynote Speech: BI Trends

The day kicked off with a keynote that explored current and future BI trends.

Bence, the main organizer and host of the event, supported his key points with insights from Gartner research and similar studies. A few highlights that caught my attention:

  • By 2025, data security and data governance are expected to top the list of priorities for executives.
  • The rapid rise of AI introduces scenarios where users export data from dashboards to Excel, feed it into tools like ChatGPT, and generate their own insights. While exciting, this raises concerns about security and "shadow reporting," issues companies have tried to curb for years.

As a contractor and consultant I find this especially ironic. Large companies often hesitate to share data, even when it’s crucial for project development. They implement robust policies like VPNs and restricted searches to prevent leaks. But, at the same time, they struggle to monitor and control employees' behaviors, such as inadvertently sharing sensitive data.

This evolving dynamic between AI, data security, and governance will definitely be a space to watch closely.

Read more about Gartner’s 2024 BI trends here.

PBIR: Report Development in Code

This technical session introduced the PBIR format, a preview feature that allows Power BI reports to be stored as individual JSON files for each visual and page, instead of a monolithic file.

The feature’s potential for bulk modifications was the most exciting part. The presenter showed how Python scripts could iterate through the JSON files to apply changes (e.g., adding shadows to all KPI cards) across the report.

While still in preview and somewhat buggy, it’s a promising direction. I’m also intrigued by the integration possibilities with VS Code and GitHub Copilot, which could simplify automation for non-coders.

However, it seems TMDL language won’t be integrated into PBIR anytime soon—a bit disappointing, but I’m optimistic this will eventually happen.

TMDL Enhancements in Power BI & VS Code

One of the most exciting parts of the forum was exploring updates to TMDL (Tabular Model Definition Language), designed to make Power BI model development more efficient.

TMDL View in Power BI

This might be the feature I’m most excited about! The ability to edit your semantic model as code directly inside Power BI is a massive leap forward. Combining drag-and-drop, Copilot, and coding will make development smarter and faster.

Immediate Code Updates in Power BI (Planned for Next Year)

A handy feature to look forward to is real-time synchronization between modified TMDL code and Power BI. Changes to the model will reflect instantly in Power BI without reopening the file, saving tons of time during development.

VS Code TMDL Extension

The TMDL extension in VS Code offers:

  • Formatting: Automatically organizes TMDL syntax.
  • IntelliSense and Autocomplete: Speeds up coding with intelligent suggestions.
  • Expand/Collapse Functionality: Makes navigating larger TMDL files easier.

Get the extension here.

 

Copilot Integration in VS Code

Copilot lets you generate measures, calculations, and scripts with AI assistance. For example, as you type "Profit," Copilot suggests a complete formula based on the context. It’s a productivity boost I can’t wait to leverage more!

Online Editing with VSCode.dev

You can now edit repositories directly in your browser using the vscode.dev prefix for your repository URL. It’s perfect for quick edits without setting up a local environment.

These updates are poised to make model development faster, smarter, and more collaborative for teams using GitHub and VS Code.

Lunch Break with Insights from Microsoft

Lunch turned into one of the highlights of the day when Tamás Polner, a key figure at Microsoft, joined our table. Tamás shared some fascinating insights about the current direction of Microsoft’s data ecosystem and upcoming trends:

  • Fabric focus: Microsoft is heavily prioritizing Fabric over tools like ADF and Synapse, which are expected to receive basically no new feature updates as development resources shift toward Fabric. While this has been an industry assumption for a while, it was great to have this firsthand confirmation. The message is clear: Fabric is the future of Microsoft’s data ecosystem.
  • Data security: Reflecting on the keynote’s emphasis on data security, Tamás explained that this aligns with what he’s seeing at Microsoft. The number of developers in the security team is increasing significantly, and this trend doesn’t seem to be slowing down.
  • Optimized compute consumption: We also discussed CU (Compute Unit) optimization in Fabric. Tamás reaffirmed something I’d heard in Fabric training sessions: notebooks are far more powerful and efficient than UI-powered features like Dataflow Gen2. They use significantly less compute capacity, making them the better choice for many workflows.
  • DP-600 exam: Tamás mentioned that the DP-600 exam has become one of the most successful certifications in Microsoft’s history, with a record high number of certifications achieved in short time.
  • Copilot and AI: Copilot is a major focus for Microsoft, but its rollout faces challenges due to the high resource intensity of AI models. Tamás noted that, like other companies deploying built-in AI solutions, Microsoft needs to continue investing heavily in CAPEX for computing power to make these solutions broadly accessible.

 

This conversation provided valuable context and insight into Microsoft’s strategic priorities and was a great opportunity to discuss industry trends and technical strategies in detail.

 

Storytelling with Power BI

This session revisited a topic close to my heart: how to create Power BI reports that truly connect with their audiences. The presenter broke it down into three key phases:

  1. Research: Start by understanding the report’s purpose. Who will use the report? What decisions should it support? Can the goal be summarized in one clear, concise sentence?
  2. Create: Develop the report based on your research. Ensure that the visuals, design, and structure align with the user’s needs and the intended outcomes.
  3. Deliver: It’s not just about handing over the report and documentation, then walking away. True success lies in monitoring how the report is used and gathering user feedback. This feedback often reveals both strengths and weaknesses you didn’t anticipate, providing opportunities to refine and enhance the report further.

While much of this was a confirmation of what I already practice, it underscored an essential point: The discovery phase and follow-ups are just as critical as the actual development process.

It’s also a reinforced me that educating clients about the value of these stages is crucial. When clients understand that investing time and resources into proper research and post-delivery follow-ups leads to better reports and happier users, they’re much more likely to embrace these processes.

 

Final Thoughts

The day was packed with insights, but what truly stood out was the seamless blend of technical innovation and strategic foresight. Whether it was exploring new options like TMDL and PBIR, or gaining a deeper understanding of the big-picture trends shaping the future of BI, the forum offered something valuable for everyone.

Of course, the lunch chat with Tamás was a treasure trove of insider knowledge—easily one of the event’s highlights for me. Another personal highlight was a heartfelt conversation with Valerie and Elena, who encouraged me to take the next step in my professional journey: becoming a conference speaker.

If any of these topics piqued your interest or you’d like me to dive deeper into specific sessions, just let me know—I’d be happy to share more!

Create efficient and customized Release Notes with Bravo Notes
10 mins read
Apr 9, 2025

Create efficient and customized Release Notes with Bravo Notes

For our customers, it is important that when we deliver a new version of their existing IT system, we also provide a release note on the content and functionality of the released package. At Visuallabs, we constantly strive to meet our customers’ needs to the maximum, all while simplifying our own workflows and increasing our administrative efficiency.We are supported in this by the Bravo Notes available in DevOps. Using this plug-in, we produce a unique yet standardized Release Note with each new development package delivery. This allows us to meet our customers’ requirements in a fast and standardized way.

Read more

For our customers, it is important that when we deliver a new version of their existing IT system, we also provide a release note on the content and functionality of the released package. At Visuallabs, we constantly strive to meet our customers' needs to the maximum, all while simplifying our own workflows and increasing our administrative efficiency. We are supported in this by the Bravo Notes available in DevOps. Using this plug-in, we produce a unique yet standardized Release Note with each new development package delivery. This allows us to meet our customers' requirements in a fast and standardized way.

What is needed to do this?

By following a few simple principles in our delivery processes, the documentation we already produce provides a good basis for generating standard version documents in a few steps for our releases or bug fixes.

How do we document?

  • The conventions for using the various purpose fields available on a given DevOps element will be strictly adhered to and filled in in a way that is appropriate for the document being generated.
  • User Stroy descriptions are prepared in a standard format. This allows us to provide standard quality for our customers and to build in automated document generation.
  • Tickets are sorted by transport unit. This helps when responding to multiple business challenges from the customer at the same time. Documentation of delivered enhancements and system changes can then be categorised in one document.

Using Bravo Notes

Bravo Notes provides technical assistance to help you meet these requirements with the right customisation.The main functions we use:

  • Compiling content: there are several options to choose from when selecting items from DevOps. We use Query most often among the options shown in the screenshot below, because the multiple filtering criteria allow us to select relevant elements more efficiently, thus making the documentation more precise.
  • Template: In Bravo Notes, we have created various templates to organise the news into a proper structure.  

Main units of the template developed:

  • In the case where several delivery units or business processes are involved for a system release, the relevant descriptions are grouped together in the document.
  • A further organizing principle in the template is that new developments are shown in a feature-by-feature breakdown, and solutions to bugs are also shown in a separate unit. This makes it clear which supported feature a given release item refers to, whether it is a new development or a bug fix.
  • Use parameters: parameters based on business processes allow you to customise the generation of documents. During generation, you can change the title, date, release date and add comments to the document. You can also specify the applications and resources involved, for example, which business area or environment is affected.
  • Display of document units and headings based on a set of rules: it is handled in the template to display only the relevant headings and document parts; e.g. if there was no error correction in a given delivery unit, its heading is not displayed either.
  • Fields used in the template: as defined above, we provide easy-to-read descriptions for the released developments. The consistent documentation of the DevOps tickets used in the design or development process allows this to be done quickly and in a standardized way. The content of the fields defined in the template about the tickets is automatically included when the document is generated.
  • Export: After generation and verification, we export your document to PDF format.

Testimonials: Overall, it is therefore important for our customers to receive detailed and business-relevant documentation on the new versions provided for the systems they use.We are also trying to simplify our own workflows.The Bravo Notes module integrated into DevOps supports us in achieving these goals.With this plug-in, we create customized yet standardized Release Notes with each new development package delivery. This allows us to meet our customers' requirements in a fast and standardised way, providing them with the necessary information and transparency on system changes and enhancements.

Soft blue and white gradient background with blurred smooth texture
Filter
Industry
Technology
Solution category
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
There's a new way to turn business ideas into app frameworks
April 23, 2025
5 min read
There's a new way to turn business ideas into app frameworks
Read more

Imagine describing an app you need in your own words and getting a basic app framework in minutes. With Plan Designer in Power Apps, that’s already becoming possible.

What is the Plan Designer?

Plan Designer is a new Copilot experience within Power Apps. It allows users to describe their app in natural language and receive a structured starting point.  

This move is part of Microsoft’s broader move to bring generative AI into everyday business tools. While it doesn't yet deliver complete or production-ready applications, it offers a strong foundation that helps teams move faster, validate ideas earlier, and collaborate more effectively with dev teams when it’s time to build.

Important to know: It’s still in preview

Plan Designer is currently available as a public preview feature. That means it’s not production-ready yet, and it’s not recommended for complex or business-critical use cases.

It’s a promising direction, and there are many more improvements in the pipeline. But for now, think of it as a way to jumpstart your ideas, not as a full replacement for expert-built solutions. Let’s see how:  

From idea to app structure, without coding

Some of the best ideas for internal apps come from the people who work closest to the process.  

You’ve likely experienced it yourself: you know exactly what your team needs, whether it’s a simple PTO planning tool or a way to track field tasks. You understand the workflow, the challenges, and the users. But when it comes to turning that insight into a working app, you’re not sure how to get started.

That’s been the reality for many business users.

Historically, PowerApps has been aimed at non-developers, people in HR, customer service, field operations, and sales. These users know their business inside and out but often lack the technical or systems thinking skills to design a well-structured, scalable app. As a result, many apps were either overly simple or hard to maintain and improve.

That’s where Plan Designer comes in.

It offers a more guided way to get started. Instead of starting from scratch, you describe what you need in natural language, for example, “I need a tool to assign jobs to field technicians.” You can even upload visuals, like a screenshot of an old tool or a process diagram.  

Picture, Picture

Based on your input, Copilot generates a structured draft of your app.  

What you get is a smart skeleton, with suggested tables, screens, user roles, and basic logic. It proposes a data model and automation ideas using Power Automate, all based on what your prompts. You can then review, adjust, or approve what Copilot gives you before it builds out the logic.

It won’t give you a finished app, but it gives you a strong starting point, one that reflects your intent and helps you think through how your app should be structured. That’s a big step forward for anyone who understands the business problem but not the development process.

What can you currently do with Plan Designer?

To access the Plan Designer, you need a preview environment with early feature access enabled. Once set up, you can start designing solutions directly from the Power Apps homepage by toggling on the new experience.

It’s still the early days, so it’s important to set the right expectations. As of April 2025, Plan Designer has the following capabilities:  

Natural language input

Based on natural language input, the Plan Designer will generate a solution tailored to your needs. This includes creating user roles, user stories, and data schemas.

Solution generation

The tool can create basic end-to-end solutions, including:

  • Dataverse tables
  • Canvas apps
  • Model-driven apps

Iterative development

You can refine your plans by providing feedback during the design process to make sure that the generated solution aligns with your specific needs.

Collaboration and documentation

The generated plan serves as both a blueprint for development and documentation for future reference to help teams align on business goals and technical execution.

Integration with Power Platform tools

While still in preview, the tool integrates with other Power Platform components like Dataverse and Power Apps. However, some features (e.g., Power Pages support and advanced data modeling) are not yet available.

Limitations in the preview

The tool currently does not support generating full Power Automate flows or using common data model tables like accounts or contacts. Features like analytics integration, Azure DevOps compatibility, and document uploads (e.g., RFPs) are not yet implemented.

The feature set is evolving rapidly, with updates rolling out every few days. One recent improvement: Copilot now explains which AI agents are working on different tasks, for example, the requirement agent, data agent, or solution agent.

Picture, Picture

To sum up, Plan Designer helps you get the core pieces in place in just a few minutes. It’s especially useful for:

  • Prototyping apps without waiting for a developer
  • Practicing prompt-writing to refine app design
  • Getting a better understanding of how systems and logic fit together

It’s great for playing around, testing out concepts, and learning how to approach app development with systems thinking. Let’s see how this might change in the coming years.  

How you’ll use Plan Designer in the near future

Let’s say there’s a process in your team that’s manual, slow, or inconsistent, and you know exactly how it should work. Maybe it’s tracking field work, collecting customer data, or planning PTO.  

You have the knowledge to solve it. What you don’t always have is the time, tools, or technical background to build the solution yourself.

That’s where Plan designer is moving toward. It will help you translate your ideas into something concrete: a data model, screens, and suggested relationships. It will give you a head start, so you won’t have to start from scratch.

Here’s what that might look like in practice:

  • You’re a field manager who needs to track technician assignments and jobs.

You describe your idea to Copilot, and it creates basic tables like “Jobs” and “Technicians,” with suggested relationships between them. The logic and visuals still need work, but you now have a structure to build on.

Looking for inspiration to improve efficiency in Field Service? Check out our use cases here.

  • You’re in sales and want to explore upsell recommendations for client visits.

Copilot sets up a rough draft with placeholders for customer info and past purchases. It doesn’t connect to CRM data yet, but it helps you map out the concept before looping in technical teams.

  • You’re on a support team and want to build a customer intake form.

You describe the form and basic routing needs, and Copilot creates a simple layout with suggested fields and logic. You’ll need to tweak it, but it’s a much faster way to get started.

While these examples are simple, they give you an idea of where things are heading. Plan Designer isn't here to replace software engineers but to allow business teams to move faster and speak the same language as your dev team.

Turning your starting point into a real solution

At VisualLabs, we follow every development in the Microsoft ecosystem closely and we’re excited about Plan Designer’s progress. It’s already a powerful tool for creating skeleton apps, exploring ideas, and learning how data models and logic come together.

But when you need more than just a starting point, when performance, integration, scalability, and usability matter, our team is here to help. We bring the expertise to take your idea and turn it into a reliable, well-designed app that fits your organisation’s needs.

AI is changing how we build apps, but human insight still makes the difference.  

Interested in what use cases our customers are prioritising? Check out our case studies here.

Handling REST API Data in ADF: A Guide to Pagination and JSON Aggregation
April 9, 2025
8 min read
Handling REST API Data in ADF: A Guide to Pagination and JSON Aggregation
Read more

When working with data from REST APIs, it's common to encounter limitations on how much data can be retrieved in a single call. Recently, I faced a challenge where the API limited responses to 1000 rows per call and lacked the usual pagination mechanism, such as a "next page URL" parameter in the response. This absence makes it difficult for developers to automate the data retrieval process, as there's no clear way to determine when all the data has been retrieved.

In scenarios like data backup, migration, or reporting, this limitation can become an obstacle. For instance, using a real-life scenario as an example: if your company manages its HR-related processes in SAP SuccessFactors, you’ll encounter this same challenge. Without a native connection between SuccessFactors and Power BI, one of the viable options for pulling data for reporting or building a data warehouse (DWH) is through REST API calls. While Power BI offers a quick solution via Power Query, it’s not always the best tool—particularly when dealing with larger datasets or when corporate policies limit your options. This is where Azure Data Factory (ADF) becomes a more appropriate choice. However, ADF presents its own challenges, such as handling API responses larger than 4MB or managing more than 5000 rows per lookup activity.

This article will walk you through overcoming these limitations when working with JSON files and API responses in ADF. By the end, you'll learn how to append multiple API responses into a single result and handle API calls with unknown pagination parameters.

While alternative solutions like Python or Power BI/FabricDataflow Gen2.0 may offer quicker implementations, there are cases where ADF is necessary due to restrictions or specific use cases. This guide is tailored for ADF developers or anyone interested in experimenting with new approaches.  

If you're familiar with working with APIs, you're likely aware that limitations on the number of rows per call are a common practice. Typically, APIs will let users know if there’s more data to retrieve by including a "next page URL" or similar parameter at the end of the response. This indicates that additional API calls are necessary to retrieve the full dataset.

However, in the scenario we faced, this part of the URL was missing, leaving no clear indication of whether more data remained in the system. Without this pagination rule, it's challenging to determine from a single successful API call whether more API requests are required to retrieve the complete dataset. This makes automating the data retrieval process more complicated, as you have to implement a method to check whether further calls are necessary.

This is the first issue we’ll solve using ADF.

The second issue involves handling API responses in JSON format and merging different JSON files into a single result.

If the requirement is to store all the data in one single JSON file, there are several approaches you can take:

  1. Create one single JSON file from all the responses and store or process it later.
  2. Generate multiple JSON files, one for each API call, and then flatten them into one file at a later stage in the ADF pipeline.
                  
     

Alternatively: Write the data directly to a SQL table, if the final destination is a database.

In this article, we’ll focus on Option 1—creating one single JSON file from multiple responses. While this may not be the ideal solution, it presents a unique challenge when working with JSON arrays in ADF.

While the solution itself is not overly complicated, there are a few important points where developers should proceed with caution.

First, be aware of the limitations ADF imposes on each LookUp Activity or Web Activity: the output size is restricted to 4 MBs or 5000 rows. If your response size slightly exceeds this limit, you can adjust the settings—lowering the number of rows per call from 1000 to, say, 800. However, keep in mind that this adjustment could significantly increase the overall runtime, especially if you're dealing with many columns of data. In such cases, consider an alternative approach, such as using the Copy Activity to write the data directly into a SQL database or generate multiple JSON files and merge these into one later.

Another critical point is the use of loops. The solution involves two loops, so it’s essential to carefully handle scenarios that could result in endless loops. Proper checks and conditions must be implemented to avoid such issues and ensure smooth execution.

Implementation - ADF

Here is the logic of the entire pipeline:

To manage API pagination and build a single valid JSON file in ADF, you will need to define several variables, as shown in the image above: Variables Setup:

           
  1. Skip X Rows (Integer):This variable will store the number of rows to skip in each REST API call, which is the dynamic part of the URL.
  2.        
  3. Skip X Rows - Temporary (Integer):Skip X Rows - Temporary (Integer): This variable is needed because ADF doesn’t support self-referencing for variables. You can’t directly update Skip X Rows using itself, so this temporary variable helps track progress.
  4.        
  5. REST API Response is empty? (Boolean):This Boolean flag will indicate whether the last API response was empty (i.e., no more data), triggering the loop to stop.
  6.        
  7. API Response Array (Array):Used to store each individual API response during the loop. This allows you to gather all responses one-by-one before processing them.
  8.        
  9. All API Response Array (Array) [Optional]:This array is optional and can be used to store all responses combined after the loop finishes.
  10.        
  11. Current JSON (String):Stores one individual API response in JSON format as a string.
  12.        
  13. Interim Combined (String):Stores the concatenated JSON responses as you append them together in the loop.
  14.        
  15. Combined JSON (String):Holds the final complete JSON result after all responses have been processed and combined.

Step-by-Step Execution

1)  Initialize Variables:

           
  • Set Skip X Rows to 0. This represents the starting point for the API.
           
  • Set Skip X Rows - Temporary to 0. This is a temporary counter to help update the primary skip rows.
           
  • Set REST API Response is empty? to false. This Boolean will control when to stop the loop.

         

2) Add an UNTIL activity: Set up a WHILE loop (or UNTIL activity) with the condition @equals(variables('REST API Response is empty?'), true) so it continues running as long as there is data to retrieve.

3) Inside the WHILE Loop: a) Lookup Activity (Initial API Call):

           
  • Perform a Lookup Activity calling the REST API, but limit the returned data to only one column (e.g., just the ID which should be never empty if present). This keeps the response light and allows you to check if more data exists.b) IF Condition (Check Response):
           
  • If the response is empty, set REST API Response is empty? to true to end the loop.
           
  • If not empty, proceed to the next step.c) Full API Call:
           
  • If the response is not empty, perform the full REST API call to retrieve the desired data.
           
  • Append the response to the API Response array variable.
  • d) Update Variables:
           
  • Increase Skip X Rows - Temporary by the number of rows retrieved (e.g., 1000).
           
  • Set Skip X Rows to the value of Skip X Rows - Temporary to update the dynamic part of the API URL.
  • 4) Handle Failure Scenarios:
           
  • Optionally, but highly recommended: add a Fail Condition or a Timeout Check. This condition will break the loop if there is a problem with the API response (e.g., a 404 error).

After gathering all the API responses, you'll have a list containing multiple JSON arrays. You’ll need to remove the unnecessary brackets, commas or other JSON elements. To do that you’ll need a for loop which iterates over all the JSON arrays in the array variable and modifies those accordingly.

The steps followed inside the for loop:

           
  1. Save the currently iterated JSON into 1 variable and save as string (string format is needed so you can manipulate the response as text).
           
  1. Modify JSONTo flatten a JSON files into a single file you need to remove the first “[“ or last “]” character so concatenating it will result in a valid file.

        3. Save it to another variable which will store the final JSON Due the lack of self referencing option in ADF you need to update the combined JSON variable every time a new JSON piece is added.  

And that’s it! You have successfully addressed both the pagination and JSON file handling challenges using ADF.

Implementation – Power BI

Compared to this, the solution in Power Query is much more straightforward. You need one function where you can control the number of rows you want to skip, which basically calls the API by 1000 rows. And you need another query which starts whit a while loop which calls the API as many times as it doesn’t return an empty response. Once it’s ready, you can combine the list of tables into one table. By expanding it, you’ll end up with the complete dataset.  Here is the code of the function:

let

   Source =  ( rows_to_skip as number ) =>

let

   Base_URL = "https://Your_API_URL",

   Relative_URL = "The relative URL part of your API call",

   Source = Json.Document(

       Web.Contents(

           Base_URL,

           [Relative_Path = Relative_URL & Number.ToText(rows_to_skip) ]

       )

   ),

   //Additionally you can convert the data directly with this function to table

   Convert_to_Table = Table.FromRecords ( {Source} )

in

   Convert_to_Table

in

   Source




Here is the query which will invoke the function and act as a while loop.  
let

Surce =

//Create a list of tables

List.Generate( () =>

// Try to call the function and set input parameter to 0 during the 1st call.

   [Result = try Function_by_1000(0) otherwise null, Page = 0],

//Checks if the inside of first row (referenced by the “{0}” part) is empty. Due to logic of this particular API it checks the “
results” inside the “d” parameter in the response.



   each not List.IsEmpty([Result]{0}[d.results]),

// Try to call the function again and increase the input parameter of the function by 1000 (max rows by API call)

   each [Result = try Function_by_1000([Page]+1000) otherwise null, Page = [Page]+1000],

   each [Result])

in

   Source

How to Trigger Power BI Refresh from Azure Data Factory
April 9, 2025
22 min read
How to Trigger Power BI Refresh from Azure Data Factory
Read more

In this article, I will show you how to connect Azure Data Factory (ADF) with Power BI to automate report refreshes. This integration keeps your Power BI dashboards up-to-date automatically, saving time and ensuring your data is always current for better decision-making.

Idea & Concept

The main challenge I’ve faced in Power BI is that I need to refresh my data hourly, but the unpredictability of the data refresh schedule can lead to inconsistencies in my reports. If the refresh process for a data source takes longer than expected, a new hourly refresh might start before the previous one finishes. This could cause Power BI to pull some data from the new refresh while the old refresh is still in progress, resulting in a mix of data from different cycles, which causes inconsistencies in the report or even break the report as it can create invalid relationships between tables.

Most organizations rely on ETL (Extract, Transform, Load) or ELT (Extract, Load, Transform) processes to load and process data within their data warehouse systems, ensuring that data is structured and accessible for analysis. In our case, we leverage Azure Data Factory (ADF) as our standard ETL tool. ADF is a powerful and versatile service that offers a wide range of built-in connectors as linked services, enabling integration with various data sources and destinations. Additionally, ADF has the capability to trigger HTTP requests, expanding its utility beyond simple data transfers.

This feature is advantageous when working with Power BI, especially given that Power BI’s online service offers numerous built-in APIs. These APIs can be leveraged to automate tasks such as dataset refreshes. By integrating ADF with Power BI through these APIs, we can synchronize ETL processes with Power BI report refreshes, ensuring that the most up-to-date data is available for analysis as soon as it’s loaded.

For instance, if there is a requirement to refresh datasets hourly, ADF can be configured to initiate the refresh process automatically after each data load operation. This not only optimizes the workflow but also guarantees that reports are consistently updated with the latest data, enhancing the accuracy and timeliness of insights. In the following example, we will demonstrate how to configure ADF to load data on a daily schedule and subsequently trigger a Power BI report refresh (in the case study we use a Power BI Dataflow which supplies data to a Power BI Semantic Modell), ensuring that your reports are always up to date with the latest data.  

In terms of pricing, we must pay for the Azure Key Vault and for the Azure Data Factory. These are depending on your configuration and usage. You can find their pricing here: Azure Data Factory, Azure Key Vault.

In this solution, we will rely on Microsoft’s trusted network for connectivity and will not configure a virtual network. However, we recommend using a virtual network across the entire solution to enhance security, provide better control over network traffic, and protect your resources more effectively.

Prerequisites

To implement the integration between Azure Data Factory (ADF) and Power BI several configurations are required. These steps will ensure secure access, manage permissions, and enable the necessary APIs to support the workflow.  

  1. App Registration in Azure Entra ID (formerly known as Azure Active Directory or AAD)

This registered application will serve as the identity for the process, enabling secure communication between ADF and Power BI.

  • Generate a client secret for the App Registration. This secret will be used for authentication when making API calls to Power BI services.
  • Assign the appropriate API permissions to the App Registration. Specifically, grant permissions to access Power BI online services, allowing the application to manage datasets refreshes.
  1. Security Group in Azure Entra ID

This group will be used to control and restrict access to the Fabric API, ensuring that only authorized users and applications can interact with Power BI resources.

  • Add the App Registration to this security group. This step is crucial for enforcing security policies and limiting API access.
  1. Azure Key Vault

An Azure Key Vault is required to securely store sensitive information, such as the client secret created in the App Registration process.

  • Upload the client certificate to the Azure Key Vault. This ensures that the client secret is securely stored and can be retrieved by ADF during the execution of ETL processes.
  1. Enable Fabric APIs in Power BI Admin Portal

Finally, Fabric APIs must be enabled in the Power BI Admin Portal. This is a crucial step as it allows the registered application to interact with Power BI, including the ability to trigger dataset refreshes.

  • Fabric Administrator permission is required to enable these APIs in the Power BI Admin Portal. This involves granting the necessary permissions and configuring the environment to support API interactions.

Each of these steps requires precise configuration within Azure. The Fabric Administrator plays a vital role, especially in enabling the Fabric APIs within Power BI Admin Portal and ensuring that all settings align with organizational security policies. Throughout the setup, we will specify and configure these services in Azure.

Implement the solution

The first step in integrating Azure Data Factory (ADF) with Power BI is to create an App Registration in Azure Entra ID. This registered application will serve as the identity through which your ADF process can securely interact with Power BI.

Setting up the Environment

Step 1: Create an App Registration

  1. Navigate to Azure Entra ID (Azure Active Directory):
  • Sign in to the Azure portal.
  • In the left-hand navigation pane, select “Azure Entra ID” (formerly Azure Active Directory). 2.Create a New App Registration:
  • In the Azure Entra ID blade, select “App registrations” from the menu.
  • Click on the “New registration” button to start the process. 3.Specify the App Registration Details:
  • Name: Enter a name for the application. This name can be changed later, so you can choose something descriptive like “ADF-PowerBI-Integration.”
  • Supported Account Types: For this scenario, select “Accounts in this organizational directory only (Single tenant).” This option restricts access to users within your organization’s Azure AD tenant.
  • Redirect URI (optional): You can leave this blank for now, as it’s not required for the initial setup. 4. Register the Application:
  • Once you have filled in the necessary details, click on the “Register” button to create the App Registration. 5. Default Permissions:
  • By default, the App Registration will have some basic permissions. Note that in Azure, every user has the ability to register an application, but the required API permissions and other advanced configurations will need to be set up afterward.

Step 2: Create a Client Secret for Your App Registration

  1. Navigate to the “Certificates & Secrets” Section:
  • In your App Registration, locate the “Manage” section on the left-hand menu.
  • Click on “Certificates & secrets” under the “Manage” heading. 2. Select the “Client Secrets” Tab:
  • In the main panel, you will see two tabs: “Certificates” and “Client secrets.”
  • Ensure that the “Client secrets” tab is selected. 3. Create a New Client Secret:
  • Click on the “+ New client secret” button to create a new client secret. This will be the secret key that your application will use to authenticate itself. 4. Configure the Client Secret:
  • A panel will open, prompting you to provide a description and an expiration period for the client secret.
  • Description: Provide a name or description for the client secret to help you identify it later (e.g., “ADF Integration Key”).
  • Expires: Choose the expiration duration for the client secret. Options typically include 6 months, 12 months, or 24 months. Select the duration that suits your security policies.
  • Once configured, click on the “Add” button. 5. Save the Client Secret Value:
  • After creating the client secret, it will be displayed only once in the “Value” column. Make sure to copy this value and store it securely, as you will not be able to retrieve it again once you leave the page.
  • You will use this client secret value later when configuring your Azure Data Factory to authenticate with Power BI.

With the client secret created, the next steps will involve storing it securely in Azure Key Vault and assigning the necessary API permissions for Power BI.

Step 3: Assign Delegated Permissions for Power BI Dataset and Dataflow Refresh

Delegated permissions allow your application to act on behalf of a signed-in user, meaning the app will perform actions based on the user’s privileges. This is useful when the application needs to perform tasks within the context of a user’s session.

3.1. Navigate to API Permissions

  1. After configuring your client secret, go back to the App Registration overview page in the Azure portal.
  2. In the left-hand menu under the “Manage” section, select API permissions.

3.2. Add Power BI API Permissions

  1. On the API permissions page, click the + Add a permission button.
  2. In the “Request API permissions” pane, select Power BI Service.

3.3. Choose Delegated Permissions

  1. In the Power BI Service section, choose Delegated Permissions.
  2. From the list of available permissions, select the following:
    • Dataset.ReadWrite.All: This permission allows the signed-in user to read and write all datasets, including the ability to refresh datasets.
    • Dataflow.ReadWrite.All: This permission allows the signed-in user to read and write all dataflows, including the ability to refresh dataflows.
  3. After selecting these permissions, click the Add permissions button.

Scope of Permissions: The app will only be able to act within the scope of the permissions granted to the user. If a user does not have permissions to refresh certain datasets or dataflows, the app won’t be able to perform those actions.

It’s important to note that the ReadWrite.All permission is only effective in workspaces where the security group has been added as a member. Attempting to access datasets in workspaces outside of this scope will result in an error, such as: {“error”:{“code”:”ItemNotFound”,”message”:”Dataset XY is not found! Please verify the datasetId is correct and that the user has sufficient permissions.”}}

Step 4: Create a Security Group and Add the App Registration as a Member

To further secure your application and manage access to Power BI resources, you can create a Security Group in Azure Entra ID (formerly Azure Active Directory) and add the App Registration as a member. This step helps you control and restrict access to the Power BI APIs and resources more effectively.

4.1. Create a Security Group in Azure Entra ID

  1. Navigate to Azure Entra ID:
    • In the Azure portal, go to Azure Entra ID from the left-hand menu.
  2. Create a New Security Group:
    • In the Azure Entra ID blade, select Groups from the menu on the left.
    • Click on the + New group button to create a new security group.
  3. Configure the Security Group:
    • Group Type: Select Security as the group type.
    • Group Name: Enter a name for the security group, such as “PowerBI-ADF-Access”.
    • Group Description: Optionally, provide a description for the group, such as “Security group for ADF to access Power BI”.
    • Membership Type: Leave this as “Assigned” to manually add members.
  4. Create the Group:
    • Once all the details are filled in, click on the Create button to create the security group.

4.2. Add the App Registration as a Member

  1. Locate the Newly Created Security Group:
    • After the group is created, it will appear in the list of groups. Click on the group name to open its settings.
  2. Add Members to the Security Group:
    • In the group settings, go to the Members tab.
    • Click on + Add members to open the member selection pane.
  3. Select the App Registration:
    • In the “Add members” pane, search for the App Registration name you created earlier (e.g., “ADF-PowerBI-Integration”).
    • Select the App Registration from the search results and click Select.
  4. Confirm Membership:
    • The selected App Registration should now appear in the members list of the security group.
    • Confirm that the App Registration has been added as a member.

Step 5: Create an Azure Key Vault and Store the Client Secret

To securely store sensitive information like the client secret generated during your App Registration, you should use Azure Key Vault. This ensures that your secrets are stored in a secure, centralized location and can be accessed safely by authorized applications.

5.1. Create an Azure Key Vault

  1. Navigate to the Azure Portal:
    • Sign in to the Azure portal and use the search bar to find “Key Vaults.”
    • Click on “Key Vaults” in the search results.
  2. Create a New Key Vault:
    • In the Key Vaults pane, click on the + Create button.
    • You will be guided through the process to set up your new Key Vault.
  3. Configure the Key Vault:
    • Subscription: Select the subscription in which you want to create the Key Vault.
    • Resource Group: Choose an existing resource group or create a new one.
    • Key Vault Name: Provide a globally unique name for your Key Vault (e.g., “kv-powerbiappreg-prod”).
    • Region: Select the region where you want the Key Vault to be hosted.
    • Pricing Tier: Choose the standard pricing tier unless you require premium features.
  4. Review and Create:
    • After filling in all the required fields, click on the Review + create button.
    • Review your configurations, and if everything is correct, click on Create to deploy your Key Vault.

5.2. Store the Client Secret in Azure Key Vault

  1. Navigate to the Newly Created Key Vault:
    • Once the Key Vault is deployed, navigate to it from the Key Vaults section in the Azure portal.
  2. Add a New Secret:
    • In the Key Vault’s left-hand menu, select Secrets under the “Settings” section.
    • Click on the + Generate/Import button to create a new secret.
  3. Configure the Secret:
    • Upload Options: Select “Manual” to manually enter the secret value.
    • Name: Enter a name for the secret (e.g., “ADF-PowerBI-ClientSecret”).
    • Value: Paste the client secret value you copied when you created it during the App Registration setup.
    • Content Type: Optionally, you can enter a content type like “Client Secret.”
    • Activation and Expiration: You can optionally set activation and expiration dates for the secret.
  4. Create the Secret:
    • After configuring all the necessary fields, click on the Create button to store the client secret in Azure Key Vault.

5.3. Accessing the Secret in Your Application

  1. Grant Access to the Key Vault:
  • You will need to grant your Azure Data Factory (or any other application that will use the secret) access to this Key Vault. This is typically done by assigning appropriate roles or setup access policies to the service principal (App Registration) or the ADF managed identity.
  • If you use RBAC (Role Based Access Control) then assign the Key Vault Certificate User permission to the Azure Data Factory.
  • If you want to use the Access Policies, then give List and Get access for the Azure Data Factory. 2. Use the Secret in Azure Data Factory:
  • In Azure Data Factory, when setting up linked services or activities that require the client secret, use the Key Vault integration to fetch the secret. This will ensure that the secret is retrieved securely at runtime. 3. Networking settings
  • To enhance the security of your Azure Key Vault, it’s important to configure the networking settings to restrict access. This ensures that only specific networks or trusted services can access the secrets stored in the Key Vault.
  • In the Firewalls and virtual networks section, choose the option Allow public access from specific virtual networks and IP. This setting restricts access to the Key Vault from only the specified virtual networks or IP addresses.
  • In the Firewall section, you have the option to allow trusted Microsoft services to bypass the firewall. This is typically recommended as it ensures that essential Azure services, like Azure Data Factory, can access the Key Vault even if network restrictions are in place.
  • You can also specify individual IP addresses or CIDR ranges that are allowed to access the Key Vault. Click on + Add your client IP address to add the current IP or manually enter an IP address or range. You need to add your client IP address to create or manage a secret.  

Step 6: Enable the Fabric API for a Specific Security Group in Power BI Admin Portal and Grant Access to a Specific Workspace

To allow your application to interact with Power BI using the Fabric API, you need to enable this API for a specific security group in the Power BI Admin Portal and grant the necessary access to the desired Power BI workspace.

6.1. Enable the Fabric API for the Security Group

  1. Access the Power BI Admin Portal:
  • Sign in to the Power BI service with an account that has Power BI administrator privileges.
  • Navigate to the Power BI Admin Portal by selecting the settings gear icon in the upper right corner and choosing Admin portal. 2.Navigate to Tenant Settings:
  • In the Power BI Admin Portal, find and select Tenant settings from the left-hand menu. 3.Enable the Fabric API:
  • Scroll down to the Developer settings section.
  • Locate the Power BI service settings for APIs or Fabric API settings (the exact name might vary).
  • Expand the section and ensure that the Fabric API is enabled.
  • Under Apply to, select Specific security groups.
  • In the Select security groups box, add the security group you created earlier (e.g., “PowerBI-ADF-Access”).  
  • As mentioned earlier, it’s important to specify a security group to grant access exclusively to the application, rather than to the entire organization.
  • Click Apply to save your changes.

6.2. Grant Access to a Specific Power BI Workspace

  1. Navigate to the Power BI Service:
  • In the Power BI service, go to the Workspaces section to locate the workspace where the app will need access. 2. Assign Workspace Roles:
  • Select the specific workspace and go to the Settings of that workspace.
  • Navigate to the Permissions tab within the workspace settings.
  • Click on Add members or Add users.
  • Search for and add the security group you configured earlier (e.g., “PowerBI-ADF-Access”).
  • Assign an appropriate role to the group:
  • Member: Allows the group to edit content within the workspace, including refreshing datasets and dataflows. We recommend this option.
  • Contributor: Allows the group to publish reports and refresh datasets.
  • Admin: Grants full control over the workspace. 3. Save Changes:
  • After assigning the appropriate role, click Save to apply the permissions to the workspace.

Creating the Azure Data Factory Pipeline

With all the foundational elements in place, we’re now ready to create the pipeline that will orchestrate our data processes and automate key tasks. Specifically, this pipeline will trigger the refresh of Power BI datasets and dataflows. By leveraging the secure connections and permissions we’ve configured. Keep in my that we need to refresh the dataflow first to refresh the dataset since the dataflow is the predecessor to the dataset.

Step 1: Creating a Linked Service in Azure Data Factory for the Azure Key Vault

Creating a linked service in Azure Data Factory (ADF) allows you to connect ADF to various data sources, including Azure Key Vault, databases, storage accounts, and more. Below is a step-by-step guide for creating a linked service to Azure Key Vault.

  1. Open Azure Data Factory
  • In the Azure portal, navigate to your Azure Data Factory instance.
  • Open the Data Factory UI by clicking on Author & Monitor.
  1. Create a New Linked Service
  • In the ADF UI, navigate to the Manage tab on the left-hand menu.
  • Under the Connections section, select Linked services.
  • Click on the + New button to create a new linked service.
  1. Choose the Linked Service Type
  • In the New Linked Service pane, search for and select Azure Key Vault from the list of available services.
  • Click Continue.
  1. Configure the Linked Service

You will now see the configuration pane for the Azure Key Vault linked service.

  1. Name:
  • Enter a name for your linked service (e.g., AzureKeyVault1). 2.Description:
  • Optionally, provide a description to identify the purpose of this linked service. 3. Azure Key Vault Selection Method:
  • From Azure subscription: Choose this option if your Key Vault is within your current subscription.
  • Enter manually: Select this option if you need to manually enter the Key Vault details (not used in this case). 4. Azure Subscription:
  • Select your Azure subscription from the dropdown list.
    5.Azure Key Vault Name:
  • Choose the Azure Key Vault you created earlier from the dropdown list. 6.Authentication Method:
  • System-assigned managed identity: This is the recommended method for authentication as it securely uses the managed identity associated with the Data Factory.
  • The Managed identity name and Managed identity object ID will be automatically populated.

 

  1. Test the Connection
  • Before creating the linked service, it’s a good practice to test the connection.
  • Select Test connection to ensure that ADF can successfully connect to the Azure Key Vault.
  1. Create the Linked Service
  • Once the connection test is successful, click on the Create button to finalize and create the linked service.

Step 2: Set Up a Web Activity for Refreshing a Power BI Dataflow in Azure Data Factory

To automate the refresh of a Power BI dataflow using Azure Data Factory, you can use a Web Activity. The following guide will walk you through setting up the Web Activity to trigger a Power BI dataflow refresh.

  1. Access the Pipeline Editor
    1. Navigate to Azure Data Factory:
      • Open your Azure Data Factory instance in the Azure portal.
      • Go to Author > Pipelines > + New pipeline to create a new pipeline or select an existing pipeline.
    2. Add a Web Activity:
      • In the Activities pane, expand the General section.
      • Drag and drop the Web activity onto the pipeline canvas.
  2. Configure the Web Activity

Here’s how you can configure the Web Activity to refresh your Power BI dataflow:

1.Set the Basic Properties:

  • Name: Set the name of the activity to something descriptive, such as Refresh Power BI Dataflow.2.Settings Tab:
  • URL:
  • Enter the URL to trigger the dataflow refresh.
  • Use the expression format to ensure the URL is dynamic if needed: https://api.powerbi.com/v1.0/myorg/groups/{GroupID}/dataflows/{DataflowID}/refreshes.
  • Replace {GroupID} with {YourGroupId} and {DataflowID} with {YourDataflowID}. (The Group ID corresponds to the Workspace ID, which you can find in the URL of the Power BI Service when you open your specific workspace).
    Group ID Example: https://app.powerbi.com/groups/89d2f27a-a923-44c2-8f1f-8fdebd865c9e/list?experience=power-bi)
  • Method:
  • Set the method to POST since you are sending a request to initiate a refresh.
  • Body:
  • Set the body to the following JSON to not receive notification in case of a failure, since Power BI built alert will send us notification. From an Azure Data Factory Point of view the dataflow/dataset refresh was a successful attempt.  

{

   “notifyOption”: “NoNotification”

}

3.Authentication:

  • Authentication Method: Select Service principal.
  • Authentication Reference Method:
  • Choose Inline.
  • Tenant: Enter your tenant ID
  • Service Principal ID: Enter your service principal (client) ID. This is your App registration client ID.
  • Service Principal Credential Type: Select Service principal key.
  • Service Principal Key:
  • Reference the key stored in Azure Key Vault by specifying the linked service and the secret name. Use the latest secret option, to dynamically handle keys if it’s renewed.  
  • Resource: Set the resource to https://analysis.windows.net/powerbi/api.4.Advanced Settings (Optional):
  • Retry: Set the number of retries to 0 if you do not want the activity to retry automatically on failure.
  • Retry Interval: Leave as default or set a custom retry interval.

Step 3: Set Up a Web Activity for Refreshing a Power BI Dataset in Azure Data Factory

  1. Add a new Web Activity

Here’s how you can configure the Web Activity to refresh your Power BI dataset:

A) Set the Basic Properties:

Name: Set the name of the activity to something more descriptive, such as Refresh Power BI Dataset.

B) Settings Tab:

  • URL:
  • Enter the URL to trigger the dataset refresh.
  • Use the expression format to ensure the URL is dynamic if needed: https://api.powerbi.com/v1.0/myorg/groups/{GroupID}/datasets/{DatasetID}/refreshes.
  • Replace {GroupID} with your Workspace ID and {DatasetID} with your Dataset ID. (The Group ID corresponds to the Workspace ID, which you can find in the URL of the Power BI Service when you open your specific workspace.)
  • Method:
  • Set the method to POST since you are sending a request to initiate a refresh.
  • Body:
  • Set the body to the following JSON to avoid receiving notifications in case of failure:

{

   “notifyOption”: “NoNotification”

}

C) Authentication:

  • Authentication Method: Select Service principal.
  • Authentication Reference Method:
  • Choose Inline.
  • Tenant: Enter your tenant ID.
  • Service Principal ID: Enter your service principal (client) ID. This is your App registration client ID.
  • Service Principal Credential Type: Select Service principal key.
  • Service Principal Key:
  • Reference the key stored in Azure Key Vault by specifying the linked service and the secret name. Use the latest secret option to dynamically handle keys if they are renewed.
  • Resource: Set the resource to https://analysis.windows.net/powerbi/api.D) Advanced Settings (Optional):
  • Retry: Set the number of retries to 0 if you do not want the activity to retry automatically on failure.
  • Retry Interval: Leave as default or set a custom retry interval.

  1. Add Dependencies (Optional)
  2. Dependencies:
  • Example: This dataset refresh activity might depend on the success of two previous Web Activities (Refresh Power BI Dataflow SQL and Refresh Power BI Dataflow BC). Ensure you configure these dependencies accordingly.

  1. Save and Trigger the Pipeline

Now that we configured to pipeline, save it and run a debug to ensure that the Power BI dataset refreshes as expected. If its successfully refreshed, then publish the changes in the Data Factory.

After publishing set an automatic trigger to your main pipeline if there isn’t one. You can do that by opening your pipeline then click on Add Trigger, then click on New/Edit. This will open a ribbon on the right side, which you can configure on your need.

We recommend to always create a main pipeline, where you can execute all your sub pipeline at once. This will give you the ability control your whole solution through a centralized pipeline. This creates a sequence to your solution helps monitor it better. For more information, you can find the documentation here: Execute Pipeline Activity – Azure Data Factory & Azure Synapse | Microsoft Learn  

Other Consideration

If you have multiple datasets that need refreshing, consider using the Get Datasets in a Group API first to retrieve the dataset IDs. You can then use the response to trigger each refresh within a single pipeline by leveraging the For Each activity in Azure Data Factory. Documentation for the API can be found here: Datasets – Get Dataset In Group – REST API (Power BI Power BI REST APIs) | Microsoft Learn and for the For Each activity can be found here: ForEach activity – Azure Data Factory & Azure Synapse | Microsoft Learn.  

Summary

In this blog post, we’ve walked through the complete process of integrating Azure Data Factory (ADF) with Power BI to automate the refresh of datasets and dataflows, ensuring your reports are always up to date. Here’s a recap of the key steps:

  1. Setup and Preparation: We started by configuring essential components such as App Registrations, security groups, and Azure Key Vault to securely manage access and credentials.
  1. Enabling APIs and Permissions: We enabled the necessary Fabric API in the Power BI Admin Portal for a specific security group, ensuring that only authorized applications can trigger dataset and dataflow refreshes.
  1. Creating Linked Services in ADF: We created linked services in Azure Data Factory to securely connect to Azure Key Vault, allowing us to store and retrieve sensitive credentials like client secrets.
  1. Building the Pipeline: With everything in place, we built the ADF pipeline, adding Web Activities to trigger the refresh of Power BI dataflows and datasets using the Power BI REST API. We also covered how to handle multiple datasets by using the Get Datasets in a Group API and the For Each activity in ADF.

By following these steps, you can automate the data refresh process in Power BI, ensuring that your business intelligence insights are based on the most current data. This integration not only streamlines your workflow but also enhances the security and manageability of your data processes.

Whether you’re working with a single dataset or multiple dataflows, Azure Data Factory provides a robust platform for orchestrating your data integration and refresh needs.

————————————————————————————————————————————————————————————–

Microsoft Documentations

Power BI REST APIs for embedded analytics and automation – Power BI REST API | Microsoft Learn

Dataflows – Refresh Dataflow – REST API (Power BI Power BI REST APIs) | Microsoft Learn

Datasets – Refresh Dataset – REST API (Power BI Power BI REST APIs) | Microsoft Learn

What is Azure Key Vault? | Microsoft Learn

Authentication and authorization basics – Microsoft Graph | Microsoft Learn

Introduction to Azure Data Factory – Azure Data Factory | Microsoft Learn

Bulk Work Item Management: the Dynamic Duo of Azure DevOps and Microsoft Excel
April 9, 2025
3 min read
Bulk Work Item Management: the Dynamic Duo of Azure DevOps and Microsoft Excel
Read more

When using Azure DevOps, it is often a challenge to modify the status of multiple Work Items in large projects at the same time. This can significantly complicate admin tasks and make the project management process time consuming.

However, Azure DevOps’ bulk editing feature can provide a solution to this problem. It allows you to edit the state of multiple Work Items at once, either tagging them or moving them between iterations.
However, if you need to create a more complex, bulk work item, a Microsoft Excel add-in can be of great help. This supports adding Work Items, updating existing Work Items, adding links or attachments to multiple Work Items and much more.

By using these, consultants and project teams can respond quickly and efficiently to changing requirements or needs related to current project processes. This can result in improved project efficiency and reduced administrative workload. The ability to manage work items in bulk can result in significant time and resource savings throughout the project lifecycle.

So how can we use this solution?

There are a few prerequisites that are necessary to use this operation:

  • Microsoft Excel 2010 or later must be installed
  • You need to install Azure DevOps Office Integration 2019 (Available for free at https://visualstudio.microsoft.com/downloads/?ref=techielass.com#other-family)
  • You must be a member of the project you want to edit.
  • Must be allowed to view Work Items and edit WorkItems in the project
  • If you want to add or edit Work Items, you must have stakeholder access

If all the necessary conditions have been met, the following steps can be followed to modify and create work items en masse:

  1. Open Excel
  2. Click on the Team tab in the top menu (this should appear automatically after installing the extension. If it is not available, you can uncheck it in the customize menu bar to make it visible).

To be able to make bulk changes, we need to connect the Excel plugin to the Azure DevOps environment. This requires the following steps:

  1. Click on New List
  2. This will activate a pop-up window that will ask us to connect to an Azure DevOps Server.
  3. Click on the Servers button
  4. Click on the Add button
  5. Enter the URL of the Azure DevOps server
  6. Then click OK
  7. You will be prompted to authenticate the Azure DevOps server
  8. Then click Close to connect Excel to the DevOps server.A beépülő modul mostantól információkat kap a csapatprojektekről.
  9. Here we will be able to select the project within which we want to make the changes

10. You need to select which Work Items you want to download from the Azure DevOps tables to Excel (the connection will always pull the items from a query created in DevOps, so it is a prerequisite that there is a view, a list created under the project).

11. Now we can start making the necessary changes. When you are done, click on the Publish button under the Team tab. This will send the completed changes to the server where Azure DevOps will do its job.

If you refresh the Azure DevOps view in your browser, you will see the changes you have made.

Bulk editing Azure DevOps items using Excel is a significant time and resource saver for everyone. For projects larger than Visual Labs, this feature is regularly used by our project managers and consultants, and it greatly helps us to operate more efficiently.

Monitoring Standardization: using Workbooks for Logic Apps, Azure Functions and Microsoft Flows
April 9, 2025
7 min read
Monitoring Standardization: using Workbooks for Logic Apps, Azure Functions and Microsoft Flows
Read more

Problem Statement

Monitoring of the three platforms mentioned in the title is solved independently in different locations. Logic Apps can be monitored either from the resource’s run history page or through the Logic App Management solution deployed to a Log Analytics workspace. Azure Functions have Application Insights, while the run history of Microsoft Flows is available on the Power Platform.Most of our clients’ solutions consist of these resources, which often chain together and call each other to represent business processes and automations. Their centralized supervision is not solved, making error tracking and analysis difficult for employees. Moreover, they had to log into the client’s environment to perform these tasks.

Goal

We wanted to get a general overview of the status of the solutions we deliver to our clients, reduce our response time, and proactively prevent error reports submitted by our clients. We aimed to track our deployments in real-time, providing a more stable system and a more convenient user experience. We wanted to make our monitoring solution available within Visuallabs so that we could carry out monitoring tasks from the tenant that hosts our daily development activities.

Solution

Infrastructure Separation

Our solution is built on the infrastructure of a client used as a test subject, whose structure can be considered a prerequisite. On the Azure side, separate subscriptions were created for each project and environment, while for Dynamics, only separate environments were used. Project-based distinction for Flows is solved based on naming conventions, and since log collection is manual, the target workspace can be freely configured.

Centralized Log Collection

It was obvious to use Azure Monitor with Log Analytics workspaces for log collection. Diagnostic settings were configured for all Azure resources, allowing us to send logs to a Log Analytics workspace dedicated to the specific project and environment. For Microsoft Flows, we forward logs to a custom monitor table created for Flows using the built-in Azure Log Analytics Data Collector connector data-sending step. This table was created to match the built-in structure of the Logic Apps log table, facilitating the later merging of the tables.

monitroing
Diagnostic settings

Log Analytics workspace

Log tables

Making Logs Accessible in Our Tenant

An important criterion for the solution was that we did not want to move the logs; they would still be stored in the client’s tenant; we only wanted to read/query them. To achieve this, we used Azure Lighthouse, which allows a role to be enforced in a delegated scope. In our case, we set up a Monitoring contributor role for the client’s Azure subscriptions for a security group created in our tenant. This way, we can list, open, and view resources and make queries on Log Analytics workspaces under the role’s scope from our tenant.

Visualization

For visualization, we used Azure Monitor Workbook, which allows data analysis and visual report creation, as well as combining logs, metrics, texts, and embedding parameters. All Log Analytics workspaces we have read access to via Lighthouse can be selected as data sources. Numerous visualizations are available for data representation; we primarily used graphs, specifically honeycomb charts, but these can easily be converted into tables or diagrams.

Combining, Customizing, and Filtering Tables

To process log tables from different resources together, we defined the columns that would be globally interpretable for all resource types and necessary for grouping and filtering.These include:

  • Client/Tenant ID
  • Environment/Subscription ID
  • Resource ID/Resource Name
  • Total number of runs
  • Number of successful runs
  • Number of failed runs

Based on these, we could later determine the client, environment, project, resource, and its numerical success rate, as well as the URLs needed for references. These formed the basis for combining tables from various Log Analytics Workspaces and resources for our visualizations.

Log Analytics

User Interface and Navigation

When designing the user interface, we focused on functionality and design. Our goal was to create a visually clear, well-interpreted, interactive solution suitable for error tracking. Workbooks allow embedding links and parameterizing queries, enabling interactivity and interoperability between different Workbooks. Utilizing this, we defined the following levels/types of pages:

  • Client
  • Project
  • Resources
  • Logic App
  • Azure Function
  • Flow
Customers

Projects

Resources

Resources

Resources

Resources [Azure Function] At the Client and Project levels, clicking on their names displays the next subordinate Workbook in either docked or full-window view, passing the appropriate filtering parameters. Time is passed as a global parameter during page navigation, but it can be modified and passed deeply on individual pages. We can filter runs retrospectively by a specific minute, hour, day, or even between two dates.On the page displaying resources, we provide multiple interactions for users. Clicking on resource names navigates to the resource’s summary page on the Azure Portal within the tenant, thanks to Lighthouse, without tenant switching (except for Power Automate Flows).Clicking on the percentage value provides a deeper insight into the resource’s run history and errors in docked view. This detailed view is resource type-specific, meaning each of the three resources we segregated has its own Workbook. We always display what percentage of all runs were successful and how many faulty runs occurred, with details of these runs.

Logic App

Beyond general information, faulty runs (status, error cause, run time) are displayed in tabular form if any occurred during the specified time interval. Clicking the INSPECT RUN link redirects the user to the specific run where all successful and failed steps in the process can be viewed. At the bottom, the average run time and the distribution of runs are displayed in diagram form.

Logic App

Logic App [INSPECT RUN]

Logic App [diagrams]

Microsoft Flow

For Flows, the same information as for Logic Apps is displayed. The link also redirects to the specific run, but since it involves leaving Azure, logging in again is required because Dynamics falls outside the scope of Lighthouse.

Micrososft Flow

Azure Function

The structure is the same for Azure Functions, with the addition that the link redirects to another Workbook instead of the specific run’s Function App monitor page. This is necessary because only the last 20 runs can be reviewed on the Portal. For older runs, we need to use Log Analytics, so to facilitate error tracking, the unique logs determined by developers in the code for the faulty run are displayed in chronological order.

Azure Function

Azure Function

Consolidated View

Since organizationally, the same team may be responsible for multiple projects, a comprehensive view was also created where all resources are displayed without type-dependent grouping. This differs from the Workbook of a specific project’s resources in that the honeycombs are ordered by success rate, and the total number of runs is displayed. Clicking on the percentage value brings up the previously described resource type-specific views.

Resources

Usability

This solution can be handy in cases where we want to get a picture of the status of various platform services in a centralized location. This can be realized interactively for all runs, except for Flows, without switching tenants or possibly different user accounts. Notification rules can also be configured based on queries used in Workbooks.

Advantages:

  • The monitoring system and visualization are flexible and customizable.
  • New resources of the same type can be added with a few clicks to already defined resource types (see: configuring diagnostic settings for Logic Apps).

Disadvantages:

  • Custom log tables, visualizations, and navigation between Workbooks require manual configuration.
  • Integrating Flows requires significantly more time investment during development and planning.
  • Combining tables, separating environments and projects can be cumbersome due to different infrastructure schemas.
  • Basic knowledge of KQL (Kusto Query Language) or SQL is necessary for queries.

Experience

The team that implemented the solution for the client provided positive feedback. They use it regularly, significantly easing the daily work of developer colleagues and error tracking. Errors have often been detected and fixed before the client noticed them. It also serves well after the deployment of new developments and modifications. For Logic Apps, diagnostic settings are included in ARM (Azure Resource Manager) templates during development, so runs can be tracked from the moment of deployment in all environments using release pipelines.

The past, present and future of ERP systems
April 9, 2025
6 min read
The past, present and future of ERP systems
Read more

When I first started working at VisualLabs, during the first WSM (weekly standup meeting), where each business unit reports on their current weekly tasks, I noticed how many acronyms we use. As a member of the ERP team, the question arose in my mind: besides the fact that we use them, do we know exactly how they were developed and what they mean?

Everyone is now familiar with the term ERP (Enterprise Resource Planning), but few people know its exact origins and how it evolved.So I thought I'd gather together where it all started and what the main milestones were that helped shape the ERP systems we know today. As we look back in time, we will realise how deeply rooted this technology is in the modern business world.

In this blog, I've collected 7 milestones that helped shape the ERP system we know today.

In today's world, it would be unthinkable for a company not to use some kind of computer system for its various processes. However, even before the advent of the computer, companies had to manage these processes (be it accounting or production planning) in some way. Let's take accounting as an example. Accountants recorded every single financial transaction on paper, by hand, in different books. And they were managed day by day, month by month. It is hard to imagine that companies often had rooms full of (general) ledgers and files, each containing dozens of transactions. And at the heart of it all was the accountants' most treasured possession: the ledger. It is hard to imagine how much work must have been involved in the year-end closing process and how many mistakes must have been made in the process.

ERP

  1. The birth of computers (1950s):

In the 1950s, with the birth of the computer - for which János Neumann laid the theoretical foundations - a new dimension opened up in the way companies operate and transform their processes. Although these computers were used in the 1950s mainly in the military and scientific fields - because of their large size and price - they soon became part of the business world thanks to continuous technological developments. These tools enabled faster processing and analysis of data and helped automate the activities of companies.

ERP

2. Inventory management and control (1960s):

One of the first milestones in the growing uptake of the computer and the realisation of its potential for business dates back to the 1960s. It was then that the manufacturing industry recognised the need for a system that would allow them to manage, monitor and control their inventory. The advent of information technology has allowed companies to integrate and automate their business processes. As a result, they have been able to improve the efficiency and accuracy of their inventory management. This was one of the first steps towards the emergence of ERP systems.

3. Material Requirement Planning (MRP I, 1970s):

The concept of MRP (Material Requirements Planning) first emerged in 1970 and was essentially a software-based approach to planning and managing manufacturing processes. The application of MRP focused primarily on planning and tracking material requirements. This approach allowed companies to predict more accurately the quantities and types of materials they would need in their production processes. MRP has enabled companies to manage material procurement and production scheduling more efficiently, reducing losses due to over- or under-pricing. This innovation has had a significant impact on the manufacturing industry and has fundamentally transformed the way companies plan materials. This approach helped to increase the efficiency and competitiveness of manufacturing companies in the 1970s.

4. Production Resource Planning (MRP II, 1980s):

The 1980s marked a major milestone with the advent of MRP II systems. While MRP focused exclusively on the inventory and materials needed to meet actual or forecast customer demand, MRP II now provides greater insight into all other manufacturing resources. By extending production planning beyond materials to labor, machinery, and other production resources, it gives companies much greater control over their manufacturing processes.

5. Enterprise resource planning systems (ERP, 1990s)

It was in the 1990s that the first true ERP systems were introduced (the term ERP itself was first used in the 1990s by the research firm Gartner).The ERP systems were a significant improvement over MRP II systems, as they focused not only on the full integration and automation of manufacturing processes, but also on the full integration and automation of business processes. Examples of such processes include purchasing, sales, finance, human resources and accounting. As a result of this full integration, companies are now able to manage their business processes in a single database. This has brought many benefits. The unified storage and management of information ensured access to accurate, up-to-date data. This improved decision-making processes and efficiency for companies. And the interconnected business areas helped them to make and implement coherent strategies. As a result, the ERP system became a 'one-stop-shop' that brought together and managed all corporate information.

6. Web-based functionalities with the advent of the Internet (ERP II, 2000s)I

n the mid-2000s, the role of the Internet in the business world increased and ERP systems adapted to this change. Systems began to incorporate customer relationship management (CRM) and supply chain management (SCM) functionality. With ERP II, the focus has been on user-friendliness and customisation. Modular systems were developed that allowed businesses to select and implement the components that best fit their operations.

7. Cloud ERPs (2010s):

In the 2010s, the emergence of cloud technology gave a new dimension to ERP systems. Cloud ERP solutions have enabled companies to host and run their ERP systems in the cloud instead of traditional on-premise deployments. This has offered significant benefits, including greater flexibility, lower costs and easier access to critical data. Thanks to cloud ERP systems, companies no longer have to worry about server maintenance or software updates, as these tasks are handled by their service providers. This allows companies to focus on their business goals and processes while ensuring that their systems are always up-to-date and available.

+1 The future of ERP:

And where is the development of ERP systems heading today? With the help of intelligent algorithms and artificial intelligence, systems are increasingly capable of automating and optimising business processes, reducing the need for human intervention. Data will continue to play a key role in the future, as companies are able to make better business decisions by analysing it more effectively. The integration of ERP systems with various IoT tools will enable real-time data exchange and real-time analysis to provide companies with faster and more accurate answers to support different business issues.

ERP

ERP systems also increasingly offer a personalised user experience and extensible integrations with other business applications and technologies. In the future, ERP systems will not only function as a tool, but will provide companies with real business intelligence and competitiveness, helping them to keep pace with the rapidly changing business environment and to stand out from their competitors.

Are you familiar with the world of ERP systems? Visual Labs can help you explore its potential.

Sources:

https://www.geniuserp.com/resources/blog/a-brief-history-of-erps

https://www.fortunetechnologyllc.com/history-of-erp-systems/

https://www.geniuserp.com/resources/blog/a-brief-history-of-erps

https://www.erp-information.com/history-of-erp.html#google_vignette

https://www.techtarget.com/searcherp/tip/MRP-vs-MRP-II-Learn-the-differences

https://www.business-case-analysis.com/account.html

https://www.britannica.com/technology/computer/IBM-develops-FORTRAN

https://business.joellemena.com/business/when-did-computers-start-being-used-in-business-2/

Sorry, no items found with this category

Ready to talk about your use cases?

Request your free audit by filling out this form. Our team will get back to you to discuss how we can support you.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Stay ahead with the latest insights
Subscribe to our newsletter for expert insights, industry updates, and exclusive content delivered straight to your inbox.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.