Blog

Insights & ideas

Stay ahead with expert articles, industry trends, and actionable insights to help you grow.

What it used to be and what it has become: visualLabs through my own eyes
10 mins read
Apr 9, 2025

What it used to be and what it has become: visualLabs through my own eyes

I first joined VisualLabs in the summer of 2020 as a junior business analyst. As you can see from the timeline, I was part of the mass junior recruitment. With three of us, the company grew to 8 people at that time.

Read more

I first joined VisualLabs in the summer of 2020 as a junior business analyst. As you can see from the timeline, I was part of the mass junior recruitment. With three of us, the company grew to 8 people at that time.

In the more than 1 year I worked here from 2020-2021, I was involved in quite a variety of tasks: building and improving Power BI reports, working a lot on a contract management application I built using the Power Platform, and also gaining insight into the beauty of Business Central. The latter also gave rise to some comical memories, such as the painstaking work involved in recording and subtitling training videos for clients, and how I was then, as an undergraduate student, on 'duty' for Christmas because I had no more holidays left for the year. But I got a lot of support from my senior colleagues in these things, they didn't let me get lost in the shuffle.

3 years later, in the summer of 2024, I rejoined VL, but now I work specifically with ERP. One thing that was very nice and new to me in the company was the company timeline. Where last time I was one of the mass junior hires, I'm now a part of the company life.

An amazing amount has happened in my time away, and it's great to see these events being shared by my colleagues, creating a stronger sense of belonging.

What has actually changed in these 3 years? I haven't had the chance to go through everything since I rejoined, and there's not enough space to go into it all here, so I'll just give you a few snippets.

Office

The first of these is probably the new office: the move from Zsigmond Square to Montevideo Street was already done when I was still here as a junior. But who I couldn't enjoy it then, and I wasn't part of the "moving in", but still, when I returned here 3 years later, I felt like I had shaped it. Interpret this to mean that the ethos that makes visuallabs to visuallabs, I think, changed very little, and the homeliness of the office reflected that.

Specialisation

The company has made huge progress in terms of specialisation and staff numbers while I was away: the team has grown to 35 people, and there are now separate business units for all the tasks I had the opportunity to join on a rotational basis as a junior. These are the CE team, who build business applications for clients, the data team, who deliver data analytics and visualisation solutions, and there's the ERP team - of which I became part - where we introduce Microsoft's enterprise management solutions (Dynamics 365 Finance and Operations and Business Central) to clients.What I would perhaps highlight from this is that even though these specialisations have evolved, it has not brought with it a siloed operation. To deliver work in our own area, we have access to the knowledge of other areas, and we mutually help each other across teams to deliver the highest quality service possible. From this perspective, what has changed in 3 years? I would say nothing; what worked then on a small scale, works now on a bigger scale.

Agile operation

We have had a solid vision of how we deliver solutions since I was a junior employee here: the agile methodology. What was in its infancy is now mature. If not fully agile, it uses the agile elements so well that it supports our work to a great extent on a day-to-day basis.It helps us communicate internally and to our customers by allowing them to post issues in DevOps that we help them resolve; we write features, user stories, test cases that help us with needs assessment and implementation. We have daily stand-up meetings with the team in the mornings where we discuss our stumbling blocks, at the end of the week we have sprint rounds where we always plan the next week's sprint, and monthly we have a retros where we pay special attention to feedback to each other, looking back on the past 1 month.

Team and all the fun

Unfortunately, during my first job, I didn't get much of that live because of Covid, but even then I had those short conversations at the beginning of a call or at the morning "all-people" DSMs that reinforced the sense of belonging to the team and the good atmosphere. Fortunately, we have kept this habit ever since, so no call is ever dull. And once the epidemic subsided, these community events only grew stronger, with regular team-building events, VL team-building retreats, co-hosted Christmas and Halloween parties.It's also a good day at the office. Although it varies from day to day, we have little rituals that colour the days and take the focus off work. For example, the daily lunch together in the office, chit-chat while making coffee, or just passing a funny comment to each other at the next desk, or the monthly office day when we all go in and look back over the past month. In short, you never get bored here. 😊

Coming back to a place where I've worked before is a special experience - especially when so much has changed in the meantime. VisualLabs has retained the supportive community and vibrancy that I grew to love, while reaching new levels of development and professionalism. This journey has been a learning experience not only for the company, but also for me, as the old and new experiences have given me a stronger, more mature perspective. I look forward to being a part of the next chapter and seeing where the company goes in the future!

Recap: Budapest BI Forum
10 mins read
Apr 9, 2025

Recap: Budapest BI Forum

The first day was a full-day Tabular Editor workshop, where we covered the basics and discussed topics such as controlling perspectives, writing macros, and refreshing partitions. The other two days of the conference were packed with learning, and here are my key takeaways from my favorite sessions.

Read more

Hey everyone! Here’s a summary of the Budapest BI Forum 2024, where I had the chance to dive into some intriguing topics and engage in inspiring conversations.

The first day was a full-day Tabular Editor workshop, where we covered the basics and discussed topics such as controlling perspectives, writing macros, and refreshing partitions. The other two days of the conference were packed with learning, and here are my key takeaways from my favorite sessions.

Keynote Speech: BI Trends

The day kicked off with a keynote that explored current and future BI trends.

Bence, the main organizer and host of the event, supported his key points with insights from Gartner research and similar studies. A few highlights that caught my attention:

  • By 2025, data security and data governance are expected to top the list of priorities for executives.
  • The rapid rise of AI introduces scenarios where users export data from dashboards to Excel, feed it into tools like ChatGPT, and generate their own insights. While exciting, this raises concerns about security and "shadow reporting," issues companies have tried to curb for years.

As a contractor and consultant I find this especially ironic. Large companies often hesitate to share data, even when it’s crucial for project development. They implement robust policies like VPNs and restricted searches to prevent leaks. But, at the same time, they struggle to monitor and control employees' behaviors, such as inadvertently sharing sensitive data.

This evolving dynamic between AI, data security, and governance will definitely be a space to watch closely.

Read more about Gartner’s 2024 BI trends here.

PBIR: Report Development in Code

This technical session introduced the PBIR format, a preview feature that allows Power BI reports to be stored as individual JSON files for each visual and page, instead of a monolithic file.

The feature’s potential for bulk modifications was the most exciting part. The presenter showed how Python scripts could iterate through the JSON files to apply changes (e.g., adding shadows to all KPI cards) across the report.

While still in preview and somewhat buggy, it’s a promising direction. I’m also intrigued by the integration possibilities with VS Code and GitHub Copilot, which could simplify automation for non-coders.

However, it seems TMDL language won’t be integrated into PBIR anytime soon—a bit disappointing, but I’m optimistic this will eventually happen.

TMDL Enhancements in Power BI & VS Code

One of the most exciting parts of the forum was exploring updates to TMDL (Tabular Model Definition Language), designed to make Power BI model development more efficient.

TMDL View in Power BI

This might be the feature I’m most excited about! The ability to edit your semantic model as code directly inside Power BI is a massive leap forward. Combining drag-and-drop, Copilot, and coding will make development smarter and faster.

Immediate Code Updates in Power BI (Planned for Next Year)

A handy feature to look forward to is real-time synchronization between modified TMDL code and Power BI. Changes to the model will reflect instantly in Power BI without reopening the file, saving tons of time during development.

VS Code TMDL Extension

The TMDL extension in VS Code offers:

  • Formatting: Automatically organizes TMDL syntax.
  • IntelliSense and Autocomplete: Speeds up coding with intelligent suggestions.
  • Expand/Collapse Functionality: Makes navigating larger TMDL files easier.

Get the extension here.

 

Copilot Integration in VS Code

Copilot lets you generate measures, calculations, and scripts with AI assistance. For example, as you type "Profit," Copilot suggests a complete formula based on the context. It’s a productivity boost I can’t wait to leverage more!

Online Editing with VSCode.dev

You can now edit repositories directly in your browser using the vscode.dev prefix for your repository URL. It’s perfect for quick edits without setting up a local environment.

These updates are poised to make model development faster, smarter, and more collaborative for teams using GitHub and VS Code.

Lunch Break with Insights from Microsoft

Lunch turned into one of the highlights of the day when Tamás Polner, a key figure at Microsoft, joined our table. Tamás shared some fascinating insights about the current direction of Microsoft’s data ecosystem and upcoming trends:

  • Fabric focus: Microsoft is heavily prioritizing Fabric over tools like ADF and Synapse, which are expected to receive basically no new feature updates as development resources shift toward Fabric. While this has been an industry assumption for a while, it was great to have this firsthand confirmation. The message is clear: Fabric is the future of Microsoft’s data ecosystem.
  • Data security: Reflecting on the keynote’s emphasis on data security, Tamás explained that this aligns with what he’s seeing at Microsoft. The number of developers in the security team is increasing significantly, and this trend doesn’t seem to be slowing down.
  • Optimized compute consumption: We also discussed CU (Compute Unit) optimization in Fabric. Tamás reaffirmed something I’d heard in Fabric training sessions: notebooks are far more powerful and efficient than UI-powered features like Dataflow Gen2. They use significantly less compute capacity, making them the better choice for many workflows.
  • DP-600 exam: Tamás mentioned that the DP-600 exam has become one of the most successful certifications in Microsoft’s history, with a record high number of certifications achieved in short time.
  • Copilot and AI: Copilot is a major focus for Microsoft, but its rollout faces challenges due to the high resource intensity of AI models. Tamás noted that, like other companies deploying built-in AI solutions, Microsoft needs to continue investing heavily in CAPEX for computing power to make these solutions broadly accessible.

 

This conversation provided valuable context and insight into Microsoft’s strategic priorities and was a great opportunity to discuss industry trends and technical strategies in detail.

 

Storytelling with Power BI

This session revisited a topic close to my heart: how to create Power BI reports that truly connect with their audiences. The presenter broke it down into three key phases:

  1. Research: Start by understanding the report’s purpose. Who will use the report? What decisions should it support? Can the goal be summarized in one clear, concise sentence?
  2. Create: Develop the report based on your research. Ensure that the visuals, design, and structure align with the user’s needs and the intended outcomes.
  3. Deliver: It’s not just about handing over the report and documentation, then walking away. True success lies in monitoring how the report is used and gathering user feedback. This feedback often reveals both strengths and weaknesses you didn’t anticipate, providing opportunities to refine and enhance the report further.

While much of this was a confirmation of what I already practice, it underscored an essential point: The discovery phase and follow-ups are just as critical as the actual development process.

It’s also a reinforced me that educating clients about the value of these stages is crucial. When clients understand that investing time and resources into proper research and post-delivery follow-ups leads to better reports and happier users, they’re much more likely to embrace these processes.

 

Final Thoughts

The day was packed with insights, but what truly stood out was the seamless blend of technical innovation and strategic foresight. Whether it was exploring new options like TMDL and PBIR, or gaining a deeper understanding of the big-picture trends shaping the future of BI, the forum offered something valuable for everyone.

Of course, the lunch chat with Tamás was a treasure trove of insider knowledge—easily one of the event’s highlights for me. Another personal highlight was a heartfelt conversation with Valerie and Elena, who encouraged me to take the next step in my professional journey: becoming a conference speaker.

If any of these topics piqued your interest or you’d like me to dive deeper into specific sessions, just let me know—I’d be happy to share more!

Create efficient and customized Release Notes with Bravo Notes
10 mins read
Apr 9, 2025

Create efficient and customized Release Notes with Bravo Notes

For our customers, it is important that when we deliver a new version of their existing IT system, we also provide a release note on the content and functionality of the released package. At Visuallabs, we constantly strive to meet our customers’ needs to the maximum, all while simplifying our own workflows and increasing our administrative efficiency.We are supported in this by the Bravo Notes available in DevOps. Using this plug-in, we produce a unique yet standardized Release Note with each new development package delivery. This allows us to meet our customers’ requirements in a fast and standardized way.

Read more

For our customers, it is important that when we deliver a new version of their existing IT system, we also provide a release note on the content and functionality of the released package. At Visuallabs, we constantly strive to meet our customers' needs to the maximum, all while simplifying our own workflows and increasing our administrative efficiency. We are supported in this by the Bravo Notes available in DevOps. Using this plug-in, we produce a unique yet standardized Release Note with each new development package delivery. This allows us to meet our customers' requirements in a fast and standardized way.

What is needed to do this?

By following a few simple principles in our delivery processes, the documentation we already produce provides a good basis for generating standard version documents in a few steps for our releases or bug fixes.

How do we document?

  • The conventions for using the various purpose fields available on a given DevOps element will be strictly adhered to and filled in in a way that is appropriate for the document being generated.
  • User Stroy descriptions are prepared in a standard format. This allows us to provide standard quality for our customers and to build in automated document generation.
  • Tickets are sorted by transport unit. This helps when responding to multiple business challenges from the customer at the same time. Documentation of delivered enhancements and system changes can then be categorised in one document.

Using Bravo Notes

Bravo Notes provides technical assistance to help you meet these requirements with the right customisation.The main functions we use:

  • Compiling content: there are several options to choose from when selecting items from DevOps. We use Query most often among the options shown in the screenshot below, because the multiple filtering criteria allow us to select relevant elements more efficiently, thus making the documentation more precise.
  • Template: In Bravo Notes, we have created various templates to organise the news into a proper structure.  

Main units of the template developed:

  • In the case where several delivery units or business processes are involved for a system release, the relevant descriptions are grouped together in the document.
  • A further organizing principle in the template is that new developments are shown in a feature-by-feature breakdown, and solutions to bugs are also shown in a separate unit. This makes it clear which supported feature a given release item refers to, whether it is a new development or a bug fix.
  • Use parameters: parameters based on business processes allow you to customise the generation of documents. During generation, you can change the title, date, release date and add comments to the document. You can also specify the applications and resources involved, for example, which business area or environment is affected.
  • Display of document units and headings based on a set of rules: it is handled in the template to display only the relevant headings and document parts; e.g. if there was no error correction in a given delivery unit, its heading is not displayed either.
  • Fields used in the template: as defined above, we provide easy-to-read descriptions for the released developments. The consistent documentation of the DevOps tickets used in the design or development process allows this to be done quickly and in a standardized way. The content of the fields defined in the template about the tickets is automatically included when the document is generated.
  • Export: After generation and verification, we export your document to PDF format.

Testimonials: Overall, it is therefore important for our customers to receive detailed and business-relevant documentation on the new versions provided for the systems they use.We are also trying to simplify our own workflows.The Bravo Notes module integrated into DevOps supports us in achieving these goals.With this plug-in, we create customized yet standardized Release Notes with each new development package delivery. This allows us to meet our customers' requirements in a fast and standardised way, providing them with the necessary information and transparency on system changes and enhancements.

Soft blue and white gradient background with blurred smooth texture
Filter
Industry
Technology
Solution category
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
My Journey with CI/CD in Power BI: A Personal Tale of Transformation Part 3
April 9, 2025
3 min read
My Journey with CI/CD in Power BI: A Personal Tale of Transformation Part 3
Read more

In part 3, I’m going to give you a step-by-step description of the implementation process of source control in Power BI. This can be divided into 4 parts:

  1. Modify settings in Power BI Desktop
  2. Download & Install necessary softwares
  3. Set up environments
  4. Use it!

Step 1 -  Modify settings in Power BI Desktop: Enable preview feature: Power BI Project (*.pbib) save option

  1. Open Power BI Desktop  
  2. Go to Options and settings and select Options

   

3. Click on Preview features and enable Power BI Project (*.pbib) save option  +1 optional) I’d recommend ticking the boksz next to Store semantic model using TMDL format        4.Hit OK

And now we can move to Step 2. Step 2 - Download & Install necessary softwares At VisualLabs we decided to use VS Code but you can do the basics in Power Shell as well. The reason I prefer VS Code is that you can have a visual interpretation of your project (track all the branches, merges, etc at the same time).  

  1. Download and install VS Code - https://code.visualstudio.com/download

Feel free to install it with the default settings.          2. Download and install GIT. You can download it from here: https://www.git-scm.com/downloads Feel free to install it with the default settings, the only thing you can change is the default editor, which you can set to Visual Studio Code.  

 

3. Add GitGraph to VS Code – this will allow you to see the historical changes of your repo as mentioned above.

  1. Open VS Code
  2. Click on Extension on the right
  3. Type Git graph
  4. Select from list
  5. Click Install

Step 3 – Set up GIT and Azure DevOps environments

  1. Set up VS Code as your default GIT editor - Open a New Terminal in VS Code and type this command (you may need to restart you VS Code or machine to make the commands work properly):

git config --global core.editor "code --wait"  

Set up your GIT Identity – type this command in the terminal git config --global user.name "FirstName LastName" git config --global user.email firstname.lastname@myorganization.com

Create a repo on Azure DevOps You can follow this MS documentation: https://learn.microsoft.com/en-us/azure/devops/repos/git/create-new-repo?view=azure-devops#create-a-repo-using-the-web-portal

4. Once the repo is there, you’ll see this on your screen and now you can clone it onto your computer  

   

5. Select Clone in VS Code option  

 

6. Select destination folder

My recommendation is to create a separate folder where you can store all your repos from this point. I’d also opt for a cloud location for this repo collector folder – like OneDrive.  

 

7. In VS Code, you can check the current status of your repo  

 

 8. The last step is Save your Power BI file As.pbib to this folder.  

   

9. Click on Yes, I trust the authors to move tot he next step. You’ll see that VS Code recognized that there are new files in the folder.  

 

  10. Now you can Add a coming message, Select the changes you want to keep (this the step called: stage changes, feel free to click on Select all) and Click Commint (it is only going to save it locally)  

11. Click Sync changes (now it’s in the colud – you can check it in the Repo created on Azure DevOps)

12. GitGraph will look like this:  

   

13. Congrats!

Your source control journey has officially begun! Feel free to create branches, repos etc., and start the co-development with your colleagues or just simply enjoy that you won’t ever be named to “MyProject_final_v124_final12.pbix”

Unified Monitoring: Using Workbooks for Logic Apps, Azure Functions, and Microsoft Flows
April 9, 2025
7 min read
Unified Monitoring: Using Workbooks for Logic Apps, Azure Functions, and Microsoft Flows
Read more

Problem Statement

Monitoring of the three platforms mentioned in the title is solved independently in different locations. Logic Apps can be monitored either from the resource’s run history page or through the Logic App Management solution deployed to a Log Analytics workspace. Azure Functions have Application Insights, while the run history of Microsoft Flows is available on the Power Platform.

Most of our clients’ solutions consist of these resources, which often chain together and call each other to represent business processes and automations. Their centralized supervision is not solved, making error tracking and analysis difficult for employees. Moreover, they had to log into the client’s environment to perform these tasks.

Goal

We wanted to get a general overview of the status of the solutions we deliver to our clients, reduce our response time, and proactively prevent error reports submitted by our clients. We aimed to track our deployments in real-time, providing a more stable system and a more convenient user experience. We wanted to make our monitoring solution available within Visuallabs so that we could carry out monitoring tasks from the tenant that hosts our daily development activities.

Solution

Infrastructure Separation

Our solution is built on the infrastructure of a client used as a test subject, whose structure can be considered a prerequisite. On the Azure side, separate subscriptions were created for each project and environment, while for Dynamics, only separate environments were used. Project-based distinction for Flows is solved based on naming conventions, and since log collection is manual, the target workspace can be freely configured.

Centralized Log Collection

It was obvious to use Azure Monitor with Log Analytics workspaces for log collection. Diagnostic settings were configured for all Azure resources, allowing us to send logs to a Log Analytics workspace dedicated to the specific project and environment. For Microsoft Flows, we forward logs to a custom monitor table created for Flows using the built-in Azure Log Analytics Data Collector connector data-sending step. This table was created to match the built-in structure of the Logic Apps log table, facilitating the later merging of the tables.

monitroing
Diagnostic settings

Log Analytics workspace

Log tables

Making Logs Accessible in Our Tenant

An important criterion for the solution was that we did not want to move the logs; they would still be stored in the client’s tenant; we only wanted to read/query them. To achieve this, we used Azure Lighthouse, which allows a role to be enforced in a delegated scope. In our case, we set up a Monitoring contributor role for the client’s Azure subscriptions for a security group created in our tenant. This way, we can list, open, and view resources and make queries on Log Analytics workspaces under the role’s scope from our tenant.

Visualization

For visualization, we used Azure Monitor Workbook, which allows data analysis and visual report creation, as well as combining logs, metrics, texts, and embedding parameters. All Log Analytics workspaces we have read access to via Lighthouse can be selected as data sources. Numerous visualizations are available for data representation; we primarily used graphs, specifically honeycomb charts, but these can easily be converted into tables or diagrams.

Combining, Customizing, and Filtering Tables

To process log tables from different resources together, we defined the columns that would be globally interpretable for all resource types and necessary for grouping and filtering.

These include:

  • Client/Tenant ID
  • Environment/Subscription ID
  • Resource ID/Resource Name
  • Total number of runs
  • Number of successful runs
  • Number of failed runs

Based on these, we could later determine the client, environment, project, resource, and its numerical success rate, as well as the URLs needed for references. These formed the basis for combining tables from various Log Analytics Workspaces and resources for our visualizations.

Log Analytics

User Interface and Navigation

When designing the user interface, we focused on functionality and design. Our goal was to create a visually clear, well-interpreted, interactive solution suitable for error tracking. Workbooks allow embedding links and parameterizing queries, enabling interactivity and interoperability between different Workbooks. Utilizing this, we defined the following levels/types of pages:

  • Client
  • Project
  • Resources
  • Logic App
  • Azure Function
  • Flow
Customers

Projects

Resources

Resources
Resources [Azure Function]

At the Client and Project levels, clicking on their names displays the next subordinate Workbook in either docked or full-window view, passing the appropriate filtering parameters. Time is passed as a global parameter during page navigation, but it can be modified and passed deeply on individual pages. We can filter runs retrospectively by a specific minute, hour, day, or even between two dates.

On the page displaying resources, we provide multiple interactions for users. Clicking on resource names navigates to the resource’s summary page on the Azure Portal within the tenant, thanks to Lighthouse, without tenant switching (except for Power Automate Flows).

Clicking on the percentage value provides a deeper insight into the resource’s run history and errors in docked view. This detailed view is resource type-specific, meaning each of the three resources we segregated has its own Workbook. We always display what percentage of all runs were successful and how many faulty runs occurred, with details of these runs.

Logic App

Beyond general information, faulty runs (status, error cause, run time) are displayed in tabular form if any occurred during the specified time interval. Clicking the INSPECT RUN link redirects the user to the specific run where all successful and failed steps in the process can be viewed. At the bottom, the average run time and the distribution of runs are displayed in diagram form.

Logic App

Logic App [INSPECT RUN]

Logic App [diagrams]

Microsoft Flow

For Flows, the same information as for Logic Apps is displayed. The link also redirects to the specific run, but since it involves leaving Azure, logging in again is required because Dynamics falls outside the scope of Lighthouse.

Microsoft Flow

Azure Function

The structure is the same for Azure Functions, with the addition that the link redirects to another Workbook instead of the specific run’s Function App monitor page. This is necessary because only the last 20 runs can be reviewed on the Portal. For older runs, we need to use Log Analytics, so to facilitate error tracking, the unique logs determined by developers in the code for the faulty run are displayed in chronological order.

Azure Function

Azure Function

Consolidated View

Since organizationally, the same team may be responsible for multiple projects, a comprehensive view was also created where all resources are displayed without type-dependent grouping. This differs from the Workbook of a specific project’s resources in that the honeycombs are ordered by success rate, and the total number of runs is displayed. Clicking on the percentage value brings up the previously described resource type-specific views.

Resources

Usability

This solution can be handy in cases where we want to get a picture of the status of various platform services in a centralized location. This can be realized interactively for all runs, except for Flows, without switching tenants or possibly different user accounts. Notification rules can also be configured based on queries used in Workbooks.

Advantages:

  • The monitoring system and visualization are flexible and customizable.
  • New resources of the same type can be added with a few clicks to already defined resource types (see: configuring diagnostic settings for Logic Apps).

Disadvantages:

  • Custom log tables, visualizations, and navigation between Workbooks require manual configuration.
  • Integrating Flows requires significantly more time investment during development and planning.
  • Combining tables, separating environments and projects can be cumbersome due to different infrastructure schemas.
  • Basic knowledge of KQL (Kusto Query Language) or SQL is necessary for queries.

Experience

The team that implemented the solution for the client provided positive feedback. They use it regularly, significantly easing the daily work of developer colleagues and error tracking. Errors have often been detected and fixed before the client noticed them. It also serves well after the deployment of new developments and modifications. For Logic Apps, diagnostic settings are included in ARM (Azure Resource Manager) templates during development, so runs can be tracked from the moment of deployment in all environments using release pipelines.

Hiding Subgrid Buttons Specifically
April 9, 2025
3 min read
Hiding Subgrid Buttons Specifically
Read more

Depending on the stage of a sales process, different functions should be available on a form's Subgrid. Practically, this means that at the beginning of the process, interests can be added to a Lead, but these should not be modifiable later in the Opportunity phase.This article requires some technical knowledge for understanding and application, so it is recommended for Dynamics 365 CE app makers who are already familiar with the Power Platform world.

Starting point:

new products

Goal:

product insert

Tools Used for the Solution:

The Solution:

1. Solution

Create a Solution that will be loaded into the Ribbon Workbench. Add the entity whose SubGrid you want to modify into this solution. (Important: when adding the existing entity to the Solution, do not import any other elements). The name of the Solution should always be constructed based on the following logic: Ribbon_VL_[entity name] e.g., Ribbon_VL_Product_Interest.

subgrid

2. Subgrid

Name the SubGrid with a unique, identifiable name. Do not use the automatically generated name, as you will refer to this later.

3. JavaScript

Create the following JavaScript as a .js file (using VS Code), then upload it to the solution containing the Web resources. It is advisable to name the file the same as its content to make it easier to find later.

subgrid

forProductInterestView: function (selectedControl) {console.log("start.forProductInterestView");"use strict";debugger;var currentGridName = selectedControl._controlName;console.log("forProductInterestView-currentGridName: "+currentGridName);var excludedPayRun = "subgrid_prodinterest"; //Name of the subgridif (currentGridName == excludedPayRun) {console.log("end.forProductInterestView.true");return false;}else { console.log("end.forProductInterestView.false");return true; }}

4. Ribbon Workbench

Open the Ribbon workbench and add the solution created in step one. Each entity has 3 ribbons: Home, Subgrid, Form. We now need the Subgrid.

Select the button you want to remove by right-clicking on it and pressing "Customise Button." A red checkmark will appear, and it will also be added to the Buttons section below. If it is already checked, it means a command is already associated with it; in that case, you need to add a new command and can skip this step.

Next, add a Command, which can be done by clicking the plus sign in the Commands section. The command should look like this:

subgrid

Explanation:

  • Library: The webResource you added to the solution (this is where the good naming comes in)
  • Function name: The name given in the JavaScript. (The part before the Function)
  • CRM Parameter: What parameter to pass; in this case, it is the SelectedControl. This Control manages the SubGrids on Forms and all listings. The PrimaryControl manages the form.

Next, add an EnableRule that hides the buttons.

Explanation:

  • Library: The webResource you added to the solution (this is where the good naming comes in)
  • Function name: The name given in the JavaScript. (The part before the Function)
  • CRM Parameter: What parameter to pass; in this case, it is the SelectedControl

Only one step remains before Publishing. For the buttons, specify which Command should be associated with them.

I hope you find this article useful and that it provides a solution idea.

The Past, Present, and Future of ERP Systems
April 9, 2025
5 min read
The Past, Present, and Future of ERP Systems
Read more

When I started working at VisualLabs, during the first WSM (weekly standup meeting) where each business division reports on their current weekly tasks, I noticed how many abbreviations we use. As a member of the ERP team, I wondered if we know exactly how these abbreviations came about and what they stand for.

The term ERP (Enterprise Resource Planning) is familiar to everyone today, but few know its exact origins and development path. Therefore, I decided to gather information on where it started and the major milestones that helped shape the ERP systems we know today. Looking back in time, we will realize how deeply this technology is rooted in the modern business world.

In this blog, I have compiled seven milestones that contributed to the development of the ERP system as we know it today.

In today’s world, it would be unimaginable for a company not to use some kind of computer system for its various processes. However, before the advent of computers, companies had to manage these processes (be it accounting or production planning) using some methods. Take accounting, for example. Accountants recorded every financial transaction manually on paper in different books, which they managed daily and monthly. It is hard to imagine that companies often had rooms full of (main) books and files, each containing dozens of transactions. At the center of it all was the accountants’ most precious asset, the general ledger. It is daunting to think about how much work the year-end closing process entailed and how many errors could occur during this process.

ERP

  1. Birth of Computers (1950s):

In the 1950s, with the birth of computers – theoretically founded by John von Neumann – a new dimension opened up in the operation of companies and the transformation of their processes. Although these computers were primarily used in the military and scientific fields in the 50s – due to their large size and cost – continuous technological developments soon brought them into the business world. These devices allowed faster data processing and analysis and helped automate business activities.

ERP

2. Inventory Management and Control (1960s):

One of the first milestones in recognizing the potential of computers for business opportunities stretches back to the 1960s. The manufacturing industry realized the need for a system that would enable inventory management, monitoring, and control. The emergence of information technology allowed companies to integrate and automate their business processes. As a result, they improved the efficiency and accuracy of inventory management. This was one of the first steps toward developing ERP systems.

3. Material Requirements Planning (MRP I, 1970s)

The concept of MRP (Material Requirements Planning) first appeared in 1970 and fundamentally represented a software-based approach to planning and controlling manufacturing processes. MRP’s application primarily focused on planning and tracking material requirements. This approach allowed companies to predict more accurately the type and amount of materials needed during production processes. With MRP, companies could manage material procurement and production scheduling more effectively, reducing losses from over- or underestimation. This innovation had a significant impact on the manufacturing industry and fundamentally transformed companies’ material planning processes. This approach contributed to increased efficiency and competitiveness of manufacturing companies in the 1970s.

4. Manufacturing Resource Planning (MRP II, 1980s): The 1980s marked a significant milestone with the advent of MRP II systems. While MRP focused solely on the inventories and materials needed based on real or forecasted customer demands, MRP II provided greater insight into all other manufacturing resources. By extending manufacturing planning beyond materials to include labor, machinery, and other production resources, it gave companies much greater control over their manufacturing processes.

5. Enterprise Resource Planning Systems (ERP, 1990s): In the 1990s, the first true ERP systems were introduced. (The term ERP itself was coined by the research firm Gartner in the 1990s.) ERP systems represented a significant advancement compared to MRP II systems as they focused not only on manufacturing but also on the full integration and automation of business processes. Such processes included procurement, sales, finance, human resources, and accounting. With full integration, companies could manage their business processes in a unified database, offering numerous advantages. The unified storage and management of information ensured access to accurate, up-to-date data, improving decision-making and efficiency. The connected business areas helped formulate and implement unified strategies. As a result, the ERP system became a “one-stop solution” that managed all company information.

6. Web-Based Functionalities with the Rise of the Internet (ERP II, 2000s): In the mid-2000s, as the internet’s role grew in the business world, ERP systems also adapted to this change. Systems began incorporating customer relationship management (CRM) and supply chain management (SCM) functionalities. ERP II emphasized user-friendly interfaces and customization. Modular systems were developed, allowing businesses to select and implement the components most relevant to their operations.

7. Cloud-Based ERP (2010s): In the 2010s, the emergence of cloud technology added a new dimension to ERP systems. Cloud-based ERP solutions allowed companies to store and run their ERP systems in the cloud instead of traditional “on-premise” installations. This offered significant advantages, including greater flexibility, lower costs, and easier access to critical data. With cloud-based ERP systems, companies no longer had to worry about server maintenance or software updates, as these tasks were handled by their providers. This allowed companies to focus on their business goals and processes while ensuring their system was always up-to-date and accessible.

+1 The Future of ERP: So where is the development of ERP systems headed today? With intelligent algorithms and artificial intelligence, systems are increasingly able to automate and optimize business processes, reducing the need for human intervention. Data will continue to play a key role in the future, as more efficient analysis of data enables companies to make better business decisions. The integration of ERP systems with various IoT devices allows real-time data exchange, providing companies with quicker and more accurate answers to support different business questions.

ERP

ERP systems are also increasingly providing personalized user experiences and offering expandable integrations with other business applications and technologies. In the future, ERP systems will not just function as tools but will provide true business intelligence and competitiveness, helping companies keep pace with the rapidly changing business environment and stand out from their competitors.

Are you exploring the world of ERP systems? Visual Labs can help you uncover the possibilities within.

Sources:

https://www.geniuserp.com/resources/blog/a-brief-history-of-erps https://www.fortunetechnologyllc.com/history-of-erp-systems/ https://www.geniuserp.com/resources/blog/a-brief-history-of-erps
https://www.erp-information.com/history-of-erp.html#google_vignette
https://www.techtarget.com/searcherp/tip/MRP-vs-MRP-II-Learn-the-differences https://www.business-case-analysis.com/account.html https://www.britannica.com/technology/computer/IBM-develops-FORTRAN https://business.joellemena.com/business/when-did-computers-start-being-used-in-business-2/

Microsoft Power Pages: Quick and Efficient Website Building for Your Business
April 9, 2025
3 min read
Microsoft Power Pages: Quick and Efficient Website Building for Your Business
Read more

Power Pages

Digital presence is becoming increasingly important for every business. To stay competitive, we need to quickly adapt to changing demands and technological advancements. Therefore, it is essential to use tools that allow us to efficiently and quickly create websites that meet our business goals.

Power Pages, previously known as Microsoft Power Apps Portals, is a platform that allows us to build websites quickly and easily while seamlessly integrating them with our existing data sources. Power Pages is an ideal solution for businesses that want to create websites swiftly without lengthy development times.

For example, we used Power Pages for the Construction Monitoring and Data Service System’s portals, which greatly assists clients in recording, tracking, and administering their inquiries.

Power Pages also played a crucial role in an IT audit project, enabling us to efficiently handle data provision from multiple companies.

Collecting and managing data services is one of the most critical phases of audit projects, especially when involving multiple companies. Power Pages proved to be an extremely useful tool in this process, where we had to request data from numerous companies and then collect and manage this information through the platform. The platform allowed the creation of websites in a simple and intuitive manner without requiring complex coding knowledge.

Firstly, with Power Pages, we easily created a user-friendly interface that enabled companies to submit their data services efficiently. The interface featured simple data entry forms, making it easy for companies to understand how to input information into our system.

Secondly, Power Pages allowed for easy management and tracking of data throughout the project. The transparent administration interface helped us keep track of which companies had submitted their data and the status of the data collection process. This allowed us to respond efficiently to any shortcomings or questions from the companies.

Thirdly, Power Pages facilitated the easy integration and analysis of data within the audit project. The collected data could be easily imported into other systems (e.g., Power BI), making it readily usable in the audit process. This enabled us to analyze and evaluate the information submitted by companies more quickly and efficiently.

Overall, Power Pages offers a scalable solution, flexible and customizable, adapting to unique business needs and requirements. This allows businesses to freely shape and expand their websites according to their business goals.

One of our clients had a need for such a website.

The Sales Portal we created for one of our clients is a website that enables external sales partners or distributors to collaborate on sales opportunities and increase sales within the organization. This site provides state-of-the-art, secure authentication and fully customizable design and functionality. Distributors can log in and collaborate on sales opportunities in full sync with internal sales teams, thanks to instant two-way data synchronization. The marketing department can assist in the sales process by updating sales guides and materials available on the homepage, keeping the latest product information up-to-date.

If you also want to create websites quickly and efficiently for your business, it might be worth trying out Power Pages. The Visual Labs team is happy to assist you with this!

Book review : John Willis - Deming's Journey to Profound Knowledge
April 9, 2025
2 min read
Book review : John Willis - Deming's Journey to Profound Knowledge
Read more

Deming's System of Profound Knowledge - IT Revolution

What is the book about?

This book covers the life of W. Edwards Deming who founded modern managerial statistics and contributed greatly to WWII production effort in the US and post-war recovery on Japanese manufacturing.

This book is not merely a biography; it intertwines Deming's life story with the evolution of management history, providing a comprehensive view of his impact.

As the book was published by IT revolution and written by the co-author of the DevOps Handbook, it talks in detail about how agile methodologies.

What I found useful?

It was truly insightful to see the lineage of how different managerial waves evolved in the past hundred years, how the different management methods succeeded each other (from Total Quality Management through Lean and then how the foundations permeated into Agile and later to DevOps). The author paints a great picture of the events and the people involved besides Deming.

The first half of the book talks about the evolution of modern manufacturing processes through the life of Deming, I feel this is the part of the book that was fairly novel. This part of the book also flows really, it could easily be a narration of a Netflix documentary.

The second half of the book turns to software development and mainly to the DevOps 'movement', this part is definitely insightful, draws on several interesting case studies especially in the IT Security area. (e.g. white hat vs. black hat hackers).

Who would I recommend it to?

Certainly an interesting read (listen) to those interested in management history and the ideological background of the current software delivery practices.

If you are new to this sort of literature and domain (e.g. manufacturing, lean, software development practices), this may not be an ideal starting point as it talks about concepts fairly briefly assuming that readers are already familiar with them - which is what you would expect from the typical reader (listener) of this book.

Follow-on

For further reading, deep-dive, it'd be interesting to read first-hand from Dr. Deming: "The culmination of his knowledge" was compiled into what is called "System of Profound Knowledge" along with this famous "14 Points for Management"

Dr. Deming's 14 Points for Management - The W. Edwards Deming Institute

The Deming System of Profound Knowledge® (SoPK) - The W. Edwards Deming Institute

Sorry, no items found with this category

Ready to talk about your use cases?

Request your free audit by filling out this form. Our team will get back to you to discuss how we can support you.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Stay ahead with the latest insights
Subscribe to our newsletter for expert insights, industry updates, and exclusive content delivered straight to your inbox.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.