[{"content":"At Ignite 2025, Microsoft announced that Defender for Cloud is now available in the Microsoft Defender unified security portal (security.microsoft.com).\nThis integration brings a single pane of glass experience across the Microsoft security product line. Security teams who don\u0026rsquo;t necessarily work directly with Azure workload deployments can now view important security metrics concerning:\nasset vulnerabilities attack paths secure scores prioritized security recommendations Prerequisites Before enabling this feature, ensure you have:\nAn active Microsoft Defender for Cloud subscription Appropriate permissions to access preview features in the Microsoft Defender portal Reader access or higher to the Azure subscriptions you want to monitor Enabling the experience You will have noticed that there is a new menu called Cloud Infrastructure in the security portal. When selecting the overview, you\u0026rsquo;re presented with the option to enable the feature.\nTo enable the feature, navigate to System \u0026gt; Settings \u0026gt; Preview Features and enable \u0026lsquo;Microsoft Defender for Cloud\u0026rsquo;.\nOnce you have enabled the feature, you can navigate to the Cloud Infrastructure overview page to view the dashboard metrics from Defender for Cloud.\nDashboard highlights The overview dashboard is quite detailed. Let me highlight some of the most useful features to help you get started with the new menu.\nNew cloud secure score The new cloud secure score model introduces asset risk factors and asset criticality into the calculation. This produces a more accurate environment score by weighting the amount and severity of open recommendations against resource criticality in your organization.\nRecommendations The recommendations you\u0026rsquo;re familiar with from Defender for Cloud are available in the unified portal. The filtering experience has been significantly improved. In the Azure portal, drilling down into issues often resets the filters, which has been a major pain point.\nIn the unified portal, the details now open within the same window, allowing your filters to remain in place throughout your investigation.\nCloud inventory This overview is one of my favourite updates as you can easily switch between the technology domains that Defender is monitoring. The below example shows DevOps security, which covers all repositories, organizations, and pipelines connected to Defender for Cloud.\nThe other categories cover: VMs, data, containers, AI, identity, and serverless, with subcategories for monitoring within each technology domain.\nRole-based access control The unified portal introduces granular RBAC roles, allowing you to tailor access based on team responsibilities. This is particularly useful for organizations with specialized security teams.\nFor example:\nA data platform team may only need visibility into database and storage resources A SOC team might require broad visibility across multiple project areas and subscriptions Application teams can focus on their specific workloads without accessing the entire security landscape This granular approach ensures teams see relevant security information without being overwhelmed by data outside their scope.\nImportant considerations While the unified portal experience is impressive, the Azure portal remains essential for certain tasks. Currently, you cannot:\nEnable security policies in subscriptions Connect code repositories Configure CSPM plans Access some of the more technical configuration details The Azure portal is still more feature-rich and provides deeper technical detail for cloud security engineers working on configuration and policy management.\nSummary The Defender for Cloud integration into the Microsoft Defender portal is a valuable addition for central security teams. It provides:\nUnified visibility across Microsoft security products Improved filtering and navigation Role-based views tailored to team responsibilities A cleaner dashboard experience for executive visibility Think of this as a complementary view rather than a replacement. Security operations teams will appreciate the consolidated dashboard, while cloud engineers will still rely on the Azure portal for deep configuration work. It\u0026rsquo;s perfect for that big screen in your SOC!\n","permalink":"http://www.forsh.dev/posts/enabling-defender-for-cloud-unified-portal/","summary":"\u003cp\u003eAt Ignite 2025, Microsoft announced that Defender for Cloud is now available in the Microsoft Defender unified security portal (\u003ca href=\"https://security.microsoft.com/\"\u003esecurity.microsoft.com\u003c/a\u003e).\u003c/p\u003e\n\u003cp\u003eThis integration brings a single pane of glass experience across the Microsoft security product line. Security teams who don\u0026rsquo;t necessarily work directly with Azure workload deployments can now view important security metrics concerning:\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003easset vulnerabilities\u003c/li\u003e\n\u003cli\u003eattack paths\u003c/li\u003e\n\u003cli\u003esecure scores\u003c/li\u003e\n\u003cli\u003eprioritized security recommendations\u003c/li\u003e\n\u003c/ul\u003e\n\u003ch2 id=\"prerequisites\"\u003ePrerequisites\u003c/h2\u003e\n\u003cp\u003eBefore enabling this feature, ensure you have:\u003c/p\u003e","title":"Enabling the Defender for Cloud Unified Security Portal"},{"content":"I\u0026rsquo;ve been posting on medium for a while now but I\u0026rsquo;ve always wanted a blog that was my own, simple to maintain, looked clean, and deployed automatically. Instead of overthinking every decision, I leaned into the current IT buzz word of the moment vibe coding: AI prompting with fast iterations, small fixes, and shipping in the flow.\nThis post is a quick recap of what I did, what went wrong, and what finally worked.\nWhy Hugo + GitHub Pages I picked Hugo for three reasons:\nIt is fast and markdown-first. It works great with static hosting. It keeps content and code in one place. For hosting, GitHub Pages was the natural fit since its free, I\u0026rsquo;m used to working with it and I wanted CI/CD from day one.\nWhat I Built I set up the site in my craigforshaw.github.io repository with:\nHugo Extended PaperMod theme GitHub Actions deployment workflow About page Post structure using both single files and page bundles After the initial setup, I added real content and started customizing the home page and navigation.\nThe Vibe Coding Workflow The flow was basically:\nMake a change. Run/preview immediately. Fix the next error fast. Repeat until it feels right. That sounds obvious, but the key was not getting stuck trying to perfect the architecture before publishing anything.\nCopilot model I initially started out with the GitHub model but it didn\u0026rsquo;t seem to vibe so well, made quite alot of errors or didn\u0026rsquo;t produce what i was looking for. As soon as i switched to Claude Sonnet then I was able to make more progress as this model understood my prompts better and was able to implement fixes faster.\nIssues I Hit (and Fixed) 1) Hugo config parse errors I hit TOML parsing issues from apostrophes/quotes in front matter and config values. The fix was to normalize quotes and keep front matter syntax strict.\n2) Preview server confusion At times the local preview looked down, but it was mostly terminal state and previous failed runs. Restarting Hugo cleanly and checking logs solved it.\n3) Workflow reliability The initial GitHub Actions workflow had extra moving parts that were not necessary for my setup. Simplifying the workflow reduced failure points and made deploys predictable.\n4) Post image management I switched to page bundles for content that has multiple screenshots. Keeping index.md and images together made writing and maintenance much easier.\nContent Migration from Medium I migrated selected Medium posts into the Hugo blog. For posts with lots of images, the page bundle structure made it straightforward:\nOne folder per post index.md for content Screenshots in the same folder Relative image links in markdown This keeps each post self-contained and easy to move/edit later.\nHome Page Layout Decisions I experimented with profile/home-info mode vs a posts-first home page. The big takeaway: choose the layout that matches your goal.\nIf you want personal branding first: profile mode. If you want content discovery first: post list on home. I leaned toward clarity and easy navigation over heavy customization.\nWhat I Like About the Current Setup Writing is fast: just markdown. Deploys are automatic on push. Theme is clean and readable. Content and infrastructure live together in one repo. Most importantly, I can now focus on publishing rather than platform maintenance.\nLessons Learned Ship the first version quickly. Keep the deployment path simple. Prefer conventions over custom complexity. Treat errors as part of the workflow, not blockers. Vibe coding worked because it kept momentum high while still producing a solid, maintainable result.\nWhat’s Next Next improvements I want to make:\nBetter image optimization for large screenshots More consistent post templates Stronger internal linking between related posts Custom domain hardening and final polish If you are setting up your own technical blog, my advice is simple: start with a minimal Hugo setup, automate deployment early, and publish your first real post as quickly as possible.\nDone beats perfect.\n","permalink":"http://www.forsh.dev/posts/vibe-coding-my-hugo-blog-on-github-pages/","summary":"\u003cp\u003eI\u0026rsquo;ve been posting on medium for a while now but I\u0026rsquo;ve always wanted a blog that was my own, simple to maintain, looked clean, and deployed automatically. Instead of overthinking every decision, I leaned into the current IT buzz word of the moment \u003cstrong\u003evibe coding\u003c/strong\u003e: AI prompting with fast iterations, small fixes, and shipping in the flow.\u003c/p\u003e\n\u003cp\u003eThis post is a quick recap of what I did, what went wrong, and what finally worked.\u003c/p\u003e","title":"Vibe Coding My Hugo Blog on GitHub Pages"},{"content":"Using container app jobs for self-hosted Azure DevOps agents allows for more control over what is running on your DevOps agents. Both VMSS and the newer managed DevOps pools give you the option to run agents on your own virtual network which is excellent for securing network traffic but if you also need to have control what is running on them then configuring the agents with docker in a container app job is a good option. You also have the added security of Defender for containers integration to ensure you can keep your images secure.\nOne of the downsides to container apps/jobs as agents until recently has been the requirement to use a PAT token to authenticate with an Azure DevOps agent pool. PAT is not ideal as the token can be exposed, can expire and is just all round vulnerable to cyberattacks. If an attacker gets their hands on your PAT token then your at risk of unauthorized access to your source code, pipelines and and ultimately your cloud infrastructure resources.\nMicrosoft have also recently annouced that they are distancing themselves from PAT in Azure DevOps: Reducing personal access token (PAT) usage across Azure DevOps\nWe can now remove this risk by configuring system-assigned managed identity on a container app job to run as an Azure DevOps agent.\nSo lets get started:\nPre-requisites Install Windows Subsystem for Linux (WSL) and install Docker desktop or the Docker engine directly in WSL to be able to build your image. Deploy a virtual network and subnet in Azure for the container app environment (if using your own vnet). Deploy an Azure container registry. Dockerfile First we need to build our dockerfile to use as an Azure DevOps agent. For this I recommend using the dockerfile which uses the \u0026lsquo;python:3-alpine\u0026rsquo; OS that is documented by Microsoft in this article: Run a self-hosted agent in Docker - Azure Pipelines\nFROM python:3-alpine ENV TARGETARCH=\u0026#34;linux-musl-x64\u0026#34; # Another option: # FROM arm64v8/alpine # ENV TARGETARCH=\u0026#34;linux-musl-arm64\u0026#34; RUN apk update \u0026amp;\u0026amp; \\ apk upgrade \u0026amp;\u0026amp; \\ apk add bash curl gcc git icu-libs jq musl-dev python3-dev libffi-dev openssl-dev cargo make # Install Azure CLI RUN pip install --upgrade pip RUN pip install azure-cli WORKDIR /azp/ COPY ./start.sh ./ RUN chmod +x ./start.sh RUN adduser -D agent RUN chown agent ./ USER agent # Another option is to run the agent as root. # ENV AGENT_ALLOW_RUNASROOT=\u0026#34;true\u0026#34; ENTRYPOINT [ \u0026#34;./start.sh\u0026#34; ] Next we need to configure the start.sh file. The start.sh file in the Microsoft documentation uses a service principal to configure the $AZP_TOKEN value but this requires that you store a client secret somewhere. Since Azure DevOps supports adding managed identities as users then I have configured the file as follows:\nConfigure $AZP_TOKEN to use managed identity instead of using a service principal or PAT token. Configure $AZP_PLACEHOLDER cleanup to keep the placeholder configuration if \u0026lsquo;$AZP_PLACEHOLDER = 1\u0026rsquo; is present which is required for KEDA scaling. Note: for those of you wondering, the \u0026lsquo;APPLICATION_ID\u0026rsquo; is the ID refers to the Azure DevOps app registration from Microsofts own tenant.\n#!/bin/bash set -e if [ -z \u0026#34;$AZP_URL\u0026#34; ]; then echo 1\u0026gt;\u0026amp;2 \u0026#34;error: missing AZP_URL environment variable\u0026#34; exit 1 fi IDENTITY_HEADER=\u0026#34;$IDENTITY_HEADER\u0026#34; IDENTITY_ENDPOINT=\u0026#34;$IDENTITY_ENDPOINT\u0026#34; APPLICATION_ID=\u0026#34;499b84ac-1321-427f-aa17-267ca6975798\u0026#34; response=$(curl -s -X GET -H \u0026#34;X-IDENTITY-HEADER: $IDENTITY_HEADER\u0026#34; \u0026#34;$IDENTITY_ENDPOINT?resource=$APPLICATION_ID\u0026amp;api-version=2019-08-01\u0026#34;) AZP_TOKEN=$(echo \u0026#34;$response\u0026#34; | jq -r \u0026#39;.access_token\u0026#39;) if [ -z \u0026#34;$AZP_TOKEN_FILE\u0026#34; ]; then if [ -z \u0026#34;$AZP_TOKEN\u0026#34; ]; then echo 1\u0026gt;\u0026amp;2 \u0026#34;error: missing AZP_TOKEN environment variable\u0026#34; exit 1 fi AZP_TOKEN_FILE=/azp/.token echo -n $AZP_TOKEN \u0026gt; \u0026#34;$AZP_TOKEN_FILE\u0026#34; fi unset AZP_TOKEN if [ -n \u0026#34;$AZP_WORK\u0026#34; ]; then mkdir -p \u0026#34;$AZP_WORK\u0026#34; fi export AGENT_ALLOW_RUNASROOT=\u0026#34;1\u0026#34; cleanup() { if [ -e config.sh ]; then if [ -z \u0026#34;$AZP_PLACEHOLDER\u0026#34; ]; then print_header \u0026#34;Cleanup. Removing Azure Pipelines agent...\u0026#34; # If the agent has some running jobs, the configuration removal process will fail. # So, give it some time to finish the job. while true; do ./config.sh remove --unattended --auth PAT --token $(cat \u0026#34;$AZP_TOKEN_FILE\u0026#34;) \u0026amp;\u0026amp; break echo \u0026#34;Retrying in 30 seconds...\u0026#34; sleep 30 done else print_header \u0026#34;Cleanup skipped as Agent is marked as a placeholder... this option should be removed if Azure DevOps allows queueing to empty Agent Pools.\u0026#34; fi fi } print_header() { lightcyan=\u0026#39;\\033[1;36m\u0026#39; nocolor=\u0026#39;\\033[0m\u0026#39; echo -e \u0026#34;${lightcyan}$1${nocolor}\u0026#34; } # Let the agent ignore the token env variables export VSO_AGENT_IGNORE=AZP_TOKEN,AZP_TOKEN_FILE print_header \u0026#34;1. Determining matching Azure Pipelines agent...\u0026#34; AZP_AGENT_PACKAGES=$(curl -LsS \\ -u user:$(cat \u0026#34;$AZP_TOKEN_FILE\u0026#34;) \\ -H \u0026#39;Accept:application/json;\u0026#39; \\ \u0026#34;$AZP_URL/_apis/distributedtask/packages/agent?platform=$TARGETARCH\u0026amp;top=1\u0026#34;) AZP_AGENT_PACKAGE_LATEST_URL=$(echo \u0026#34;$AZP_AGENT_PACKAGES\u0026#34; | jq -r \u0026#39;.value[0].downloadUrl\u0026#39;) if [ -z \u0026#34;$AZP_AGENT_PACKAGE_LATEST_URL\u0026#34; -o \u0026#34;$AZP_AGENT_PACKAGE_LATEST_URL\u0026#34; == \u0026#34;null\u0026#34; ]; then echo 1\u0026gt;\u0026amp;2 \u0026#34;error: could not determine a matching Azure Pipelines agent\u0026#34; echo 1\u0026gt;\u0026amp;2 \u0026#34;check that account \u0026#39;$AZP_URL\u0026#39; is correct and the token is valid for that account\u0026#34; exit 1 fi print_header \u0026#34;2. Downloading and extracting Azure Pipelines agent...\u0026#34; curl -LsS $AZP_AGENT_PACKAGE_LATEST_URL | tar -xz \u0026amp; wait $! source ./env.sh trap \u0026#39;cleanup; exit 0\u0026#39; EXIT trap \u0026#39;cleanup; exit 130\u0026#39; INT trap \u0026#39;cleanup; exit 143\u0026#39; TERM print_header \u0026#34;3. Configuring Azure Pipelines agent...\u0026#34; ./config.sh --unattended \\ --agent \u0026#34;${AZP_AGENT_NAME:-$(hostname)}\u0026#34; \\ --url \u0026#34;$AZP_URL\u0026#34; \\ --auth PAT \\ --token $(cat \u0026#34;$AZP_TOKEN_FILE\u0026#34;) \\ --pool \u0026#34;${AZP_POOL:-Default}\u0026#34; \\ --work \u0026#34;${AZP_WORK:-_work}\u0026#34; \\ --replace \\ --acceptTeeEula \u0026amp; wait $! print_header \u0026#34;4. Running Azure Pipelines agent...\u0026#34; ./run.sh \u0026#34;$@\u0026#34; \u0026amp; wait $! Now you can run docker build and push commands to your container registry directly from WSL or via an Azure DevOps pipeline task with a service connection that has the \u0026lsquo;acrPush\u0026rsquo; role on the container registry.\n# build docker file with tag docker build --tag \u0026#34;agent:latest\u0026#34; --file \u0026#34;./dockerfile\u0026#34; . # tag docker file for use in acr docker tag agent:latest \u0026lt;name_of_container_registry\u0026gt;.azurecr.io/agent:latest # push docker file to acr docker push \u0026lt;name_of_container_registry\u0026gt;.azurecr.io/agent:latest Azure container app job Now we are ready to deploy the container app job thats going to run the Azure DevOps agent.\nFor this I am using bicep code to deploy both a container app environment and a container app job. There are a few different options for doing this deployment, you can either deploy the app environment on a Microsoft hosted network or you can deploy it in an isolated subnet with either a consumption workload or a dedicated workload profile (or both).\nIn the following example I\u0026rsquo;m deploying on my own virtual network in a dedicated subnet that is using an internal load balancer on the container environment and a D4 workload consumption profile. Here are some points also worth mentioning in the configuration:\nScale rules are set to use KEDA scaling with the managed identity of the container app job. This requires that the trigger parameter is the AzureDevOps URL of your organization and that it is stored as a secret in the container app config. The managed identity needs to be added as a user to Azure DevOps wiht stakeholder access and added to the DevOps Agent pool as an administrator. You need to give the managed identity of the container app job \u0026lsquo;acrPull\u0026rsquo; access on the container registry you are using to pull the image from. With this you can avoid using the admin user access key on the container registry. You also need to deploy a placeholder agent (as mentioned earlier) which will be present in the Azure DevOps agent pool as an offline resource. This is required so that when you trigger a pipeline the KEDA scaling will start the app job. This can be deleted once its been registrered offline but you will need to add the managed identity for that to the pool as well to ensure it works before you delete the app. Summary Using this configuration will ensure that you can remove the need for PAT on your Azure DevOps agents and take advantage of the system-assigned managed identities on the container app jobs instead.\n","permalink":"http://www.forsh.dev/posts/azure-devops-agents-container-app-jobs/","summary":"\u003cp\u003eUsing container app jobs for self-hosted Azure DevOps agents allows for more control over what is running on your DevOps agents. Both VMSS and the newer managed DevOps pools give you the option to run agents on your own virtual network which is excellent for securing network traffic but if you also need to have control what is running on them then configuring the agents with docker in a container app job is a good option. You also have the added security of Defender for containers integration to ensure you can keep your images secure.\u003c/p\u003e","title":"Creating Self-Hosted Azure DevOps Agents with Azure Container App Jobs and Managed Identity"},{"content":"Regulatory compliance Azure has a feature in Microsoft Defender for Cloud called regulatory compliance that allows you to start getting your cloud compliance under control. Central to this feature is the Microsoft Cloud Security Benchmark.\nWhat is the Microsoft Cloud Security Benchmark? The MCSB for short, is a set of practices that form a track of the Cloud adoption framework for Azure from Microsoft.\nThis has been traditionally a set of best practices and guidelines for cloud deployments but more recently it has been integrated into the Defender for Cloud portal to provide that bridge from the adoption framework to reporting on resources against best practices.\nDefender for Cloud regulatory compliance The portal provides an overview of all of the MCSB controls and the number of controls that have either passed or not. The controls cover not only Azure but also both Amazon Web Services \u0026amp; Google Cloud Platform as well as GitHub and Azure DevOps that are integrated into DevOps Security.\nDiving into the compliance controls will reveal what resources are non-compliant, for example, the below compliance control for privileged access reveals that there is a requirement for multiple owners on 1 subscription.\nClicking on the control details link will provide some additional info associated with the control with the relevant compliance status in your subscription and links the MCSB control in Microsoft docs for further information.\nRemediation Getting to work remediating the fixes involves opening the affected non compliant resource and following the remediation steps outlined in the resource compliance status.\nGovernance rules It\u0026rsquo;s also possible to enforce remediation via governance rules to assign owners and due dates for addressing recommendations on specific resources.\nThis is particularly useful for large Azure environments with multiple subscriptions where it unmanageable for the security team to implement fixes. These fixes will give accountability to the resource owners and issue an SLA for remediation.\nGovernance rules are created in the environment settings of the Defender for Cloud portal.\nThe governance rules pane allows you to create governance rules based on subscription scope.\nThen you can add conditions to alert based on severity or based on specific recommendations in the MCSB controls. Setting owners and remediation timeframes enforces governance with tailored timeframes and email notifications.\nSummary In summary Microsoft Defender for Cloud\u0026rsquo;s regulatory compliance feature helps ensure the security and compliance with the Microsoft Cloud Security Benchmark (MCSB) integrated into the Defender for Cloud portal.\nThese controls bring to life tracking and reporting on resources\u0026rsquo; compliance status and setting of compliance controls, showing which have passed or failed. Detailed information about non-compliant resources is accessible, including remediation steps.\nFinally, governance rules can be enforced to assign responsibility and set deadlines for addressing compliance issues which brings the whole security compliance lifecycle management together.\n","permalink":"http://www.forsh.dev/posts/defender-cloud-regulatory-compliance/","summary":"\u003ch2 id=\"regulatory-compliance\"\u003eRegulatory compliance\u003c/h2\u003e\n\u003cp\u003eAzure has a feature in Microsoft Defender for Cloud called regulatory compliance that allows you to start getting your cloud compliance under control. Central to this feature is the \u003ca href=\"https://learn.microsoft.com/en-us/security/benchmark/azure/\"\u003eMicrosoft Cloud Security Benchmark\u003c/a\u003e.\u003c/p\u003e\n\u003ch3 id=\"what-is-the-microsoft-cloud-security-benchmark\"\u003eWhat is the Microsoft Cloud Security Benchmark?\u003c/h3\u003e\n\u003cp\u003eThe MCSB for short, is a set of practices that form a track of the \u003ca href=\"https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/\"\u003eCloud adoption framework for Azure\u003c/a\u003e from Microsoft.\u003c/p\u003e\n\u003cp\u003eThis has been traditionally a set of best practices and guidelines for cloud deployments but more recently it has been integrated into the Defender for Cloud portal to provide that bridge from the adoption framework to reporting on resources against best practices.\u003c/p\u003e","title":"Microsoft Defender for Cloud Regulatory Compliance"},{"content":"In my previous blog DevOps Security with Microsoft Defender for Cloud I introduced the DevOps Security features in Defender for Cloud and how you can link and scan your GitHub code repositories for vulnerabilities before they hit your infrastructure platforms.\nIn this blog I am going to focus on the options for fixing code issues based on the reporting findings from Defender for Cloud.\nFindings As mentioned previously, all of the reporting from your connected repositories appears under findings in the security overview dashboard of DevOps Security.\nTo get some findings to remediate for this blog, I am using two vulnerable by design IaC code repositories developed by Bridgecrew:\nTerragoat\nBicepgoat\nThese repositories are perfect for testing misconfigurations and vulnerabilities in a safe space. Be sure to keep them separate from your production environments if you are going to deploy this code!\nFindings fall into the following categories:\nCode — findings that are based on the CodeQL scanning in GitHub (CodeQL is the code analysis engine developed by GitHub to automate security checks).\nIaC — IaC findings from the DevOps Security GitHub Action / Azure DevOps pipeline.\nSecrets — Secrets scanned by GitHub advanced security.\nDependencies — Dependabot alerts from GitHub that include security updates and dependency version updates.\nRecommendations In the security overview there is an option to view the DevOps environment posture management recommendations. This is a new overview that gives you a good insight into the detected vulnerabilities, their risk level and risk factors.\nUsing this overview we can prioritize the high level issues first and drill down further to each vulnerability that has been found. In this case we have some high priority dependency vulnerabilities in the code that need to be resolved.\nFix code findings and dependencies To start fixing code findings and dependencies we will need to do this within GitHub by opening the security tab of the code repository. Given the amount of security vulnerabilities in this case you can look for specific issues based on the GHSA ID. This is the ID that appears in the above severity report and in each dependency issue ID in GitHub.\nTo resolve the dependency issues GitHub automatically creates a series of pull requests that can be resolved.\nLooking at the top pull request, dependabot has created this is a pull request that contains 38 fixes identified in the dependabot alerts above.\nAll that\u0026rsquo;s needed here is to merge the pull request to automatically fix the issues in the code and this will remediate the fixes being alerted in both GitHub and eventually Defender for Cloud after a short polling period.\nSecrets Secret scanning alerts are presented in the security tab under secrets scanning. The secrets are captured by GitHub advanced security and then are synchronised to Defender for Cloud. But what is defined here as a secret? It is any token or private key that is used to communicate with an external service. So keep that in mind as this isn\u0026rsquo;t something thats going to capture an exposed virtual machine password for example. The secret scanning partner program provides some guidance on what is covered and how it works.\nThere is also an option to enable push protection for secrets which will prevent anyone pushing code to your repo that contains a secret.\nFix IaC misconfigurations To fix IaC misconfigurations, run the Microsoft Security DevOps scanner in CI—either the GitHub Microsoft Security DevOps Action or the Azure DevOps task. I covered the setup in my previous blog.\nMicrosoft Security DevOps Action: https://github.com/marketplace/actions/microsoft-security-devops Microsoft Security DevOps for Azure DevOps: https://marketplace.visualstudio.com/items?itemName=ms-securitydevops.microsoft-security-devops-azdevops There are two main scenarios you can use to run the action, firstly against an existing deployment on your main branch or against a branch pull request.\nThe first scenario will add any identified vulnerabilities to your code scanning section of the security tab in GitHub. These can then be remediated in your code directly, to fix the already deployed issues, or you can create a branch copy fix and re-run the action as part of a pull request annotation. This fit\u0026rsquo;s nicely with the second scenario, introduction of new code.\nThis scenario is the method that will prevent vulnerabilities from hitting your environment in the first place. When a developer creates new code in a branch and initiates a pull request then the action will automatically scan the branch and add issues that need to be fixed directly in the pull request.\nTo show this further, I have created a video that shows an example of a simple azure storage account written in bicep code that has some missing security parameters and how to use the code scanning tools as part of a pull request to remediate them before they are deployed into your code.\nSummary In summary, the tools available to remediate code issues from Defender for Cloud in combination with GitHub advanced security provide a wide area of protection. You have a full range of support for application code, IaC, secrets and dependencies that can be reported into Defender for Cloud.\nUsing the pull request integration feature for is also a nice way to include this action as a pre-requisite in your code environment allowing for developers to fix coding issues before they are deployed to your Azure environments.\n","permalink":"http://www.forsh.dev/posts/investigate-remediate-devops-security-defender-for-cloud/","summary":"\u003cp\u003eIn my previous blog \u003ca href=\"\"\u003eDevOps Security with Microsoft Defender for Cloud\u003c/a\u003e I introduced the DevOps Security features in Defender for Cloud and how you can link and scan your GitHub code repositories for vulnerabilities before they hit your infrastructure platforms.\u003c/p\u003e\n\u003cp\u003eIn this blog I am going to focus on the options for fixing code issues based on the reporting findings from Defender for Cloud.\u003c/p\u003e\n\u003ch2 id=\"findings\"\u003eFindings\u003c/h2\u003e\n\u003cp\u003eAs mentioned previously, all of the reporting from your connected repositories appears under findings in the security overview dashboard of DevOps Security.\u003c/p\u003e","title":"Investigate and Remediate DevOps Security findings in Microsoft Defender for Cloud"},{"content":"Following Microsoft Ignite in Nov 2023, Defender for DevOps has now become DevOps security. In practice this means that a lot of the features which were previously in public preview are now generally available.\nBut first… what is DevOps security in Defender for Cloud?\nDevOps Security This feature of Defender for Cloud provides end-to-end security for code-based deployments from the well-known major source code repositories available in the market. These can be selected from the environment settings of the DevOps security blade in Defender for Cloud with GitLab being the most recent addition to the environment list.\nLicensing The licensing model has been incorporated into the Defender CSPM plans where the functionality varies based on whether you are using the free foundational CSPM or billable Defender CSPM license.\nWhat does this mean? Starting March 1, 2024, Defender CSPM must be enabled to have premium DevOps security capabilities which include:\nCode-to-cloud contextualization powering security explorer and attack paths Pull request annotations for Infrastructure-as-Code security findings Using the Foundational CSPM license will continue to have some functions such as allowing you to connect your repositories and providing GitHub advanced security recommendations in the portal (which is free for public repos) but this will not have any coverage for IaC. Only the CodeQL findings in GitHub are supported.\nIf you want the full benefit of the product you will need the Defender CSPM license. More detailed info on what is supported on each license tier can be found here:\nSupport and prerequisites — Microsoft Defender for Cloud\nConnector To start you need to connect your repos to Defender for Cloud via a connector. In order to do this, you need contributor level access or Security admin permissions on the Azure subscription that is running Defender for Cloud. For your source code environments, the permissions are as follows:\nGitHub — Organisation owner Azure DevOps — Project collection admin GitLab — Group owner The connector now supports more regions than during the public preview. More regions should come online as Microsoft rolls this out across more of their Azure regional datacenters.\nFor both GitHub and Azure DevOps you can have multiple connectors for the same source code platform at different levels in the hierarchy. For example, in GitHub a connector per org or repo or for Azure DevOps per org, project or repo. For GitLab only one instance of a GitLab group can be onboarded to the Azure Tenant you are creating a connector in currently as this feature is in preview.\nPortal The overall look and feel of the portal got a makeover. For example, the security overview shows the DevOps security results from code, secrets, dependencies and IaC findings. Repos can also be structured into a hierarchical view for ease of navigation.\nMicrosoft Security DevOps Microsoft Security DevOps is a command line application that integrates static analysis tools into the development lifecycle. It uses the open source tools to scan the pipeline run or action to check for vulnerabilities in code. It is a key component of Defender CSPM especially concerning IaC code scanning as both Terrascan for Terraform and Template Analyser for Bicep/ARM templates are included here.\nThe sample workflow is available for both Azure DevOps and GitHub actions. In the following steps I will focus on the GitHub action.\nYou need to copy and paste the sample workflow into your pipeline job and commit it to commence the analysis.\nNote: I\u0026rsquo;ve added permissions that are needed to ensure security events are written to the GitHub security tab. These are not included in the standard template.\nname: MSDO on: push: branches: - main jobs: sample: name: Microsoft Security DevOps Analysis # MSDO runs on windows-latest. # ubuntu-latest also supported runs-on: windows-latest permissions: actions: read contents: read security-events: write steps: # Checkout your code repository to scan - uses: actions/checkout@v3 # Run analyzers - name: Run Microsoft Security DevOps Analysis uses: microsoft/security-devops-action@latest id: msdo with: # config: string. Optional. A file path to an MSDO configuration file (\u0026#39;*.gdnconfig\u0026#39;). # policy: \u0026#39;GitHub\u0026#39; | \u0026#39;microsoft\u0026#39; | \u0026#39;none\u0026#39;. Optional. The name of a well-known Microsoft policy. # categories: string. Optional. A comma-separated list of analyzer categories to run. # Values: \u0026#39;secrets\u0026#39;, \u0026#39;code\u0026#39;, \u0026#39;artifacts\u0026#39;, \u0026#39;IaC\u0026#39;, \u0026#39;containers\u0026#39;. Example: \u0026#39;IaC,secrets\u0026#39;. # Defaults to all. # languages: string. Optional. A comma-separated list of languages to analyze. # Example: \u0026#39;javascript,typescript\u0026#39;. Defaults to all. # tools: string. Optional. A comma-separated list of analyzer tools to run. # Values: \u0026#39;bandit\u0026#39;, \u0026#39;binskim\u0026#39;, \u0026#39;eslint\u0026#39;, \u0026#39;templateanalyzer\u0026#39;, \u0026#39;terrascan\u0026#39;, \u0026#39;trivy\u0026#39;. # Upload alerts to the Security tab - name: Upload alerts to Security tab uses: github/codeql-action/upload-sarif@v2 with: sarif_file: ${{ steps.msdo.outputs.sarifFile }} # Upload alerts file as a workflow artifact - name: Upload alerts file as a workflow artifact uses: actions/upload-artifact@v3 with: name: alerts path: ${{ steps.msdo.outputs.sarifFile }} Investigate Findings Once you have run the pipeline task the results should appear quickly in the GitHub security tab of your repo under \u0026lsquo;Code scanning\u0026rsquo;.\nFor the reporting to be visible in Defender it will take some time for the results to be populated under \u0026lsquo;Findings\u0026rsquo; which includes all code, IaC, secret and dependency findings together.\nYou can then click into the findings to get a complete list of recommendations for remediation.\nYou are now ready to remediate these fixes in your code! So how do we do that? Well look out for my next blog post which will cover these steps :-)\nSummary In summary, Defender DevOps Security is available for organisations to help identify and remediate issues in code before they are deployed to your environments. This provides a key security monitoring and remediation capability for code which Microsoft previously was lacking. It will be interesting to follow the development of this feature moving forward.\nOriginally published on Medium on January 24, 2024\n","permalink":"http://www.forsh.dev/posts/devops-security-defender-for-cloud/","summary":"\u003cp\u003eFollowing Microsoft Ignite in Nov 2023, Defender for DevOps has now become DevOps security. In practice this means that a lot of the features which were previously in public preview are now generally available.\u003c/p\u003e\n\u003cp\u003eBut first… what is DevOps security in Defender for Cloud?\u003c/p\u003e\n\u003ch2 id=\"devops-security\"\u003eDevOps Security\u003c/h2\u003e\n\u003cp\u003eThis feature of Defender for Cloud provides end-to-end security for code-based deployments from the well-known major source code repositories available in the market. These can be selected from the environment settings of the DevOps security blade in Defender for Cloud with GitLab being the most recent addition to the environment list.\u003c/p\u003e","title":"DevOps Security with Microsoft Defender for Cloud"},{"content":"The Challenge Ok so the challenge is end-to-end infrastructure security with IaC using only the Microsoft technology stack. Do I really need any third party tooling or does Microsoft have the products to support securing the entire DevSecOps process?\nBicep Lets start with your IaC configuration files. Microsoft launched its own IaC declarative languages tool called Bicep on August 31, 2020. It is a domain specific language (DSL) for infrastructure deployments in Azure.\nState So How does Bicep handle state? Bicep is a set of configuration files similar to most other IaC tools but one of the advantages it has over its main rival is that it is stateless in nature. The state is what is actually deployed and Bicep runs a differential comparison of the actual configuration compared to the configuration files being executed. Why is this an advantage from a security perspective? A stateful tool such as Terrafom requires a separate state file which needs careful planning when it comes to security. The state should be stored securely and encrypted as it contains sensitive information in clear text. This file is critical to infrastructure deployments and is a value asset for any threat actor so a secure storage, authentication and access design needs to be done. Bicep however reads directly from resource manager and coverts its templates automatically into JSON format before deploying resources.\nParameters Looking at the code, what type secure coding practices does Bicep support then? Well firstly marking string or object parameters as secure is a start as this these values are not saved to the deployment history and aren\u0026rsquo;t logged.\n@secure() param demoPassword string @secure() param demoSecretObject object Secrets The most important security coding practice that you need to follow is placing secrets in a password vault. Microsoft has its own solution that supports this, Azure KeyVault. Bicep uses the getSecret function to return a secret from KeyVault once the pre-requisites are in place (Key Vault secret with Bicep)\nNote: Github and Azure DevOps also support secret storage for actions/pipelines which we will cover.\nWhat about ARM templates i hear you ask. Well, you can pass secret parameters to ARM templates for sure! But for most cases you should be using Bicep which calls ARM anyway (those JSON files i mentioned earlier) so let\u0026rsquo;s stick with that since its declarative and is in a human-readable format.\nWhat else can we use as a security tool for secure coding practices then? Copilot of course!\nGithub copilot \u0026amp; chat (beta) Github copilot is an AI pair programmer that offers autocomplete-style suggestions as you code, so this tool is of huge benefit to have from a security perspective. Inline suggestions will not only fix code formatting but also make suggestions around secure code.\nThe recently released Github copilot chat beta now takes it to the next level so that you can ask copilot to suggest recommendations for securing code vulnerabilities. Infrastructure as code vulnerabilities can be analysed within chat and recommendations given to ensure a developer can remediate issues before it goes to pull request. This tool will only get better over time based on the inputs of potentially millions of developers worldwide so this will become an invaluable tool to a developer for secure coding.\nSo that\u0026rsquo;s our IaC tool covered as well as Azure KeyVault for secret storage and some help from Copilot. Where do we put the code?\nVersion Control System Placing the config files in a VCS is a given. Luckily Microsoft has two versions to choose from, GitHub and Azure DevOps.\nGitHub Advanced Security Microsoft aquired GitHub in 2018 and now is seen as a preferred platform over Azure DevOps at least for IaC deployments. GitHub has a feature called advanced security which is available for Enterprise accounts as well as some features available to public repositorys (further info can be found here: GitHub plans).\nA GitHub advanced security licence includes code scanning, secret scanning and dependancy reviews that give a next level of security features that are critcal in todays security landscape and especially for those of us who are new to coding in the infrastructure world. GitHub provides starter workflows with ready made security features to get started with too!\nOther security features Security policies are available to allow for reporting of security vulnerabilities in code by adding a SECURITY.md file to the root of your repo.\nAdditionally, GitHub has a security advisory database with known vulnerabilties and malware, grouped in two categories: GitHub-reviewed advisories and unreviewed advisories.\nFinally it supports Entra ID SSO integration for user and role management and also there are some security hardening features available to organisations on GitHub (several 2FA options for example) as well as a host of security settings to control, log and monitor access to repositorys with, of course, it\u0026rsquo;s own secret storage for actions.\nAzure DevOps Out of the box Azure DevOps security features are restricted to permissions, access and security grouping with some pipeline secret storage available. For features such as code scanning then you would need to use the GitHub advanced security integration feature which has recently just become generally available. This extends the GitHub advanced security features to Azure repos however worth noting that there is no plan for this to be made available to the DevOps server edition.\nAdditionally, a license is required on GitHub advanced security for this integration to work.\nBoth these version control systems ship with the standard branching pull request, peer review processes that you find in most VCS systems and are fundamental to good DevSecOps practices.\nPipelines/actions So your now ready to deploy code using a pipeline or as GitHub calls them a GitHub action. There are a host of security hardening for Github actions considerations that GitHub has published guidance for. Secrets management is the main consideration but also a lot of other important considerations need to be taken such as risk assessment and using Open ID connect when authenticating to Azure: Configuring OpenID Connect in Azure.\nBefore you run your pipeline or action you should run some Bicep validation checks:\nLinting - Linting validation will check for things like unused parameters, unused variables, interpolation, secure parameters and more. You can tell Bicep to verify your file by manually building the Bicep file through the Bicep CLI: bicep build main.bicep Validate - You can use the AzureResourceManagerTemplateDeployment task to submit a Bicep file for preflight validation. What-if - Previews the changes that will happen. The what-if operation doesn\u0026rsquo;t make any changes to existing resources. Instead, it predicts the changes if the specified Bicep file is deployed. Defender for DevOps So you\u0026rsquo;ve created the code, put it in GitHub or Azure DevOps, used GitHub copilot to help quality check before you create a pull request, you\u0026rsquo;ve validated the deployment and your ready to deploy with your pipelines or github actions.\nWe now need to check the code via an analyser and also ensure that your security team is aware of any security vulnerabilities that might exist in your repository. Luckily Microsoft have an answer to that too.. Defender for DevOps.\nDefender for DevOps provides comprehensive visibility, posture management, and threat protection across multi-cloud environments including Azure. Initial set up involves connecting the service to a GitHub org or repository as well as enabling the GitHub advanced security features on the target (which we mentioned earlier).\nConfigure the Microsoft security github action This action incorporates static analysis tools into a github action to be run on your IaC code repository. It has a host of open-source tools for analysis but in our case its template analyser that covers analysis of Bicep configuration files.\nThe steps involve simply incorporating this sample action workflow into your actions as a pre-requisite for code deployment. With all the GitHub advanced security features enabled this can report security vulnerabilities to github under the security tab and auto-create pull requests for remediation. Additionally, this data is reported back to Defender for DevOps for visibility to the security teams for remediation and reporting.\nIts also worth mentioning that there is an equivalent for Azure DevOps pipelines too. Marvellous!\nSummary In summary Microsoft has matured its offerings around IaC and DevSecOps with its tooling over the past few years and most recently the offerings around GitHub advanced security, GitHub Copilot and Defender for DevOps are now introducing even more advanced capabilities in this space to compete with Hashicorp, Sonar and CyberArk among others.\nAs more sophisticated supply chain threats emerge its important to have full end-to-end security coverage that continuously evolves over time. Microsoft is building its capability in this area which is exciting to see and will benefit its customers who need to implement DevSecOps with the latest emerging security tools to protect their environments.\n","permalink":"http://www.forsh.dev/posts/securing-iac-microsoft-stack/","summary":"\u003ch2 id=\"the-challenge\"\u003eThe Challenge\u003c/h2\u003e\n\u003cp\u003eOk so the challenge is end-to-end infrastructure security with IaC using only the Microsoft technology stack. Do I really need any third party tooling or does Microsoft have the products to support securing the entire DevSecOps process?\u003c/p\u003e\n\u003ch2 id=\"bicep\"\u003eBicep\u003c/h2\u003e\n\u003cp\u003eLets start with your IaC configuration files. Microsoft launched its own IaC declarative languages tool called \u003ca href=\"https://learn.microsoft.com/en-us/azure/azure-resource-manager/bicep/overview?tabs=bicep\"\u003eBicep\u003c/a\u003e on August 31, 2020. It is a domain specific language (DSL) for infrastructure deployments in Azure.\u003c/p\u003e","title":"Securing infrastructure as code (IaC) with the Microsoft technology stack"},{"content":"Craig Forshaw Cloud architect \u0026amp; Microsoft MVP who enjoys working with Azure security, DevOps, GitHub Advanced Security, and infrastructure as code. Always learning, always sharing.\n💼 Senior Cloud Solutions Architect at Atea 🏆 Microsoft MVP — Cloud Security 🌍 Based in Norway 🧭 Focus areas: Azure, Microsoft Security, DevOps, Infrastructure as Code Links 🐙 GitHub: @craigforshaw 💼 LinkedIn: in/craig4shaw 🔗 Sessionize: craig4shaw83 More If you’d like to collaborate, speak, or have questions—reach out on LinkedIn. You can also find my talks and demos in my GitHub repos.\n","permalink":"http://www.forsh.dev/about/","summary":"\u003ch2 id=\"craig-forshaw\"\u003eCraig Forshaw\u003c/h2\u003e\n\u003cp\u003eCloud architect \u0026amp; Microsoft MVP who enjoys working with Azure security, DevOps, GitHub Advanced Security, and infrastructure as code. Always learning, always sharing.\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003e💼 Senior Cloud Solutions Architect at \u003ca href=\"https://www.atea.no/\"\u003eAtea\u003c/a\u003e\u003c/li\u003e\n\u003cli\u003e🏆 Microsoft MVP — Cloud Security\u003c/li\u003e\n\u003cli\u003e🌍 Based in Norway\u003c/li\u003e\n\u003cli\u003e🧭 Focus areas: Azure, Microsoft Security, DevOps, Infrastructure as Code\u003c/li\u003e\n\u003c/ul\u003e\n\u003ch2 id=\"links\"\u003eLinks\u003c/h2\u003e\n\u003cul\u003e\n\u003cli\u003e🐙 GitHub: \u003ca href=\"https://github.com/craigforshaw\"\u003e@craigforshaw\u003c/a\u003e\u003c/li\u003e\n\u003cli\u003e💼 LinkedIn: \u003ca href=\"https://www.linkedin.com/in/craig4shaw\"\u003ein/craig4shaw\u003c/a\u003e\u003c/li\u003e\n\u003cli\u003e🔗 Sessionize: \u003ca href=\"https://sessionize.com/craig4shaw83\"\u003ecraig4shaw83\u003c/a\u003e\u003c/li\u003e\n\u003c/ul\u003e\n\u003ch2 id=\"more\"\u003eMore\u003c/h2\u003e\n\u003cp\u003eIf you’d like to collaborate, speak, or have questions—reach out on LinkedIn. You can also find my talks and demos in my GitHub repos.\u003c/p\u003e","title":"About"}]