Tag: DevOps

  • Terraform Made Simple, your first working configuration from install to Azure access

    Introduction
    Infrastructure as Code is not optional anymore. Terraform gives you a declarative way to build, modify, and destroy cloud resources cleanly. This tutorial shows exactly how to install Terraform, create your first configuration, and connect it to Azure without affecting your company’s production environment. I used these steps to rebuild my own skills after leaving California and stepping into Utah’s quiet season of learning.


    Step 1
    Install Terraform using Winget

    1. Open PowerShell as admin
    2. Run the installer
      winget install HashiCorp.Terraform –source winget
    3. Restart your PowerShell window
    4. Verify the installation
      terraform -version

    You should see something like
    Terraform v1.14.0


    Step 2
    Create your Terraform workspace

    1. Create a folder
      mkdir C:\terraform\test1
    2. Go inside the folder
      cd C:\terraform\test1
    3. Create a new file
      New-Item main.tf -ItemType File

    Leave the file empty for now. Terraform just needs to see that a configuration file exists.


    Step 3
    Write your first Terraform configuration

    Open main.tf and paste this:

    provider “azurerm” {
    features {}
    }


    Nothing created yet. This is read only.

    The goal is to connect Terraform to Azure safely.

    Save the file.


    Step 4
    Initialize Terraform

    Run
    terraform init

    This downloads the AzureRM provider and sets up your working directory.

    You should see
    Terraform has been successfully initialized


    Step 5
    Install the Azure CLI

    Terraform connects to Azure using your Azure CLI login. Install it with:

    winget install Microsoft.AzureCLI

    Verify it
    az –version


    Step 6
    Log into Azure

    Run
    az login

    A browser opens. Select your Azure account.

    Important note
    If you see Martin’s Azure subscription, stop here and do not run terraform apply.
    Terraform plan is safe because it does not make changes.


    Step 7
    Check your Azure subscription

    az account show

    This confirms who you are logged in as and which subscription Terraform will use.


    Step 8
    Run your first Terraform plan

    terraform plan

    This reads your main.tf and checks for any required changes.
    Since your config is empty, the output will say:
    No changes. Infrastructure is up to date.


    Step 9
    Useful Azure CLI commands for Cloud Engineers

    Check all resource groups
    az group list -o table

    Check all VMs
    az vm list -o table

    Check storage accounts
    az storage account list -o table

    Check virtual networks
    az network vnet list -o table

    Check VM status
    az vm get-instance-view –name VMNAME –resource-group RGNAME –query instanceView.statuses[1].displayStatus

    Check Azure AD users
    az ad user list –filter “accountEnabled eq true” -o table

    Check your role assignments
    az role assignment list –assignee <your UPN> -o table

    These commands show LC that you are comfortable with both Terraform and Azure CLI.


    Step 10
    Can Terraform check Defender?

    Terraform itself does not “check” Defender, but you can manage Defender settings as resources.

    For example:

    azurerm_security_center_contact
    azurerm_security_center_subscription_pricing
    azurerm_security_center_assessment
    azurerm_defender_server

    Meaning
    Terraform is for configuration
    Azure CLI is for inspection
    Graph / PowerShell is for deep security reporting

    If LC wants real Defender reporting, we use:

    Connect-MgGraph
    Get-MgSecurityAlert
    Get-MgSecuritySecureScore

    You already know these.


    Step 11
    Cleaning up safely

    Since we did not deploy anything, no cleanup is required.

    If you later create real resources, destroy them with
    terraform destroy


    Final thoughts
    Terraform is one of the most powerful tools in cloud engineering. Once you know how to initialize it, authenticate with Azure, and run plans, you are already ahead of many engineers who feel overwhelmed by IaC. LC will immediately see that you are not just an Exchange guy or a VMware guy. You are becoming a modern DevOps cloud engineer who can manage infrastructure in code.


    © 2012–2025 Jet Mariano. All rights reserved.
    For usage terms, please see the Legal Disclaimer.

  • Terraform for M365 & Azure (With Real Examples)

    Title:

    Terraform for M365 and Azure — Infrastructure-as-Code Made Simple

    Introduction

    Terraform is one of the most powerful tools for managing cloud environments because it lets you declare what you want and Azure builds it. No guessing. No clicking. No forgetting what you changed.

    Even if M365 doesn’t support Terraform natively for all workloads, you can still automate Azure AD, Conditional Access, Groups, SPNs, Networking, Key Vault, and App Registrations through the Microsoft Graph provider.

    I used IaC principles while supporting Church systems — Terraform makes environments repeatable, auditable, and consistent.


    1. Installing Terraform

    choco install terraform
    

    2. Azure Login Block

    provider "azurerm" {
      features {}
    }
    
    provider "azuread" {
    }
    

    3. Creating an Azure Resource Group

    resource "azurerm_resource_group" "rg1" {
      name     = "M365AutomationRG"
      location = "WestUS2"
    }
    

    4. Creating an Azure AD Group

    resource "azuread_group" "security_group" {
      display_name     = "M365-Automation-Admins"
      security_enabled = true
    }
    

    5. Creating an App Registration + Secret

    resource "azuread_application" "app" {
      display_name = "Terraform-Automation-App"
    }
    
    resource "azuread_service_principal" "sp" {
      application_id = azuread_application.app.application_id
    }
    
    resource "azuread_application_password" "sp_secret" {
      application_object_id = azuread_application.app.id
      display_name          = "secret1"
    }
    

    6. Conditional Access via Terraform (Yes, possible!)

    Uses the Microsoft Graph Terraform provider.

    resource "msgraph_conditional_access_policy" "block_non_us" {
      display_name = "Block Non-US IP"
      state        = "enabled"
    
      conditions {
        users {
          include_users = ["all"]
        }
        locations {
          include_locations = ["All"]
          exclude_locations = ["US"]
        }
      }
    
      grant_controls {
        operator         = "OR"
        built_in_controls = ["block"]
      }
    }
    

    7. Create an M365 Group (Unified Group)

    resource "msgraph_group" "m365_group" {
      display_name     = "Engineering Team"
      mail_nickname    = "engineering"
      security_enabled = false
      mail_enabled     = true
      group_type       = ["Unified"]
    }
    

    8. Create Azure Key Vault

    resource "azurerm_key_vault" "kv" {
      name                = "m365-keyvault-prod"
      location            = azurerm_resource_group.rg1.location
      resource_group_name = azurerm_resource_group.rg1.name
      tenant_id           = data.azuread_client_config.current.tenant_id
      sku_name            = "standard"
    }
    

    Conclusion

    Terraform is the “blueprint” of modern cloud administration.
    Clicking creates inconsistencies — IaC creates reliable, repeatable deployments.


    © 2012–2025 Jet Mariano. All rights reserved.
    For usage terms, please see the Legal Disclaimer.

  • 💻 Path to Become a Developer (Ivy Falls)

    From coding late nights to building real solutions — proof that persistence pays off.
    DeveloperJourney #IvyFalls #NoBandAidFix

    Introduction: The Path Is the Practice

    My journey to development and infrastructure followed the same rhythm — discipline by day, learning by night.
    While working full-time at All Electronics Corporation in Van Nuys (1990–1995), I woke at 4 A.M. to catch two LA Metro buses from Western and 3rd Street to my 6:30 A.M. shift, then sometimes worked evenings at the Taco Bell drive-thru in Glendale.

    I wasn’t chasing titles; I was chasing understanding. At All Electronics, I became obsessed with the Integrated Circuit (IC) — the heartbeat of every computer. There was no Internet back then — only library books and endless curiosity. I crashed my own PCs, rebuilt them, and soon began fixing computers for free for anyone who needed help.

    Back then, I used to dream of a day when I wouldn’t have to wait for the bus in the rain just to get home. Years later, those same dreams became reality — not through luck, but through faith, discipline, and persistence. The rides changed — from buses to a BMW, an Audi, and now a Tesla — but what never changed was the purpose: to keep moving forward.

    Those early mornings and late nights opened the door to my first IT role at USC as a PC Specialist, then to GTE (now Verizon), Aerospace, and eventually to my own IT consulting business serving clients large and small across California and beyond.


    Season of Refinement

    While working full-time at USC, I entered what I call my season of refinement.
    By day I supported campus systems and users; by night I was a full-time student at Los Angeles City College (LACC) and a weekend warrior at DeVry University, studying Management in Telecommunications.

    It was during this time that Microsoft introduced the MCSE (Microsoft Certified Systems Engineer) program.
    One of my LACC professors encouraged me to earn it, saying, “Once you have that license, companies will chase you.”
    He was right — that MCSE became my ticket to GTE, my first step into enterprise-scale IT.

    My tenure at GTE was brief because Aerospace came calling with a six-figure offer just before Y2K — an opportunity too great to refuse.
    After Aerospace, I founded my own consulting firm — Ahead InfoTech (AIT) — and entered what I now call my twelve years of plenty.

    One of my earliest clients, USC Perinatal Group, asked me to design and implement a secure LAN/WAN connecting satellite offices across major hospitals including California Hospital Medical Center, Saint Joseph of Burbank and Mission Hills, and Hollywood Presbyterian Hospital.
    We used T1 lines with CSU/DSU units and Fortinet firewalls; I supplied every workstation and server under my own AIT brand.

    Through that success I was referred to additional projects for Tarzana and San Gabriel Perinatal Groups, linked by dedicated frame-relay circuits — early-era networking at its finest.
    Momentum carried me to new partnerships with The Claremont Colleges and the City of West Covina, where I served as Senior Consultant handling forensic and SMTP (email) engineering.

    Word spread further. An attorney client introduced me to an opportunity in American Samoa to help design and build a regional ISP, and later to a contract with Sanyo Philippines.
    During this period Fortinet was still new, and I became one of its early resellers. I preferred building AIT servers and workstations from the ground up rather than reselling mass-produced systems.
    DSL was just emerging, yet most clients relied on dedicated T1 lines — real hands-on networking that demanded patience and precision.

    Those were the twelve years of plenty — projects stretching from Los Angeles hospitals to overseas data links.
    By the time AWS launched in 2006 and Azure in 2010, I was already managing distributed networks and data replication.

    When I returned to Corporate America, my first full-time role was at Payforward, where I led the On-Prem to AWS migration, building multi-region environments across US-East (1a and 1b) and US-West, complete with VPCs, subnets, IAM policies, and full cloud security.
    That’s when I earned my AWS certifications, completing a journey that had begun with cables and consoles and matured in the cloud.

    Education, experience, and certification merged into one lesson:
    Discipline comes first. Validation follows.
    Degrees and credentials were never my starting line — they became the icing on the cake of years of practice, service, and faith.


    My Philosophy: Code Like a Craftsman

    Photography taught me patience. Martial Arts taught me form. IT taught me precision.
    All three share one secret: the art lies in repetition with awareness.

    As Ansel Adams said:

    “When words become unclear, I shall focus with photographs. When images become inadequate, I shall be content with silence.”

    Coding feels the same. When logic becomes unclear, I focus. When code seems inadequate, I find peace in understanding.


    The Developer Path

    1️⃣ Core Web Skills

    HTML | CSS | JavaScript (ES6+) | Git | GitHub
    Learn Free: freeCodeCamp | Traversy Media

    2️⃣ Frontend Framework

    Master React or Next.js.
    Courses: Max Schwarzmüller Udemy | Colt Steele Bootcamp | Jonas Schmedtmann JS Course

    3️⃣ Backend & APIs

    Choose Node.js or Python (Flask / FastAPI).
    Watch: Corey Schafer | Course: Angela Yu 100 Days of Code

    4️⃣ DevOps for Developers

    Learn Docker, GitHub Actions, and Cloud Deployments.
    Watch: TechWorld with Nana

    5️⃣ Labs & Simulators

    No hardware? Use Whizlabs Labs | Replit | Microsoft Sandboxes

    6️⃣ Portfolio

    Build three apps (CRUD, API, SPA) + README + screenshots + a short blog for each.


    Final Reflection

    From library nights in Koreatown to pushing code in the cloud, this path proves that curiosity and consistency still change lives.
    Keep learning, keep building, and remember — every keystroke is one more kick toward mastery.
    This blog will continue to grow as technology changes — come back often and build along with me.


    🪶 Closing Note

    I share this story not to boast but to inspire those still discovering their own path in technology.
    Everything here is told from personal experience and memory; if a date or detail differs from official records, it’s unintentional.
    I’m grateful for mentors like my LACC professor, who once told me to look up a name not yet famous — Bill Gates — and earn my MCSE + I.
    He was right: that single decision opened countless doors.

    I don’t claim to know everything; I simply kept learning, serving, and sharing.
    My living witnesses are my son, my younger brother, and friends who once worked with me and now thrive in IT.
    After all these years, I’m still standing — doing what I love most: helping people through Information Technology.


    ⚖️ Legal Disclaimer

    All events and company names mentioned are described from personal recollection for educational and inspirational purposes only. Any factual inaccuracies are unintentional. Opinions expressed are my own and do not represent any past or current employer.

    © 2012–2025 Jet Mariano. All rights reserved.
    For usage terms, please see the Legal Disclaimer.

  • 🌥️ The Cloud Above Us

    PIMCO (Newport Beach HQ, CA) 🌍 — Global financial services supporting regions in NA, EMEA, APAC.
    Church (Riverton Office Building, UT) ⛪ — Worldwide infrastructure with 200k employees and over 80k missionaries.
    Monster Energy (Corona HQ, CA) ⚡ — Global enterprise IT operations across NA, EMEA, APAC.
    City National Bank (Downtown LA, CA) 🏙️ — U.S. banking systems at scale.

    A journey across scales: national (CNB), global (PIMCO & Monster Energy), and worldwide (The Church).


    Every IT career tells a story, and mine has moved through three different scales of impact:

    Company-Level Foundations → At PayForward, I migrated an entire OnPrem environment into AWS. That meant setting up VPCs, building HA Exchange clusters with load balancers, and proving the power of cloud for a fast-moving startup.

    Regional / Global Scale → At Monster Energy and PIMCO, the work stretched across North America, EMEA, and APAC. The systems never slept. VMware clusters and M365 tenants had to function as one, even though users were scattered across time zones and continents.

    Worldwide Reach → At the Church, the scale expanded beyond regions. Over 200,000 employees and over 80,000 missionaries, connected by systems that had to reach every corner of the globe, demanded both technical precision and spiritual responsibility.

    This journey shows that the “cloud above us” isn’t just AWS, Azure, or GCP — it’s the ability to design, secure, and sustain systems at every possible scale.

    A colleague once told me: “Automate, or eliminate.” In IT, that isn’t just a clever saying — it’s survival. At the scale of hundreds or even thousands of VMs, EC2 instances, or mailboxes, doing things manually is not just unrealistic — it’s risky. What automation can finish in under 10 minutes might take days or weeks by hand, and even then would be prone to errors.

    That’s why Python, PowerShell, Bash, and automation frameworks became part of my daily toolkit. Not to flaunt, but because without automation, no single engineer could handle the demands of environments as large as PIMCO, Monster Energy, or the Church.


    Snippet 1: AWS (My PayForward Days)

    import boto3
    
    # Connect to AWS S3
    s3 = boto3.client('s3')
    
    # List buckets
    buckets = s3.list_buckets()
    print("Your AWS buckets:")
    for bucket in buckets['Buckets']:
        print(f"  {bucket['Name']}")
    

    From racks of servers to a few lines of Python—that’s the power of AWS.

    Snippet 2: PowerShell + Azure (My Church Years, CNB)

    Connect-AzAccount
    Get-AzResourceGroup | Select ResourceGroupName, Location
    

    One line, and you can see every Azure resource group spread across the world. A task that once required data center visits and clipboards is now just a command away.

    Snippet 3: PHP + GCP (Expanding Horizons)

    use Google\Cloud\Storage\StorageClient;
    
    $storage = new StorageClient([
        'keyFilePath' => 'my-service-account.json'
    ]);
    
    $buckets = $storage->buckets();
    
    foreach ($buckets as $bucket) {
        echo $bucket->name() . PHP_EOL;
    }
    

    Snippet 4: VMware + M365 (Monster Energy, PIMCO, and Beyond)

    # Connect to vCenter and list VMs across data centers
    Connect-VIServer -Server vcenter.global.company.com -User admin -Password pass
    Get-VM | Select Name, PowerState, VMHost, Folder
    
    # Quick check of licensed users in M365 (global tenants)
    Connect-MgGraph -Scopes "User.Read.All"
    Get-MgUser -All -Property DisplayName, UserPrincipalName, UsageLocation |
        Group-Object UsageLocation |
        Select Name, Count
    

    One script, and suddenly you’re seeing footprints of users spread across the globe — NA, EMEA, APAC, or even worldwide. That’s the reality of modern IT infrastructure.


    The “cloud above us” is both a literal technology — AWS, Azure, and GCP that I’ve worked across — and a metaphor. It represents resilience, scalability, and unseen support. Just as automation carries workloads we could never handle by hand, life has storms we cannot carry alone.

    From startups making their first move to the cloud, to global financial institutions, to worldwide organizations with hundreds of thousands of users, the lesson is the same: we are not meant to fight every battle manually.

    We are given tools, teammates, and even unseen strength from above to keep moving forward. The same way a script can manage thousands of servers or accounts without error, trust and preparation help us navigate the storms of life with less fear.

    ☁️ Above every storm, there’s always a cloud carrying potential. And above that cloud, always light waiting to break through.

    Before my cloud journey, I also spent nine years in forensic IT supporting law enforcement — a grounding reminder that technology isn’t only about systems and scale, but about accountability and truth.

    © 2012–2025 Jet Mariano. All rights reserved.
    For usage terms, please see the Legal Disclaimer.

error: Content is protected !!