Getting Started with Microsoft's PowerShell by@jhash

Getting Started with Microsoft's PowerShell

Shekhar Jha HackerNoon profile picture

Shekhar Jha

Focus on enterprise security with background in IAM & cloud security

Powershell continues to be an important part of the Microsoft ecosystem. Most of this learning is based on Microsoft documentation, a few blogs, and google and stack overflow searches. Please note that the focus of this article is on building complex packages and tools rather than single file scripts.


Modules are one of the fundamental blocks of packaging in PowerShell. Most of the teams would probably end up creating a module to collect a set of common utility functions or share the finished component with larger teams through PowerShell Gallery. Alternatively, modules can also be imported locally (see bootstrap below) and used as part of scripts.

Powershell Modules typically should follow a standard structure and configuration to ensure consistent outcomes and reduce potential issues. The following directory structure ensures that you can have the right place for each component involved.


The recommended module directory structure aligns with standard guidance available for writing the module. The root directory should be named to uniquely identify module and match the module name. The PSD1 file is pretty standard files generated and then updated to align with specific module. The “RootModule” and “ModuleList” in PSD1 file points to “<module name>.psm1” file that contains basic module reference.

The public folder contains scripts files that contain functions, class that should be exported. The private folder should be used to store helper functions. bin folder can be used to store dll or other components that is used by module.

The psm1 file below contains some additions to reference available here.

#Get public and private function definition files.
$Public  = @( Get-ChildItem -Path $PSScriptRoot\Public\*.ps1 -ErrorAction SilentlyContinue )
$Private = @( Get-ChildItem -Path $PSScriptRoot\Private\*.ps1 -ErrorAction SilentlyContinue )

#Dot source the files
Foreach($import in @($Public + $Private))
        . $import.fullname
        Write-Error -Message "Failed to import function $($import.fullname): $_"

 Set-Variable -Name SCRIPT_PATH -Value (Split-Path (Resolve-Path $myInvocation.MyCommand.Path)) -Scope local
 Set-Variable -Name FULL_SCRIPT_PATH -Value (Resolve-Path $myInvocation.MyCommand.Path) -Scope local
 Set-Variable -Name CURRENT_PATH -Value ((Get-Location).Path) -Scope local
 Set-Variable -Name LOGGER -Value $null -Scope global
 # Export functions
 Export-ModuleMember -Function <Function name>

Export function in the modules instead of class where possible to avoid reload issues associated with class.

All the functions defined in scripts should contain basic documentation tags like .SYNOPSIS, .DESCRIPTION, .PARAMETER (s), .OUTPUTS, .EXAMPLE. The following sample can be used as a guide for creating functions.

    This contain all the sample utility functions
    Sample utility functions should be used across all sample applications

    Quick function description
    This is a very long description about the function below
  .PARAMETER  OneParameter
    Description of parameter including allowed values.
  .PARAMETER TwoParameter
    Another description
    Exit codes:
    0 : Success
    1 : Unkown error
    2 : More specific error code
    Run-SampleFunction -OneParameter "some value"
Function Run-Samplefunction {
  param (
    [Parameter(Mandatory = $True, Position = 0)]
    [Parameter(Mandatory = $True, Position = 1)]
  Process {
    # Write-LogDebug "Entering function Run-Samplefunction"
    try {
       # something.... 
    } catch [System.Exception] {
      $ErrorMessage = $_.Exception.Message
      # Write-LogError "Message: $ErrorMessage"
    # Write-LogDebug "Leaving function Run-Samplefunction"
    return $LASTEXITCODE

Note the use of CmdletBinding() that provides some advanced capabilities like binding predefined arguments to the function. In addition to that well-defined parameters and output can reduce the boilerplate validation code. Using input processing methods (Begin, Process, End) ensures that your functions are prepared for pipelines. Use of try…catch and return exit code is recommended approach but should be evaluated in line with internal coding guidelines.

Common Utility

Most of the big developments should evaluate the use and need of the following components.


A simple level-based logging mechanism can be created using log4net and/or Write-Host function with function signatures like Write-LogDebug ($Text), Write-LogInfo ($Text) to ensure that logging at different levels can be enabled and or disabled through a common mechanism (see bootstrap below).


Develop a JSON or equivalent model of config that has the ability to load configuration from a file using equivalent to the following command

$configData = New-Object -TypeName PSCustomObject -argumentlist (Get-Content -Raw -Path $ConfigFile | ConvertFrom-Json)

It may be helpful to have a standard naming convention while designing the configuration file to ensure that you have a common configuration (config.common) along with component-specific configuration (config.component). In addition to that develop helper functions that can return config associated with the key name specified in dot-separated format can ensure that each component can have its own format for configuration and know how to handle it without bothering about the rest of the configuration structure. See the section on bootstrap below on how the configuration loading can be performed.


It is important to develop a helper security function at the start of the project and use it consistently in the code to ensure that over time the code can use updates being made from a security perspective. Such helper functions should create for reading passwords using config (either by decrypting from config or retrieving from the vault with a simple configuration change) and return PSCredential object to be used. Another important helper function can be a common authentication mechanism when multiple authentications can be performed (for example Azure can support multiple authentication mechanisms like id/password, service principal/password, certificate), it may be helpful to have a single authentication function that chooses actual authentication process based on configuration.


In a large development, multiple scripts across different use-cases would typically be developed. Such a development if it follows standard structure can significantly simplify the overall process.


At the very basic it is recommended, that a standard directory structure is developed to package these development components. The root directory should contain with specific details about how the packaging has been done, various scripts available and specific examples to run the script. The bin directory contains the various scripts along with a common bootStrap.ps1 which provides a standard mechanism to handle configuration, load common modules, initialize logging, handle common parameters (see below for more details). The config directory contains the configuration file with default values and the data folder contains various templates and other files needed to run the script. Additional directory like logs, tmp may be created as needed by script as part of the execution process.


The bootstrap.ps1 can contain typical operations that can not be handled in a module due to dependency on specific parameters. It is recommended, that the bootstrap should have a function like the one below that can be called by any script.

Function BootStrap { [CmdletBinding()] [OutputType([PSCustomObject])] Param([string[]] $ModulesToLoad, [string] $ConfigFile, [string] $ConfigToLoad, [string] $LogLevel, [string] $LogType)

This function should perform the following functions

  1. Identify the root location of deployment (parent of the <Root Directory> shown above)

  2. Add the root location to $env:PSModulePath (if not already set) to ensure that all the modules that are part of the package can be imported without dependency on external repository like PSGallery.

  3. Import basic modules provided

    If (!(Get-Module Sample.Common-Utils)) { Import-Module Sample.Common-Utils -Force -ErrorAction Stop }

  4. Initialize logger and any other common system needed.

  5. Load all the modules passed as $ModulesToLoad after checking if the module is available. If not, then it tries to install using

    Install-Module $ModuleName -AllowClobber -Force -ErrorAction Stop -Scope CurrentUser

  6. Tries to locate the config file by resolving it as it is and then trying to locate it in bin directory and then in config directory using the $SCRIPT_PATH variable (see above).

  7. Load the configuration data and return the same.

All the scripts should be defined with well-defined Parameter list and run the bootstrap as follows

$SCRIPT_PATH= Split-Path (Resolve-Path $myInvocation.MyCommand.Path)
# This line allows passing all command line parameters to bootstrap function
$BootstrapParameters =$PSBoundParameters
$loadModuleList = “Sample.Common-Utils”,”Sample.Az-Utils”
$BootstrapParameters.Add(‘ModulesToLoad’, $loadModuleList)
$ConfigData = BootStrap @BootStrapParameters
# continued code for script

Powershell continues to play an important part of automation within the Microsoft ecosystem. Leveraging the recommendations described above can significantly improve the developer experience and reduce the overall time of development for large projects.

First published here

Shekhar Jha HackerNoon profile picture
by Shekhar Jha @jhash.Focus on enterprise security with background in IAM & cloud security
Read my stories


Signup or Login to Join the Discussion


Related Stories