How to Learn Git, Markdown and PowerShell by Contributing to the PowerShell-Docs Repository

Intro

The PowerShell-Docs repository is the home of the official PowerShell documentation. It contains reference and conceptual content for various versions and modules of PowerShell. Contributing to this repository is a great way to learn Git, Markdown and PowerShell, as well as to help improve the quality and accuracy of the documentation.

In this blog post, I will show you how you can contribute to the PowerShell-Docs repository by doing quality contributions, and why it’s beneficial for your learning and development.

What are quality contributions?

Quality contributions are enhancements or fixes that improve the readability, consistency, style or accuracy of the documentation. They can include things like:

Quality contributions are different from content contributions, which involve adding new articles or topics, or making significant changes to existing ones. Content contributions require more discussion and approval from the PowerShell-Docs team before they can be merged.

How to make quality contributions?

Before we get into how to make quality contributions, I’d like to shamelessly plug my own module: PowerShell-Docs-CommunityModule It will help you pick out work that has not been done yet.

Install & try it, using the following code:

Set-Location $env:USERPROFILE

#  Make sure 'username' reflects your actual github username
git clone https://github.com/username/PowerShell-Docs

Install-Module PowerShell-Docs-CommunityModule

Find-MissingAliasNotes -Verbose

To make quality contributions, you need to have a GitHub account and some basic knowledge of Git and Markdown. You also need to install some tools that will help you edit and preview your changes locally. Here are some steps you can follow:

  1. Fork the PowerShell-Docs repository on GitHub.
  2. Clone your forked repository to your local machine using Git.
  3. Install Git, Markdown tools, Docs Authoring Pack (a VS Code extension), and Posh-Git (a PowerShell module).
  4. Check out the PowerShell Docs Quality Contributions project on GitHub. This project tracks all the open issues and PRs related to quality improvements.
  5. Pick an issue that interests you or create a new one if you find something that needs fixing. You can use PowerShell-Docs-CommunityModule to help you here.
  6. Assign yourself to the issue and start working on it locally using VS Code or your preferred editor. Make sure you create a new branch before editing any files. Making a new branch will ensure your edited files is clean in your upcoming Pull-Request.
  7. Preview your changes, make sure you’ve edited the ms.date at the top of the document to todays date (MM/dd/yyy), this way other contributors know when the document has been edited, it’s also required when doing an update, so the owners of the repository will ask you to do this if you miss it.
  8. Commit your changes using Git and push them to your forked repository on GitHub.
  9. Create a pull request (PR) from your forked repository to the original PowerShell-Docs repository on GitHub.
  10. Wait for feedback from reviewers or maintainers of the PowerShell-Docs team.
  11. Address any comments or suggestions they may have until your PR is approved and merged.

Why make quality contributions?

Making quality contributions has many benefits for both you and the PowerShell community.

For you:

  • You can learn Git, by contributing to a very friendly large code-base project. The owners are more then willing to help you with git related questions. You’ll grow a ton in this area once you start doing some PRs.
  • You will write/edit files in Markdown (.md), a very popular markup language.
  • Because you will be proof-reading the markdown documents, you will learn more PowerShell topics straight from the source of the documentation.
  • You can improve your writing skills by following Microsofts style guides and best practices for technical documentation.
  • You can get feedback from experts who work on PowerShell, Markdown and Git every day.
  • You can build your reputation as a contributor by having your name appear in commit history.

For the community:

  • You can help improve the clarity, consistency, style or accuracy of the documentation that many people rely on every day.
  • You can help reduce confusion, errors or frustration among users who read the documentation.
  • You can help keep the documentation up-to-date with changes in PowerShell features or functionality.
  • You will be listed in the Community Contributor Hall of Fame

Conclusion

In this post, I showed you how you can contribute to the PowerShell-Docs repository by doing quality contributions, and why it’s great for learning Git, Markdown and PowerShell, while at the same time using the PowerShell-Docs-CommunityModule to find out what to do first. We hope this blog post inspires you to join us in making the PowerShell documentation better for everyone.

If you have any questions, comments, or suggestions, please feel free to reach out on a DM either on twitter, email or mastodon! I hope to see a PR from you, and if you’ve successfully done so because of this post, make sure to notify me about it! 😃

Happy contributing

Using the PowerShell Module Z to Save Time in the Terminal

The module “Z” is a handy tool that can save you lots of time when navigating around your terminal. In this blog-post, we’ll cover how to install it to user scope, how to configure it by jumping around the terminal to different places, and how it saves lots of time in the long run.

Installing Z to your CurrentUser scope

To install Z, open a PowerShell terminal and run the following command:

Install-Module z -Scope CurrentUser

This will install Z for your current user only. If you want to install it for all users on your system, you can use -Scope AllUsers instead.

Configuring Z

Once you have Z installed, you can start configuring it by jumping around your terminal to different places. To do this, simply navigate to a directory that you frequently visit and then run z followed by a space and then part of the directory name. For example:

cd C:\Users\MyUser\Documents\Projects\MyProject
cd \
z MyProject

This will add C:\Users\MyUser\Documents\Projects\MyProject to Z’s list of directories. The more you use z with different directories, the more accurate it will become at predicting where you want to go.

Saving Time with Z

Once you have Z configured with your frequently visited directories, you can start saving time by using z instead of typing out long paths.

It may not seem that much of a timer-saver at first glance, however the more you get used to using z, instead of cd/tab-completion for navigation, the more time you will save. And I would imagine if you use your terminal daily, the time-savings are huge.

Z’s a bash fork

That’s right. The module is actually forked and ported from a bash script, what makes it super useful together with PowerShell is, you guessed it: PSProviders.

Not only can you navigate around your file-system, you can also visit your local reg hive, cert store, and other PSProviders, to save them to your directory history.

cd Cert:\LocalMachine\My\

z 'Cert:\LocalMachine\My'

z C:\

z my
  • Here, you start out visiting your local certificate store.
  • You save it to your z directory
  • You navigate back to C root
  • And back to your local cert store using only “z my”

This works with lots of different PSProviders, so try it out yourself!

To sum it up

By installing, using and configuring Z, you can save lots of time when navigating around your terminal. By installing it to user scope and configuring it by jumping around your terminal to different places, you’ll be able to quickly navigate to frequently visited directories and other PowerShell provider paths, with just a few keystrokes. Keep in mind that the more you use it, the better it gets, both from a functional standpoint but also the amount of time it saves you (So that you can drink coffee instead 😁 ).

Happy Coding

Active Directory Delegation done the PowerShell way!

Active Directory Access Granting

The following module is very important for a Active Directory operations or engineering type person. Most who have tried granting permissions outside of adding users to groups in Active Directory would probably agree that, access delegation can be a daunting task.

Your best bet is to use the Delegation wizard, but wheres the fun (PowerShell fun..) in that. And how well does it scale? Not very well.

The alternatives

So, the code-friendly alternatives we have are:

The first alternative, the native command DSACLS, can be a powerful way delegating, but is very frustrating to work with (or so I think).

This alternative is probably your best bet outside of the module DSACL. However creating the directory access rules and applying them can be a bit cumbersome.

Both alternatives are fine, but will require you to do some heavy lifting in terms of learning. I will not go into depth into the alternatives here, it’s just a good heads-up that these exists.

If you’ve learnt the basic of using a cmdlet in PowerShell, luckily for you, access delegation is just a module install away.

Using DSACL.

# Install the DSACL Module
Install-Module -Name DSACL  -Scope CurrentUser

# Build a splatting table with the desired groups
$ParamSplat = @{
    TargetDN = "CN=SalesUsers,DC=Contoso,DC=COM" # DistinguishedName of object to modify ACL on. Usually an OU.
    DelegateDN = "CN=SalesAdmins,OU=Groups,DC=Contoso,DC=COM" # DistinguishedName of group or user to give permissions to.
    DirectOnGroup = $true
}

Add-DSACLManageGroupMember @ParamSplat

This example creates a splatting table, and defines the objects to modify ACL (access control list), and user/group to give the permission to.

$ParamSplat = @{
    TargetDN = "OU=Users,DC=Contoso,DC=COM"
    DelegateDN = "CN=SalesAdmins,OU=Groups,DC=Contoso,DC=COM"
    AccessType = 'Allow'
    ObjectTypeName = 'User'
    NoInheritance = $True
}

Add-DSACLFullControl @ParamSplat

Gives the SalesAdmins group Full Control over ‘User’ object administrations in the Users OU. Disables Inheritance, making the ACL only apply on the given OU (Users).

The module contains some really useful cmdlets for managing AD delegation, to name a few:

  • Set-DSACLOwner
  • Add-DSACLJoinDomain
  • Add-DSACLResetPassword
Get-Command -Module DSACL

I recommend checking out the online documentation for each cmdlet, as well as running them in lab and verifying the ACL’s once executed.

If your into AD and PowerShell, it’s a fun and very learning experience to fire up some hyper-v VMs using Windows Server Evaluation ISO and a local client with some left-over RAM, and promote some domain controllers to test this module out. An awesome lab module that I recommend for very fast labs, are AutomatedLab. If you want to go a more step-by-step and less abstracted way of setting up a lab, I recommend my own Hyper-V script Labmil.

That’s it. Try it out yourself to discover how easy directory access delegation can be. The amount of AD automation is endless once you get a hang of it, combined with the standard ActiveDirectory module.

Happy coding

PowerShell Solution: Install PFX certificate on servers

Problem

As you may have guessed, this post will be about installing certificates using PowerShell.

Every year as someone working with identity topics, I am tasked with renewing certificates across the environment. Many services relies fully on valid certificates in order to function securely.

One critical service in particular that this scenario will cover is: Active Directory Federation Services, ADFS.

In most cases, you will have multiple ADFS servers, meaning, if your not automating already, you will need to install the SSL certificate manually (no fun experience on 10+ servers).

There’s more to say regarding specifically ADFS SSL certificates, that this post will not cover, however an installation will be needed in many of those scenarios as well.

Solution

This solution covers how one could do this for ADFS servers, however it carries over to other services that requires a valid certificate as well.

To generate an pfx file out of an external certificate, I recommend using The Digicert Cert Utility to generate the CSR (Certificate Signing Request) on the root server. Then simply import it using the digicert tool, and export the certificate to a .pfx file.

Here’s an example of how to export an already installed certificate as a PFX file:

$PfxPw = (Read-Host -Prompt 'Enter a password' -AsSecureString)

Get-ChildItem -Path cert:\localMachine\my\<thumbprint> | Export-PfxCertificate -FilePath C:\Cert\ssl_cert.pfx -Password $PfxPw

It’s important that the certificate gets imported on the server where the CSR was generated, in order to have a valid public/private keypair.

What we need to start out is:

  1. The ADFS Root server with the pfx certificate exported
  2. Access to all ADFS servers
  3. WinRM/PowerShell remoting enabled environment
# Local path to the certificate
$PFXPath = 'C:\Cert\ssl_cert.pfx'

# Credential object, we only use the password property
$Creds = Get-Credential -UserName 'Enter PFX password below' -Message 'Enter PFX password below'

# Path of the remote server we will copy to
$ServerCertPath = "C:\Cert\"

$InternalServers = "SERVER1", "SERVER2", "SERVER3"

foreach ($Server in $InternalServers) {

    # Creates a remote session
    $Session = New-PSSession -ComputerName $Server
    # Copies the certificate to the remote session
    Copy-Item -Path $PFXPath -ToSession $Session -Destination $ServerCertPath -Force -Verbose -ErrorAction Stop

    # Imports the pfx certificate using the credentials provided remotely
    Invoke-Command -Session $Session -ScriptBlock {

        Import-PfxCertificate -FilePath $using:ServerCertPath -CertStoreLocation Cert:\LocalMachine\My -Password $using:Creds.Password
        
    }
}

Small Talk

And just like that, you’ve saved truckloads of time every year using PowerShell.

I highly recommend checking out more cmdlets based from the pki and Microsoft.PowerShell.Security module. The script above displays how one can tackle a .pfx certificate, but using Import-Certificate, you could do similar things with .cer files.

Also, one could eliminate the need for generating a password with using something like Microsoft.PowerShell.SecretManagement. This module translates well into a lot of cmdlets in the pki/security module.

Stay safe & happy coding!

/Emil

PowerShell Solution: AGPM unable to take control of a GPO

Problem

If you enjoy the principle of least privileges, version control and doing big infrastructural changes in a safe manner, Advanced Group Policy Management or AGPM, is an amazing tool.

AGPM itself has a few years on its back, and as we sysadmins tend to get easier and easier systems now days, legacy systems can mean complexity.

When combined with new sysadmins that has not been introduced to the concept of AGPM, uncontrolled GPOs might become a problem and the built in error messages are sadly not the greatest.

(GPMC Error) could not take the ownership of the production GPO. Access is denied. (Exceptions from HRESULT : 0x80070005 (E_ACCESSDENIED)).

Access denied is caused by the AGPM service-account not having the permission to take control of the GPO (not having control of a GPO in AGPM really does ruin the point of AGPM). Solving this problem involves giving the service-account the permissions needed, however it’s a bit of a tricky thing to do.

Solution

As we’ve established, we must add the correct permissions for the service-account to the GPO, easy right? Luckily yes, because we know PowerShell!

To add the permissions, we need to understand how a GPO is stored. There’s two places a GPOs data resides in, ActiveDirectory (GPC) & Sysvol (GPT).

GPC

Group Policy Container (GPC), luckily the name is easy to remember because we already understand that AD consists of Organizational Units and… Containers. The GPC is stored in AD, under “CN=,Policies,CN=System,DC=x,DC=x”. Since it’s an AD object, logically it has attributes describing the object version etc.

GPT

Group Policy Template (GPT), is stored in the DC’s system volume (sysvol), under the ‘policies’ subfolder.

The GPT stores the majority of GPO data, it contains a folder structure of files that describes the GPOs functionality, meaning it stores script files, administrative template-based policies and various other security settings.

Replication

GPC uses AD replication, and GPT uses DFS-R since its in sysvol. This is important because we will edit the ACLs of both AD and sysvol in order to solve our issue.

Editing ACL for GPC

Editing its ACL requires generating an ActiveDirectoryRights object with the desired access. This can be done multiple ways, dsacls, using Set-ACL to name a few. In this case I had heard of an amazing module from Simon Wahlin called DSACL, so I can simply do the following:

$ADRights = "CreateChild", "DeleteChild", "Self", "WriteProperty", "DeleteTree", "Delete", "GenericRead", "WriteDacl", "WriteOwner", "AccessSystemSecurity"
Add-DSACLCustom -TargetDN $GPODN -DelegateDN $DelegateDN -ActiveDirectoryRights $ADRights  -InheritanceType Descendents -AccessControlType Allow

Add-DSACLCustom -TargetDN $GPODN -DelegateDN $DelegateDN -ActiveDirectoryRights $ADRights[0..8] -InheritanceType None -AccessControlType Allow

The ‘TargetDN’ in this case will be the GPCs distinguishedName, and the DelegateDN will be the distinguishedName of our AGPM service-account. We run the cmdlet twice to mimic the way AGPM edits the ACL in a controlled GPO. AccessSystemSecurity was not needed in the 2nd ACE and therefore I ended up selecting the first 9 (0..8) ADRights.

Editing the ACL for GPT

Since GPT is in sysvol, we now have the task of editing a filesystem ACL. This is different from a directory service ACL. There’s many ways of doing this as well, cacls, and Set-ACL works great. I ended up taking the easy way out and used NTFSSecurity, again another killer PowerShell module with 1,1mil downloads as of writing. And that’s quite understandable considering this is how one can grant full control on a filesystem:

Add-NTFSAccess -Path "\\DOMAIN\SYSVOL\DOMAIN.TEST\Policies\{$($GPOObject.Id)}" -AccessRights FullControl -Account 'DOMAIN\Service-Account-AGPM'

Almost ready to solve!

As we have learned, GPC and GPT is a bit different. Sysvol and AD does replicate, but in different ways. Key take-away is that most likely we need to wait for replication in order for the AGPM server to understand that the rights are in fact in place. This took me around 15m, this could have been avoided had I done the changes on the same server.

Using the AGPM module, we’re now ready to take control of the GPO, since we now have the access to do so.

Get-Gpo -Name "TheUncontrolledGPO" | Add-ControlledGpo -PassThru

In my case, I had more then 1 uncontrolled GPO, to say the least. Sadly the AGPM module doesn’t have something like ‘Get-UncontrolledGPO’.

What I ended up doing was to filter out all uncontrolled GPOs myself using Compare-Object, .

$ControlledGPOS = Get-ControlledGpo
$UncontrolledGPOS = (Compare-Object $ControlledGPOS.Name (Get-GPO -All).DisplayName).InputObject

foreach ($GPO in $UncontrolledGPOS) {
    Get-Gpo -Name $GPO | Add-ControlledGpo -PassThru
}

You can of course also navigate within GPMC > ChangeControl > Uncontrolled > select all GPOs, rightclick, Control.

Congratulations on having a fully controlled AGPM environment.

Discussion

Understanding where a GPO is stored is a nice way of understanding how they work. The reason behind having them stored in separate places most likely goes back to fact that AD is old, and back in the days, size mattered. Having the GPT files in the AD database (.dit) would simply mean a huge increase in data for AD. Splitting things up and having the DCs taking a bit of storage was probably a good idea back then.

On another note, notice my code in this solution was quite simple. Even thought we did some complex tasks. I was actively not trying to re-invent the wheel, and this is something that gets more important the ‘harder’ the task becomes. Using “blackbox” modules where we only follow the PowerShell standard way of typing out a cmdlet, can be a great way of achieving complex tasks with speed. It’s also important that when a “blackbox” module solves something for you, go back and try to dig deeper in what it actually did. I find this a good way of learning things in general.

Happy coding

/Emil

PowerShell: C Sharp for PowerShell Modules

My journey into C# so far

I’ve always been somewhat interested in programming, and PowerShell scripting and module making has put fire to that interest. The natural next language to learn for me has always been C#, reason being it’s easy to pick up if you already know PowerShell and it enables you to create binary PowerShell modules.

Some content I’ve devoured to increase my C# knowledge are:

  • C# Fundamentals course on PluralSight by Scott Allen
  • PowerShell to C# And Back, Book by Deepak Dhami & Prateek Singh

Various YouTube content and talks on building PowerShell Binary Modules using C#:

  • Building Cross Platform PowerShell Modules by Adam Driscoll
  • Writing Compiled PowerShell Cmdlets by Thomas Rayner

The above lists are things I’ve went through and I can honestly recommend.

What’s to come

I plan to further increase my knowledge with the books:

  • The C# Player’s Guide, Book by RB Whitaker
  • C# In Depth, Book by Jon Skeet

As well as writing more modules and other C# related projects.

The process of wrapping a binary module using a nuget package

  1. Install a modern version of the dotnet cli together with a dotnet sdk suitable for that version

  2. Init an empty git repo for the new code to live in

  3. Navigate to the folder; dotnet new classlib. This will generate a dotnet class library, once compiled it will generate a DLL file that will be our module

  4. In the csproj file, to make the module compatible with Win PowerShell & PowerShell, we set the TargetFramework to “netstandard2.0”

  5. Remove ImplicitUsings and Nullable Enabled. These are language features we do not need

  6. dotnet add package PowerShell Standard.Library

  7. dotnet add package thenugetpackage

  8. dotnet publish. We have now added the packages needed to start wrapping the nuget package into a PowerShell module

  9. To follow the official PowerShell repos naming standard, all the cmdlets are to be named: VerbNounCommand.cs

The following source code is commented to help one with a PowerShell background to understand it easier:

// Usings, similar to Import-Module
using System;
using System.Management.Automation;
using PasswordGenerator;

namespace PasswordGenerator
{
    // VerbsCommon contains a list of approved common verbs, the string Password is the Noun of the cmdlet
    [Cmdlet(VerbsCommon.New,"Password")]
    // Cmdletname : PSCmdlet, similar to Function Name {}
    public class GetGeneratedPasswordCommand : PSCmdlet
    {
        // [Parameter], default value is 16. If Get > Default, if set > set the value of the param
        private int _pwLengthDefault = 16;
        [Parameter]
        [ValidateRange(4,128)]
        public Int32 Length
        {
            get
            {
                return _pwLengthDefault;
            }
            set
            {
                _pwLengthDefault = value;
            }
        }

        private int _amountDefault = 1;
        [Parameter]
        public Int32 Amount
        {
            get
            {
                return _amountDefault;
            }
            set
            {
                _amountDefault = value;
            }
        }

        // Switch parameters, they turn true if specified
        [Parameter]
        public SwitchParameter IncludeSpecial { get; set; }
        [Parameter]
        public SwitchParameter IncludeNumeric { get; set; }
        [Parameter]
        public SwitchParameter IncludeUppercase { get; set; }
        [Parameter]
        public SwitchParameter IncludeLowercase { get; set; }

        protected override void ProcessRecord()
        {
            // for loop, same concept as in PowerShell
            for (int i = 0; i < Amount; i++)
            {
                if (!IncludeLowercase & !IncludeUppercase & IncludeSpecial & IncludeNumeric)
                {
                    var pwd = new Password(Length).IncludeSpecial().IncludeNumeric();
                    var password = pwd.Next();
                    WriteObject(password); 
                }
                else if (IncludeNumeric & !IncludeSpecial & !IncludeUppercase & !IncludeLowercase)
                {
                    var pwd = new Password(Length).IncludeNumeric();
                    var password = pwd.Next();
                    WriteObject(password);
                }
                else if (IncludeSpecial & !IncludeNumeric & !IncludeUppercase & !IncludeLowercase)
                {
                    var pwd = new Password(Length).IncludeSpecial();
                    var password = pwd.Next();
                    WriteObject(password); 
                }
                else if (!IncludeNumeric & !IncludeSpecial & IncludeUppercase & IncludeLowercase)
                {
                    var pwd = new Password(Length).IncludeLowercase().IncludeUppercase();
                    var password = pwd.Next();
                    WriteObject(password);
                }
                else if (!IncludeNumeric & !IncludeSpecial & !IncludeLowercase & IncludeUppercase)
                {
                    var pwd = new Password(Length).IncludeUppercase();
                    var password = pwd.Next();
                    WriteObject(password);
                }
                else if (!IncludeNumeric & !IncludeSpecial & !IncludeUppercase & IncludeLowercase)
                {
                    var pwd = new Password(Length).IncludeLowercase();
                    var password = pwd.Next();
                    WriteObject(password);
                }
                else if (!IncludeNumeric & IncludeLowercase & IncludeUppercase & IncludeSpecial)
                {
                    var pwd = new Password(Length).IncludeLowercase().IncludeUppercase().IncludeSpecial();
                    var password = pwd.Next();
                    WriteObject(password);
                }
                else if (!IncludeNumeric & IncludeLowercase & IncludeUppercase & IncludeNumeric)
                {
                    var pwd = new Password(Length).IncludeLowercase().IncludeUppercase().IncludeNumeric();
                    var password = pwd.Next();
                    WriteObject(password);
                }
                else if (!IncludeNumeric & !IncludeUppercase & IncludeSpecial & IncludeLowercase)
                {
                    var pwd = new Password(Length).IncludeLowercase().IncludeSpecial();
                    var password = pwd.Next();
                    WriteObject(password);
                }
                else if (!IncludeSpecial & !IncludeUppercase & IncludeLowercase & IncludeNumeric)
                {
                    var pwd = new Password(Length).IncludeLowercase().IncludeNumeric();
                    var password = pwd.Next();
                    WriteObject(password);
                }
                else if (!IncludeLowercase & !IncludeNumeric & IncludeUppercase & IncludeSpecial)
                {
                    var pwd = new Password(Length).IncludeUppercase().IncludeSpecial();
                    var password = pwd.Next();
                    WriteObject(password);
                }
                else if (!IncludeLowercase & !IncludeSpecial & IncludeUppercase & IncludeNumeric)
                {
                    var pwd = new Password(Length).IncludeUppercase().IncludeNumeric();
                    var password = pwd.Next();
                    WriteObject(password); 
                }
                else if (!IncludeUppercase & IncludeLowercase & IncludeNumeric & IncludeSpecial)
                {
                    var pwd = new Password(Length).IncludeLowercase().IncludeSpecial().IncludeNumeric();
                    var password = pwd.Next();
                    WriteObject(password); 
                }
                else if (!IncludeLowercase & IncludeUppercase & IncludeNumeric & IncludeSpecial)
                {
                    var pwd = new Password(Length).IncludeUppercase().IncludeSpecial().IncludeNumeric();
                    var password = pwd.Next();
                    WriteObject(password); 
                }
                else
                {
                    var pwd = new Password(Length).IncludeLowercase().IncludeUppercase().IncludeSpecial().IncludeNumeric();
                    var password = pwd.Next();
                    WriteObject(password); 
                }
            }
            
        }

    }
}

The rest is a matter of figuring out how the package works, and what it supports. Make sure to try to get all the functionality of the package out in the PowerShell module. Obviously it might require you to make more cmdlets.

Next blog will be how I published my module to the PowerShell Gallery. Using Git, Github, PSGallery and PlatyPS.

Stay tuned!

Happy coding

/Emil

PowerShell for Security: PassWord Gen Part 2

Did it again

2021-05-10 - I wrote an article on Password Generators.

The goal of that pass-gen was to have a script in my $Profile that would simply work on both PowerShell 5.1 & PowerShell 7+. The goal was also to cover AD complexity rules, and it did just that.

However,

This time I’ve taken a whole new bull by the horn. While looking for a nuget package for password generators, out of curiosity on how a .net/C# developer would tackle the challenge that is coding a password generator, I stumbled upon “PasswordGenerator”.

To my surprise, the package has reached 1.6 million(!!!) downloads. I figured this package must be something special, some sort of holy grail of pass gens. And while I’m no C# expert, I’m always up for a challenge!

So I shamefully forked the repository and started working on a binary PowerShell cmdlet that would mimic the nuget package. 7 versions and 29 commits later, “BinaryPasswordGenerator” was born!

It’s fast…

Fast

It’s customizable

The cmdlet is highly customizable, just like the nuget package. This opens up a new usecase area that the former script did not cover:

  • Backend engine for generating passwords, in GUI/Web senarios (like a nuget package)
  • PIN/One Time Pass generations (usually 4-8 digit codes)
  • More user-friendly passwords (example: lowercase + numeric)
  • Supports up to 128 char length passwords
  • It’s wicked fast, meaning it scales better

Examples

# By default, all characters available for use and a length of 16

# Will return a random password with the default settings

New-Password
# Same as above but you can set the length. Must be between 4 and 128

# Will return a password which is 32 characters long

New-Password -Length 32
# Same as above but you can set the length. Must be between 4 and 128

# Will return a password which only contains lowercase and uppercase characters and is 21 characters long.

New-Password -IncludeLowercase -IncludeUppercase -Length 21
# You can build up your reqirements by adding parameters, like -IncludeNumeric

# This will return a password which is just numbers and has a default length of 16

New-Password -IncludeNumeric
# As above, here is how to get lower, upper and special characters using this approach

New-Password -IncludeLowercase -IncludeUppercase -IncludeSpecial
# This is the same as the above, but with a length of 128

New-Password -IncludeLowercase -IncludeUppercase -IncludeSpecial -Length 128
# One Time Passwords

# If you want to return a 4 digit number you can use this:

New-Password -IncludeNumeric -Length 4

Using together with other PowerShell modules:


# Convert to SecureString
$pw = New-Password | ConvertTo-SecureString -AsPlainText -Force

# Set a password in your SecretVault using Secret Store/Management
Set-Secret -Name 'User' -Secret (New-Password -Length 128) -Vault PSVault

Get-Secret User
System.Security.SecureString

Get-Secret User -AsPlainText
u%4EkQlMpVjPnO5VM5tYcnUE!F!D3wvhB8w595LXqIEAny1XC4OVn4\x!1Q79Nlj!QwK!zBVkFUAHVy44iEIO2icVE0meAz3YEWudP9UdKrjbrp8nJ8DECVll2Uq!kt5

Happy coding

/Emil

Solution: Restore Deleted Private key for EFS Encrypted Files

a

Problem

You have encrypted files using the “File > Properties > Advanced > Encrypt content to secure data” feature in Windows, and have lost your certificates in your personal certificate store.

Solution

A solution to this problem is to restore the private key used for encrypting your file system (EKU: Encrypting File System 1.3.6.1.4.1.311.10.3.4 ) that was generated upon encrypting your files.

The keypair to this certificate is stored in your Personal certificate store, luckily a copy of the public key is stored in your LocalMachine certificate store. This means we can restore the private key and enable decryption as long as the computer has not been reinstalled or lost.

Step-by-step

  1. This file is currently unecrypted. Let’s encrypt is using the method discussed

Secret!

Rightclick the file > properties > advanced > encrypt content..

encrypt_gui encrypt_file

I will select ‘Ecrypt the file only’ in this case

loc_icon

The lock symbol indicates that the file is successfully encrypted. Under the hood, windows generated a self-signed certificate with a private/public keypair in my personal store and a certificate in my localmachine/addressbook only containing a public key.

  1. Verify the certificates

currentuser_certstore_cert

Let’s verify it’s private key:

private

Let’s verify the localmachine/addressbook certificate:

cert_store_public

cert_store_public_private_key

HasPrivateKey: False - tells us this certificate lacks the private key, and is somewhat useless for the decrypting of our file. We will now move on to the issue at hand

  1. Delete the current users private key to simulate the issue

delete_currentuser_privatekey_cert

Let’s try to query the certificate store to verify the lack of this deleted certificate

after_deleted_privatekey

We are recursivly looking for the certificate in the root of the certificate store, and we only got one hit. Meaning the private/public keypair has been removed, together with the ability to decrypt our file:

accessdenied_after_removal

The screenshot displays an attempt to move the file, as well as open it with notepad. Both failed due to a “lack of access”.

  1. Restoring the certificate using the public key in LocalMachine store

restore_private_key

First, we move into the LocalMachine\AddressBook path in the certificate store, and we verify that it contains our public-key based certificate

We then utilize certutil to restore the private-key part of that we had before lost:

certutil -repairstore addressbook '<insert thumbprint>'

privatekey_after_restore_success

Verify that the PrivateKey was infact restored

We have now restored the most critical part of the removal, but decryption will still fail, since windows will only query your personal store while the decryption process takes place. This means we will need to export this certificate, together with it’s private key - and import it to your personal store.

  1. Export / Import the key-pair to the Personal Store

export_privateandpublic_key

Navigate to the mmc snap-in, import the Certificate snap-in, select Local Computer, and navigate to the ‘Other People’ folder

Right-click the certificate (this is the same certificate that we displayed in PowerShell after the restore process) All Tasks > Export…

private_key_selected

Export the private key > Next

allow_only_current_user

Use your currently logged on user > Next

savetopath

Save to path

import_current_user

Navigate to the path, right-click > Install PFX, Current User > Next > Next > Next..

successful_import

We have now moved the key-pair back to the personal store, and can now decrypt files

successful_decrypt_and_write

Verifying this by writing to the file, and getting it’s content

successful_open_doubleclick

Double-clicking the file now works as expected

Discussion

We have now simulated an issue that unaware users can be exposed to, and solved it using PowerShell, the Certificate snap-in in mmc, and certutil.

Accidential deletion the private/public key-pair in the personal store can be quite common, since IT personel usually perform this, together with a GPUpdate, to re-enroll autoenrollment managed certificates. However, in this case the certificate will not re-enroll itself since it’s a self-signed certificate, only used locally for ecryption/decryption.

This problem be solved in this way for self-signed certificates, I would imagine that it’s the most of cases.

If your Public Key Infrastructure enrolls “Encryption File System” certificates to domain users, a PKI admin could in theory export the certificate in the CA (as long as the private key is exportable, that is).

PowerShell Productivity tip: Working with History

One thing I’ve always done while hacking along in the terminal is working with my command-line history. There’s quite a few ways to do so currently, so I thought i’d share some of my favorite ones.

Different ways of viewing your history

  • To view your current sessions history, PowerShell creates an alias for the cmdlet Get-History -> h. Simple as that.
h
  • To view your over-all history in your current environment, PSReadLine is your friend:
cat (Get-PSReadLineOption).HistorySavePath

Viewing the txt-file of PSReadLines history can be a lifesaver if you work with colleagues that tend to never document their solutions. You could modify the HistorySavePath property of your own output, to view someone else’s history on a shared server/computer.

cat (Get-PSReadLineOption).historysavepath.Replace("$env:USERNAME","Your-Colleague-That-Did-Not-Docx-It")

History Tend To Repeat Itself

As we all know, we humans always have a need to re-run history. Without being political about it, here’s some examples on how you could do this in PowerShell

r 25

It’s that simple. In your current session, each command you enter will be available in your Get-History (h). Each history entry has an ID. The cmdlet behind the alias r (Invoke-History) will execute your history based on the ID you provide. In the example above, I’m executing my 25’th command inputed in my terminal.

But there’s more!

PS C:\Users\Emil\git> #25<tab>
PS C:\Users\Emil\git> Write-Host "Tomorrow is monday!!!" -ForegroundColor (Get-Random "Green","Yellow","Blue")

The following is PowerShell Black-Magic, and is’s really useful. It saves you the trouble of copy-pasting your history.

Simply view your history, memorize the ID, hit ‘#’ and the ID, followed by a ’tab’ and you have it printed ready to be executed in your terminal.

Help!

If you’ve read this far and desire more reading, run the following one-liner

gcm *-history | % { help $_.name -s }

To be clear:

Get-Command *-history | ForEach-Object { Get-Help $_.Name -ShowWindow }
Get-Help about_history

Hopefully this was a decent history-lesson for someone!

Happy coding

/Emil

PowerShell in 2021: From my IAM Perspective

2021: From my IAM Perspective

First of all, this blog-post inspired me to write something similar, so thanks @MDowst for sharing.

For myself I’ve been writing quite a few security focused modules and scripts to help me in my day-to-day life as an IAM Engineer. I’ve discovered the PowerShell community calls, and found a whole array of inspirational twitter-users to follow.

2021: Some cool stuff I found

2022 exciting stuff

Some stuff I’m personally excited about:

  • Finish reading PowerShell Cookbook
  • Finish reading PowerShell In Action
  • Developing more automation with PowerShell 7 and Azure Automation
  • Implementing PowerShell logging / Protected logging
  • Implementing a team oriented PowerShellGet repo
  • Implementing SecretManegement/SecretStore/Az Keyvault on on-prem resources
  • Attend the PowerShell Community Calls
  • Creating more useful modules for my team in general, as well as improving existing code
  • Would love to contribute to the PowerShell repo this year, or publish some useful code to the PSGallery, to give back to the community

Time flies

This year will be my 9th year of using and working with PowerShell, much have changed but mostly for the better.

I miss when ISE was the golden standard though, but hopefully VSCode will work more smoothly this year. I do appriciate the cross-platformness of VSCode, and the ability to export and import keybinds.

Would be awesome with a competitor to VSCode though, I would assume that some competition would not hurt for the end-user exp, let’s see what happens.

End with fire!

I’d like to end my blog-post with this old gem from Lee Holmes Still waiting for someone smart enough to re-write this for the Windows Terminal :)

Have a good one!

Happy coding

/Emil