250 PowerShell Scripts
for IT Professionals
Preface
PowerShell has become an
indispensable tool for IT
professionals, system
administrators, and developers
worldwide. Its ability to
streamline workflows, automate
repetitive tasks, and manage
complex IT infrastructures makes it
a cornerstone of modern IT
management. However, while
PowerShell is a robust and
versatile scripting language, many
professionals struggle to unlock
its full potential. This book, "250
PowerShell Scripts for IT
Professionals", is designed to
bridge that gap.
The goal of this book is to provide
a comprehensive, practical, and
hands-on guide for leveraging
PowerShell to solve real-world IT
challenges. Whether you're a
seasoned IT professional looking to
enhance your scripting capabilities
or a beginner aiming to build
foundational skills, this book
offers something for everyone.
---
Why This Book?
Unlike traditional PowerShell books
that focus solely on syntax and
theory, this book dives straight
into practical, ready-to-use
scripts. Each chapter is organized
around a specific area of IT
management, such as system
administration, file and folder
management, user account
management, Active Directory, and
more. The scripts are designed to
be immediately applicable, helping
you save time, reduce errors, and
improve efficiency in your daily
tasks.
---
What You Will Learn
Through this book, you will:
Gain a deep understanding of
PowerShell’s capabilities and how
to harness them effectively.
Explore practical scripts that
address common IT challenges,
from managing files and users to
configuring networks and
automating administrative tasks.
Learn to customize and extend the
provided scripts to meet your
specific requirements.
Build confidence in your ability
to script complex workflows and
integrate PowerShell into your IT
ecosystem.
---
How This Book is
Structured
The book is divided into seven
chapters, each focusing on a
different area of IT management:
1. Chapter 1: PowerShell Basics -
Introduces foundational concepts
and scripts to get you started
with PowerShell.
2. Chapter 2: File and Folder
Management - Covers scripts to
manage files, folders, and
storage.
3. Chapter 3: User Management -
Provides scripts for handling
user accounts and permissions.
4. Chapter 4: Active Directory
Management - Explores advanced
scripts for managing Active
Directory environments.
5. Chapter 5: System Administration
- Focuses on automating
administrative tasks and managing
IT infrastructure.
6. Chapter 6: Network Administration
- Delves into network monitoring,
configuration, and
troubleshooting.
7. Chapter 7: Advanced Automation -
Highlights sophisticated scripts
for cloud management, DevOps
pipelines, and more.
Each chapter is packed with scripts
that gradually progress in
complexity, allowing you to build
your skills step by step. The
scripts are annotated with detailed
explanations, tips, and best
practices to ensure you understand
not just how they work but also why
they work.
---
Who This Book is For
This book is ideal for:
System Administrators who want to
automate repetitive tasks and
enhance their efficiency.
IT Professionals looking to
manage large-scale environments
with minimal manual intervention.
Developers seeking to integrate
PowerShell into their development
workflows.
Students and Beginners who want a
hands-on approach to learning
PowerShell scripting.
---
How to Use This Book
While the chapters are designed to
be read sequentially, you can also
use this book as a reference guide.
Each script is self-contained,
allowing you to jump directly to
the sections relevant to your
current needs. Additionally, the
scripts are designed to be modular
and customizable, enabling you to
adapt them to various scenarios.
---
Acknowledgments
This book is the result of years of
experience and feedback from the IT
community. I want to thank all the
PowerShell enthusiasts, trainers,
and professionals who contributed
to the evolution of this scripting
language and shared their
expertise. Special thanks to the
countless IT administrators and
learners whose challenges inspired
many of the scripts in this book.
---
Final Thoughts
PowerShell is more than just a
scripting language—it’s a gateway
to automation, efficiency, and
mastery in IT management. With the
scripts and guidance provided in
this book, you’ll be well-equipped
to tackle challenges, streamline
workflows, and transform how you
manage IT systems. Let’s embark on
this journey together and unlock
the full potential of PowerShell.
Happy scripting!
László Bocsó (Microsoft Certified
Trainer)
Table of Contents
Chapt
Title Scripts
er
Overview of PowerShell, Setting
up your environment, Basic
Introducti
Intro scripting fundamentals,
on
Understanding cmdlets, modules,
and script files
Display system information, List
PowerShell installed programs, Find uptime,
1 Basics (1– Disk usage, IP configuration,
20) Show processes, Export processes,
Kill processes
Current user,
Create/rename/move/delete files,
Get folder size, Generate
password, Format date, Compare
files, Append text, Search text
Create folders, List files,
File/Folde
Search by extension, Batch
r
2 rename, Zip/unzip,
Management
Copy/move/archive files, Monitor
(21–50)
changes, Delete empty folders
Chapt
Title Scripts
er
Set permissions, Add security
group, Retrieve creation date,
Folder tree, Export metadata,
Sort files, Backup/restore,
Remove duplicates
Encrypt/decrypt files, Create
shortcuts, Monitor access,
Split/merge files, Weekly
cleanup, Count files, Directory
report
List users,
User Create/modify/enable/disable/dele
3 Management te accounts, Reset passwords,
(51–90) Unlock accounts, Expiration
dates, Assign/remove groups
Export/import users, Set
profiles, Permissions, Disabled
accounts, Last login, Login
history, Password reminders,
Inactive users, Bulk creation
Chapt
Title Scripts
er
Activity reports, Admin accounts,
Group changes, Clone accounts,
Temp credentials, Login attempts,
Roaming profiles, Backup/restore
profiles
Log login/logout, Lock inactive
accounts, Unique usernames,
Active sessions, Failed logins,
Domain users, Orphaned accounts,
Expired passwords
Password policy, Expiry
notifications
Query users/groups, Find
Active user/group members, Add/remove
4 Directory users to groups, Export/import
(91–130) users, Automate creation, Audit
accounts, Replication
Group policy, Domain controllers,
OU creation, Export policies,
Cleanup reports, Create security
groups, Hierarchy, User
attributes
Chapt
Title Scripts
er
Expired passwords, Bulk resets,
Disable stale accounts,
Authentication logs, Schema, Role
reports, Access control, Group
cleanup
Automate updates, Service
System management, Monitor CPU/memory,
5 Admin Schedule tasks, Event logs,
(131–180) Performance reports,
Reboot/shutdown systems
Install/uninstall software,
Backups, Restore systems,
Hardware issues, Inventory,
Manage drivers, Audit/configure
firewall, Network tests
Disk cleanup, Disk management
(format, encrypt, clone), VHDs,
Tasks, Printers, Network
adapters, BIOS, Defender, SSL,
Remote desktop, BitLocker
Disk health, System policies,
Restore points
Chapt
Title Scripts
er
Port scans, Outage detection,
Network Monitor traffic, Network map, IP
6 Admin allocation, DNS/DHCP settings,
(181–220) VPN, Latency, Wi-Fi, Server
uptime, Shares
FTP automation, Proxy settings,
Export network config, Server
logs, VLANs, Rogue devices, MAC
filtering, Static IP, Router
backups, SSH
Load balancers, Failover,
Interface health, QoS, Bandwidth,
NTP, Firewall logs, Security
scans, DNS issues, Switch configs
Incident response, Cloud config,
Advanced Azure VMs, AWS resources,
7 Automation Database backups/maintenance,
(221–250) Email, Slack, Sync, CI/CD, App
logs
Chapt
Title Scripts
er
Compliance, Anomalies, IoT
devices, API integration, DevOps,
DSC, Containers, Kubernetes, VM
snapshots, Hyper-V, Docker,
Certificates
Encryption, Key rotation, Git,
Security, Compliance, Patch
management, End-of-day reports
Concl Summary of the book, Tips for
Conclusion
. extending scripts
Introduction to
PowerShell for IT
Professionals
Overview of PowerShell
and its importance in IT
PowerShell is a powerful task
automation and configuration
management framework developed by
Microsoft. It consists of a
command-line shell and associated
scripting language built on the
.NET Framework. Since its
introduction in 2006, PowerShell
has become an essential tool for IT
professionals, system
administrators, and developers
working in Windows environments and
beyond.
Key features of PowerShell:
1. Command-line interface (CLI):
PowerShell provides a robust CLI
for executing commands and
scripts.
2. Scripting language: It includes a
full-featured scripting language
for creating complex automation
scripts.
3. Object-oriented: PowerShell works
with .NET objects, allowing for
rich data manipulation and
analysis.
4. Extensible: Users can create
custom cmdlets and modules to
extend PowerShell's
functionality.
5. Cross-platform: With PowerShell
Core, it's now available on
Windows, macOS, and Linux.
Importance in IT:
1. Automation: PowerShell enables IT
professionals to automate
repetitive tasks, saving time and
reducing human error.
2. System administration: It
provides powerful tools for
managing Windows systems, Active
Directory, and other Microsoft
technologies.
3. Cloud management: PowerShell is
crucial for managing cloud
services like Azure and Office
365.
4. DevOps: It plays a significant
role in DevOps practices,
facilitating infrastructure as
code and continuous
integration/continuous deployment
(CI/CD) pipelines.
5. Security: PowerShell is essential
for security auditing, threat
hunting, and incident response.
6. Reporting: It offers robust
capabilities for generating
reports and analyzing system
data.
Evolution of PowerShell:
PowerShell 1.0: Introduced in
2006 for Windows XP SP2, Windows
Server 2003 SP1, and Windows
Vista.
PowerShell 2.0: Released in 2009,
adding remoting capabilities and
script modules.
PowerShell 3.0: Launched in 2012
with Windows 8 and Windows Server
2012, introducing workflow
functionality and improved
performance.
PowerShell 4.0: Released in 2013
with Windows 8.1 and Windows
Server 2012 R2, adding Desired
State Configuration (DSC).
PowerShell 5.0: Introduced in
2016 with Windows 10, adding
class support and package
management.
PowerShell Core 6.0: Released in
2018 as open-source and cross-
platform.
PowerShell 7.0+: The latest
versions, combining Windows
PowerShell features with the
cross-platform capabilities of
PowerShell Core.
Setting up your
environment for scripting
To get started with PowerShell
scripting, you'll need to set up
your environment properly. This
section will guide you through the
process of installing PowerShell,
configuring your development
environment, and preparing for
script execution.
Installing PowerShell:
1. Windows:
PowerShell is pre-installed on
modern Windows systems.
To get the latest version,
download it from the Microsoft
Store or the GitHub releases
page.
2. macOS:
Install Homebrew package manager
if not already installed.
Run brew install --cask powershell in
Terminal.
3. Linux:
Follow distribution-specific
instructions from the official
Microsoft documentation.
Generally involves adding
Microsoft's repository and
installing via package manager.
Configuring your development
environment:
1. Integrated Development
Environment (IDE):
Visual Studio Code (VS Code) with
PowerShell extension is highly
recommended.
Install VS Code from
[Link]
Install the PowerShell extension
from the VS Code marketplace.
2. PowerShell Integrated Scripting
Environment (ISE):
Built-in Windows tool for
PowerShell scripting (not
available on PowerShell Core).
Launch by typing "PowerShell ISE"
in the Start menu.
3. Text editors:
Notepad++, Sublime Text, or Atom
can be used with appropriate
plugins.
Configuring execution policy:
PowerShell's execution policy
determines which scripts can be run
on your system. To allow script
execution:
1. Open PowerShell as Administrator.
2. Run Get-ExecutionPolicy to check the
current policy.
3. Set a less restrictive policy
with Set-ExecutionPolicy RemoteSigned.
Setting up profile scripts:
Profile scripts run automatically
when PowerShell starts, allowing
you to customize your environment.
1. Create a profile script:
if (!(Test-Path -Path $PROFILE)) {
New-Item -ItemType File -Path $PROFILE -
Force
}
2. Edit the profile script:
notepad $PROFILE
3. Add customizations, such as:
# Set aliases
Set-Alias -Name np -Value [Link]
# Custom prompt
function prompt {
"PS
$($[Link]
Location)$('>' * ($nestedPromptLevel + 1)) "
}
# Import modules
Import-Module PSReadLine
Setting up version control:
1. Install Git from [Link]
[Link]/
2. Configure Git with your name and
email:
git config --global [Link] "Your Name"
git config --global [Link]
"[Link]@[Link]"
3. Initialize a Git repository for
your PowerShell projects:
mkdir PowerShellScripts
cd PowerShellScripts
git init
Creating a script template:
Create a template for new scripts
to ensure consistency:
<#
.SYNOPSIS
Brief description of the script.
.DESCRIPTION
Detailed description of the script.
.PARAMETER Param1
Description of Param1.
.EXAMPLE
Example usage of the script.
.NOTES
Author: Your Name
Date: $(Get-Date -Format "yyyy-MM-dd")
#>
[CmdletBinding()]
param (
[Parameter(Mandatory=$true)]
[string]$Param1
)
# Script logic goes here
Save this as ScriptTemplate.ps1 and use
it as a starting point for new
scripts.
Basic scripting
fundamentals
Understanding the fundamentals of
PowerShell scripting is crucial for
creating effective and efficient
scripts. This section covers the
basic concepts and structures
you'll use in your PowerShell
scripts.
Variables and data types:
PowerShell uses variables to store
and manipulate data. Variables in
PowerShell are denoted by a $ sign.
# Declaring variables
$name = "John Doe"
$age = 30
$isAdmin = $true
# Strongly typing variables
[string]$stringVar = "Hello"
[int]$intVar = 42
[bool]$boolVar = $false
# Arrays
$fruits = @("apple", "banana", "orange")
# Hashtables
$person = @{
Name = "Alice"
Age = 25
City = "New York"
}
Operators:
PowerShell supports various
operators for comparison,
arithmetic, and logical operations.
# Arithmetic operators
$sum = 5 + 3
$difference = 10 - 7
$product = 4 * 6
$quotient = 15 / 3
# Comparison operators
$isEqual = 5 -eq 5
$isGreater = 10 -gt 5
$isLess = 3 -lt 7
# Logical operators
$andResult = $true -and $false
$orResult = $true -or $false
$notResult = -not $false
Control structures:
Control structures allow you to
control the flow of your script
based on conditions or to repeat
actions.
1. If-Else statements:
$age = 18
if ($age -ge 18) {
Write-Output "You are an adult"
} elseif ($age -ge 13) {
Write-Output "You are a teenager"
} else {
Write-Output "You are a child"
}
2. Switch statements:
$fruit = "apple"
switch ($fruit) {
"apple" { Write-Output "It's an apple" }
"banana" { Write-Output "It's a banana" }
"orange" { Write-Output "It's an orange"
}
default { Write-Output "Unknown fruit" }
}
3. Loops:
ForEach-Object loop:
$numbers = 1..5
$numbers | ForEach-Object {
Write-Output "Number: $_"
}
For loop:
for ($i = 1; $i -le 5; $i++) {
Write-Output "Iteration $i"
}
While loop:
$counter = 0
while ($counter -lt 5) {
Write-Output "Counter: $counter"
$counter++
}
Do-While loop:
$number = 0
do {
$number = Get-Random -Minimum 1 -Maximum
10
Write-Output "Generated number: $number"
} while ($number -ne 7)
Functions:
Functions allow you to group code
into reusable blocks. PowerShell
supports both simple and advanced
functions.
1. Simple function:
function Get-Greeting {
param (
[string]$Name
)
return "Hello, $Name!"
}
$greeting = Get-Greeting -Name "Alice"
Write-Output $greeting
2. Advanced function:
function Get-ComputerInfo {
[CmdletBinding()]
param (
[Parameter(Mandatory=$true)]
[string]$ComputerName
)
begin {
Write-Verbose "Starting to gather
information for $ComputerName"
}
process {
try {
$os = Get-WmiObject -Class
Win32_OperatingSystem -ComputerName
$ComputerName
$cpu = Get-WmiObject -Class
Win32_Processor -ComputerName $ComputerName
$memory = Get-WmiObject -Class
Win32_PhysicalMemory -ComputerName
$ComputerName | Measure-Object -Property
Capacity -Sum
$result = [PSCustomObject]@{
ComputerName = $ComputerName
OSVersion = $[Link]
CPUName = $[Link]
TotalMemoryGB =
[math]::Round($[Link] / 1GB, 2)
}
return $result
}
catch {
Write-Error "Failed to retrieve
information for $ComputerName: $_"
}
}
end {
Write-Verbose "Finished gathering
information for $ComputerName"
}
}
$info = Get-ComputerInfo -ComputerName
"localhost" -Verbose
$info
Error handling:
Proper error handling is crucial
for creating robust scripts.
PowerShell provides several
mechanisms for handling errors.
1. Try-Catch-Finally:
try {
$result = 10 / 0
}
catch [[Link]] {
Write-Error "Division by zero occurred"
}
catch {
Write-Error "An error occurred: $_"
}
finally {
Write-Output "This will always execute"
}
2. ErrorActionPreference:
$ErrorActionPreference = "Stop" # This will
cause all errors to be terminating
try {
Get-Content "[Link]"
}
catch {
Write-Output "File not found: $_"
}
3. LASTEXITCODE:
ping localhost
if ($LASTEXITCODE -eq 0) {
Write-Output "Ping successful"
} else {
Write-Output "Ping failed"
}
Get-Process -Name "NonExistentProcess"
if (-not $?) {
Write-Output "The last command failed"
}
Working with files and
folders:
PowerShell provides cmdlets for
working with files and folders:
# Create a new folder
New-Item -Path "C:\Temp\NewFolder" -ItemType
Directory
# Create a new file
New-Item -Path "C:\Temp\[Link]" -
ItemType File
# Write content to a file
Set-Content -Path "C:\Temp\[Link]" -
Value "Hello, World!"
# Append content to a file
Add-Content -Path "C:\Temp\[Link]" -
Value "This is a new line"
# Read content from a file
$content = Get-Content -Path
"C:\Temp\[Link]"
# Copy a file
Copy-Item -Path "C:\Temp\[Link]" -
Destination
"C:\Temp\NewFolder\[Link]"
# Move a file
Move-Item -Path "C:\Temp\[Link]" -
Destination "C:\Temp\NewFolder\[Link]"
# Delete a file
Remove-Item -Path
"C:\Temp\NewFolder\[Link]"
# Get file properties
Get-ItemProperty -Path
"C:\Temp\NewFolder\[Link]"
Working with CSV and XML
files:
PowerShell has built-in cmdlets for
working with CSV and XML files:
1. CSV files:
# Export data to CSV
$data = @(
[PSCustomObject]@{Name="John"; Age=30;
City="New York"},
[PSCustomObject]@{Name="Alice"; Age=25;
City="London"},
[PSCustomObject]@{Name="Bob"; Age=35;
City="Paris"}
)
$data | Export-Csv -Path "C:\Temp\[Link]" -
NoTypeInformation
# Import data from CSV
$importedData = Import-Csv -Path
"C:\Temp\[Link]"
$importedData
2. XML files:
# Export data to XML
$data = @(
[PSCustomObject]@{Name="John"; Age=30;
City="New York"},
[PSCustomObject]@{Name="Alice"; Age=25;
City="London"},
[PSCustomObject]@{Name="Bob"; Age=35;
City="Paris"}
)
$data | Export-Clixml -Path
"C:\Temp\[Link]"
# Import data from XML
$importedData = Import-Clixml -Path
"C:\Temp\[Link]"
$importedData
Regular expressions:
PowerShell supports regular
expressions for pattern matching
and text manipulation:
# Match a pattern
$text = "The quick brown fox jumps over the
lazy dog"
$pattern = "quick.*fox"
if ($text -match $pattern) {
Write-Output "Pattern found:
$($Matches[0])"
}
# Replace text
$newText = $text -replace "lazy", "energetic"
Write-Output $newText
# Split text
$words = $text -split "\s+"
$words
# Using the -match operator with capture
groups
$logEntry = "2023-04-15 [Link] ERROR: File
not found"
if ($logEntry -match "(\d{4}-\d{2}-\d{2})
(\d{2}:\d{2}:\d{2}) (\w+): (.+)") {
$date = $Matches[1]
$time = $Matches[2]
$level = $Matches[3]
$message = $Matches[4]
Write-Output "Date: $date"
Write-Output "Time: $time"
Write-Output "Level: $level"
Write-Output "Message: $message"
}
Understanding cmdlets,
modules, and script files
PowerShell's functionality is built
around cmdlets, modules, and script
files. Understanding these
components is essential for
effective PowerShell scripting and
automation.
Cmdlets:
Cmdlets are lightweight commands in
PowerShell that perform specific
operations. They are typically
implemented in .NET and follow a
verb-noun naming convention.
1. Anatomy of a cmdlet:
Verb: Describes the action (e.g.,
Get, Set, New, Remove)
Noun: Describes the resource
(e.g., Process, Service, Item)
Example: Get-Process, Set-Location, New-
Item
2. Common cmdlets:
# Get running processes
Get-Process
# Get services
Get-Service
# Get content of a file
Get-Content -Path "C:\[Link]"
# Create a new directory
New-Item -Path "C:\NewFolder" -ItemType
Directory
# Remove a file
Remove-Item -Path "C:\[Link]"
3. Getting help for cmdlets:
# Get help for a specific cmdlet
Get-Help Get-Process
# Get detailed help with examples
Get-Help Get-Process -Detailed
# Get full help documentation
Get-Help Get-Process -Full
# Get online help
Get-Help Get-Process -Online
4. Using parameters:
# Using named parameters
Get-Process -Name "chrome"
# Using positional parameters
Get-Process chrome
# Using switch parameters
Get-Process -Name "chrome" -FileVersionInfo
5. Pipeline:
Cmdlets can be combined using the
pipeline operator | to create
powerful commands:
Get-Process | Where-Object { $_.CPU -gt 10 }
| Sort-Object CPU -Descending | Select-Object
-First 5
Modules:
Modules are packages that contain
cmdlets, functions, variables, and
other resources. They extend
PowerShell's functionality and
allow for better organization of
code.
1. Built-in modules:
# List available modules
Get-Module -ListAvailable
# Import a module
Import-Module ActiveDirectory
# Get commands in a module
Get-Command -Module ActiveDirectory
2. Installing modules from
PowerShell Gallery:
# Install a module
Install-Module -Name PSReadLine
# Update a module
Update-Module -Name PSReadLine
# Uninstall a module
Uninstall-Module -Name PSReadLine
3. Creating a custom module:
# Create a new module manifest
New-ModuleManifest -Path
"C:\MyModule\MyModule.psd1" -RootModule
"MyModule.psm1" -Author "Your Name"
# Create the module script file
@"
function Get-Greeting {
param([string]$Name)
return "Hello, $Name!"
}
function Get-Farewell {
param([string]$Name)
return "Goodbye, $Name!"
}
Export-ModuleMember -Function Get-Greeting,
Get-Farewell
"@ | Set-Content -Path
"C:\MyModule\MyModule.psm1"
# Import the custom module
Import-Module "C:\MyModule\MyModule.psd1"
# Use the custom module functions
Get-Greeting -Name "Alice"
Get-Farewell -Name "Bob"
Script files:
Script files are text files
containing PowerShell commands and
code. They allow you to save and
reuse complex sequences of
commands.
1. Creating a script file:
Create a new file with a .ps1
extension
Add your PowerShell code to the
file
Save the file
2. Running a script file:
# Run a script file
.\MyScript.ps1
# Run a script file with parameters
.\MyScript.ps1 -Param1 "Value1" -Param2
"Value2"
3. Script parameters:
# MyScript.ps1
param (
[Parameter(Mandatory=$true)]
[string]$Name,
[int]$Age = 30
)
Write-Output "Hello, $Name! You are $Age
years old."
# Run the script
.\MyScript.ps1 -Name "Alice" -Age 25
4. Dot-sourcing scripts:
Dot-sourcing allows you to run a
script in the current scope, making
its functions and variables
available in your session:
# Dot-source a script
. .\MyFunctions.ps1
# Now you can use functions defined in
MyFunctions.ps1
5. Script blocks:
Script blocks are portions of code
that can be stored in variables and
passed as arguments:
$scriptBlock = {
param($x, $y)
return $x + $y
}
$result = & $scriptBlock 5 3
Write-Output $result # Output: 8
6. Using functions in scripts:
# MyFunctions.ps1
function Get-Square {
param([int]$Number)
return $Number * $Number
}
function Get-Cube {
param([int]$Number)
return $Number * $Number * $Number
}
# MyScript.ps1
. .\MyFunctions.ps1
$number = 5
$square = Get-Square -Number $number
$cube = Get-Cube -Number $number
Write-Output "The square of $number is
$square"
Write-Output "The cube of $number is $cube"
7. Error handling in scripts:
# ErrorHandling.ps1
[CmdletBinding()]
param (
[Parameter(Mandatory=$true)]
[string]$Path
)
try {
$content = Get-Content -Path $Path -
ErrorAction Stop
Write-Output "File contents:"
Write-Output $content
}
catch [[Link]] {
Write-Error "File not found: $Path"
}
catch {
Write-Error "An error occurred: $_"
}
finally {
Write-Output "Script execution
completed."
}
# Run the script
.\ErrorHandling.ps1 -Path
"C:\[Link]"
8. Using advanced functions in
scripts:
# AdvancedFunction.ps1
function Get-ComputerInfo {
[CmdletBinding()]
param (
[Parameter(Mandatory=$true,
ValueFromPipeline=$true)]
[string[]]$ComputerName
)
begin {
Write-Verbose "Starting computer
information gathering"
}
process {
foreach ($computer in $ComputerName)
{
try {
$os = Get-WmiObject -Class
Win32_OperatingSystem -ComputerName $computer
$cpu = Get-WmiObject -Class
Win32_Processor -ComputerName $computer
$memory = Get-WmiObject -
Class Win32_PhysicalMemory -ComputerName
$computer | Measure-Object -Property Capacity
-Sum
[PSCustomObject]@{
ComputerName = $computer
OSVersion = $[Link]
CPUName = $[Link]
TotalMemoryGB =
[math]::Round($[Link] / 1GB, 2)
}
}
catch {
Write-Error "Failed to
retrieve information for $computer: $_"
}
}
}
end {
Write-Verbose "Finished computer
information gathering"
}
}
# Usage
"localhost", "remote-computer" | Get-
ComputerInfo -Verbose
9. Creating a script module:
# MyModule.psm1
function Get-Greeting {
param([string]$Name)
return "Hello, $Name!"
}
function Get-Farewell {
param([string]$Name)
return "Goodbye, $Name!"
}
Export-ModuleMember -Function Get-Greeting,
Get-Farewell
# MyScript.ps1
Import-Module .\MyModule.psm1
$name = "Alice"
$greeting = Get-Greeting -Name $name
$farewell = Get-Farewell -Name $name
Write-Output $greeting
Write-Output $farewell
0. Using script signing for
security:
# Generate a self-signed certificate
$cert = New-SelfSignedCertificate -Subject
"CN=PowerShell Code Signing" -Type
CodeSigning -CertStoreLocation
Cert:\CurrentUser\My
# Sign a script
Set-AuthenticodeSignature -FilePath
.\MyScript.ps1 -Certificate $cert
# Set execution policy to require signed
scripts
Set-ExecutionPolicy AllSigned
# Run the signed script
.\MyScript.ps1
By mastering cmdlets, modules, and
script files, you'll be able to
create powerful and reusable
PowerShell solutions for various IT
tasks and automation scenarios.
Remember to always follow best
practices, such as proper error
handling, input validation, and
code documentation, to ensure your
scripts are robust and
maintainable.
Chapter 1: PowerShell
Basics (Scripts 1–20)
1. Display System
Information
This script retrieves and displays
essential system information,
providing a quick overview of the
computer's hardware and software
configuration.
Get-ComputerInfo | Select-Object
WindowsProductName, WindowsVersion,
OsHardwareAbstractionLayer, CsManufacturer,
CsModel, CsProcessors,
CsNumberOfLogicalProcessors,
CsNumberOfProcessors, CsTotalPhysicalMemory
This script uses the Get-ComputerInfo
cmdlet to gather system information
and then selects specific
properties to display:
WindowsProductName: The name of
the Windows operating system
WindowsVersion: The version
number of the Windows operating
system
OsHardwareAbstractionLayer: The
Hardware Abstraction Layer (HAL)
version
CsManufacturer: The computer
manufacturer
CsModel: The computer model
CsProcessors: Information about
the processors
CsNumberOfLogicalProcessors: The
number of logical processors
CsNumberOfProcessors: The number
of physical processors
CsTotalPhysicalMemory: The total
amount of physical memory
This information can be useful for
troubleshooting, inventory
management, or simply understanding
the capabilities of the system
you're working with.
2. List All Installed
Programs
This script retrieves and displays
a list of all installed programs on
the system, which can be helpful
for software inventory or
troubleshooting purposes.
Get-WmiObject -Class Win32_Product | Select-
Object Name, Version, Vendor | Sort-Object
Name | Format-Table -AutoSize
Here's a breakdown of the script:
Get-WmiObject -Class Win32_Product: This
cmdlet retrieves information
about installed software products
using the Windows Management
Instrumentation (WMI) class
Win32_Product.
Select-Object Name, Version, Vendor: This
selects only the Name, Version,
and Vendor properties from the
retrieved information.
Sort-Object Name: This sorts the list
alphabetically by the program
name.
Format-Table -AutoSize: This formats the
output as a table and
automatically adjusts the column
widths.
Note that using Win32_Product can
be slow on some systems and may
trigger a consistency check of
installed programs. An alternative
method is to query the registry:
Get-ItemProperty
HKLM:\Software\Wow6432Node\Microsoft\Windows\
CurrentVersion\Uninstall\* |
Select-Object DisplayName, DisplayVersion,
Publisher, InstallDate |
Where-Object { $_.DisplayName -ne $null } |
Sort-Object DisplayName |
Format-Table -AutoSize
This alternative method is faster
and doesn't trigger a consistency
check, but it may not capture all
installed programs, especially
those installed for the current
user only.
3. Find System Uptime
This script calculates and displays
the system's uptime, which is
useful for monitoring system
stability and tracking reboot
cycles.
$bootuptime = (Get-CimInstance -ClassName
Win32_OperatingSystem).LastBootUpTime
$currenttime = Get-Date
$uptime = $currenttime - $bootuptime
$uptimeFormatted = "{0} days, {1} hours, {2}
minutes, {3} seconds" -f $[Link],
$[Link], $[Link],
$[Link]
Write-Output "System Uptime:
$uptimeFormatted"
Here's how the script works:
1. It retrieves the last boot-up
time using the
Win32_OperatingSystem WMI class.
2. It gets the current date and
time.
3. It calculates the difference
between the current time and the
last boot-up time.
4. It formats the uptime into a
readable string showing days,
hours, minutes, and seconds.
5. Finally, it displays the
formatted uptime.
This information can be crucial for
identifying systems that haven't
been rebooted in a long time, which
might indicate stability issues or
pending updates that require a
restart.
4. Get Disk Usage
Statistics
This script provides information
about disk usage, which is
essential for managing storage
resources and identifying potential
space issues.
Get-WmiObject Win32_LogicalDisk -Filter
"DriveType=3" |
Select-Object DeviceID,
@{Name="Size(GB)";Expression=
{[math]::Round($_.Size/1GB,2)}},
@{Name="FreeSpace(GB)";Expression=
{[math]::Round($_.FreeSpace/1GB,2)}},
@{Name="UsedSpace(GB)";Expression=
{[math]::Round(($_.Size -
$_.FreeSpace)/1GB,2)}},
@{Name="PercentFree";Expression=
{[math]::Round(($_.FreeSpace / $_.Size) *
100,2)}} |
Format-Table -AutoSize
This script does the following:
1. Uses Get-WmiObject to retrieve
information about logical disks,
filtering for only fixed drives
(DriveType=3).
2. Selects the DeviceID (drive
letter) and calculates the
following:
Total size in GB
Free space in GB
Used space in GB
Percentage of free space
3. Formats the output as a table
with automatically sized columns.
The script uses calculated
properties to convert bytes to
gigabytes and calculate
percentages, rounding the results
to two decimal places for
readability.
This information is crucial for
proactive storage management,
helping to identify drives that are
running low on space before they
cause issues.
5. Retrieve IP
Configuration Details
This script retrieves and displays
detailed IP configuration
information for all network
adapters, which is useful for
network troubleshooting and
configuration verification.
Get-NetIPConfiguration |
Select-Object InterfaceAlias,
InterfaceDescription, IPv4Address,
IPv6Address, DNSServer |
Format-Table -AutoSize
Here's what the script does:
1. retrieves the IP
Get-NetIPConfiguration
configuration for all network
interfaces.
2. Select-Object is used to choose
specific properties:
InterfaceAlias: The friendly name
of the network interface
InterfaceDescription: A more
detailed description of the
interface
IPv4Address: The IPv4 address
assigned to the interface
IPv6Address: The IPv6 address
assigned to the interface
DNSServer: The DNS servers
configured for the interface
3. formats the
Format-Table -AutoSize
output as a table with
automatically sized columns.
This script provides a quick
overview of the network
configuration, which can be
invaluable when diagnosing network
issues or verifying network
settings across multiple machines.
6. Show All Running
Processes
This script displays information
about all currently running
processes on the system, which is
useful for monitoring system
activity and troubleshooting
performance issues.
Get-Process |
Select-Object Name, ID, CPU, WorkingSet,
Description |
Sort-Object CPU -Descending |
Format-Table -AutoSize
Here's a breakdown of the script:
1. Get-Process retrieves information
about all running processes.
2. Select-Object is used to choose
specific properties:
3. Name: The name of the process
4. ID: The process ID
5. CPU: The amount of processor time
used by the process
6. WorkingSet: The amount of
physical memory used by the
process
7. Description: A description of the
process (if available)
Sort-Object CPU -Descendingsorts the
processes by CPU usage in
descending order.
Format-Table -AutoSize formats the
output as a table with
automatically sized columns.
This script provides a snapshot of
current system activity,
highlighting which processes are
consuming the most CPU resources.
It can be particularly useful for
identifying resource-intensive
applications or potential malware
activity.
7. Export Running
Processes to a CSV
This script exports information
about running processes to a CSV
file, which can be useful for
record-keeping, analysis, or
sharing process information with
others.
$csvPath =
"$env:USERPROFILE\Desktop\[Link]
v"
Get-Process |
Select-Object Name, ID, CPU, WorkingSet,
Description |
Sort-Object CPU -Descending |
Export-Csv -Path $csvPath -NoTypeInformation
Write-Output "Process information exported to
$csvPath"
Here's how the script works:
1. It defines the path for the CSV
file, saving it to the user's
desktop.
2. Get-Process retrieves information
about all running processes.
3. Select-Object chooses specific
properties (Name, ID, CPU,
WorkingSet, Description).
4. Sort-Object CPU -Descending sorts the
processes by CPU usage in
descending order.
5. Export-Csv exports the data to the
specified CSV file.
6. The -NoTypeInformation parameter
removes the type information from
the CSV file, making it cleaner
and more compatible with other
applications.
7. Finally, it outputs a message
confirming where the file was
saved.
This script allows you to capture a
snapshot of running processes that
can be easily analyzed in
spreadsheet software or imported
into other tools for further
processing.
8. Kill a Process by Name
or ID
This script provides a way to
terminate a process either by its
name or ID, which can be useful for
stopping unresponsive applications
or managing system resources.
param (
[Parameter(Mandatory=$true,
ParameterSetName="ByName")]
[string]$ProcessName,
[Parameter(Mandatory=$true,
ParameterSetName="ById")]
[int]$ProcessId
)
if ($ProcessName) {
Get-Process -Name $ProcessName | ForEach-
Object {
$_ | Stop-Process -Force
Write-Output "Process '$($_.Name)'
with ID $($_.Id) has been terminated."
}
}
elseif ($ProcessId) {
try {
$process = Get-Process -Id $ProcessId
-ErrorAction Stop
$process | Stop-Process -Force
Write-Output "Process
'$($[Link])' with ID $ProcessId has
been terminated."
}
catch {
Write-Error "No process found with ID
$ProcessId."
}
}
This script uses parameters to
allow for flexible usage:
You can terminate a process by
name: .\KillProcess.ps1 -ProcessName
"notepad"
Or by ID: .\KillProcess.ps1 -ProcessId 1234
Here's how it works:
1. The script defines two parameter
sets: one for process name and
one for process ID.
2. If a process name is provided, it
uses Get-Process to find all
processes with that name and
terminates them using Stop-Process -
Force.
3. If a process ID is provided, it
attempts to find and terminate
the specific process with that
ID.
4. In both cases, it outputs a
message confirming the
termination of the process(es).
5. If no process is found with the
given ID, it outputs an error
message.
This script provides a flexible way
to terminate processes, which can
be particularly useful in scripts
for application management or
system maintenance.
9. Display the Current
Logged-in User
This script retrieves and displays
information about the currently
logged-in user, which can be useful
for auditing, troubleshooting, or
personalizing scripts.
$currentUser =
[[Link]]::
GetCurrent()
$userPrincipal = New-Object
[Link]($c
urrentUser)
$adminRole =
[[Link]
]::Administrator
$userInfo = @{
"Username" = $[Link]
"SID" = $[Link]
"Domain" =
$[Link]([[Link].
[Link]]).[Link]('\')[0]
"IsAdmin" =
$[Link]($adminRole)
}
$userInfo | Format-Table -AutoSize
Here's what the script does:
1. It gets the current user's
identity using
WindowsIdentity::GetCurrent().
2. It creates a WindowsPrincipal object
to check for role membership.
3. It defines the Administrator
role.
4. It creates a hashtable with user
information:
Username: The full username
SID: The Security Identifier of
the user
Domain: The domain or computer
name
IsAdmin: Whether the user has
administrator privileges
5. Finally, it displays this
information in a formatted table.
This script provides a quick way to
verify the current user context,
which can be crucial when running
scripts that require specific
permissions or when troubleshooting
user-specific issues.
10. Create a Text File
Using a Script
This script demonstrates how to
create a text file and write
content to it using PowerShell,
which is a common task in many
automation scenarios.
param (
[Parameter(Mandatory=$true)]
[string]$FilePath,
[Parameter(Mandatory=$true)]
[string]$Content
)
try {
# Ensure the directory exists
$directory = Split-Path -Path $FilePath -
Parent
if (-not (Test-Path -Path $directory)) {
New-Item -ItemType Directory -Path
$directory -Force | Out-Null
}
# Create the file and write the content
Set-Content -Path $FilePath -Value
$Content -Force
Write-Output "File created successfully
at: $FilePath"
}
catch {
Write-Error "An error occurred: $_"
}
This script does the following:
1. It takes two mandatory
parameters:
$FilePath: The full path where the
file should be created
$Content: The text content to write
to the file
2. It checks if the directory exists
and creates it if it doesn't.
3. It uses Set-Content to create the
file and write the content.
4. If successful, it outputs a
confirmation message.
5. If an error occurs, it outputs an
error message.
You can use this script like this:
.\CreateTextFile.ps1 -FilePath
"C:\Temp\[Link]" -Content "This is the
content of the new file."
This script is useful for creating
log files, configuration files, or
any scenario where you need to
programmatically generate text
files.
11. Rename a File
This script provides a way to
rename a file, which is a common
operation in file management and
automation tasks.
param (
[Parameter(Mandatory=$true)]
[string]$OldName,
[Parameter(Mandatory=$true)]
[string]$NewName
)
try {
# Check if the source file exists
if (-not (Test-Path -Path $OldName)) {
throw "The file '$OldName' does not
exist."
}
# Get the directory of the file
$directory = Split-Path -Path $OldName -
Parent
# Combine the directory with the new
filename
$newPath = Join-Path -Path $directory -
ChildPath $NewName
# Rename the file
Rename-Item -Path $OldName -NewName
$newPath -Force
Write-Output "File renamed successfully
from '$OldName' to '$newPath'"
}
catch {
Write-Error "An error occurred: $_"
}
Here's how the script works:
1. It takes two mandatory
parameters:
$OldName: The current name (or
path) of the file
$NewName: The new name for the file
2. It checks if the source file
exists.
3. It determines the directory of
the source file.
4. It constructs the new full path
for the file.
5. It uses Rename-Item to rename the
file.
6. If successful, it outputs a
confirmation message.
7. If an error occurs, it outputs an
error message.
You can use this script like this:
.\RenameFile.ps1 -OldName
"C:\Temp\[Link]" -NewName "[Link]"
This script is useful for batch
renaming operations, organizing
files, or as part of larger file
management scripts.
12. Copy a File to
Another Directory
This script demonstrates how to
copy a file from one location to
another, which is a fundamental
operation in file management and
data backup scenarios.
param (
[Parameter(Mandatory=$true)]
[string]$SourcePath,
[Parameter(Mandatory=$true)]
[string]$DestinationPath,
[switch]$Overwrite
)
try {
# Check if the source file exists
if (-not (Test-Path -Path $SourcePath)) {
throw "The source file '$SourcePath'
does not exist."
}
# Ensure the destination directory exists
$destDir = Split-Path -Path
$DestinationPath -Parent
if (-not (Test-Path -Path $destDir)) {
New-Item -ItemType Directory -Path
$destDir -Force | Out-Null
}
# Copy the file
Copy-Item -Path $SourcePath -Destination
$DestinationPath -Force:$Overwrite
Write-Output "File copied successfully
from '$SourcePath' to '$DestinationPath'"
}
catch {
Write-Error "An error occurred: $_"
}
Here's what the script does:
1. It takes three parameters:
$SourcePath: The path of the file to
be copied
$DestinationPath:
The path where the
file should be copied to
$Overwrite: An optional switch to
allow overwriting existing files
2. It checks if the source file
exists.
3. It ensures that the destination
directory exists, creating it if
necessary.
4. It uses Copy-Item to copy the file,
using the -Force parameter if
$Overwrite is specified.
5. If successful, it outputs a
confirmation message.
6. If an error occurs, it outputs an
error message.
You can use this script like this:
.\CopyFile.ps1 -SourcePath
"C:\Source\[Link]" -DestinationPath
"D:\Backup\[Link]" -Overwrite
This script is useful for backup
operations, distributing files
across a system, or as part of
larger file management workflows.
13. Move a File
This script demonstrates how to
move a file from one location to
another, which is useful for file
organization, archiving, or
workflow automation.
param (
[Parameter(Mandatory=$true)]
[string]$SourcePath,
[Parameter(Mandatory=$true)]
[string]$DestinationPath,
[switch]$Force
)
try {
# Check if the source file exists
if (-not (Test-Path -Path $SourcePath)) {
throw "The source file '$SourcePath'
does not exist."
}
# Ensure the destination directory exists
$destDir = Split-Path -Path
$DestinationPath -Parent
if (-not (Test-Path -Path $destDir)) {
New-Item -ItemType Directory -Path
$destDir -Force | Out-Null
}
# Move the file
Move-Item -Path $SourcePath -Destination
$DestinationPath -Force:$Force
Write-Output "File moved successfully
from '$SourcePath' to '$DestinationPath'"
}
catch {
Write-Error "An error occurred: $_"
}
Here's how the script works:
1. It takes three parameters:
$SourcePath: The current path of the
file
$DestinationPath:
The path where the
file should be moved to
$Force: An optional switch to force
the move operation, overwriting
existing files
2. It checks if the source file
exists.
3. It ensures that the destination
directory exists, creating it if
necessary.
4. It uses Move-Item to move the file,
using the -Force parameter if
specified.
5. If successful, it outputs a
confirmation message.
6. If an error occurs, it outputs an
error message.
You can use this script like this:
.\MoveFile.ps1 -SourcePath
"C:\Temp\OldLocation\[Link]" -
DestinationPath "D:\NewLocation\[Link]" -
Force
This script is particularly useful
for automating file organization
tasks, such as moving processed
files to an archive folder or
reorganizing files based on certain
criteria.
14. Delete a File
This script provides a way to
delete a file, which is a common
operation in file management and
cleanup tasks.
param (
[Parameter(Mandatory=$true)]
[string]$FilePath,
[switch]$Force
)
try {
# Check if the file exists
if (-not (Test-Path -Path $FilePath)) {
throw "The file '$FilePath' does not
exist."
}
# Delete the file
Remove-Item -Path $FilePath -Force:$Force
Write-Output "File '$FilePath' has been
deleted successfully."
}
catch {
Write-Error "An error occurred: $_"
}
Here's what the script does:
1. It takes two parameters:
2. $FilePath: The path of the file to
be deleted
3. $Force: An optional switch to force
deletion of read-only files
It checks if the file exists.
It uses Remove-Item to delete the
file, using the -Force parameter if
specified.
4. If successful, it outputs a
confirmation message.
5. If an error occurs, it outputs an
error message.
You can use this script like this:
.\DeleteFile.ps1 -FilePath
"C:\Temp\[Link]" -Force
This script is useful for cleanup
operations, removing temporary
files, or as part of larger file
management workflows. The -Force
switch allows for deletion of read-
only files, which can be helpful in
certain scenarios but should be
used cautiously.
15. Get Folder Size
This script calculates and displays
the size of a specified folder,
including all its contents. This is
useful for disk space management
and reporting.
param (
[Parameter(Mandatory=$true)]
[string]$FolderPath
)
function Get-FolderSize {
param ([string]$Path)
$size = Get-ChildItem -Path $Path -
Recurse -Force -ErrorAction SilentlyContinue
|
Measure-Object -Property Length -
Sum
return $[Link]
}
try {
# Check if the folder exists
if (-not (Test-Path -Path $FolderPath -
PathType Container)) {
throw "The folder '$FolderPath' does
not exist."
}
$size = Get-FolderSize -Path $FolderPath
# Convert bytes to more readable format
if ($size -ge 1TB) {
$sizeString = "{0:N2} TB" -f ($size /
1TB)
}
elseif ($size -ge 1GB) {
$sizeString = "{0:N2} GB" -f ($size /
1GB)
}
elseif ($size -ge 1MB) {
$sizeString = "{0:N2} MB" -f ($size /
1MB)
}
elseif ($size -ge 1KB) {
$sizeString = "{0:N2} KB" -f ($size /
1KB)
}
else {
$sizeString = "{0:N0} Bytes" -f $size
}
Write-Output "The size of folder
'$FolderPath' is $sizeString"
}
catch {
Write-Error "An error occurred: $_"
}
Here's how the script works:
1. It takes a mandatory parameter
$FolderPath for the folder to be
measured.
2. It defines a function Get-FolderSize
that recursively calculates the
total size of all files in the
folder.
3. It checks if the specified folder
exists.
4. It calls Get-FolderSize to calculate
the total size.
5. It converts the size from bytes
to a more readable format (TB,
GB, MB, KB, or Bytes).
6. Finally, it outputs the folder
size in the most appropriate
unit.
You can use this script like this:
.\GetFolderSize.ps1 -FolderPath
"C:\Users\Username\Documents"
This script is particularly useful
for identifying large folders that
may be consuming excessive disk
space, aiding in storage management
tasks.
16. Generate a Random
Password
This script generates a random
password based on specified
criteria, which can be useful for
creating secure temporary passwords
or for password policy enforcement.
param (
[int]$Length = 12,
[switch]$IncludeSpecialChars,
[switch]$IncludeNumbers,
[switch]$IncludeUppercase,
[switch]$IncludeLowercase
)
function Get-RandomPassword {
param (
[int]$Length,
[char[]]$CharSet
)
$random = New-Object [Link]
$password = 1..$Length | ForEach-Object {
$CharSet[$[Link](0, $[Link])] }
return -join $password
}
try {
$charSet = @()
if ($IncludeSpecialChars) { $charSet +=
'!@#$%^&*()_-+=<>?/[]{}|'.ToCharArray() }
if ($IncludeNumbers) { $charSet +=
'0123456789'.ToCharArray() }
if ($IncludeUppercase) { $charSet +=
'ABCDEFGHIJKLMNOPQRSTUVWXYZ'.ToCharArray() }
if ($IncludeLowercase) { $charSet +=
'abcdefghijklmnopqrstuvwxyz'.ToCharArray() }
if ($[Link] -eq 0) {
throw "At least one character type
must be selected."
}
$password = Get-RandomPassword -Length
$Length -CharSet $charSet
Write-Output "Generated Password:
$password"
}
catch {
Write-Error "An error occurred: $_"
}
Here's what the script does:
1. It takes several parameters:
$Length: The desired length of the
password (default is 12)
$IncludeSpecialChars: Switch to include
special characters
$IncludeNumbers: Switch to include
numbers
$IncludeUppercase: Switch to include
uppercase letters
$IncludeLowercase: Switch to include
lowercase letters
2. It defines a function Get-
RandomPassword that generates a
random string from a given
character set.
3. Based on the switches provided,
it builds a character set for the
password.
4. It checks if at least one
character type is selected.
5. It generates the password using
the Get-RandomPassword function.
6. Finally, it outputs the generated
password.
You can use this script like this:
.\GenerateRandomPassword.ps1 -Length 16 -
IncludeSpecialChars -IncludeNumbers -
IncludeUppercase -IncludeLowercase
This script is useful for
generating secure passwords that
comply with specific complexity
requirements, which is valuable for
system administrators and security
professionals.
17. Display Date and Time
in a Specific Format
This script demonstrates how to
format and display the current date
and time in various ways, which can
be useful for logging, reporting,
or creating standardized
timestamps.
param (
[Parameter(Mandatory=$true)]
[ValidateSet("Short", "Long", "ISO8601",
"RFC1123", "Custom")]
[string]$Format,
[Parameter(Mandatory=$false)]
[string]$CustomFormat
)
$currentDateTime = Get-Date
try {
switch ($Format) {
"Short" {
$formattedDateTime =
$[Link]("g")
}
"Long" {
$formattedDateTime =
$[Link]("F")
}
"ISO8601" {
$formattedDateTime =
$[Link]("yyyy-MM-
ddTHH:mm:[Link]")
}
"RFC1123" {
$formattedDateTime =
$[Link]("R")
}
"Custom" {
if
([string]::IsNullOrEmpty($CustomFormat)) {
throw "Custom format string
must be provided when using the Custom
format."
}
$formattedDateTime =
$[Link]($CustomFormat)
}
}
Write-Output "Formatted Date and Time:
$formattedDateTime"
}
catch {
Write-Error "An error occurred: $_"
}
Here's how the script works:
1. It takes two parameters:
$Format: A mandatory parameter to
specify the desired format
(Short, Long, ISO8601, RFC1123,
or Custom)
$CustomFormat: An optional parameter
for specifying a custom format
string
2. It gets the current date and time
using Get-Date.
3. Based on the specified format, it
formats the date and time:
Short: A concise, culture-
specific date and time string
Long: A more detailed, culture-
specific date and time string
ISO8601: The ISO 8601 standard
format
RFC1123: The RFC 1123 standard
format
Custom: Uses the provided custom
format string
4. It outputs the formatted date and
time string.
You can use this script like this:
.\FormatDateTime.ps1 -Format "ISO8601"
.\FormatDateTime.ps1 -Format "Custom" -
CustomFormat "yyyy-MM-dd HH:mm:ss"
This script is particularly useful
for generating consistent
timestamps across different systems
or for creating date/time strings
that conform to specific standards
or requirements.
18. Compare Two Files for
Differences
This script compares the contents
of two files and reports any
differences, which is useful for
file verification, change tracking,
or troubleshooting.
param (
[Parameter(Mandatory=$true)]
[string]$File1Path,
[Parameter(Mandatory=$true)]
[string]$File2Path
)
try {
# Check if both files exist
if (-not (Test-Path -Path $File1Path)) {
throw "The file '$File1Path' does not
exist."
}
if (-not (Test-Path -Path $File2Path)) {
throw "The file '$File2Path' does not
exist."
}
# Compare the files
$differences = Compare-Object -
ReferenceObject (Get-Content -Path
$File1Path) -DifferenceObject (Get-Content -
Path $File2Path)
if ($differences) {
Write-Output "Differences found
between the files:"
foreach ($diff in $differences) {
$indicator = if
($[Link] -eq "<=") { "Only in
$File1Path" } else { "Only in $File2Path" }
Write-Output "[$indicator]
$($[Link])"
}
}
else {
Write-Output "The files are
identical."
}
}
catch {
Write-Error "An error occurred: $_"
}
Here's what the script does:
1. It takes two mandatory
parameters:
$File1Path: The path to the first
file
$File2Path: The path to the second
file
It checks if both files exist.
2. It uses Compare-Object to compare the
contents of the two files line by
line.
3. If differences are found, it
outputs each difference,
indicating which file it's from.
4. If no differences are found, it
reports that the files are
identical.
You can use this script like this:
.\CompareFiles.ps1 -File1Path
"C:\Path\To\[Link]" -File2Path
"C:\Path\To\[Link]"
This script is particularly useful
for:
Verifying file integrity after
transfers or backups
Identifying changes between
different versions of a file
Troubleshooting configuration
files or logs
It provides a quick way to spot
differences between files without
having to manually review them line
by line.
19. Append Text to an
Existing File
This script demonstrates how to
append text to an existing file,
which is useful for logging,
updating configuration files, or
adding new information to existing
documents.
param (
[Parameter(Mandatory=$true)]
[string]$FilePath,
[Parameter(Mandatory=$true)]
[string]$TextToAppend
)
try {
# Check if the file exists
if (-not (Test-Path -Path $FilePath)) {
throw "The file '$FilePath' does not
exist."
}
# Append the text to the file
Add-Content -Path $FilePath -Value
$TextToAppend
Write-Output "Text successfully appended
to '$FilePath'"
}
catch {
Write-Error "An error occurred: $_"
}
Here's how the script works:
1. It takes two mandatory
parameters:
$FilePath: The path to the file
where text will be appended
$TextToAppend: The text to be added
to the file
2. It checks if the specified file
exists.
3. It uses Add-Content to append the
provided text to the end of the
file.
4. If successful, it outputs a
confirmation message.
5. If an error occurs, it outputs an
error message.
You can use this script like this:
.\AppendToFile.ps1 -FilePath
"C:\Logs\[Link]" -TextToAppend "New log
entry: $(Get-Date)"
This script is particularly useful
for:
Adding entries to log files
Updating configuration files with
new settings
Appending new data to existing
data files
It provides a simple way to add
content to files without
overwriting existing data, which is
crucial for many file management
and logging tasks.
20. Search for Specific
Text in a File
This script searches for specific
text within a file and reports the
lines where the text is found. This
is useful for analyzing log files,
finding configuration settings, or
locating specific information in
large text files.
param (
[Parameter(Mandatory=$true)]
[string]$FilePath,
[Parameter(Mandatory=$true)]
[string]$SearchText,
[switch]$CaseSensitive
)
try {
# Check if the file exists
if (-not (Test-Path -Path $FilePath)) {
throw "The file '$FilePath' does not
exist."
}
# Perform the search
$matches = if ($CaseSensitive) {
Select-String -Path $FilePath -
Pattern $SearchText -CaseSensitive
} else {
Select-String -Path $FilePath -
Pattern $SearchText
}
# Output results
if ($matches) {
Write-Output "Found $($[Link])
matches for '$SearchText' in '$FilePath':"
foreach ($match in $matches) {
Write-Output "Line
$($[Link]): $($[Link]())"
}
} else {
Write-Output "No matches found for
'$SearchText' in '$FilePath'."
}
}
catch {
Write-Error "An error occurred: $_"
}
Here's how the script works:
1. It takes three parameters:
$FilePath: The path to the file to
be searched
$SearchText: The text to search for
within the file
$CaseSensitive:
An optional switch to
enable case-sensitive searching
2. It checks if the specified file
exists.
3. It uses Select-String to search for
the specified text in the file.
If $CaseSensitive is specified, it
performs a case-sensitive search.
It outputs the number of matches
found and the content of each
matching line, along with its
line number.
4. If no matches are found, it
outputs a message indicating
this.
5. If an error occurs, it outputs an
error message.
You can use this script like this:
.\SearchInFile.ps1 -FilePath
"C:\Logs\[Link]" -SearchText "error"
-CaseSensitive
This script is particularly useful
for:
Analyzing log files for specific
events or errors
Finding configuration settings in
large configuration files
Locating specific information in
text-based data files
It provides a quick way to search
through files without having to
manually open and scan them, which
can be a time-saving tool for
system administrators and
developers.
These 20 scripts cover a wide range
of basic PowerShell operations that
are commonly used in IT
administration and automation
tasks. They demonstrate key
PowerShell concepts and provide a
foundation for more complex
scripting tasks. As you become more
comfortable with these basic
scripts, you can start combining
and expanding on them to create
more sophisticated automation
solutions tailored to your specific
needs.
Chapter 2: File and
Folder Management
(Scripts 21–50)
21. Create folders with
specific names
$folderNames = @("Projects", "Documents",
"Images", "Backups")
$basePath = "C:\Users\YourUsername\Desktop"
foreach ($folderName in $folderNames) {
$fullPath = Join-Path -Path $basePath -
ChildPath $folderName
if (-not (Test-Path -Path $fullPath)) {
New-Item -Path $fullPath -ItemType
Directory
Write-Host "Created folder:
$fullPath"
} else {
Write-Host "Folder already exists:
$fullPath"
}
}
This script creates folders with
specific names in a designated
location. It uses an array to store
the desired folder names and a base
path where the folders should be
created. The script then iterates
through the folder names,
constructs the full path for each
folder, and creates it if it
doesn't already exist.
Key points:
Uses the New-Item cmdlet to create
directories
Checks for existing folders to
avoid errors
Provides feedback on folder
creation or existence
22. List all files in a
directory
param (
[Parameter(Mandatory=$true)]
[string]$DirectoryPath
)
if (Test-Path -Path $DirectoryPath -PathType
Container) {
Get-ChildItem -Path $DirectoryPath -File
|
Select-Object Name, Length, LastWriteTime
|
Format-Table -AutoSize
} else {
Write-Error "The specified path is not a
valid directory."
}
This script lists all files in a
specified directory, displaying
their names, sizes, and last
modified dates. It uses the Get-
ChildItem cmdlet to retrieve file
information and formats the output
as a table for easy reading.
Key points:
Accepts a directory path as a
parameter
Validates the directory path
before processing
Uses Select-Object to choose specific
file properties
Formats the output as a table for
clarity
23. Search for files by
extension
param (
[Parameter(Mandatory=$true)]
[string]$DirectoryPath,
[Parameter(Mandatory=$true)]
[string]$Extension
)
if (Test-Path -Path $DirectoryPath -PathType
Container) {
$files = Get-ChildItem -Path
$DirectoryPath -Recurse -File |
Where-Object { $_.Extension -eq
".$Extension" }
if ($files) {
$files | Select-Object FullName,
Length, LastWriteTime |
Format-Table -AutoSize
} else {
Write-Host "No files with extension
.$Extension found in the specified
directory."
}
} else {
Write-Error "The specified path is not a
valid directory."
}
This script searches for files with
a specific extension in a given
directory and its subdirectories.
It uses the Get-ChildItem cmdlet with
the -Recurse parameter to search
subdirectories, and filters the
results based on the specified file
extension.
Key points:
Accepts directory path and file
extension as parameters
Searches recursively through
subdirectories
Filters results using Where-Object
Displays file paths, sizes, and
last modified dates
24. Batch rename files
param (
[Parameter(Mandatory=$true)]
[string]$DirectoryPath,
[Parameter(Mandatory=$true)]
[string]$OldPattern,
[Parameter(Mandatory=$true)]
[string]$NewPattern
)
if (Test-Path -Path $DirectoryPath -PathType
Container) {
$files = Get-ChildItem -Path
$DirectoryPath -File |
Where-Object { $_.Name -match
$OldPattern }
foreach ($file in $files) {
$newName = $[Link] -replace
$OldPattern, $NewPattern
Rename-Item -Path $[Link] -
NewName $newName
Write-Host "Renamed: $($[Link]) to
$newName"
}
} else {
Write-Error "The specified path is not a
valid directory."
}
This script performs batch renaming
of files in a specified directory
based on a pattern. It uses regular
expressions to match and replace
parts of the file names.
Key points:
Accepts directory path, old
pattern, and new pattern as
parameters
Uses Where-Object to filter files
based on the old pattern
Utilizes the -replace operator for
string manipulation
Provides feedback on each renamed
file
25. Zip a folder
param (
[Parameter(Mandatory=$true)]
[string]$FolderPath,
[Parameter(Mandatory=$true)]
[string]$ZipPath
)
if (Test-Path -Path $FolderPath -PathType
Container) {
try {
Compress-Archive -Path $FolderPath -
DestinationPath $ZipPath -Force
Write-Host "Successfully zipped
folder to: $ZipPath"
} catch {
Write-Error "Failed to zip folder:
$_"
}
} else {
Write-Error "The specified folder path is
not valid."
}
This script compresses a specified
folder into a zip file. It uses the
Compress-Archive cmdlet, which is
available in PowerShell 5.0 and
later versions.
Key points:
Accepts source folder path and
destination zip path as
parameters
Uses Compress-Archive for zipping
Includes error handling with try-
catch block
Provides feedback on successful
compression or failure
26. Unzip a file
param (
[Parameter(Mandatory=$true)]
[string]$ZipPath,
[Parameter(Mandatory=$true)]
[string]$ExtractPath
)
if (Test-Path -Path $ZipPath -PathType Leaf)
{
if (-not (Test-Path -Path $ExtractPath))
{
New-Item -Path $ExtractPath -ItemType
Directory
}
try {
Expand-Archive -Path $ZipPath -
DestinationPath $ExtractPath -Force
Write-Host "Successfully unzipped
file to: $ExtractPath"
} catch {
Write-Error "Failed to unzip file:
$_"
}
} else {
Write-Error "The specified zip file path
is not valid."
}
This script extracts the contents
of a zip file to a specified
directory. It uses the Expand-Archive
cmdlet, which is available in
PowerShell 5.0 and later versions.
Key points:
Accepts zip file path and
extraction path as parameters
Creates the extraction directory
if it doesn't exist
Uses Expand-Archive for unzipping
Includes error handling with try-
catch block
27. Copy files from one
directory to another
param (
[Parameter(Mandatory=$true)]
[string]$SourceDir,
[Parameter(Mandatory=$true)]
[string]$DestinationDir,
[string]$FilePattern = "*"
)
if ((Test-Path -Path $SourceDir -PathType
Container) -and (Test-Path -Path
$DestinationDir -PathType Container)) {
try {
$files = Get-ChildItem -Path
$SourceDir -File -Filter $FilePattern
foreach ($file in $files) {
Copy-Item -Path $[Link] -
Destination $DestinationDir -Force
Write-Host "Copied:
$($[Link])"
}
Write-Host "File copy operation
completed."
} catch {
Write-Error "Failed to copy files:
$_"
}
} else {
Write-Error "Source or destination
directory is not valid."
}
This script copies files from one
directory to another, with an
optional file pattern filter. It
uses the Copy-Item cmdlet to perform
the file copy operation.
Key points:
Accepts source directory,
destination directory, and
optional file pattern as
parameters
Uses Get-ChildItem with -Filter for
file selection
Copies files individually,
providing feedback for each copy
operation
Includes error handling for
invalid directories or copy
failures
28. Move large files to a
specific location
param (
[Parameter(Mandatory=$true)]
[string]$SourceDir,
[Parameter(Mandatory=$true)]
[string]$DestinationDir,
[Parameter(Mandatory=$true)]
[int]$SizeThresholdMB
)
if ((Test-Path -Path $SourceDir -PathType
Container) -and (Test-Path -Path
$DestinationDir -PathType Container)) {
try {
$largeFiles = Get-ChildItem -Path
$SourceDir -File |
Where-Object {
$_.Length -gt ($SizeThresholdMB * 1MB) }
foreach ($file in $largeFiles) {
Move-Item -Path $[Link] -
Destination $DestinationDir -Force
Write-Host "Moved large file:
$($[Link]) (Size:
$([math]::Round($[Link] / 1MB, 2)) MB)"
}
Write-Host "Large file move operation
completed."
} catch {
Write-Error "Failed to move large
files: $_"
}
} else {
Write-Error "Source or destination
directory is not valid."
}
This script moves files larger than
a specified size threshold from one
directory to another. It uses the
Move-Item cmdlet to perform the file
move operation.
Key points:
Accepts source directory,
destination directory, and size
threshold (in MB) as parameters
Filters files based on size using
Where-Object
Moves files individually,
providing feedback on each move
operation
Calculates and displays file
sizes in MB
29. Archive old files
param (
[Parameter(Mandatory=$true)]
[string]$SourceDir,
[Parameter(Mandatory=$true)]
[string]$ArchiveDir,
[Parameter(Mandatory=$true)]
[int]$DaysOld
)
if ((Test-Path -Path $SourceDir -PathType
Container) -and (Test-Path -Path $ArchiveDir
-PathType Container)) {
$cutoffDate = (Get-
Date).AddDays(-$DaysOld)
try {
$oldFiles = Get-ChildItem -Path
$SourceDir -File |
Where-Object {
$_.LastWriteTime -lt $cutoffDate }
foreach ($file in $oldFiles) {
$destinationPath = Join-Path -
Path $ArchiveDir -ChildPath $[Link]
Move-Item -Path $[Link] -
Destination $destinationPath -Force
Write-Host "Archived:
$($[Link]) (Last modified:
$($[Link]))"
}
Write-Host "File archiving operation
completed."
} catch {
Write-Error "Failed to archive files:
$_"
}
} else {
Write-Error "Source or archive directory
is not valid."
}
This script archives files that are
older than a specified number of
days by moving them to an archive
directory. It uses the LastWriteTime
property to determine file age.
Key points:
Accepts source directory, archive
directory, and age threshold (in
days) as parameters
Calculates a cutoff date based on
the current date and age
threshold
Filters files based on their last
write time
Moves files to the archive
directory, preserving their
original names
30. Monitor folder
changes
param (
[Parameter(Mandatory=$true)]
[string]$FolderPath,
[int]$DurationSeconds = 60
)
if (Test-Path -Path $FolderPath -PathType
Container) {
$watcher = New-Object
[Link]
$[Link] = $FolderPath
$[Link] = $true
$[Link] = $true
$action = {
$change = $[Link]
$name = $[Link]
$changeType = $[Link]
$timeStamp = $[Link]
Write-Host "[$timeStamp] File
$changeType : $name"
}
Register-ObjectEvent $watcher "Created" -
Action $action
Register-ObjectEvent $watcher "Changed" -
Action $action
Register-ObjectEvent $watcher "Deleted" -
Action $action
Register-ObjectEvent $watcher "Renamed" -
Action $action
Write-Host "Monitoring folder:
$FolderPath for $DurationSeconds seconds..."
Start-Sleep -Seconds $DurationSeconds
Unregister-Event -SourceIdentifier
FileCreated
Unregister-Event -SourceIdentifier
FileChanged
Unregister-Event -SourceIdentifier
FileDeleted
Unregister-Event -SourceIdentifier
FileRenamed
$[Link]()
Write-Host "Folder monitoring completed."
} else {
Write-Error "The specified folder path is
not valid."
}
This script monitors a specified
folder for changes (create, modify,
delete, rename) and reports them in
real-time. It uses the
[Link] class to
detect changes.
Key points:
Accepts folder path and
monitoring duration as parameters
Sets up a FileSystemWatcher object to
monitor the specified folder
Registers event handlers for
different types of file system
changes
Runs for a specified duration,
then cleans up event
registrations and disposes of the
watcher
31. Delete empty folders
param (
[Parameter(Mandatory=$true)]
[string]$RootPath
)
function Remove-EmptyFolders {
param (
[string]$Path
)
Get-ChildItem -Path $Path -Directory |
ForEach-Object {
Remove-EmptyFolders -Path $_.FullName
if (-not (Get-ChildItem -Path
$_.FullName)) {
Remove-Item -Path $_.FullName -
Force
Write-Host "Deleted empty folder:
$($_.FullName)"
}
}
}
if (Test-Path -Path $RootPath -PathType
Container) {
Remove-EmptyFolders -Path $RootPath
Write-Host "Empty folder deletion
completed."
} else {
Write-Error "The specified root path is
not valid."
}
This script recursively deletes
empty folders within a specified
directory. It uses a recursive
function to traverse the directory
structure and remove folders that
don't contain any files or
subdirectories.
Key points:
Accepts a root path as a
parameter
Uses a recursive function to
check all subdirectories
Deletes folders only if they are
empty
Provides feedback on each deleted
folder
32. Compare files in two
folders
param (
[Parameter(Mandatory=$true)]
[string]$Folder1,
[Parameter(Mandatory=$true)]
[string]$Folder2
)
if ((Test-Path -Path $Folder1 -PathType
Container) -and (Test-Path -Path $Folder2 -
PathType Container)) {
$files1 = Get-ChildItem -Path $Folder1 -
File | Select-Object Name, Length,
LastWriteTime
$files2 = Get-ChildItem -Path $Folder2 -
File | Select-Object Name, Length,
LastWriteTime
$comparison = Compare-Object -
ReferenceObject $files1 -DifferenceObject
$files2 -Property Name, Length, LastWriteTime
if ($comparison) {
Write-Host "Differences found:"
foreach ($diff in $comparison) {
$status = if ($[Link]
-eq "<=") { "Only in $Folder1" } else { "Only
in $Folder2" }
Write-Host "$($[Link]) -
$status"
}
} else {
Write-Host "The folders contain
identical files."
}
} else {
Write-Error "One or both of the specified
folder paths are not valid."
}
This script compares the contents
of two folders, identifying files
that are present in one folder but
not the other, or files with
different sizes or modification
times.
Key points:
Accepts two folder paths as
parameters
Uses Get-ChildItem to retrieve file
information from both folders
Utilizes Compare-Object to identify
differences between the folder
contents
Reports differences, including
which folder contains the unique
or different files
33. Set folder
permissions
param (
[Parameter(Mandatory=$true)]
[string]$FolderPath,
[Parameter(Mandatory=$true)]
[string]$Username,
[Parameter(Mandatory=$true)]
[string]$Permission
)
if (Test-Path -Path $FolderPath -PathType
Container) {
try {
$acl = Get-Acl -Path $FolderPath
$accessRule = New-Object
[Link]
sRule($Username, $Permission,
"ContainerInherit,ObjectInherit", "None",
"Allow")
$[Link]($accessRule)
Set-Acl -Path $FolderPath -AclObject
$acl
Write-Host "Permission '$Permission'
granted to user '$Username' on folder:
$FolderPath"
} catch {
Write-Error "Failed to set folder
permissions: $_"
}
} else {
Write-Error "The specified folder path is
not valid."
}
This script sets specific
permissions for a user on a given
folder. It uses the
[Link]
e class to define the access rule
and applies it to the folder's
Access Control List (ACL).
Key points:
Accepts folder path, username,
and permission level as
parameters
Retrieves the current ACL of the
folder
Creates a new access rule based
on the provided parameters
Applies the new rule to the ACL
and sets it on the folder
34. Add a security group
to a folder
param (
[Parameter(Mandatory=$true)]
[string]$FolderPath,
[Parameter(Mandatory=$true)]
[string]$GroupName,
[Parameter(Mandatory=$true)]
[string]$Permission
)
if (Test-Path -Path $FolderPath -PathType
Container) {
try {
$acl = Get-Acl -Path $FolderPath
$accessRule = New-Object
[Link]
sRule($GroupName, $Permission,
"ContainerInherit,ObjectInherit", "None",
"Allow")
$[Link]($accessRule)
Set-Acl -Path $FolderPath -AclObject
$acl
Write-Host "Permission '$Permission'
granted to group '$GroupName' on folder:
$FolderPath"
} catch {
Write-Error "Failed to add security
group to folder: $_"
}
} else {
Write-Error "The specified folder path is
not valid."
}
This script adds a security group
to a folder's permissions. It's
similar to the previous script but
specifically targets security
groups instead of individual users.
Key points:
Accepts folder path, group name,
and permission level as
parameters
Creates a new access rule for the
specified group
Adds the new rule to the existing
ACL instead of replacing it
Applies the updated ACL to the
folder
35. Retrieve file
creation date
param (
[Parameter(Mandatory=$true)]
[string]$FilePath
)
if (Test-Path -Path $FilePath -PathType Leaf)
{
try {
$file = Get-Item -Path $FilePath
$creationTime = $[Link]
$lastWriteTime = $[Link]
$lastAccessTime =
$[Link]
Write-Host "File: $($[Link])"
Write-Host "Creation Time:
$creationTime"
Write-Host "Last Write Time:
$lastWriteTime"
Write-Host "Last Access Time:
$lastAccessTime"
} catch {
Write-Error "Failed to retrieve file
information: $_"
}
} else {
Write-Error "The specified file path is
not valid."
}
This script retrieves and displays
the creation date, last write date,
and last access date of a specified
file.
Key points:
Accepts a file path as a
parameter
Uses Get-Item to retrieve file
metadata
Displays creation time, last
write time, and last access time
Includes error handling for
invalid file paths
36. Generate a folder
tree structure
param (
[Parameter(Mandatory=$true)]
[string]$RootPath,
[string]$OutputFile = "[Link]"
)
function Get-FolderTree {
param (
[string]$Path,
[int]$Level = 0
)
$indent = " " * $Level
$folderName = Split-Path -Leaf $Path
"$indent$folderName"
Get-ChildItem -Path $Path -Directory |
ForEach-Object {
Get-FolderTree -Path $_.FullName -
Level ($Level + 1)
}
}
if (Test-Path -Path $RootPath -PathType
Container) {
try {
$tree = Get-FolderTree -Path
$RootPath
$tree | Out-File -FilePath
$OutputFile
Write-Host "Folder tree structure has
been saved to $OutputFile"
} catch {
Write-Error "Failed to generate
folder tree: $_"
}
} else {
Write-Error "The specified root path is
not valid."
}
This script generates a text-based
tree structure of folders starting
from a specified root directory. It
uses a recursive function to
traverse the directory structure
and create an indented
representation of the folder
hierarchy.
Key points:
Accepts root path and optional
output file path as parameters
Uses a recursive function to
build the folder tree
Indents subfolder names to
represent the hierarchy
Saves the generated tree
structure to a text file
37. Export file metadata
to a CSV
param (
[Parameter(Mandatory=$true)]
[string]$FolderPath,
[string]$OutputFile = "[Link]"
)
if (Test-Path -Path $FolderPath -PathType
Container) {
try {
$files = Get-ChildItem -Path
$FolderPath -File -Recurse
$fileData = $files | Select-Object
Name, FullName, Length, CreationTime,
LastWriteTime, LastAccessTime,
@{Name="SizeKB"; Expression=
{[math]::Round($_.Length / 1KB, 2)}}
$fileData | Export-Csv -Path
$OutputFile -NoTypeInformation
Write-Host "File metadata has been
exported to $OutputFile"
} catch {
Write-Error "Failed to export file
metadata: $_"
}
} else {
Write-Error "The specified folder path is
not valid."
}
This script exports metadata of all
files in a specified folder
(including subfolders) to a CSV
file. It includes information such
as file name, full path, size, and
various timestamps.
Key points:
Accepts folder path and optional
output file path as parameters
Uses Get-ChildItem with -Recurse to get
all files in the folder and
subfolders
Selects relevant file properties
and calculates size in KB
Exports the data to a CSV file
using Export-Csv
38. Sort files by size
param (
[Parameter(Mandatory=$true)]
[string]$FolderPath,
[int]$TopN = 10
)
if (Test-Path -Path $FolderPath -PathType
Container) {
try {
$files = Get-ChildItem -Path
$FolderPath -File -Recurse
$sortedFiles = $files | Sort-Object
Length -Descending | Select-Object -First
$TopN
$results = $sortedFiles | Select-
Object Name, @{Name="SizeMB"; Expression=
{[math]::Round($_.Length / 1MB, 2)}},
FullName
$results | Format-Table -AutoSize
Write-Host "Displayed the top $TopN
largest files in $FolderPath"
} catch {
Write-Error "Failed to sort files by
size: $_"
}
} else {
Write-Error "The specified folder path is
not valid."
}
This script sorts files in a
specified folder (including
subfolders) by size and displays
the top N largest files.
Key points:
Accepts folder path and optional
number of files to display as
parameters
Uses Get-ChildItem with -Recurse to get
all files
Sorts files by size (Length) in
descending order
Calculates file sizes in MB for
better readability
Displays results in a formatted
table
39. Backup specific files
param (
[Parameter(Mandatory=$true)]
[string]$SourceFolder,
[Parameter(Mandatory=$true)]
[string]$BackupFolder,
[string[]]$FilePatterns = @("*.docx",
"*.xlsx", "*.pdf")
)
if ((Test-Path -Path $SourceFolder -PathType
Container) -and (Test-Path -Path
$BackupFolder -PathType Container)) {
try {
foreach ($pattern in $FilePatterns) {
$files = Get-ChildItem -Path
$SourceFolder -Recurse -File -Filter $pattern
foreach ($file in $files) {
$destinationPath = Join-Path
-Path $BackupFolder -ChildPath $[Link]
Copy-Item -Path
$[Link] -Destination $destinationPath
-Force
Write-Host "Backed up:
$($[Link])"
}
}
Write-Host "Backup completed
successfully."
} catch {
Write-Error "Failed to backup files:
$_"
}
} else {
Write-Error "Source or backup folder path
is not valid."
}
This script backs up specific file
types from a source folder to a
backup folder. It allows for
multiple file patterns to be
specified.
Key points:
Accepts source folder, backup
folder, and optional file
patterns as parameters
Uses Get-ChildItem with -Filter to
select files based on patterns
Copies files to the backup
folder, preserving their names
Provides feedback on each backed-
up file
40. Restore files from a
backup
param (
[Parameter(Mandatory=$true)]
[string]$BackupFolder,
[Parameter(Mandatory=$true)]
[string]$RestoreFolder,
[switch]$OverwriteExisting
)
if ((Test-Path -Path $BackupFolder -PathType
Container) -and (Test-Path -Path
$RestoreFolder -PathType Container)) {
try {
$backupFiles = Get-ChildItem -Path
$BackupFolder -File
foreach ($file in $backupFiles) {
$destinationPath = Join-Path -
Path $RestoreFolder -ChildPath $[Link]
if ((Test-Path -Path
$destinationPath) -and (-not
$OverwriteExisting)) {
Write-Host "Skipped (already
exists): $($[Link])"
} else {
Copy-Item -Path
$[Link] -Destination $destinationPath
-Force
Write-Host "Restored:
$($[Link])"
}
}
Write-Host "Restore operation
completed."
} catch {
Write-Error "Failed to restore files:
$_"
}
} else {
Write-Error "Backup or restore folder
path is not valid."
}
This script restores files from a
backup folder to a specified
restore folder. It includes an
option to overwrite existing files.
Key points:
Accepts backup folder, restore
folder, and an overwrite switch
as parameters
Checks for existing files in the
restore folder
Allows skipping or overwriting
existing files based on the
OverwriteExisting switch
Provides feedback on each
restored or skipped file
41. Remove duplicate
files
param (
[Parameter(Mandatory=$true)]
[string]$FolderPath
)
function Get-FileHash {
param (
[string]$FilePath
)
$hash = Get-FileHash -Path $FilePath -
Algorithm MD5
return $[Link]
}
if (Test-Path -Path $FolderPath -PathType
Container) {
try {
$files = Get-ChildItem -Path
$FolderPath -File -Recurse
$uniqueFiles = @{}
$duplicatesRemoved = 0
foreach ($file in $files) {
$hash = Get-FileHash -FilePath
$[Link]
if
($[Link]($hash)) {
Remove-Item -Path
$[Link] -Force
Write-Host "Removed
duplicate: $($[Link])"
$duplicatesRemoved++
} else {
$uniqueFiles[$hash] =
$[Link]
}
}
Write-Host "Duplicate removal
completed. Removed $duplicatesRemoved files."
} catch {
Write-Error "Failed to remove
duplicate files: $_"
}
} else {
Write-Error "The specified folder path is
not valid."
}
This script removes duplicate files
within a specified folder and its
subfolders. It uses file hashing to
identify duplicates, keeping the
first occurrence of each unique
file.
Key points:
Accepts a folder path as a
parameter
Uses MD5 hashing to identify
duplicate files
Keeps track of unique files using
a hashtable
Removes duplicates and provides a
count of removed files
42. Encrypt a file or
folder
param (
[Parameter(Mandatory=$true)]
[string]$Path,
[Parameter(Mandatory=$true)]
[string]$Password
)
function Encrypt-File {
param (
[string]$FilePath,
[string]$Password
)
$bytes =
[[Link]]::ReadAllBytes($FilePath)
$encryptedBytes =
[[Link]]:
:Protect($bytes,
[[Link]]::[Link]($Passwo
rd),
[[Link]
cope]::CurrentUser)
[[Link]]::WriteAllBytes("$FilePat
[Link]", $encryptedBytes)
Remove-Item -Path $FilePath
Write-Host "Encrypted: $FilePath"
}
if (Test-Path -Path $Path) {
try {
Add-Type -AssemblyName
[Link]
if ((Get-Item -Path $Path) -is
[[Link]]) {
Get-ChildItem -Path $Path -File -
Recurse | ForEach-Object {
Encrypt-File -FilePath
$_.FullName -Password $Password
}
Write-Host "Folder encryption
completed: $Path"
} else {
Encrypt-File -FilePath $Path -
Password $Password
Write-Host "File encryption
completed: $Path"
}
} catch {
Write-Error "Encryption failed: $_"
}
} else {
Write-Error "The specified path is not
valid."
}
This script encrypts a file or all
files within a folder (including
subfolders) using a password-based
encryption method.
Key points:
Accepts a file/folder path and a
password as parameters
Uses
[Link]
for encryption
Handles both single file and
folder encryption
Removes the original file after
encryption
43. Decrypt a file or
folder
param (
[Parameter(Mandatory=$true)]
[string]$Path,
[Parameter(Mandatory=$true)]
[string]$Password
)
function Decrypt-File {
param (
[string]$FilePath,
[string]$Password
)
$encryptedBytes =
[[Link]]::ReadAllBytes($FilePath)
$decryptedBytes =
[[Link]]:
:Unprotect($encryptedBytes,
[[Link]]::[Link]($Passwo
rd),
[[Link]
cope]::CurrentUser),
$decryptedFilePath = $FilePath -replace
'\.encrypted$', ''
[[Link]]::WriteAllBytes($decrypte
dFilePath, $decryptedBytes)
Remove-Item -Path $FilePath
Write-Host "Decrypted: $FilePath"
}
if (Test-Path -Path $Path) {
try {
Add-Type -AssemblyName
[Link]
if ((Get-Item -Path $Path) -is
[[Link]]) {
Get-ChildItem -Path $Path -File -
Recurse -Filter "*.encrypted" | ForEach-
Object {
Decrypt-File -FilePath
$_.FullName -Password $Password
}
Write-Host "Folder decryption
completed: $Path"
} else {
Decrypt-File -FilePath $Path -
Password $Password
Write-Host "File decryption
completed: $Path"
}
} catch {
Write-Error "Decryption failed: $_"
}
} else {
Write-Error "The specified path is not
valid."
}
This script decrypts a file or all
encrypted files within a folder
(including subfolders) using the
same password-based encryption
method used in the encryption
script.
Key points:
Accepts a file/folder path and a
password as parameters
Uses
[Link]
for decryption
Handles both single file and
folder decryption
Removes the ".encrypted"
extension from decrypted files
Removes the encrypted file after
successful decryption
44. Create shortcuts for
files
param (
[Parameter(Mandatory=$true)]
[string]$SourcePath,
[Parameter(Mandatory=$true)]
[string]$ShortcutPath
)
function Create-Shortcut {
param (
[string]$SourceFile,
[string]$ShortcutFile
)
$WshShell = New-Object -ComObject
[Link]
$Shortcut =
$[Link]($ShortcutFile)
$[Link] = $SourceFile
$[Link]()
Write-Host "Created shortcut:
$ShortcutFile"
}
if (Test-Path -Path $SourcePath) {
try {
if ((Get-Item -Path $SourcePath) -is
[[Link]]) {
if (-not (Test-Path -Path
$ShortcutPath)) {
New-Item -Path $ShortcutPath
-ItemType Directory
}
Get-ChildItem -Path $SourcePath -
File | ForEach-Object {
$shortcutFile = Join-Path -
Path $ShortcutPath -ChildPath
"$($_.BaseName).lnk"
Create-Shortcut -SourceFile
$_.FullName -ShortcutFile $shortcutFile
}
Write-Host "Shortcuts created for
all files in: $SourcePath"
} else {
$shortcutFile =
"$[Link]"
Create-Shortcut -SourceFile
$SourcePath -ShortcutFile $shortcutFile
}
} catch {
Write-Error "Failed to create
shortcuts: $_"
}
} else {
Write-Error "The specified source path is
not valid."
}
This script creates shortcuts for
files or all files within a folder.
It can create a shortcut for a
single file or multiple shortcuts
for all files in a directory.
Key points:
Accepts source path (file or
folder) and shortcut path as
parameters
Uses the [Link] COM object to
create shortcuts
Handles both single file and
folder scenarios
Creates shortcuts with the ".lnk"
extension
45. Monitor file access
param (
[Parameter(Mandatory=$true)]
[string]$FolderPath,
[int]$DurationSeconds = 300
)
function Start-FileSystemAudit {
param (
[string]$Path
)
$auditFlags =
[[Link]
ts]::ReadData -bor
[[Link]
[Link]]::WriteData -bor
[[Link]
[Link]]::Delete
$accessRule = New-Object
[Link]
Rule(
"Everyone",
$auditFlags,
"Success",
"None",
"None"
)
$acl = Get-Acl -Path $Path
$[Link]($accessRule)
Set-Acl -Path $Path -AclObject $acl
}
if (Test-Path -Path $FolderPath -PathType
Container) {
try {
Start-FileSystemAudit -Path
$FolderPath
$startTime = Get-Date
$endTime =
$[Link]($DurationSeconds)
Write-Host "Monitoring file access in
$FolderPath for $DurationSeconds seconds..."
while ((Get-Date) -lt $endTime) {
Get-WinEvent -FilterHashtable @{
LogName = 'Security'
ID = 4663
StartTime = $startTime
} | Where-Object {
$_.Properties[6].Value -like
"$FolderPath*"
} | ForEach-Object {
$accessType =
$_.Properties[4].Value
$fileName =
$_.Properties[6].Value
$user =
$_.Properties[1].Value
Write-Host "$(Get-Date -
Format 'yyyy-MM-dd HH:mm:ss') - $user
accessed $fileName ($accessType)"
}
Start-Sleep -Seconds 5
}
Write-Host "File access monitoring
completed."
} catch {
Write-Error "Failed to monitor file
access: $_"
}
} else {
Write-Error "The specified folder path is
not valid."
}
This script monitors file access
within a specified folder for a
given duration. It uses Windows
Event Logs to track file access
events.
Key points:
Accepts folder path and
monitoring duration as parameters
Enables file system auditing for
the specified folder
Continuously checks the Security
event log for file access events
Displays real-time information
about file accesses, including
user and access type
46. Split a large file
into smaller chunks
param (
[Parameter(Mandatory=$true)]
[string]$FilePath,
[Parameter(Mandatory=$true)]
[int]$ChunkSizeBytes
)
function Split-File {
param (
[string]$Path,
[int]$ChunkSize
)
$file = Get-Item -Path $Path
$buffer = New-Object byte[] $ChunkSize
$partNumber = 1
$reader =
[[Link]]::OpenRead($[Link])
while ($bytesRead = $[Link]($buffer,
0, $[Link])) {
$outputPath = "{0}\{1}.part{2}" -f
$[Link], $[Link],
$[Link]("000")
$writer =
[[Link]]::Create($outputPath)
$[Link]($buffer, 0, $bytesRead)
$[Link]()
Write-Host "Created part $partNumber:
$outputPath"
$partNumber++
}
$[Link]()
}
if (Test-Path -Path $FilePath -PathType Leaf)
{
try {
Split-File -Path $FilePath -ChunkSize
$ChunkSizeBytes
Write-Host "File splitting
completed."
} catch {
Write-Error "Failed to split file:
$_"
}
} else {
Write-Error "The specified file path is
not valid."
}
This script splits a large file
into smaller chunks of a specified
size. It's useful for breaking down
large files for easier transfer or
storage.
Key points:
Accepts file path and chunk size
(in bytes) as parameters
Reads the file in chunks and
writes each chunk to a separate
file
Names output files with
sequential part numbers (e.g.,
filename.part001,
filename.part002)
Provides feedback on each created
file part
47. Merge multiple files
into one
param (
[Parameter(Mandatory=$true)]
[string]$FolderPath,
[Parameter(Mandatory=$true)]
[string]$OutputFile,
[string]$FilePattern = "*.part*"
)
function Merge-Files {
param (
[string]$SourceFolder,
[string]$DestinationFile,
[string]$Pattern
)
$files = Get-ChildItem -Path
$SourceFolder -Filter $Pattern | Sort-Object
Name
$writer =
[[Link]]::Create($DestinationFile)
foreach ($file in $files) {
$reader =
[[Link]]::OpenRead($[Link])
$[Link]($writer)
$[Link]()
Write-Host "Merged: $($[Link])"
}
$[Link]()
}
if (Test-Path -Path $FolderPath -PathType
Container) {
try {
Merge-Files -SourceFolder $FolderPath
-DestinationFile $OutputFile -Pattern
$FilePattern
Write-Host "File merging completed.
Output file: $OutputFile"
} catch {
Write-Error "Failed to merge files:
$_"
}
} else {
Write-Error "The specified folder path is
not valid."
}
This script merges multiple files
in a folder into a single output
file. It's particularly useful for
reassembling files that were
previously split into parts.
Key points:
Accepts folder path, output file
path, and optional file pattern
as parameters
Sorts input files by name to
ensure correct order
Reads each input file and writes
its contents to the output file
Provides feedback on each merged
file
48. Automate weekly
folder cleanup
param (
[Parameter(Mandatory=$true)]
[string]$FolderPath,
[int]$DaysOld = 7
)
function Remove-OldFiles {
param (
[string]$Path,
[int]$Days
)
$cutoffDate = (Get-Date).AddDays(-$Days)
Get-ChildItem -Path $Path -Recurse |
Where-Object {
$_.LastWriteTime -lt $cutoffDate
} | ForEach-Object {
Remove-Item -Path $_.FullName -Force
Write-Host "Removed: $($_.FullName)"
}
}
if (Test-Path -Path $FolderPath -PathType
Container) {
try {
Remove-OldFiles -Path $FolderPath -
Days $DaysOld
Write-Host "Folder cleanup
completed."
} catch {
Write-Error "Failed to perform folder
cleanup: $_"
}
} else {
Write-Error "The specified folder path is
not valid."
}
# To schedule this script to run weekly:
# $action = New-ScheduledTaskAction -Execute
"[Link]" -Argument "-File
'C:\Path\To\CleanupScript.ps1' -FolderPath
'C:\PathToClean' -DaysOld 7"
# $trigger = New-ScheduledTaskTrigger -Weekly
-DaysOfWeek Sunday -At 3am
# Register-ScheduledTask -Action $action -
Trigger $trigger -TaskName
"WeeklyFolderCleanup" -Description "Removes
files older than 7 days every Sunday at 3 AM"
This script automates the process
of cleaning up old files in a
specified folder. It can be
scheduled to run weekly for regular
maintenance.
Key points:
Accepts folder path and age
threshold (in days) as parameters
Removes files and folders older
than the specified number of days
Includes commented-out code to
set up a weekly scheduled task
Provides feedback on each removed
item
49. Count files in a
folder
param (
[Parameter(Mandatory=$true)]
[string]$FolderPath,
[switch]$IncludeSubfolders,
[string]$FilePattern = "*"
)
function Get-FileCount {
param (
[string]$Path,
[bool]$Recursive,
[string]$Pattern
)
$params = @{
Path = $Path
File = $true
Filter = $Pattern
}
if ($Recursive) {
$[Link]("Recurse", $true)
}
$files = Get-ChildItem @params
return $[Link]
}
if (Test-Path -Path $FolderPath -PathType
Container) {
try {
$count = Get-FileCount -Path
$FolderPath -Recursive $IncludeSubfolders -
Pattern $FilePattern
$scopeMessage = if
($IncludeSubfolders) { "including subfolders"
} else { "excluding subfolders" }
$patternMessage = if ($FilePattern -
ne "*") { " matching pattern '$FilePattern'"
} else { "" }
Write-Host "Total files in
'$FolderPath' $scopeMessage$patternMessage:
$count"
} catch {
Write-Error "Failed to count files:
$_"
}
} else {
Write-Error "The specified folder path is
not valid."
}
This script counts the number of
files in a specified folder, with
options to include subfolders and
filter by file pattern.
Key points:
Accepts folder path, subfolder
inclusion switch, and file
pattern as parameters
Uses Get-ChildItem with customizable
parameters for flexibility
Provides a detailed output
message describing the count
scope and any applied filters
50. Generate a directory
report
param (
[Parameter(Mandatory=$true)]
[string]$FolderPath,
[string]$OutputFile =
"[Link]"
)
function Get-FolderSize {
param ([string]$Path)
$size = (Get-ChildItem -Path $Path -
Recurse -File | Measure-Object -Property
Length -Sum).Sum
return [math]::Round($size / 1MB, 2)
}
function Generate-Report {
param (
[string]$Path,
[string]$OutputPath
)
$report = @"
<!DOCTYPE html>
<html>
<head>
<title>Directory Report: $Path</title>
<style>
body { font-family: Arial, sans-
serif; }
table { border-collapse: collapse;
width: 100%; }
th, td { border: 1px solid #ddd;
padding: 8px; }
th { background-color: #f2f2f2; }
</style>
</head>
<body>
<h1>Directory Report: $Path</h1>
<h2>Generated on $(Get-Date -Format
"yyyy-MM-dd HH:mm:ss")</h2>
<table>
<tr>
<th>Name</th>
<th>Type</th>
<th>Size (MB)</th>
<th>Last Modified</th>
</tr>
"@
Get-ChildItem -Path $Path | ForEach-
Object {
$type = if ($_.PSIsContainer) {
"Folder" } else { "File" }
$size = if ($_.PSIsContainer) { Get-
FolderSize -Path $_.FullName } else {
[math]::Round($_.Length / 1MB, 2) }
$report += @"
<tr>
<td>$($_.Name)</td>
<td>$type</td>
<td>$size</td>
<td>$($_.LastWriteTime)</td>
</tr>
"@
}
$report += @"
</table>
</body>
</html>
"@
$report | Out-File -FilePath $OutputPath
}
if (Test-Path -Path $FolderPath -PathType
Container) {
try {
Generate-Report -Path $FolderPath -
OutputPath $OutputFile
Write-Host "Directory report
generated: $OutputFile"
} catch {
Write-Error "Failed to generate
directory report: $_"
}
} else {
Write-Error "The specified folder path is
not valid."
}
This script generates an HTML
report of the contents of a
specified directory, including file
and folder sizes, types, and last
modified dates.
Key points:
Accepts folder path and optional
output file path as parameters
Generates an HTML report with a
table of directory contents
Calculates folder sizes
recursively
Includes styling for better
readability of the HTML output
These scripts provide a
comprehensive set of tools for file
and folder management in
PowerShell, covering a wide range
of common tasks and more advanced
operations. They can be used as-is
or adapted to suit specific needs
in various IT administration and
automation scenarios.
Chapter 3: User
Management (Scripts 51–
90)
51. List all users on a
system
Get-LocalUser | Format-Table Name, Enabled,
LastLogon
This script uses the Get-LocalUser
cmdlet to retrieve all local user
accounts on the system. The output
is formatted as a table showing the
user's name, whether the account is
enabled, and the last logon time.
This is useful for quickly auditing
user accounts on a local machine.
52. Create a new user
account
$Username = "NewUser"
$Password = ConvertTo-SecureString
"P@ssw0rd123!" -AsPlainText -Force
New-LocalUser -Name $Username -Password
$Password -FullName "New User" -Description
"Newly created user account"
This script creates a new local
user account. It sets a username
and password (which is converted to
a secure string), and then uses the
New-LocalUser cmdlet to create the
account. The full name and
description are optional parameters
that provide additional information
about the user.
53. Modify user account
properties
$Username = "ExistingUser"
Set-LocalUser -Name $Username -FullName
"Updated Full Name" -Description "Updated
user description" -UserMayChangePassword
$true
This script modifies the properties
of an existing user account. It
uses the Set-LocalUser cmdlet to
update the full name, description,
and whether the user can change
their own password. You can modify
various other properties as needed.
54. Enable or disable a
user account
$Username = "TargetUser"
$Action = "Enable" # or "Disable"
if ($Action -eq "Enable") {
Enable-LocalUser -Name $Username
} else {
Disable-LocalUser -Name $Username
}
This script enables or disables a
local user account based on the
specified action. It uses the Enable-
LocalUser or Disable-LocalUser cmdlet
accordingly. This is useful for
temporarily restricting access
without deleting the account.
55. Delete a user account
$Username = "UserToDelete"
Remove-LocalUser -Name $Username -
Confirm:$false
This script deletes a local user
account. It uses the Remove-LocalUser
cmdlet and suppresses the
confirmation prompt. Be cautious
when using this script, as it
permanently removes the user
account and associated data.
56. Reset a user's
password
$Username = "TargetUser"
$NewPassword = ConvertTo-SecureString
"NewP@ssw0rd!" -AsPlainText -Force
Set-LocalUser -Name $Username -Password
$NewPassword -PasswordNeverExpires $false
This script resets the password for
a local user account. It converts
the new password to a secure string
and uses the Set-LocalUser cmdlet to
update the password. The script
also sets the password to expire,
encouraging the user to change it
at next login.
57. Unlock a locked-out
user account
$Username = "LockedUser"
Unlock-ADAccount -Identity $Username
This script unlocks a locked-out
Active Directory user account. It
uses the Unlock-ADAccount cmdlet, which
is part of the Active Directory
module. This is useful for quickly
restoring access for users who have
exceeded failed login attempts.
58. Check user account
expiration dates
Get-ADUser -Filter * -Properties Name,
AccountExpirationDate |
Where-Object {$_.AccountExpirationDate -
ne $null} |
Select-Object Name, AccountExpirationDate
|
Sort-Object AccountExpirationDate
This script retrieves all Active
Directory user accounts with
expiration dates set. It filters
the accounts, selects the name and
expiration date properties, and
sorts the results by expiration
date. This is helpful for
identifying accounts that may need
attention or renewal.
59. Assign group
memberships to a user
$Username = "TargetUser"
$GroupNames = @("Group1", "Group2", "Group3")
foreach ($GroupName in $GroupNames) {
Add-ADGroupMember -Identity $GroupName -
Members $Username
}
This script adds a user to multiple
Active Directory groups. It
iterates through an array of group
names and uses the Add-ADGroupMember
cmdlet to add the user to each
group. This is useful for quickly
assigning permissions and access
rights to a user.
60. Remove a user from a
group
$Username = "TargetUser"
$GroupName = "GroupToRemoveFrom"
Remove-ADGroupMember -Identity $GroupName -
Members $Username -Confirm:$false
This script removes a user from an
Active Directory group. It uses the
Remove-ADGroupMember cmdlet and
suppresses the confirmation prompt.
This is helpful for revoking
specific access rights or
permissions from a user.
61. Export user details
to a CSV
Get-ADUser -Filter * -Properties Name,
EmailAddress, Enabled, LastLogonDate |
Select-Object Name, EmailAddress,
Enabled, LastLogonDate |
Export-Csv -Path "C:\[Link]" -
NoTypeInformation
This script exports Active
Directory user details to a CSV
file. It retrieves all users,
selects specific properties, and
exports them using the Export-Csv
cmdlet. This is useful for creating
reports or performing bulk
operations on user data.
62. Import users from a
CSV
$Users = Import-Csv -Path
"C:\[Link]"
foreach ($User in $Users) {
$SecurePassword = ConvertTo-SecureString
$[Link] -AsPlainText -Force
New-ADUser -Name $[Link] -
SamAccountName $[Link] -
UserPrincipalName $[Link] -
EmailAddress $[Link] -Enabled
$true -AccountPassword $SecurePassword
}
This script imports user data from
a CSV file and creates new Active
Directory user accounts. It reads
the CSV, iterates through each user
entry, and uses the New-ADUser cmdlet
to create the accounts. This is
helpful for bulk user creation or
migrating users from another
system.
63. Set user profile
paths
$Username = "TargetUser"
$ProfilePath = "\\Server\Profiles\$Username"
Set-ADUser -Identity $Username -ProfilePath
$ProfilePath
This script sets the profile path
for an Active Directory user. It
uses the Set-ADUser cmdlet to update
the profile path attribute. This is
useful for configuring roaming
profiles or redirecting user
profiles to a specific location.
64. Assign permissions to
a user
$Username = "TargetUser"
$FolderPath = "C:\SharedFolder"
$AccessRule = New-Object
[Link]
sRule($Username, "Modify",
"ContainerInherit,ObjectInherit", "None",
"Allow")
$ACL = Get-Acl $FolderPath
$[Link]($AccessRule)
Set-Acl $FolderPath $ACL
This script assigns file system
permissions to a user for a
specific folder. It creates a new
access rule, retrieves the current
ACL, adds the new rule, and applies
the updated ACL. This is helpful
for managing file and folder access
rights for users.
65. Find all disabled
accounts
Get-ADUser -Filter {Enabled -eq $false} -
Properties Name, LastLogonDate |
Select-Object Name, LastLogonDate |
Sort-Object LastLogonDate
This script finds all disabled user
accounts in Active Directory. It
filters for disabled accounts,
retrieves the name and last logon
date, and sorts the results. This
is useful for auditing inactive
accounts or identifying accounts
that may need to be re-enabled or
removed.
66. Get last login
details of a user
$Username = "TargetUser"
Get-ADUser -Identity $Username -Properties
LastLogonDate, LastBadPasswordAttempt |
Select-Object Name, LastLogonDate,
LastBadPasswordAttempt
This script retrieves the last
login details for a specific user.
It uses the Get-ADUser cmdlet to
fetch the last logon date and last
bad password attempt. This
information is helpful for
troubleshooting login issues or
monitoring user activity.
67. Audit user login
history
$StartDate = (Get-Date).AddDays(-30)
Get-WinEvent -FilterHashtable
@{LogName='Security'; ID=4624;
StartTime=$StartDate} |
Where-Object {$_.Properties[5].Value -
notlike '*$'} |
Select-Object TimeCreated,
@{Name='Username';Expression=
{$_.Properties[5].Value}}
This script audits user login
history for the past 30 days. It
retrieves successful login events
from the Windows Security log,
filters out system accounts, and
displays the timestamp and
username. This is useful for
security auditing and monitoring
user access patterns.
68. Automate password
expiration reminders
$DaysBeforeExpiry = 7
$Users = Get-ADUser -Filter {Enabled -eq
$true -and PasswordNeverExpires -eq $false} -
Properties PasswordLastSet, PasswordExpired,
EmailAddress
foreach ($User in $Users) {
$PasswordExpiryDate =
$[Link]((Get-
ADDefaultDomainPasswordPolicy).MaxPasswordAge
.Days)
$DaysUntilExpiry = ($PasswordExpiryDate -
(Get-Date)).Days
if ($DaysUntilExpiry -le
$DaysBeforeExpiry -and -not
$[Link]) {
$Body = "Your password will expire in
$DaysUntilExpiry days. Please change it
soon."
Send-MailMessage -To
$[Link] -From "IT@[Link]" -
Subject "Password Expiration Reminder" -Body
$Body -SmtpServer "[Link]"
}
}
This script automates password
expiration reminders. It identifies
users whose passwords are about to
expire within a specified number of
days and sends them an email
reminder. This helps prevent
account lockouts due to expired
passwords and encourages regular
password changes.
69. Notify inactive users
$InactiveDays = 30
$InactiveDate = (Get-
Date).AddDays(-$InactiveDays)
$InactiveUsers = Get-ADUser -Filter
{LastLogonDate -lt $InactiveDate -and Enabled
-eq $true} -Properties LastLogonDate,
EmailAddress
foreach ($User in $InactiveUsers) {
$Body = "Your account has been inactive
for more than $InactiveDays days. Please log
in to keep your account active."
Send-MailMessage -To $[Link] -
From "IT@[Link]" -Subject "Account
Inactivity Notice" -Body $Body -SmtpServer
"[Link]"
}
This script identifies and notifies
users who have been inactive for a
specified period. It retrieves
users whose last logon date is
beyond the inactive threshold and
sends them an email notification.
This helps maintain account
security and encourages regular
account usage.
70. Bulk user creation
script
$Users = Import-Csv -Path "C:\[Link]"
foreach ($User in $Users) {
$SecurePassword = ConvertTo-SecureString
$[Link] -AsPlainText -Force
$UPN = $[Link] + "@[Link]"
New-ADUser -Name $[Link] `
-GivenName $[Link] `
-Surname $[Link] `
-SamAccountName $[Link]
`
-UserPrincipalName $UPN `
-EmailAddress $[Link] `
-Description $[Link] `
-AccountPassword
$SecurePassword `
-Enabled $true `
-ChangePasswordAtLogon $true
Add-ADGroupMember -Identity
$[Link] -Members $[Link]
}
This script performs bulk user
creation based on data from a CSV
file. It creates Active Directory
user accounts, sets various
attributes, and adds the users to
their respective department groups.
This is extremely useful for
onboarding large numbers of new
employees or setting up accounts
for a new organization.
71. Generate reports on
user activities
$StartDate = (Get-Date).AddDays(-7)
$EndDate = Get-Date
$LoginEvents = Get-WinEvent -FilterHashtable
@{
LogName='Security'
ID=4624
StartTime=$StartDate
EndTime=$EndDate
} | Where-Object {$_.Properties[5].Value -
notlike '*$'}
$LoginReport = $LoginEvents | Group-Object
{$_.Properties[5].Value} |
Select-Object
@{Name='Username';Expression={$_.Name}},
Count
$LoginReport | Export-Csv -Path
"C:\[Link]" -NoTypeInformation
This script generates a report on
user login activities for the past
week. It retrieves login events
from the Windows Security log,
groups them by username, and
exports the results to a CSV file.
This report is valuable for
understanding user access patterns
and identifying unusual activity.
72. Identify admin
accounts on a system
Get-LocalGroupMember -Group "Administrators"
|
Where-Object {$_.ObjectClass -eq "User"}
|
Select-Object Name, PrincipalSource
Get-ADGroupMember -Identity "Domain Admins" |
Get-ADUser -Properties Name, Enabled,
LastLogonDate |
Select-Object Name, Enabled,
LastLogonDate
This script identifies both local
and domain administrator accounts.
It lists members of the local
Administrators group and retrieves
details of Domain Admins group
members. This information is
crucial for security audits and
ensuring that admin access is
properly controlled.
73. Monitor user group
changes
$Group = "ImportantGroup"
$InitialMembers = Get-ADGroupMember -Identity
$Group | Select-Object -ExpandProperty
SamAccountName
while ($true) {
Start-Sleep -Seconds 300 # Check every 5
minutes
$CurrentMembers = Get-ADGroupMember -
Identity $Group | Select-Object -
ExpandProperty SamAccountName
$AddedMembers = Compare-Object -
ReferenceObject $InitialMembers -
DifferenceObject $CurrentMembers |
Where-Object {$_.SideIndicator -eq
"=>"}
$RemovedMembers = Compare-Object -
ReferenceObject $InitialMembers -
DifferenceObject $CurrentMembers |
Where-Object {$_.SideIndicator -eq "
<="}
if ($AddedMembers -or $RemovedMembers) {
$Message = "Group membership changes
detected for $Group`n"
$Message += "Added:
$($[Link] -join ', ')`n"
$Message += "Removed:
$($[Link] -join ', ')"
Send-MailMessage -To
"admin@[Link]" -From
"monitoring@[Link]" -Subject "Group
Membership Change Alert" -Body $Message -
SmtpServer "[Link]"
$InitialMembers = $CurrentMembers
}
}
This script continuously monitors
changes in the membership of a
specified Active Directory group.
It compares the current group
members with the initial state and
sends an email alert if any changes
are detected. This is useful for
security monitoring and ensuring
that group memberships are not
altered without proper
authorization.
74. Clone a user account
function Clone-ADUser {
param(
[string]$SourceUser,
[string]$NewUsername,
[string]$NewFullName,
[string]$NewEmail
)
$SourceUserObj = Get-ADUser -Identity
$SourceUser -Properties *
$SecurePassword = ConvertTo-SecureString
"TempP@ss123!" -AsPlainText -Force
New-ADUser -SamAccountName $NewUsername `
-UserPrincipalName
"$NewUsername@[Link]" `
-Name $NewFullName `
-GivenName
$[Link]()[0] `
-Surname $[Link]()
[-1] `
-EmailAddress $NewEmail `
-Enabled $true `
-AccountPassword
$SecurePassword `
-ChangePasswordAtLogon $true
$Groups = Get-ADPrincipalGroupMembership
-Identity $SourceUser
foreach ($Group in $Groups) {
if ($[Link] -ne "Domain Users") {
Add-ADGroupMember -Identity
$Group -Members $NewUsername
}
}
Set-ADUser -Identity $NewUsername -
Description $[Link] -
Office $[Link] -Department
$[Link]
}
Clone-ADUser -SourceUser "ExistingUser" -
NewUsername "NewUser" -NewFullName "New User"
-NewEmail "newuser@[Link]"
This script defines a function to
clone an existing Active Directory
user account. It creates a new user
with similar properties and group
memberships as the source user.
This is particularly useful when
onboarding new employees who need
similar access rights as existing
users.
75. Generate temporary
login credentials
function New-TempCredentials {
param(
[string]$Username,
[int]$ValidityDays = 1
)
$TempPassword = -join ((65..90) +
(97..122) + (48..57) | Get-Random -Count 12 |
ForEach-Object {[char]$_})
$SecurePassword = ConvertTo-SecureString
$TempPassword -AsPlainText -Force
$ExpiryDate = (Get-
Date).AddDays($ValidityDays)
Set-ADAccountPassword -Identity $Username
-NewPassword $SecurePassword -Reset
Set-ADUser -Identity $Username -
ChangePasswordAtLogon $true
Set-ADAccountExpiration -Identity
$Username -DateTime $ExpiryDate
return @{
Username = $Username
Password = $TempPassword
ExpiryDate = $ExpiryDate
}
}
$TempCreds = New-TempCredentials -Username
"TempUser" -ValidityDays 2
Write-Output "Temporary credentials generated
for $($[Link])"
Write-Output "Password:
$($[Link])"
Write-Output "Expiry Date:
$($[Link])"
This script generates temporary
login credentials for a user. It
creates a random password, sets it
for the specified user, forces a
password change at next logon, and
sets an account expiration date.
This is useful for providing
temporary access to contractors or
for emergency access situations.
76. Monitor login
attempts
$LogPath = "C:\Logs\[Link]"
$FilterXPath = "*[System[(EventID=4625)]]"
$Watcher = New-Object
[Link]
tcher("Security", $FilterXPath)
Register-ObjectEvent -InputObject $Watcher -
EventName EventRecordWritten -Action {
$Event = $[Link]
$Username = $[Link][5].Value
$Workstation =
$[Link][13].Value
$IPAddress = $[Link][19].Value
$LogEntry = "$(Get-Date) - Failed login
attempt: User=$Username,
Workstation=$Workstation, IP=$IPAddress"
Add-Content -Path $LogPath -Value
$LogEntry
if ((Get-Content $LogPath | Measure-
Object -Line).Lines -gt 1000) {
Get-Content $LogPath | Select-Object
-Skip 1 | Set-Content "$[Link]"
Move-Item "$[Link]" $LogPath -
Force
}
}
# Keep the script running
while ($true) { Start-Sleep -Seconds 60 }
This script monitors and logs
failed login attempts in real-time.
It watches the Windows Security
event log for failed login events,
extracts relevant information, and
logs it to a file. It also
implements a basic log rotation to
prevent the log file from growing
too large. This is crucial for
security monitoring and detecting
potential brute-force attacks.
77. Create roaming
profiles
function Set-RoamingProfile {
param(
[string]$Username,
[string]$ProfilePath
)
$User = Get-ADUser -Identity $Username
if (-not $User) {
Write-Error "User $Username not
found."
return
}
if (-not (Test-Path $ProfilePath)) {
New-Item -Path $ProfilePath -ItemType
Directory -Force
}
$ACL = Get-Acl $ProfilePath
$AccessRule = New-Object
[Link]
sRule($Username, "Modify",
"ContainerInherit,ObjectInherit", "None",
"Allow")
$[Link]($AccessRule)
Set-Acl $ProfilePath $ACL
Set-ADUser -Identity $Username -
ProfilePath "\\server\profiles\$Username"
}
Set-RoamingProfile -Username "JohnDoe" -
ProfilePath "\\server\profiles\JohnDoe"
This script sets up a roaming
profile for an Active Directory
user. It creates the profile
directory if it doesn't exist, sets
appropriate permissions, and
updates the user's AD account with
the profile path. Roaming profiles
allow users to maintain consistent
desktop environments across
different machines in the network.
78. Backup user profile
data
function Backup-UserProfile {
param(
[string]$Username,
[string]$BackupPath
)
$UserProfile = "C:\Users\$Username"
$BackupDestination = Join-Path
$BackupPath "$Username-$(Get-Date -Format
'yyyyMMdd')"
if (-not (Test-Path $UserProfile)) {
Write-Error "User profile for
$Username not found."
return
}
if (-not (Test-Path $BackupPath)) {
New-Item -Path $BackupPath -ItemType
Directory -Force
}
$FoldersToBackup = @('Desktop',
'Documents', 'Pictures', 'Downloads')
foreach ($Folder in $FoldersToBackup) {
$Source = Join-Path $UserProfile
$Folder
$Destination = Join-Path
$BackupDestination $Folder
if (Test-Path $Source) {
Copy-Item -Path $Source -
Destination $Destination -Recurse -Force
}
}
Write-Output "Backup completed for
$Username at $BackupDestination"
}
Backup-UserProfile -Username "JohnDoe" -
BackupPath "E:\UserBackups"
This script backs up important
folders from a user's profile to a
specified backup location. It
creates a dated backup folder and
copies selected directories like
Desktop, Documents, Pictures, and
Downloads. This is useful for
preserving user data before major
system changes or as part of a
regular backup routine.
79. Restore user profile
data
function Restore-UserProfile {
param(
[string]$Username,
[string]$BackupPath,
[string]$RestoreDate
)
$UserProfile = "C:\Users\$Username"
$BackupSource = Join-Path $BackupPath
"$Username-$RestoreDate"
if (-not (Test-Path $BackupSource)) {
Write-Error "Backup for $Username on
$RestoreDate not found."
return
}
$FoldersToRestore = @('Desktop',
'Documents', 'Pictures', 'Downloads')
foreach ($Folder in $FoldersToRestore) {
$Source = Join-Path $BackupSource
$Folder
$Destination = Join-Path $UserProfile
$Folder
if (Test-Path $Source) {
if (Test-Path $Destination) {
Rename-Item -Path
$Destination -NewName "$Folder-Old-$(Get-Date
-Format 'yyyyMMddHHmmss')"
}
Copy-Item -Path $Source -
Destination $Destination -Recurse -Force
}
}
Write-Output "Restore completed for
$Username from backup dated $RestoreDate"
}
Restore-UserProfile -Username "JohnDoe" -
BackupPath "E:\UserBackups" -RestoreDate
"20230515"
This script restores user profile
data from a previous backup. It
allows specifying a particular
backup date, renames existing
folders to prevent data loss, and
then copies the backed-up data to
the user's profile. This is crucial
for recovering user data after
system failures or when migrating
to a new machine.
80. Log user login/logout
events
$LogFile = "C:\Logs\[Link]"
function Write-Log {
param([string]$Message)
Add-Content -Path $LogFile -Value "$(Get-
Date) - $Message"
}
$LoginFilter = @{
LogName = 'Security'
ID = 4624 # Successful login
}
$LogoutFilter = @{
LogName = 'Security'
ID = 4634 # Successful logout
}
$LoginWatcher = New-Object
[Link]
tcher(
(New-Object
[Link]
ery("Security",
[[Link]]
::LogName, "*[System[(EventID=4624)]]"))
)
$LogoutWatcher = New-Object
[Link]
tcher(
(New-Object
[Link]
ery("Security",
[[Link]]
::LogName, "*[System[(EventID=4634)]]"))
)
Register-ObjectEvent -InputObject
$LoginWatcher -EventName EventRecordWritten -
Action {
$Event = $[Link]
$Username = $[Link][5].Value
$IPAddress = $[Link][18].Value
Write-Log "Login: User=$Username,
IP=$IPAddress"
}
Register-ObjectEvent -InputObject
$LogoutWatcher -EventName EventRecordWritten
-Action {
$Event = $[Link]
$Username = $[Link][1].Value
Write-Log "Logout: User=$Username"
}
# Keep the script running
while ($true) { Start-Sleep -Seconds 60 }
This script logs user login and
logout events in real-time. It
watches the Windows Security event
log for successful login and logout
events, extracts relevant
information, and logs it to a file.
This is useful for auditing user
activity and maintaining a record
of system access.
81. Lock inactive user
accounts
$InactiveDays = 90
$InactiveDate = (Get-
Date).AddDays(-$InactiveDays)
$InactiveUsers = Get-ADUser -Filter
{LastLogonDate -lt $InactiveDate -and Enabled
-eq $true} -Properties LastLogonDate,
EmailAddress
foreach ($User in $InactiveUsers) {
Disable-ADAccount -Identity $User
$Body = "Your account has been disabled
due to inactivity for more than $InactiveDays
days. Please contact IT support to reactivate
your account."
Send-MailMessage -To $[Link] -
From "IT@[Link]" -Subject "Account
Disabled Due to Inactivity" -Body $Body -
SmtpServer "[Link]"
Write-Output "Disabled account for
$($[Link]) - Last logon:
$($[Link])"
}
This script identifies and disables
user accounts that have been
inactive for a specified period. It
retrieves users whose last logon
date is beyond the inactive
threshold, disables their accounts,
and sends them an email
notification. This helps maintain
security by reducing the attack
surface of unused accounts.
82. Generate unique
usernames
function Get-UniqueUsername {
param(
[string]$FirstName,
[string]$LastName
)
$BaseUsername =
($[Link](0,1) +
$LastName).ToLower()
$Username = $BaseUsername
$Counter = 1
while (Get-ADUser -Filter {SamAccountName
-eq $Username}) {
$Username = $BaseUsername + $Counter
$Counter++
}
return $Username
}
$NewUsername = Get-UniqueUsername -FirstName
"John" -LastName "Smith"
Write-Output "Generated unique username:
$NewUsername"
This script generates a unique
username based on the user's first
and last name. It creates a base
username using the first initial
and last name, then checks if it
exists in Active Directory. If the
username already exists, it appends
a number and increments until a
unique username is found. This is
useful for automating user account
creation and ensuring unique
identifiers for each user.
83. Display all active
sessions for a user
function Get-UserSessions {
param([string]$Username)
$Sessions = query session
/server:$env:COMPUTERNAME |
Where-Object {$_ -match
$Username} |
ForEach-Object {
$Fields = $_.Trim() -
split '\s+'
[PSCustomObject]@{
Username = $Fields[1]
SessionName =
$Fields[2]
ID = $Fields[3]
State = $Fields[4]
IdleTime = $Fields[5]
LogonTime =
$Fields[6]
}
}
return $Sessions
}
$ActiveSessions = Get-UserSessions -Username
"JohnDoe"
$ActiveSessions | Format-Table -AutoSize
This script displays all active
sessions for a specified user
across the network. It uses the
query session command to retrieve
session information, then parses
and formats the output. This is
useful for troubleshooting login
issues or monitoring user activity
across multiple machines.
84. Audit failed login
attempts
$StartTime = (Get-Date).AddDays(-1)
$EndTime = Get-Date
$FailedLogins = Get-WinEvent -FilterHashtable
@{
LogName = 'Security'
ID = 4625 # Failed login attempt
StartTime = $StartTime
EndTime = $EndTime
} | ForEach-Object {
[PSCustomObject]@{
Time = $_.TimeCreated
Username = $_.Properties[5].Value
Workstation = $_.Properties[13].Value
IPAddress = $_.Properties[19].Value
FailureReason =
$_.Properties[8].Value
}
}
$FailedLogins | Export-Csv -Path
"C:\[Link]" -NoTypeInformation
$SuspiciousActivity = $FailedLogins | Group-
Object Username | Where-Object {$_.Count -ge
5}
if ($SuspiciousActivity) {
$Body = "The following accounts have had
5 or more failed login attempts in the last
24 hours:`n`n"
$Body += $SuspiciousActivity | ForEach-
Object { "$($_.Name): $($_.Count) attempts`n"
}
Send-MailMessage -To
"security@[Link]" -From
"monitoring@[Link]" -Subject "Suspicious
Login Activity Alert" -Body $Body -SmtpServer
"[Link]"
}
This script audits failed login
attempts over the past 24 hours. It
retrieves failed login events from
the Windows Security log, extracts
relevant information, and exports
it to a CSV file. Additionally, it
identifies accounts with multiple
failed attempts and sends an alert
email if suspicious activity is
detected. This is crucial for
detecting potential brute-force
attacks or compromised accounts.
85. Display all domain
users
function Get-DomainUserReport {
$Users = Get-ADUser -Filter * -Properties
Name, Enabled, LastLogonDate,
PasswordLastSet, PasswordNeverExpires,
EmailAddress, Department
$Report = $Users | Select-Object Name,
Enabled, LastLogonDate, PasswordLastSet,
PasswordNeverExpires, EmailAddress,
Department
$Report | Export-Csv -Path
"C:\[Link]" -NoTypeInformation
$TotalUsers = $[Link]
$EnabledUsers = ($Report | Where-Object
{$_.Enabled -eq $true}).Count
$DisabledUsers = $TotalUsers -
$EnabledUsers
$NeverLoggedIn = ($Report | Where-Object
{$_.LastLogonDate -eq $null}).Count
$PasswordNeverExpires = ($Report | Where-
Object {$_.PasswordNeverExpires -eq
$true}).Count
$Summary = @"
Domain User Report Summary:
---------------------------
Total Users: $TotalUsers
Enabled Users: $EnabledUsers
Disabled Users: $DisabledUsers
Never Logged In: $NeverLoggedIn
Password Never Expires: $PasswordNeverExpires
"@
Write-Output $Summary
$Summary | Out-File -FilePath
"C:\[Link]"
return $Report
}
$DomainUsers = Get-DomainUserReport
$DomainUsers | Format-Table -AutoSize
This script generates a
comprehensive report of all users
in the Active Directory domain. It
retrieves various user properties,
exports them to a CSV file, and
provides a summary of key
statistics. This is useful for
regular audits of the domain user
base, identifying inactive
accounts, and ensuring password
policies are properly enforced.
86. Identify orphaned
accounts
function Get-OrphanedAccounts {
$InactiveDays = 90
$InactiveDate = (Get-
Date).AddDays(-$InactiveDays)
$OrphanedAccounts = Get-ADUser -Filter {
(LastLogonDate -lt $InactiveDate -or
LastLogonDate -notlike "*") -and
(Enabled -eq $true)
} -Properties LastLogonDate, Manager,
Department
$Report = $OrphanedAccounts | ForEach-
Object {
$Manager = if ($_.Manager) { (Get-
ADUser $_.Manager).Name } else { "No Manager"
}
[PSCustomObject]@{
Username = $_.SamAccountName
Name = $_.Name
LastLogon = $_.LastLogonDate
Department = $_.Department
Manager = $Manager
}
}
$Report | Export-Csv -Path
"C:\[Link]" -NoTypeInformation
Write-Output "Found $($[Link])
potentially orphaned accounts. Details
exported to C:\[Link]"
return $Report
}
$OrphanedAccounts = Get-OrphanedAccounts
$OrphanedAccounts | Format-Table -AutoSize
This script identifies potentially
orphaned user accounts in Active
Directory. It looks for accounts
that haven't logged in for a
specified period (default 90 days)
and are still enabled. It also
includes information about the
user's department and manager,
which can be helpful in determining
if the account is truly orphaned or
just inactive. This is crucial for
maintaining a clean and secure
Active Directory environment.
87. Archive old user
accounts
function Archive-OldUserAccounts {
param(
[int]$InactiveDays = 365,
[string]$ArchiveOU =
"OU=ArchivedUsers,DC=yourdomain,DC=com"
)
$InactiveDate = (Get-
Date).AddDays(-$InactiveDays)
$OldAccounts = Get-ADUser -Filter {
(LastLogonDate -lt $InactiveDate -or
LastLogonDate -notlike "*") -and
(Enabled -eq $true)
} -Properties LastLogonDate, Description
foreach ($Account in $OldAccounts) {
$NewDescription = "Archived on $(Get-
Date -Format 'yyyy-MM-dd'). Original
description: $($[Link])"
Set-ADUser -Identity $Account -
Description $NewDescription -Enabled $false
Move-ADObject -Identity $Account -
TargetPath $ArchiveOU
Write-Output "Archived account:
$($[Link])"
}
$ArchivedCount = $[Link]
Write-Output "Archived $ArchivedCount
user accounts to $ArchiveOU"
}
Archive-OldUserAccounts -InactiveDays 365 -
ArchiveOU
"OU=ArchivedUsers,DC=yourdomain,DC=com"
This script archives old user
accounts that have been inactive
for a specified period. It moves
these accounts to a designated
Archived Users OU, disables them,
and updates their description to
include the archive date. This
helps in maintaining a clean Active
Directory while preserving old
accounts for potential future
reference or audit purposes.
88. Detect users with
expired passwords
function Get-ExpiredPasswordUsers {
$ExpiredUsers = Search-ADAccount -
AccountExpired -UsersOnly |
Get-ADUser -Properties
Name, EmailAddress, PasswordLastSet,
AccountExpirationDate
$Report = $ExpiredUsers | ForEach-Object
{
[PSCustomObject]@{
Username = $_.SamAccountName
Name = $_.Name
Email = $_.EmailAddress
PasswordLastSet =
$_.PasswordLastSet
AccountExpirationDate =
$_.AccountExpirationDate
}
}
$Report | Export-Csv -Path
"C:\[Link]" -
NoTypeInformation
Write-Output "Found $($[Link])
users with expired passwords. Details
exported to C:\[Link]"
# Optionally, send notifications to users
foreach ($User in $Report) {
$Body = "Dear $($[Link]),`n`nYour
account password has expired. Please contact
IT support to reset your password and regain
access to your account."
Send-MailMessage -To $[Link] -
From "IT@[Link]" -Subject "Your Account
Password Has Expired" -Body $Body -SmtpServer
"[Link]"
}
return $Report
}
$ExpiredPasswordUsers = Get-
ExpiredPasswordUsers
$ExpiredPasswordUsers | Format-Table -
AutoSize
This script detects users with
expired passwords in Active
Directory. It generates a report of
these users, including their
contact information and password
expiration details. Additionally,
it can send email notifications to
affected users, prompting them to
update their passwords. This helps
in maintaining account security and
ensuring users have continuous
access to their accounts.
89. Check password policy
compliance
function Check-PasswordPolicyCompliance {
$DomainPolicy = Get-
ADDefaultDomainPasswordPolicy
$FineGrainedPolicies = Get-
ADFineGrainedPasswordPolicy -Filter *
$Users = Get-ADUser -Filter * -Properties
PasswordLastSet, PasswordNeverExpires, msDS-
UserPasswordExpiryTimeComputed
$Report = @()
foreach ($User in $Users) {
$Policy = $DomainPolicy
$AppliedFineGrainedPolicy =
$FineGrainedPolicies | Where-Object { (Get-
ADUserResultantPasswordPolicy
$[Link]) -eq $_ }
if ($AppliedFineGrainedPolicy) {
$Policy =
$AppliedFineGrainedPolicy
}
$PasswordAge = (Get-Date) -
$[Link]
$DaysUntilExpiry =
(([datetime]::FromFileTime($User."msDS-
UserPasswordExpiryTimeComputed")) - (Get-
Date)).Days
$Report += [PSCustomObject]@{
Username = $[Link]
PasswordAge = $[Link]
PasswordNeverExpires =
$[Link]
DaysUntilExpiry =
$DaysUntilExpiry
CompliantMaxPasswordAge =
($[Link] -le
$[Link])
CompliantMinPasswordAge =
($[Link] -ge
$[Link])
AppliedPolicy = if
($AppliedFineGrainedPolicy) {
$[Link] } else {
"Default Domain Policy" }
}
}
$Report | Export-Csv -Path
"C:\[Link]" -
NoTypeInformation
$NonCompliantUsers = $Report | Where-
Object { -not $_.CompliantMaxPasswordAge -or
-not $_.CompliantMinPasswordAge }
Write-Output "Password Policy Compliance
Report generated. Total Users:
$($[Link])"
Write-Output "Non-compliant Users:
$($[Link])"
Write-Output "Details exported to
C:\[Link]"
return $Report
}
$PolicyCompliance = Check-
PasswordPolicyCompliance
$PolicyCompliance | Format-Table Username,
PasswordAge, DaysUntilExpiry,
CompliantMaxPasswordAge,
CompliantMinPasswordAge, AppliedPolicy -
AutoSize
This script checks password policy
compliance for all users in Active
Directory. It considers both the
default domain password policy and
any fine-grained password policies
that may be applied. The script
generates a report showing each
user's password age, days until
expiry, and whether they comply
with the maximum and minimum
password age requirements. This is
crucial for ensuring that all user
accounts adhere to the
organization's password policies,
enhancing overall security.
90. Notify users of
password expiry
function Notify-PasswordExpiry {
param(
[int]$WarningDays = 14
)
$Users = Get-ADUser -Filter {Enabled -eq
$true -and PasswordNeverExpires -eq $false} -
Properties msDS-
UserPasswordExpiryTimeComputed, EmailAddress
$ExpiringUsers = $Users | ForEach-Object
{
$ExpiryDate =
[datetime]::FromFileTime($_."msDS-
UserPasswordExpiryTimeComputed")
$DaysUntilExpiry = ($ExpiryDate -
(Get-Date)).Days
if ($DaysUntilExpiry -le $WarningDays
-and $DaysUntilExpiry -ge 0) {
[PSCustomObject]@{
Username = $_.SamAccountName
Email = $_.EmailAddress
ExpiryDate = $ExpiryDate
DaysUntilExpiry =
$DaysUntilExpiry
}
}
}
foreach ($User in $ExpiringUsers) {
$Body = @"
Dear $($[Link]),
Your account password will expire in
$($[Link]) day(s) on
$($[Link]("yyyy-MM-dd")).
Please change your password before it expires
to avoid any disruption to your account
access.
To change your password:
1. Press Ctrl+Alt+Delete
2. Select 'Change a password'
3. Follow the prompts to set a new password
If you need assistance, please contact the IT
Help Desk.
Thank you,
IT Department
"@
Send-MailMessage -To $[Link] -
From "IT@[Link]" -Subject "Password
Expiry Warning" -Body $Body -SmtpServer
"[Link]"
Write-Output "Sent password expiry
notification to $($[Link])"
}
$NotifiedCount = $[Link]
Write-Output "Notified $NotifiedCount
users about upcoming password expiration"
}
Notify-PasswordExpiry -WarningDays 14
This script identifies users whose
passwords are about to expire
within a specified number of days
(default is 14) and sends them
email notifications. It retrieves
the password expiry date for each
enabled user, calculates the days
until expiry, and sends a
customized email to those whose
passwords will expire soon. This
proactive approach helps prevent
account lockouts due to expired
passwords and encourages users to
maintain up-to-date credentials.
These scripts provide a
comprehensive set of tools for
managing user accounts, monitoring
security, and maintaining
compliance with password policies
in an Active Directory environment.
They cover various aspects of user
management, from creation and
modification to archiving and
security auditing, helping IT
professionals efficiently manage
and secure their organization's
user accounts.
Chapter 4: Active
Directory Management
(Scripts 91–130)
91. Query all AD users
This script retrieves all Active
Directory users in the current
domain.
Import-Module ActiveDirectory
Get-ADUser -Filter * | Select-Object Name,
SamAccountName, UserPrincipalName, Enabled
This script uses the Get-ADUser
cmdlet with the -Filter * parameter
to retrieve all user objects. The
Select-Object cmdlet is used to
display specific properties of each
user, including their name,
SamAccountName, UserPrincipalName,
and whether the account is enabled.
92. Query all AD groups
This script retrieves all Active
Directory groups in the current
domain.
Import-Module ActiveDirectory
Get-ADGroup -Filter * | Select-Object Name,
GroupCategory, GroupScope
The Get-ADGroup cmdlet is used with
the -Filter * parameter to retrieve
all group objects. The Select-Object
cmdlet displays the name, category
(Distribution or Security), and
scope (Domain Local, Global, or
Universal) of each group.
93. Find a specific user
in AD
This script searches for a specific
user in Active Directory based on
their name or username.
Import-Module ActiveDirectory
$searchTerm = Read-Host "Enter the user's
name or username to search"
Get-ADUser -Filter "Name -like
'*$searchTerm*' -or SamAccountName -like
'*$searchTerm*'" |
Select-Object Name, SamAccountName,
UserPrincipalName, Enabled
The script prompts the user to
enter a search term, then uses the
Get-ADUser cmdlet with a filter to
search for users whose name or
SamAccountName contains the search
term. The results are displayed
with relevant user properties.
94. Find all members of a
group
This script retrieves all members
of a specified Active Directory
group.
Import-Module ActiveDirectory
$groupName = Read-Host "Enter the group name"
Get-ADGroupMember -Identity $groupName |
Get-ADUser -Properties Name,
SamAccountName, UserPrincipalName, Enabled |
Select-Object Name, SamAccountName,
UserPrincipalName, Enabled
The script prompts for a group
name, then uses Get-ADGroupMember to
retrieve all members of the group.
It then pipes the results to Get-
ADUser to get detailed information
about each user, and finally
displays relevant properties.
95. Add a user to an AD
group
This script adds a specified user
to an Active Directory group.
Import-Module ActiveDirectory
$userName = Read-Host "Enter the username to
add"
$groupName = Read-Host "Enter the group name"
try {
Add-ADGroupMember -Identity $groupName -
Members $userName -ErrorAction Stop
Write-Host "User $userName successfully
added to group $groupName" -ForegroundColor
Green
} catch {
Write-Host "Error:
$($_.[Link])" -ForegroundColor Red
}
The script prompts for a username
and group name, then uses the Add-
ADGroupMember cmdlet to add the user to
the specified group. Error handling
is included to catch and display
any issues that occur during the
process.
96. Remove a user from an
AD group
This script removes a specified
user from an Active Directory
group.
Import-Module ActiveDirectory
$userName = Read-Host "Enter the username to
remove"
$groupName = Read-Host "Enter the group name"
try {
Remove-ADGroupMember -Identity $groupName
-Members $userName -Confirm:$false -
ErrorAction Stop
Write-Host "User $userName successfully
removed from group $groupName" -
ForegroundColor Green
} catch {
Write-Host "Error:
$($_.[Link])" -ForegroundColor Red
}
Similar to the previous script,
this one prompts for a username and
group name, but uses the Remove-
ADGroupMember cmdlet to remove the user
from the specified group. The -
Confirm:$false parameter is used to
suppress the confirmation prompt.
97. Export AD user
details to a CSV
This script exports Active
Directory user details to a CSV
file.
Import-Module ActiveDirectory
$outputPath = "C:\[Link]"
$properties = @("Name", "SamAccountName",
"UserPrincipalName", "Enabled",
"EmailAddress", "Department", "Title")
Get-ADUser -Filter * -Properties $properties
|
Select-Object $properties |
Export-Csv -Path $outputPath -
NoTypeInformation
Write-Host "AD user details exported to
$outputPath" -ForegroundColor Green
This script uses Get-ADUser to
retrieve all users and their
specified properties. The results
are then exported to a CSV file
using Export-Csv . The -NoTypeInformation
parameter is used to exclude the
type information from the CSV file.
98. Import users into AD
from a CSV
This script imports users into
Active Directory from a CSV file.
Import-Module ActiveDirectory
$csvPath = Read-Host "Enter the path to the
CSV file"
$users = Import-Csv -Path $csvPath
foreach ($user in $users) {
$securePassword = ConvertTo-SecureString
$[Link] -AsPlainText -Force
try {
New-ADUser -Name $[Link] `
-SamAccountName
$[Link] `
-UserPrincipalName
$[Link] `
-EmailAddress
$[Link] `
-Enabled $true `
-AccountPassword
$securePassword `
-ChangePasswordAtLogon
$true `
-ErrorAction Stop
Write-Host "User $($[Link])
created successfully" -ForegroundColor Green
} catch {
Write-Host "Error creating user
$($[Link]): $($_.[Link])" -
ForegroundColor Red
}
}
This script prompts for the path to
a CSV file containing user
information. It then reads the CSV
and creates a new AD user for each
entry using the New-ADUser cmdlet.
The script includes error handling
to catch and display any issues
during user creation.
99. Automate AD user
creation
This script automates the creation
of a new Active Directory user with
predefined settings.
Import-Module ActiveDirectory
function New-StandardADUser {
param(
[Parameter(Mandatory=$true)]
[string]$FirstName,
[Parameter(Mandatory=$true)]
[string]$LastName,
[Parameter(Mandatory=$true)]
[string]$Department,
[Parameter(Mandatory=$true)]
[string]$Title
)
$username = ($[Link](0,1) +
$LastName).ToLower()
$upn = "$username@[Link]"
$email = "$username@[Link]"
$ou =
"OU=$Department,OU=Users,DC=yourdomain,DC=com
"
$securePassword = ConvertTo-SecureString
"ChangeMe123!" -AsPlainText -Force
try {
New-ADUser -Name "$FirstName
$LastName" `
-GivenName $FirstName `
-Surname $LastName `
-SamAccountName $username
`
-UserPrincipalName $upn `
-EmailAddress $email `
-Title $Title `
-Department $Department `
-Path $ou `
-Enabled $true `
-AccountPassword
$securePassword `
-ChangePasswordAtLogon
$true `
-ErrorAction Stop
Write-Host "User $username created
successfully" -ForegroundColor Green
} catch {
Write-Host "Error creating user
$username: $($_.[Link])" -
ForegroundColor Red
}
}
# Example usage
New-StandardADUser -FirstName "John" -
LastName "Doe" -Department "IT" -Title
"Systems Administrator"
This script defines a function New-
StandardADUser that creates a new AD
user with standardized settings. It
generates a username, email, and
UPN based on the user's name, and
places the user in an OU based on
their department. The function
includes error handling and can be
easily called with the required
parameters.
100. Audit AD user
accounts
This script performs an audit of
Active Directory user accounts,
checking for various security and
compliance issues.
Import-Module ActiveDirectory
$inactiveThreshold = (Get-Date).AddDays(-90)
$passwordExpirationThreshold = (Get-
Date).AddDays(14)
$users = Get-ADUser -Filter * -Properties
Name, Enabled, LastLogonDate,
PasswordLastSet, PasswordNeverExpires,
PasswordExpired
$auditResults = @()
foreach ($user in $users) {
$issues = @()
if (-not $[Link]) {
$issues += "Account disabled"
}
if ($[Link] -lt
$inactiveThreshold) {
$issues += "Inactive (Last logon:
$($[Link]))"
}
if ($[Link]) {
$issues += "Password never expires"
}
if ($[Link]) {
$issues += "Password expired"
}
if ($[Link] -and
$[Link](90) -lt (Get-
Date)) {
$issues += "Password older than 90
days"
}
if ($[Link] -gt 0) {
$auditResults += [PSCustomObject]@{
Name = $[Link]
SamAccountName =
$[Link]
Issues = $issues -join ", "
}
}
}
$auditResults | Export-Csv -Path
"C:\[Link]" -NoTypeInformation
Write-Host "Audit completed. Results exported
to C:\[Link]" -ForegroundColor
Green
This script performs an audit of AD
user accounts, checking for issues
such as disabled accounts,
inactivity, password expiration,
and password age. The results are
compiled into a custom object and
exported to a CSV file for further
analysis.
101. Check AD replication
status
This script checks the replication
status between domain controllers
in an Active Directory environment.
Import-Module ActiveDirectory
$dcs = Get-ADDomainController -Filter *
foreach ($dc in $dcs) {
Write-Host "Checking replication status
for $($[Link])..." -ForegroundColor
Yellow
$results = repadmin /showrepl
$[Link]
if ($results -match "0 consecutive
failure") {
Write-Host "Replication is healthy
for $($[Link])" -ForegroundColor Green
} else {
Write-Host "Replication issues
detected for $($[Link]):" -
ForegroundColor Red
$results | Where-Object { $_ -match
"consecutive failure" } | ForEach-Object {
Write-Host $_ -ForegroundColor
Red
}
}
Write-Host ""
}
This script uses the repadmin
command-line tool to check the
replication status for each domain
controller in the environment. It
then parses the results to
determine if there are any
replication failures and displays
the status for each DC.
102. Force AD replication
This script forces replication
between all domain controllers in
an Active Directory environment.
Import-Module ActiveDirectory
$dcs = Get-ADDomainController -Filter *
foreach ($dc in $dcs) {
Write-Host "Forcing replication for
$($[Link])..." -ForegroundColor Yellow
try {
$result = repadmin /syncall
$[Link] /AdeP
if ($result -match "Sync from") {
Write-Host "Replication initiated
successfully for $($[Link])" -
ForegroundColor Green
} else {
Write-Host "Unexpected result for
$($[Link]):" -ForegroundColor Red
Write-Host $result
}
} catch {
Write-Host "Error forcing replication
for $($[Link]): $($_.[Link])"
-ForegroundColor Red
}
Write-Host ""
}
This script uses the repadmin
command-line tool with the /syncall
parameter to force replication for
each domain controller. The /AdeP
switch ensures that all partitions
are synchronized. The script
includes error handling and
provides feedback on the
replication process for each DC.
103. Generate a group
membership report
This script generates a report of
all groups and their members in the
Active Directory environment.
Import-Module ActiveDirectory
$groups = Get-ADGroup -Filter *
$report = @()
foreach ($group in $groups) {
$members = Get-ADGroupMember -Identity
$[Link] |
Get-ADObject -Properties Name,
ObjectClass |
Select-Object Name,
ObjectClass
foreach ($member in $members) {
$report += [PSCustomObject]@{
GroupName = $[Link]
MemberName = $[Link]
MemberType = $[Link]
}
}
}
$report | Export-Csv -Path
"C:\[Link]" -
NoTypeInformation
Write-Host "Group membership report exported
to C:\[Link]" -
ForegroundColor Green
This script retrieves all AD
groups, then iterates through each
group to get its members. It
creates a custom object for each
group-member pair, including the
group name, member name, and member
type (user or group). The results
are exported to a CSV file for easy
analysis.
104. Find AD users with
no group memberships
This script identifies Active
Directory users who are not members
of any groups.
Import-Module ActiveDirectory
$users = Get-ADUser -Filter * -Properties
MemberOf
$usersWithNoGroups = @()
foreach ($user in $users) {
if ($[Link] -eq 0) {
$usersWithNoGroups +=
[PSCustomObject]@{
Name = $[Link]
SamAccountName =
$[Link]
Enabled = $[Link]
}
}
}
if ($[Link] -gt 0) {
$usersWithNoGroups | Export-Csv -Path
"C:\[Link]" -
NoTypeInformation
Write-Host "Found
$($[Link]) users with no
group memberships. Results exported to
C:\[Link]" -ForegroundColor
Yellow
} else {
Write-Host "No users found without group
memberships." -ForegroundColor Green
}
This script retrieves all AD users
and their group memberships. It
then identifies users who are not
members of any groups and exports
the results to a CSV file. The
script provides feedback on the
number of users found without group
memberships.
105. Reset an AD user
password
This script allows an administrator
to reset the password for an Active
Directory user account.
Import-Module ActiveDirectory
$username = Read-Host "Enter the username of
the account to reset"
$newPassword = Read-Host "Enter the new
password" -AsSecureString
try {
Set-ADAccountPassword -Identity $username
-NewPassword $newPassword -Reset -ErrorAction
Stop
Set-ADUser -Identity $username -
ChangePasswordAtLogon $true -ErrorAction Stop
Write-Host "Password reset successful for
user $username. User will be required to
change password at next logon." -
ForegroundColor Green
} catch {
Write-Host "Error resetting password:
$($_.[Link])" -ForegroundColor Red
}
This script prompts for a username
and a new password, then uses Set-
ADAccountPassword to reset the user's
password. It also sets the "Change
password at next logon" flag using
Set-ADUser . The script includes error
handling to catch and display any
issues during the password reset
process.
106. Unlock an AD account
This script unlocks a locked Active
Directory user account.
Import-Module ActiveDirectory
$username = Read-Host "Enter the username of
the locked account"
try {
Unlock-ADAccount -Identity $username -
ErrorAction Stop
Write-Host "Account $username has been
successfully unlocked" -ForegroundColor Green
} catch {
Write-Host "Error unlocking account:
$($_.[Link])" -ForegroundColor Red
}
This script prompts for a username
and uses the Unlock-ADAccount cmdlet to
unlock the specified account. It
includes error handling to catch
and display any issues that occur
during the unlocking process.
107. Delete an AD user
account
This script deletes a specified
Active Directory user account.
Import-Module ActiveDirectory
$username = Read-Host "Enter the username of
the account to delete"
try {
$user = Get-ADUser -Identity $username -
ErrorAction Stop
$confirmation = Read-Host "Are you sure
you want to delete the user account for
$($[Link])? (Y/N)"
if ($confirmation -eq "Y") {
Remove-ADUser -Identity $username -
Confirm:$false -ErrorAction Stop
Write-Host "User account $username
has been successfully deleted" -
ForegroundColor Green
} else {
Write-Host "Deletion cancelled" -
ForegroundColor Yellow
}
} catch {
Write-Host "Error:
$($_.[Link])" -ForegroundColor Red
}
This script prompts for a username,
confirms the user's identity, and
then asks for confirmation before
deleting the account. It uses the
Remove-ADUser cmdlet to delete the
account if confirmed. The script
includes error handling and
confirmation steps to prevent
accidental deletions.
108. Monitor AD changes
This script monitors Active
Directory for changes and logs them
to a file.
Import-Module ActiveDirectory
$logFile = "C:\[Link]"
$filter = "*"
$properties = @("*")
$events = Get-WinEvent -FilterHashtable @{
LogName = "Security"
ID = 4662, 5136, 5137, 5141
} -MaxEvents 100
foreach ($event in $events) {
$message = "Time: $($[Link]),
Event ID: $($[Link]), "
switch ($[Link]) {
4662 { $message += "Operation: Object
accessed" }
5136 { $message += "Operation: Object
modified" }
5137 { $message += "Operation: Object
created" }
5141 { $message += "Operation: Object
deleted" }
}
$message += ", User:
$($[Link][1].Value)"
$message | Out-File -Append -FilePath
$logFile
}
Write-Host "AD changes logged to $logFile" -
ForegroundColor Green
This script uses the Get-WinEvent
cmdlet to retrieve recent security
events related to AD changes. It
then parses these events and logs
them to a file, including
information about the type of
change, time, and user who made the
change.
109. Automate group
policy application
This script automates the
application of group policies to a
specific Organizational Unit (OU).
Import-Module GroupPolicy
$ouPath = "OU=TestOU,DC=yourdomain,DC=com"
$gpoName = "Test GPO"
try {
# Create a new GPO
$gpo = New-GPO -Name $gpoName -
ErrorAction Stop
# Link the GPO to the specified OU
New-GPLink -Name $gpoName -Target $ouPath
-ErrorAction Stop
# Set a sample policy (e.g., enable
Windows Firewall)
Set-GPRegistryValue -Name $gpoName -Key
"HKLM\SYSTEM\CurrentControlSet\Services\Share
dAccess\Parameters\FirewallPolicy\StandardPro
file" -ValueName "EnableFirewall" -Type DWord
-Value 1 -ErrorAction Stop
# Force Group Policy update
Invoke-GPUpdate -Force -ErrorAction Stop
Write-Host "Group Policy '$gpoName'
created and linked to $ouPath successfully" -
ForegroundColor Green
} catch {
Write-Host "Error:
$($_.[Link])" -ForegroundColor Red
}
This script creates a new Group
Policy Object (GPO), links it to a
specified OU, sets a sample policy
(enabling Windows Firewall in this
case), and forces a Group Policy
update. It includes error handling
to catch and display any issues
during the process.
110. List all domain
controllers
This script lists all domain
controllers in the Active Directory
environment.
Import-Module ActiveDirectory
try {
$dcs = Get-ADDomainController -Filter * |
Select-Object Name, IPv4Address, Site,
OperatingSystem
$dcs | Format-Table -AutoSize
$dcs | Export-Csv -Path
"C:\[Link]" -NoTypeInformation
Write-Host "Domain controller list
exported to C:\[Link]" -
ForegroundColor Green
} catch {
Write-Host "Error:
$($_.[Link])" -ForegroundColor Red
}
This script uses the Get-
ADDomainController cmdlet to retrieve
information about all domain
controllers in the environment. It
displays the results in a formatted
table and exports them to a CSV
file for further analysis.
111. Check domain
controller health
This script checks the health
status of all domain controllers in
the Active Directory environment.
Import-Module ActiveDirectory
function Test-DCHealth {
param (
[string]$DCName
)
$results = @()
# Check if DC is reachable
$pingResult = Test-Connection -
ComputerName $DCName -Count 1 -Quiet
$results += [PSCustomObject]@{
Test = "Ping"
Result = if ($pingResult) { "Success"
} else { "Failure" }
}
# Check NTDS service
$ntdsService = Get-Service -ComputerName
$DCName -Name "NTDS" -ErrorAction
SilentlyContinue
$results += [PSCustomObject]@{
Test = "NTDS Service"
Result = if ($[Link] -eq
"Running") { "Running" } else { "Not Running"
}
}
# Check DNS service
$dnsService = Get-Service -ComputerName
$DCName -Name "DNS" -ErrorAction
SilentlyContinue
$results += [PSCustomObject]@{
Test = "DNS Service"
Result = if ($[Link] -eq
"Running") { "Running" } else { "Not Running"
}
}
# Check disk space
$diskSpace = Get-WmiObject -ComputerName
$DCName -Class Win32_LogicalDisk -Filter
"DeviceID='C:'" | Select-Object
Size,FreeSpace
$freeSpacePercentage =
($[Link] / $[Link]) *
100
$results += [PSCustomObject]@{
Test = "Disk Space"
Result = if ($freeSpacePercentage -gt
10) { "OK
($([math]::Round($freeSpacePercentage, 2))%
free)" } else { "Low
($([math]::Round($freeSpacePercentage, 2))%
free)" }
}
return $results
}
$dcs = Get-ADDomainController -Filter *
foreach ($dc in $dcs) {
Write-Host "Checking health of
$($[Link])..." -ForegroundColor Yellow
$health = Test-DCHealth -DCName
$[Link]
$health | Format-Table -AutoSize
Write-Host ""
}
This script defines a function Test-
DCHealth that performs various health
checks on a domain controller,
including ping, NTDS and DNS
service status, and disk space. It
then retrieves all domain
controllers and runs the health
check for each one, displaying the
results in a formatted table.
112. Automate OU creation
This script automates the creation
of Organizational Units (OUs) in
Active Directory.
Import-Module ActiveDirectory
function New-CustomOU {
param (
[Parameter(Mandatory=$true)]
[string]$OUName,
[Parameter(Mandatory=$true)]
[string]$ParentPath
)
try {
New-ADOrganizationalUnit -Name
$OUName -Path $ParentPath -
ProtectedFromAccidentalDeletion $true -
ErrorAction Stop
Write-Host "OU '$OUName' created
successfully under $ParentPath" -
ForegroundColor Green
} catch {
Write-Host "Error creating OU
'$OUName': $($_.[Link])" -
ForegroundColor Red
}
}
# Example usage
$domainDN = (Get-ADDomain).DistinguishedName
New-CustomOU -OUName "Departments" -
ParentPath $domainDN
New-CustomOU -OUName "IT" -ParentPath
"OU=Departments,$domainDN"
New-CustomOU -OUName "HR" -ParentPath
"OU=Departments,$domainDN"
New-CustomOU -OUName "Finance" -ParentPath
"OU=Departments,$domainDN"
This script defines a function New-
CustomOU that creates a new
Organizational Unit with protection
from accidental deletion. It then
demonstrates how to use this
function to create a hierarchy of
OUs, including a top-level
"Departments" OU and sub-OUs for
different departments.
113. Export AD group
policies
This script exports all Group
Policy Objects (GPOs) in the Active
Directory environment.
Import-Module GroupPolicy
$exportPath = "C:\GPOExport"
if (-not (Test-Path $exportPath)) {
New-Item -ItemType Directory -Path
$exportPath | Out-Null
}
try {
$gpos = Get-GPO -All
foreach ($gpo in $gpos) {
$gpoPath = Join-Path $exportPath
$[Link]
New-Item -ItemType Directory -Path
$gpoPath -Force | Out-Null
Write-Host "Exporting GPO:
$($[Link])" -ForegroundColor Yellow
Backup-GPO -Guid $[Link] -Path
$gpoPath -ErrorAction Stop
}
Write-Host "All GPOs exported
successfully to $exportPath" -ForegroundColor
Green
} catch {
Write-Host "Error exporting GPOs:
$($_.[Link])" -ForegroundColor Red
}
This script retrieves all GPOs in
the environment using Get-GPO -All ,
then iterates through each GPO and
exports it to a specified directory
using Backup-GPO . Each GPO is
exported to its own subdirectory
named after the GPO's display name.
114. Detect unused AD
accounts
This script identifies Active
Directory user accounts that
haven't been used in a specified
period.
Import-Module ActiveDirectory
$daysInactive = 90
$inactiveDate = (Get-
Date).AddDays(-$daysInactive)
$inactiveUsers = Get-ADUser -Filter {
Enabled -eq $true -and LastLogonTimeStamp
-lt $inactiveDate
} -Properties LastLogonTimeStamp,
LastLogonDate, PasswordLastSet |
Select-Object Name, SamAccountName,
@{Name="LastLogon"; Expression=
{[DateTime]::FromFileTime($_.LastLogonTimeSta
mp)}},
LastLogonDate, PasswordLastSet
if ($inactiveUsers) {
$inactiveUsers | Export-Csv -Path
"C:\[Link]" -NoTypeInformation
Write-Host "Found $($[Link])
inactive user accounts. Results exported to
C:\[Link]" -ForegroundColor
Yellow
} else {
Write-Host "No inactive user accounts
found." -ForegroundColor Green
}
This script uses Get-ADUser with a
filter to find enabled user
accounts that haven't logged on
within the specified number of
days. It calculates the last logon
time using the LastLogonTimeStamp
attribute and exports the results
to a CSV file.
115. Delete stale
computer accounts in AD
This script identifies and
optionally deletes stale computer
accounts in Active Directory.
Import-Module ActiveDirectory
$daysInactive = 90
$inactiveDate = (Get-
Date).AddDays(-$daysInactive)
$staleComputers = Get-ADComputer -Filter {
LastLogonTimeStamp -lt $inactiveDate
} -Properties LastLogonTimeStamp,
OperatingSystem |
Select-Object Name, DistinguishedName,
OperatingSystem,
@{Name="LastLogon"; Expression=
{[DateTime]::FromFileTime($_.LastLogonTimeSta
mp)}}
if ($staleComputers) {
Write-Host "Found
$($[Link]) stale computer
accounts:" -ForegroundColor Yellow
$staleComputers | Format-Table -AutoSize
$confirmation = Read-Host "Do you want to
delete these stale computer accounts? (Y/N)"
if ($confirmation -eq "Y") {
foreach ($computer in
$staleComputers) {
try {
Remove-ADComputer -Identity
$[Link] -Confirm:$false -
ErrorAction Stop
Write-Host "Deleted:
$($[Link])" -ForegroundColor Green
} catch {
Write-Host "Error deleting
$($[Link]): $($_.[Link])" -
ForegroundColor Red
}
}
} else {
Write-Host "No accounts deleted." -
ForegroundColor Yellow
}
} else {
Write-Host "No stale computer accounts
found." -ForegroundColor Green
}
This script identifies computer
accounts that haven't logged on
within the specified number of days
using the LastLogonTimeStamp attribute.
It displays the list of stale
accounts and prompts for
confirmation before deleting them.
The script includes error handling
for the deletion process.
116. Generate AD cleanup
reports
This script generates comprehensive
cleanup reports for Active
Directory, including inactive
users, stale computers, and empty
groups.
Import-Module ActiveDirectory
$reportPath = "C:\ADCleanupReports"
$daysInactive = 90
$inactiveDate = (Get-
Date).AddDays(-$daysInactive)
if (-not (Test-Path $reportPath)) {
New-Item -ItemType Directory -Path
$reportPath | Out-Null
}
# Inactive Users Report
$inactiveUsers = Get-ADUser -Filter {
Enabled -eq $true -and LastLogonTimeStamp
-lt $inactiveDate
} -Properties LastLogonTimeStamp,
LastLogonDate, PasswordLastSet |
Select-Object Name, SamAccountName,
@{Name="LastLogon"; Expression=
{[DateTime]::FromFileTime($_.LastLogonTimeSta
mp)}},
LastLogonDate, PasswordLastSet
$inactiveUsers | Export-Csv -Path
"$reportPath\[Link]" -
NoTypeInformation
# Stale Computers Report
$staleComputers = Get-ADComputer -Filter {
LastLogonTimeStamp -lt $inactiveDate
} -Properties LastLogonTimeStamp,
OperatingSystem |
Select-Object Name, DistinguishedName,
OperatingSystem,
@{Name="LastLogon"; Expression=
{[DateTime]::FromFileTime($_.LastLogonTimeSta
mp)}}
$staleComputers | Export-Csv -Path
"$reportPath\[Link]" -
NoTypeInformation
# Empty Groups Report
$emptyGroups = Get-ADGroup -Filter * -
Properties Members |
Where-Object { $_.[Link] -eq 0 } |
Select-Object Name, GroupCategory,
GroupScope
$emptyGroups | Export-Csv -Path
"$reportPath\[Link]" -
NoTypeInformation
# Summary Report
$summary = @"
AD Cleanup Report Summary
-------------------------
Inactive Users: $($[Link])
Stale Computers: $($[Link])
Empty Groups: $($[Link])
Reports generated on $(Get-Date)
"@
$summary | Out-File -FilePath
"$reportPath\[Link]"
Write-Host "AD cleanup reports generated in
$reportPath" -ForegroundColor Green
This script generates comprehensive
cleanup reports for Active
Directory, including inactive
users, stale computers, and empty
groups. It saves these reports as
CSV files in a specified directory
and creates a summary text file.
The script provides a clear
overview of potential cleanup
targets in the AD environment.
117. Create AD security
groups
This script automates the creation
of security groups in Active
Directory.
Import-Module ActiveDirectory
function New-ADSecurityGroup {
param (
[Parameter(Mandatory=$true)]
[string]$GroupName,
[Parameter(Mandatory=$true)]
[string]$GroupScope,
[Parameter(Mandatory=$true)]
[string]$OUPath,
[string]$Description = ""
)
try {
New-ADGroup -Name $GroupName `
-GroupScope $GroupScope `
-GroupCategory Security `
-Path $OUPath `
-Description $Description
`
-ErrorAction Stop
Write-Host "Security group
'$GroupName' created successfully" -
ForegroundColor Green
} catch {
Write-Host "Error creating security
group '$GroupName': $($_.[Link])"
-ForegroundColor Red
}
}
# Example usage
$ouPath =
"OU=SecurityGroups,DC=yourdomain,DC=com"
New-ADSecurityGroup -GroupName "IT_Admins" -
GroupScope Global -OUPath $ouPath -
Description "IT Administrators"
New-ADSecurityGroup -GroupName "HR_Staff" -
GroupScope Global -OUPath $ouPath -
Description "Human Resources Staff"
New-ADSecurityGroup -GroupName
"Finance_Users" -GroupScope Global -OUPath
$ouPath -Description "Finance Department
Users"
This script defines a function New-
ADSecurityGroup that creates a new
security group in Active Directory.
It then demonstrates how to use
this function to create several
security groups with different
names, scopes, and descriptions.
118. Assign permissions
to an AD group
This script assigns permissions to
an Active Directory group for a
specific folder.
function Set-FolderPermission {
param (
[Parameter(Mandatory=$true)]
[string]$FolderPath,
[Parameter(Mandatory=$true)]
[string]$GroupName,
[Parameter(Mandatory=$true)]
[[Link]
stemRights]$Rights,
[Parameter(Mandatory=$true)]
[[Link]
ControlType]$AccessControl
)
try {
$acl = Get-Acl $FolderPath
$group = New-Object
[Link]($GroupNam
e)
$accessRule = New-Object
[Link]
sRule($group, $Rights, $AccessControl)
$[Link]($accessRule)
Set-Acl -Path $FolderPath -AclObject
$acl
Write-Host "Permissions assigned
successfully for group '$GroupName' on folder
'$FolderPath'" -ForegroundColor Green
} catch {
Write-Host "Error assigning
permissions: $($_.[Link])" -
ForegroundColor Red
}
}
# Example usage
$folderPath = "C:\SharedFolder"
$groupName = "YOURDOMAIN\IT_Admins"
Set-FolderPermission -FolderPath $folderPath
`
-GroupName $groupName `
-Rights "Modify" `
-AccessControl "Allow"
This script defines a function Set-
FolderPermission that assigns specified
permissions to an AD group for a
given folder. It uses the
[Link] namespace to
create and apply access rules. The
example demonstrates how to give
"Modify" permissions to the
"IT_Admins" group for a shared
folder.
119. Find all locked AD
accounts
This script identifies all locked
user accounts in Active Directory.
Import-Module ActiveDirectory
$lockedAccounts = Search-ADAccount -LockedOut
| Select-Object Name, SamAccountName,
LastLogonDate
if ($lockedAccounts) {
$lockedAccounts | Format-Table -AutoSize
$lockedAccounts | Export-Csv -Path
"C:\[Link]" -NoTypeInformation
Write-Host "Found
$($[Link]) locked accounts.
Results exported to C:\[Link]"
-ForegroundColor Yellow
} else {
Write-Host "No locked accounts found." -
ForegroundColor Green
}
This script uses the Search-ADAccount
cmdlet with the -LockedOut parameter
to find all locked user accounts.
It displays the results in a
formatted table and exports them to
a CSV file for further analysis.
120. Export AD hierarchy
This script exports the Active
Directory Organizational Unit (OU)
hierarchy to a text file.
Import-Module ActiveDirectory
function Get-ADHierarchy {
param (
[Parameter(Mandatory=$true)]
[string]$RootOU,
[int]$Level = 0
)
$ous = Get-ADOrganizationalUnit -Filter *
-SearchBase $RootOU -SearchScope OneLevel
foreach ($ou in $ous) {
$indent = " " * $Level
"$indent$($[Link])"
Get-ADHierarchy -RootOU
$[Link] -Level ($Level + 1)
}
}
$domainDN = (Get-ADDomain).DistinguishedName
$outputFile = "C:\[Link]"
"Active Directory Hierarchy" | Out-File
$outputFile
"-------------------------" | Out-File
$outputFile -Append
Get-ADHierarchy -RootOU $domainDN | Out-File
$outputFile -Append
Write-Host "AD hierarchy exported to
$outputFile" -ForegroundColor Green
This script defines a recursive
function Get-ADHierarchy that traverses
the OU structure starting from a
specified root OU. It then calls
this function with the domain's
distinguished name as the root,
creating an indented text
representation of the AD hierarchy.
The result is saved to a text file.
121. Automate OU
structure creation
This script automates the creation
of a predefined Organizational Unit
(OU) structure in Active Directory.
Import-Module ActiveDirectory
function New-OUStructure {
param (
[Parameter(Mandatory=$true)]
[string]$ParentOU,
[Parameter(Mandatory=$true)]
[string[]]$OUNames
)
foreach ($ouName in $OUNames) {
try {
$ouPath = "OU=$ouName,$ParentOU"
New-ADOrganizationalUnit -Name
$ouName -Path $ParentOU -
ProtectedFromAccidentalDeletion $true -
ErrorAction Stop
Write-Host "Created OU: $ouPath"
-ForegroundColor Green
} catch {
Write-Host "Error creating OU
'$ouName': $($_.[Link])" -
ForegroundColor Red
}
}
}
$domainDN = (Get-ADDomain).DistinguishedName
# Define OU structure
$topLevelOUs = @("Departments", "Resources",
"Security Groups")
$departmentOUs = @("IT", "HR", "Finance",
"Marketing", "Sales")
$resourceOUs = @("Computers", "Users",
"Servers")
# Create top-level OUs
New-OUStructure -ParentOU $domainDN -OUNames
$topLevelOUs
# Create department OUs
New-OUStructure -ParentOU
"OU=Departments,$domainDN" -OUNames
$departmentOUs
# Create resource OUs
New-OUStructure -ParentOU
"OU=Resources,$domainDN" -OUNames
$resourceOUs
Write-Host "OU structure creation completed."
-ForegroundColor Green
This script defines a function New-
OUStructure that creates multiple OUs
under a specified parent OU. It
then uses this function to create a
predefined OU structure, including
top-level OUs, department OUs, and
resource OUs. The script includes
error handling for each OU creation
attempt.
122. Update AD user
attributes
This script updates specified
attributes for Active Directory
users.
Import-Module ActiveDirectory
function Update-ADUserAttribute {
param (
[Parameter(Mandatory=$true)]
[string]$Username,
[Parameter(Mandatory=$true)]
[hashtable]$Attributes
)
try {
$user = Get-ADUser -Identity
$Username -ErrorAction Stop
Set-ADUser -Identity $user -Replace
$Attributes -ErrorAction Stop
Write-Host "Updated attributes for
user $Username" -ForegroundColor Green
} catch {
Write-Host "Error updating attributes
for user $Username: $($_.[Link])"
-ForegroundColor Red
}
}
# Example usage
$usersToUpdate = @(
@{
Username = "jdoe"
Attributes = @{
Title = "Senior Developer"
Department = "IT"
Office = "New York"
}
},
@{
Username = "jsmith"
Attributes = @{
Title = "HR Manager"
Department = "Human Resources"
Office = "London"
}
}
)
foreach ($user in $usersToUpdate) {
Update-ADUserAttribute -Username
$[Link] -Attributes $[Link]
}
This script defines a function
Update-ADUserAttribute that updates
specified attributes for a given AD
user. It then demonstrates how to
use this function to update
attributes for multiple users,
allowing for batch updates of user
information.
123. Find AD accounts
with expired passwords
This script identifies Active
Directory user accounts with
expired passwords.
Import-Module ActiveDirectory
$expiredAccounts = Search-ADAccount -
AccountExpired |
Get-ADUser -Properties Name,
SamAccountName, PasswordLastSet,
PasswordExpired |
Select-Object Name, SamAccountName,
PasswordLastSet, PasswordExpired
if ($expiredAccounts) {
$expiredAccounts | Format-Table -AutoSize
$expiredAccounts | Export-Csv -Path
"C:\[Link]" -NoTypeInformation
Write-Host "Found
$($[Link]) accounts with
expired passwords. Results exported to
C:\[Link]" -ForegroundColor
Yellow
} else {
Write-Host "No accounts found with
expired passwords." -ForegroundColor Green
}
This script uses the Search-ADAccount
cmdlet with the -AccountExpired
parameter to find user accounts
with expired passwords. It
retrieves additional properties for
these accounts and displays the
results in a formatted table. The
script also exports the results to
a CSV file for further analysis.
124. Bulk password reset
for AD users
This script performs a bulk
password reset for specified Active
Directory users.
Import-Module ActiveDirectory
function Reset-ADUserPassword {
param (
[Parameter(Mandatory=$true)]
[string]$Username,
[Parameter(Mandatory=$true)]
[string]$NewPassword
)
try {
$securePassword = ConvertTo-
SecureString $NewPassword -AsPlainText -Force
Set-ADAccountPassword -Identity
$Username -NewPassword $securePassword -Reset
-ErrorAction Stop
Set-ADUser -Identity $Username -
ChangePasswordAtLogon $true -ErrorAction Stop
Write-Host "Password reset successful
for user $Username" -ForegroundColor Green
} catch {
Write-Host "Error resetting password
for user $Username: $($_.[Link])"
-ForegroundColor Red
}
}
# Example usage
$usersToReset = @(
@{Username = "jdoe"; NewPassword =
"P@ssw0rd123!"},
@{Username = "jsmith"; NewPassword =
"Str0ngP@ss!"},
@{Username = "asmith"; NewPassword =
"C0mpl3xP@ss!"}
)
foreach ($user in $usersToReset) {
Reset-ADUserPassword -Username
$[Link] -NewPassword $[Link]
}
This script defines a function Reset-
ADUserPassword that resets the password
for a given AD user and forces them
to change it at next logon. It then
demonstrates how to use this
function to perform a bulk password
reset for multiple users. Note that
in a production environment, you
should use a more secure method of
handling passwords, such as reading
them from an encrypted file or
generating random passwords.
125. Disable stale AD
accounts
This script identifies and disables
Active Directory user accounts that
have been inactive for a specified
period.
Import-Module ActiveDirectory
$daysInactive = 90
$inactiveDate = (Get-
Date).AddDays(-$daysInactive)
$staleAccounts = Get-ADUser -Filter {
Enabled -eq $true -and LastLogonTimeStamp
-lt $inactiveDate
} -Properties LastLogonTimeStamp,
LastLogonDate |
Select-Object Name, SamAccountName,
@{Name="LastLogon"; Expression=
{[DateTime]::FromFileTime($_.LastLogonTimeSta
mp)}},
LastLogonDate
if ($staleAccounts) {
Write-Host "Found $($[Link])
stale accounts:" -ForegroundColor Yellow
$staleAccounts | Format-Table -AutoSize
$confirmation = Read-Host "Do you want to
disable these stale accounts? (Y/N)"
if ($confirmation -eq "Y") {
foreach ($account in $staleAccounts)
{
try {
Disable-ADAccount -Identity
$[Link] -ErrorAction Stop
Write-Host "Disabled account:
$($[Link])" -ForegroundColor
Green
} catch {
Write-Host "Error disabling
account $($[Link]):
$($_.[Link])" -ForegroundColor Red
}
}
} else {
Write-Host "No accounts disabled." -
ForegroundColor Yellow
}
$staleAccounts | Export-Csv -Path
"C:\[Link]" -NoTypeInformation
Write-Host "Stale account list exported
to C:\[Link]" -ForegroundColor
Green
} else {
Write-Host "No stale accounts found." -
ForegroundColor Green
}
This script identifies user
accounts that haven't logged on
within the specified number of days
using the LastLogonTimeStamp attribute.
It displays the list of stale
accounts and prompts for
confirmation before disabling them.
The script includes error handling
for the disabling process and
exports the list of stale accounts
to a CSV file.
126. Monitor AD
authentication logs
This script monitors the Active
Directory authentication logs for
failed login attempts.
$logName = "Security"
$eventID = 4625 # Failed logon attempt
$startTime = (Get-Date).AddMinutes(-5)
$endTime = Get-Date
$events = Get-WinEvent -FilterHashtable @{
LogName = $logName
ID = $eventID
StartTime = $startTime
EndTime = $endTime
} -ErrorAction SilentlyContinue
if ($events) {
Write-Host "Found $($[Link]) failed
login attempts in the last 5 minutes:" -
ForegroundColor Yellow
$failedLogins = $events | ForEach-Object
{
$event = [xml]$_.ToXml()
[PSCustomObject]@{
Time = $_.TimeCreated
Username =
$[Link] | Where-Object {
$_.Name -eq 'TargetUserName' } | Select-
Object -ExpandProperty '#text'
Workstation =
$[Link] | Where-Object {
$_.Name -eq 'WorkstationName' } | Select-
Object -ExpandProperty '#text'
IPAddress =
$[Link] | Where-Object {
$_.Name -eq 'IpAddress' } | Select-Object -
ExpandProperty '#text'
}
}
$failedLogins | Format-Table -AutoSize
$failedLogins | Export-Csv -Path
"C:\[Link]" -NoTypeInformation -
Append
Write-Host "Failed login attempts
appended to C:\[Link]" -
ForegroundColor Green
} else {
Write-Host "No failed login attempts
found in the last 5 minutes." -
ForegroundColor Green
}
This script uses the Get-WinEvent
cmdlet to retrieve failed login
events from the Security log. It
parses the event data to extract
relevant information such as the
username, workstation, and IP
address of the failed login
attempt. The script displays the
results in a formatted table and
appends them to a CSV file for
ongoing monitoring.
127. Query AD schema
details
This script retrieves and displays
information about the Active
Directory schema.
Import-Module ActiveDirectory
function Get-ADSchemaInfo {
$schema = Get-ADObject -SearchBase ((Get-
ADRootDSE).schemaNamingContext) -SearchScope
OneLevel -Filter * -Properties *
$schemaInfo = $schema | ForEach-Object {
[PSCustomObject]@{
Name = $_.Name
LdapDisplayName =
$_.lDAPDisplayName
ObjectClass = $_.objectClass
AttributeCount = ($_.mayContain +
$_.mustContain).Count
IsDefunct = $_.isDefunct
}
}
return $schemaInfo
}
$schemaDetails = Get-ADSchemaInfo
$schemaDetails | Format-Table -AutoSize
$schemaDetails | Export-Csv -Path
"C:\[Link]" -NoTypeInformation
Write-Host "AD schema details exported to
C:\[Link]" -ForegroundColor
Green
# Display some statistics
$totalClasses = $[Link]
$defunctClasses = ($schemaDetails | Where-
Object { $_.IsDefunct -eq $true }).Count
$attributeCount = ($schemaDetails | Measure-
Object -Property AttributeCount -Sum).Sum
Write-Host "AD Schema Statistics:" -
ForegroundColor Yellow
Write-Host "Total Classes: $totalClasses" -
ForegroundColor Cyan
Write-Host "Defunct Classes: $defunctClasses"
-ForegroundColor Cyan
Write-Host "Total Attributes:
$attributeCount" -ForegroundColor Cyan
This script defines a function Get-
ADSchemaInfo that retrieves
information about the AD schema,
including class names, LDAP display
names, object classes, attribute
counts, and whether the class is
defunct. It displays the results in
a formatted table, exports them to
a CSV file, and provides some basic
statistics about the schema.
128. Generate AD role
reports
This script generates reports on
Active Directory users with
specific roles or group
memberships.
Import-Module ActiveDirectory
function Get-ADRoleMembers {
param (
[Parameter(Mandatory=$true)]
[string[]]$GroupNames
)
$roleMembers = @()
foreach ($groupName in $GroupNames) {
try {
$group = Get-ADGroup -Identity
$groupName -ErrorAction Stop
$members = Get-ADGroupMember -
Identity $group -Recursive |
Get-ADUser -Properties
Name, SamAccountName, Title, Department,
EmailAddress
foreach ($member in $members) {
$roleMembers +=
[PSCustomObject]@{
Role = $[Link]
Name = $[Link]
SamAccountName =
$[Link]
Title = $[Link]
Department =
$[Link]
EmailAddress =
$[Link]
}
}
} catch {
Write-Host "Error processing
group $groupName: $($_.[Link])" -
ForegroundColor Red
}
}
return $roleMembers
}
# Define the roles (groups) to report on
$roles = @(
"Domain Admins",
"Enterprise Admins",
"Schema Admins",
"Backup Operators",
"Account Operators",
"Server Operators"
)
$roleReport = Get-ADRoleMembers -GroupNames
$roles
if ($roleReport) {
$roleReport | Format-Table -AutoSize
$roleReport | Export-Csv -Path
"C:\[Link]" -NoTypeInformation
Write-Host "AD role report exported to
C:\[Link]" -ForegroundColor Green
# Generate summary
$summary = $roleReport | Group-Object
Role | Select-Object Name, Count
Write-Host "Role Membership Summary:" -
ForegroundColor Yellow
$summary | Format-Table -AutoSize
} else {
Write-Host "No role members found." -
ForegroundColor Yellow
}
This script defines a function Get-
ADRoleMembers that retrieves members of
specified AD groups (roles) along
with their details. It then uses
this function to generate a report
for a predefined list of important
AD roles. The script displays the
results, exports them to a CSV
file, and provides a summary of
role memberships.
129. Audit AD access
control
This script audits and reports on
the access control settings for
sensitive Active Directory objects.
Import-Module ActiveDirectory
function Get-ADObjectACL {
param (
[Parameter(Mandatory=$true)]
[string]$ObjectDN
)
$acl = Get-Acl -Path "AD:$ObjectDN"
$aclInfo = $[Link] | ForEach-Object {
[PSCustomObject]@{
ObjectDN = $ObjectDN
IdentityReference =
$_.IdentityReference
AccessControlType =
$_.AccessControlType
ActiveDirectoryRights =
$_.ActiveDirectoryRights
InheritanceType =
$_.InheritanceType
InheritanceFlags =
$_.InheritanceFlags
PropagationFlags =
$_.PropagationFlags
}
}
return $aclInfo
}
# Define sensitive objects to audit
$sensitiveObjects = @(
(Get-ADDomain).DistinguishedName,
"CN=AdminSDHolder,CN=System,$((Get-
ADDomain).DistinguishedName)",
(Get-ADObject -Filter {name -eq "Domain
Controllers"} -SearchBase (Get-
ADDomain).DistinguishedName).DistinguishedNam
e
)
$aclReport = @()
foreach ($object in $sensitiveObjects) {
$aclReport += Get-ADObjectACL -ObjectDN
$object
}
if ($aclReport) {
$aclReport | Format-Table -AutoSize
$aclReport | Export-Csv -Path
"C:\[Link]" -NoTypeInformation
Write-Host "AD access control report
exported to C:\[Link]" -
ForegroundColor Green
# Generate summary
$summary = $aclReport | Group-Object
ObjectDN, IdentityReference | Select-Object
Name, Count
Write-Host "Access Control Summary:" -
ForegroundColor Yellow
$summary | Format-Table -AutoSize
} else {
Write-Host "No access control information
found." -ForegroundColor Yellow
}
This script defines a function Get-
ADObjectACL that retrieves the access
control list (ACL) for a given AD
object. It then uses this function
to audit the ACLs of sensitive AD
objects, such as the domain root,
AdminSDHolder, and Domain
Controllers OU. The script displays
the results, exports them to a CSV
file, and provides a summary of
access control settings.
130. Automate AD group
cleanup
This script identifies and
optionally removes empty or
obsolete Active Directory groups.
Import-Module ActiveDirectory
function Get-EmptyADGroups {
Get-ADGroup -Filter * -Properties
Members, Description, WhenCreated |
Where-Object { $_.[Link] -eq 0 } |
Select-Object Name, GroupCategory,
GroupScope, Description, WhenCreated
}
function Get-ObsoleteADGroups {
$cutoffDate = (Get-Date).AddDays(-180) #
Groups not modified in 180 days
Get-ADGroup -Filter * -Properties
Members, Description, WhenCreated, Modified |
Where-Object { $_.Modified -lt
$cutoffDate -and $_.[Link] -eq 0 } |
Select-Object Name, GroupCategory,
GroupScope, Description, WhenCreated,
Modified
}
$emptyGroups = Get-EmptyADGroups
$obsoleteGroups = Get-ObsoleteADGroups
Write-Host "Empty Groups:
$($[Link])" -ForegroundColor
Yellow
$emptyGroups | Format-Table -AutoSize
Write-Host "Obsolete Groups:
$($[Link])" -ForegroundColor
Yellow
$obsoleteGroups | Format-Table -AutoSize
$emptyGroups | Export-Csv -Path
"C:\[Link]" -NoTypeInformation
$obsoleteGroups | Export-Csv -Path
"C:\[Link]" -NoTypeInformation
Write-Host "Reports exported to
C:\[Link] and
C:\[Link]" -ForegroundColor
Green
$confirmation = Read-Host "Do you want to
remove the obsolete groups? (Y/N)"
if ($confirmation -eq "Y") {
foreach ($group in $obsoleteGroups) {
try {
Remove-ADGroup -Identity
$[Link] -Confirm:$false -ErrorAction Stop
Write-Host "Removed group:
$($[Link])" -ForegroundColor Green
} catch {
Write-Host "Error removing group
$($[Link]): $($_.[Link])" -
ForegroundColor Red
}
}
} else {
Write-Host "No groups removed." -
ForegroundColor Yellow
}
This script defines two functions:
Get-EmptyADGroups to find groups with
no members, and Get-ObsoleteADGroups to
find empty groups that haven't been
modified in the last 180 days. It
displays the results, exports them
to CSV files, and provides the
option to remove the obsolete
groups. The script includes error
handling for the group removal
process.
These scripts provide a
comprehensive set of tools for
managing and maintaining an Active
Directory environment, covering
various aspects such as user and
group management, security
auditing, and cleanup tasks.
Chapter 5: System
Administration (Scripts
131–180)
131. Automate Windows
Updates
function Invoke-WindowsUpdate {
$UpdateSession = New-Object -ComObject
[Link]
$UpdateSearcher =
$[Link]()
$SearchResult =
$[Link]("IsInstalled=0")
if ($[Link] -eq 0) {
Write-Host "No updates available."
} else {
$UpdatesToInstall = New-Object -
ComObject [Link]
foreach ($Update in
$[Link]) {
$[Link]($Update) |
Out-Null
}
$Installer =
$[Link]()
$[Link] =
$UpdatesToInstall
$InstallResult = $[Link]()
if ($[Link] -eq 2)
{
Write-Host "Updates installed
successfully."
} else {
Write-Host "Update installation
failed."
}
}
}
Invoke-WindowsUpdate
This script automates the process
of checking for and installing
Windows updates. It uses the
Windows Update API to search for
available updates, creates a
collection of updates to install,
and then installs them. The
function reports whether updates
were installed successfully or if
the installation failed.
132. Check Service Status
function Get-ServiceStatus {
param (
[Parameter(Mandatory=$true)]
[string]$ServiceName
)
$service = Get-Service -Name $ServiceName
-ErrorAction SilentlyContinue
if ($service) {
Write-Host "Service Name:
$($[Link])"
Write-Host "Display Name:
$($[Link])"
Write-Host "Status:
$($[Link])"
Write-Host "Start Type:
$($[Link])"
} else {
Write-Host "Service '$ServiceName'
not found."
}
}
Get-ServiceStatus -ServiceName "Spooler"
This script checks the status of a
specified Windows service. It takes
the service name as a parameter and
retrieves information such as the
service name, display name, current
status, and start type. If the
service doesn't exist, it reports
that the service was not found.
133. Start/Stop/Restart a
Service
function Manage-Service {
param (
[Parameter(Mandatory=$true)]
[string]$ServiceName,
[Parameter(Mandatory=$true)]
[ValidateSet("Start", "Stop",
"Restart")]
[string]$Action
)
$service = Get-Service -Name $ServiceName
-ErrorAction SilentlyContinue
if ($service) {
try {
switch ($Action) {
"Start" {
$service | Start-Service
Write-Host "Service
'$ServiceName' started successfully."
}
"Stop" {
$service | Stop-Service
Write-Host "Service
'$ServiceName' stopped successfully."
}
"Restart" {
$service | Restart-
Service
Write-Host "Service
'$ServiceName' restarted successfully."
}
}
} catch {
Write-Host "Failed to $Action
service '$ServiceName'. Error:
$($_.[Link])"
}
} else {
Write-Host "Service '$ServiceName'
not found."
}
}
Manage-Service -ServiceName "Spooler" -Action
"Restart"
This script provides functionality
to start, stop, or restart a
Windows service. It takes the
service name and the desired action
as parameters. The script uses a
switch statement to perform the
appropriate action on the service
and reports the result.
134. Configure Service
Startup Types
function Set-ServiceStartupType {
param (
[Parameter(Mandatory=$true)]
[string]$ServiceName,
[Parameter(Mandatory=$true)]
[ValidateSet("Automatic", "Manual",
"Disabled")]
[string]$StartupType
)
$service = Get-WmiObject -Class
Win32_Service -Filter "Name='$ServiceName'"
if ($service) {
try {
$result =
$[Link]($StartupType)
if ($[Link] -eq 0) {
Write-Host "Startup type for
service '$ServiceName' set to $StartupType
successfully."
} else {
Write-Host "Failed to set
startup type. Error code:
$($[Link])"
}
} catch {
Write-Host "An error occurred:
$($_.[Link])"
}
} else {
Write-Host "Service '$ServiceName'
not found."
}
}
Set-ServiceStartupType -ServiceName "Spooler"
-StartupType "Automatic"
This script allows you to configure
the startup type of a Windows
service. It accepts the service
name and the desired startup type
(Automatic, Manual, or Disabled) as
parameters. The script uses WMI to
change the startup mode of the
specified service and reports the
result.
135. Monitor CPU Usage
function Monitor-CPUUsage {
param (
[int]$DurationSeconds = 60,
[int]$IntervalSeconds = 5
)
$endTime = (Get-
Date).AddSeconds($DurationSeconds)
while ((Get-Date) -lt $endTime) {
$cpu = Get-WmiObject Win32_Processor
| Measure-Object -Property LoadPercentage -
Average | Select-Object Average
$cpuUsage =
[math]::Round($[Link], 2)
Write-Host "CPU Usage: $cpuUsage%"
Start-Sleep -Seconds $IntervalSeconds
}
}
Monitor-CPUUsage -DurationSeconds 300 -
IntervalSeconds 10
This script monitors CPU usage over
a specified duration. It takes two
optional parameters: the total
duration to monitor and the
interval between measurements. The
script uses WMI to query the CPU
load percentage and reports the
average usage at each interval.
136. Monitor Memory Usage
function Monitor-MemoryUsage {
param (
[int]$DurationSeconds = 60,
[int]$IntervalSeconds = 5
)
$endTime = (Get-
Date).AddSeconds($DurationSeconds)
while ((Get-Date) -lt $endTime) {
$os = Get-WmiObject
Win32_OperatingSystem
$totalMemory =
[math]::Round($[Link] /
1MB, 2)
$freeMemory =
[math]::Round($[Link] / 1MB,
2)
$usedMemory = $totalMemory -
$freeMemory
$usedPercentage =
[math]::Round(($usedMemory / $totalMemory) *
100, 2)
Write-Host "Memory Usage: $usedMemory
GB / $totalMemory GB ($usedPercentage%)"
Start-Sleep -Seconds $IntervalSeconds
}
}
Monitor-MemoryUsage -DurationSeconds 300 -
IntervalSeconds 10
This script monitors memory usage
over a specified duration. Like the
CPU monitoring script, it takes
parameters for duration and
interval. It uses WMI to query the
operating system for memory
information, calculates the used
memory and percentage, and reports
these values at each interval.
137. Schedule Tasks Using
PowerShell
function Schedule-PowerShellTask {
param (
[Parameter(Mandatory=$true)]
[string]$TaskName,
[Parameter(Mandatory=$true)]
[string]$ScriptPath,
[Parameter(Mandatory=$true)]
[string]$Schedule
)
$action = New-ScheduledTaskAction -
Execute "[Link]" -Argument "-
NoProfile -ExecutionPolicy Bypass -File
`"$ScriptPath`""
$trigger = New-ScheduledTaskTrigger -Once
-At $Schedule
Register-ScheduledTask -TaskName
$TaskName -Action $action -Trigger $trigger -
RunLevel Highest -Force
Write-Host "Scheduled task '$TaskName'
created successfully."
}
Schedule-PowerShellTask -TaskName
"DailyBackup" -ScriptPath
"C:\Scripts\Backup.ps1" -Schedule "3:00AM"
This script creates a scheduled
task to run a PowerShell script. It
takes parameters for the task name,
the path to the script, and the
schedule (when to run the task).
The script creates a new scheduled
task action and trigger, then
registers the task with the Windows
Task Scheduler.
138. Export Event Logs
function Export-EventLogs {
param (
[Parameter(Mandatory=$true)]
[string]$LogName,
[Parameter(Mandatory=$true)]
[string]$ExportPath,
[int]$Days = 7
)
$startDate = (Get-Date).AddDays(-$Days)
$endDate = Get-Date
try {
Get-WinEvent -LogName $LogName -
StartTime $startDate -EndTime $endDate |
Export-Csv -Path $ExportPath -
NoTypeInformation
Write-Host "Event log '$LogName'
exported successfully to $ExportPath"
} catch {
Write-Host "Failed to export event
log. Error: $($_.[Link])"
}
}
Export-EventLogs -LogName "System" -
ExportPath "C:\Logs\[Link]" -Days 30
This script exports Windows event
logs to a CSV file. It takes
parameters for the log name, export
file path, and the number of days
of logs to export. The script uses
Get-WinEvent to retrieve the
specified logs and exports them to
a CSV file.
139. Clear Event Logs
function Clear-EventLog {
param (
[Parameter(Mandatory=$true)]
[string]$LogName
)
try {
[Link] cl $LogName
Write-Host "Event log '$LogName'
cleared successfully."
} catch {
Write-Host "Failed to clear event
log. Error: $($_.[Link])"
}
}
Clear-EventLog -LogName "System"
This script clears a specified
Windows event log. It uses the
[Link] command-line tool to
clear the log. The script takes the
log name as a parameter and reports
whether the operation was
successful.
140. Create Custom Event
Logs
function New-CustomEventLog {
param (
[Parameter(Mandatory=$true)]
[string]$LogName,
[Parameter(Mandatory=$true)]
[string]$Source
)
try {
New-EventLog -LogName $LogName -
Source $Source
Write-Host "Custom event log
'$LogName' with source '$Source' created
successfully."
} catch {
Write-Host "Failed to create custom
event log. Error: $($_.[Link])"
}
}
New-CustomEventLog -LogName "MyApplication" -
Source "MyAppSource"
This script creates a custom
Windows event log. It takes
parameters for the log name and the
event source. The script uses the
New-EventLog cmdlet to create the
new log and reports the result.
141. Generate System
Performance Reports
function Generate-PerformanceReport {
param (
[Parameter(Mandatory=$true)]
[string]$ReportPath,
[int]$DurationSeconds = 60
)
$endTime = (Get-
Date).AddSeconds($DurationSeconds)
$report = @()
while ((Get-Date) -lt $endTime) {
$cpu = Get-WmiObject Win32_Processor
| Measure-Object -Property LoadPercentage -
Average | Select-Object Average
$cpuUsage =
[math]::Round($[Link], 2)
$os = Get-WmiObject
Win32_OperatingSystem
$totalMemory =
[math]::Round($[Link] /
1MB, 2)
$freeMemory =
[math]::Round($[Link] / 1MB,
2)
$usedMemory = $totalMemory -
$freeMemory
$memoryUsage =
[math]::Round(($usedMemory / $totalMemory) *
100, 2)
$report += [PSCustomObject]@{
Timestamp = Get-Date
CPUUsage = $cpuUsage
MemoryUsage = $memoryUsage
}
Start-Sleep -Seconds 5
}
$report | Export-Csv -Path $ReportPath -
NoTypeInformation
Write-Host "Performance report generated
at $ReportPath"
}
Generate-PerformanceReport -ReportPath
"C:\Reports\[Link]" -
DurationSeconds 300
This script generates a system
performance report over a specified
duration. It collects CPU and
memory usage data at regular
intervals and exports the data to a
CSV file. The script takes
parameters for the report file path
and the duration of monitoring.
142. Reboot a Remote
System
function Reboot-RemoteSystem {
param (
[Parameter(Mandatory=$true)]
[string]$ComputerName,
[PSCredential]$Credential
)
try {
if ($Credential) {
Restart-Computer -ComputerName
$ComputerName -Credential $Credential -Force
} else {
Restart-Computer -ComputerName
$ComputerName -Force
}
Write-Host "Reboot initiated for
$ComputerName"
} catch {
Write-Host "Failed to reboot
$ComputerName. Error:
$($_.[Link])"
}
}
$cred = Get-Credential
Reboot-RemoteSystem -ComputerName
"RemotePC01" -Credential $cred
This script reboots a remote
system. It takes parameters for the
computer name and optional
credentials. The script uses the
Restart-Computer cmdlet to initiate
the reboot and reports the result.
143. Shutdown a Remote
System
function Shutdown-RemoteSystem {
param (
[Parameter(Mandatory=$true)]
[string]$ComputerName,
[PSCredential]$Credential,
[switch]$Force
)
try {
$params = @{
ComputerName = $ComputerName
}
if ($Credential) {
$[Link] = $Credential
}
if ($Force) {
$[Link] = $true
}
Stop-Computer @params
Write-Host "Shutdown initiated for
$ComputerName"
} catch {
Write-Host "Failed to shutdown
$ComputerName. Error:
$($_.[Link])"
}
}
$cred = Get-Credential
Shutdown-RemoteSystem -ComputerName
"RemotePC01" -Credential $cred -Force
This script shuts down a remote
system. It's similar to the reboot
script but uses the Stop-Computer
cmdlet instead. The script includes
a Force parameter to initiate an
immediate shutdown if needed.
144. Install Software
Remotely
function Install-RemoteSoftware {
param (
[Parameter(Mandatory=$true)]
[string]$ComputerName,
[Parameter(Mandatory=$true)]
[string]$InstallerPath,
[string]$Arguments,
[PSCredential]$Credential
)
try {
$session = New-PSSession -
ComputerName $ComputerName -Credential
$Credential
Copy-Item -Path $InstallerPath -
Destination "C:\Temp" -ToSession $session
$installerName = Split-Path
$InstallerPath -Leaf
$remotePath =
"C:\Temp\$installerName"
Invoke-Command -Session $session -
ScriptBlock {
param($path, $args)
Start-Process -FilePath $path -
ArgumentList $args -Wait
} -ArgumentList $remotePath,
$Arguments
Remove-PSSession $session
Write-Host "Software installation
completed on $ComputerName"
} catch {
Write-Host "Failed to install
software on $ComputerName. Error:
$($_.[Link])"
}
}
$cred = Get-Credential
Install-RemoteSoftware -ComputerName
"RemotePC01" -InstallerPath
"C:\Installers\[Link]" -Arguments
"/silent" -Credential $cred
This script installs software on a
remote system. It copies the
installer to the remote system,
executes it with the provided
arguments, and then removes the
installer. The script uses
PowerShell remoting to perform
these actions.
145. Uninstall Software
Remotely
function Uninstall-RemoteSoftware {
param (
[Parameter(Mandatory=$true)]
[string]$ComputerName,
[Parameter(Mandatory=$true)]
[string]$SoftwareName,
[PSCredential]$Credential
)
try {
$session = New-PSSession -
ComputerName $ComputerName -Credential
$Credential
Invoke-Command -Session $session -
ScriptBlock {
param($name)
$app = Get-WmiObject -Class
Win32_Product | Where-Object { $_.Name -like
"*$name*" }
if ($app) {
$[Link]()
Write-Host "Software '$name'
uninstalled successfully."
} else {
Write-Host "Software '$name'
not found."
}
} -ArgumentList $SoftwareName
Remove-PSSession $session
} catch {
Write-Host "Failed to uninstall
software on $ComputerName. Error:
$($_.[Link])"
}
}
$cred = Get-Credential
Uninstall-RemoteSoftware -ComputerName
"RemotePC01" -SoftwareName "Adobe Reader" -
Credential $cred
This script uninstalls software
from a remote system. It uses
PowerShell remoting to connect to
the remote system, searches for the
specified software using WMI, and
uninstalls it if found.
146. Automate System
Backups
function Backup-System {
param (
[Parameter(Mandatory=$true)]
[string]$SourcePath,
[Parameter(Mandatory=$true)]
[string]$DestinationPath,
[string[]]$ExcludeFolders
)
try {
$date = Get-Date -Format
"yyyyMMdd_HHmmss"
$backupPath = Join-Path
$DestinationPath "Backup_$date"
$robocopyArgs = @(
$SourcePath,
$backupPath,
"/MIR",
"/R:3",
"/W:10",
"/MT:16",
"/NP",
"/LOG:$backupPath\backup_log.txt"
)
foreach ($folder in $ExcludeFolders)
{
$robocopyArgs += "/XD"
$robocopyArgs += $folder
}
& [Link] $robocopyArgs
Write-Host "Backup completed
successfully. Backup location: $backupPath"
} catch {
Write-Host "Backup failed. Error:
$($_.[Link])"
}
}
Backup-System -SourcePath "C:\ImportantData"
-DestinationPath "D:\Backups" -ExcludeFolders
@("C:\ImportantData\Temp",
"C:\ImportantData\Logs")
This script automates system
backups using Robocopy. It takes
parameters for the source and
destination paths, as well as an
optional list of folders to
exclude. The script creates a
timestamped backup folder and uses
Robocopy to mirror the source to
the destination, excluding
specified folders.
147. Restore Systems from
Backups
function Restore-SystemFromBackup {
param (
[Parameter(Mandatory=$true)]
[string]$BackupPath,
[Parameter(Mandatory=$true)]
[string]$RestorePath,
[switch]$Force
)
try {
if (-not (Test-Path $BackupPath)) {
throw "Backup path does not
exist."
}
if (Test-Path $RestorePath) {
if ($Force) {
Remove-Item -Path
$RestorePath -Recurse -Force
} else {
throw "Restore path already
exists. Use -Force to overwrite."
}
}
$robocopyArgs = @(
$BackupPath,
$RestorePath,
"/E",
"/R:3",
"/W:10",
"/MT:16",
"/NP",
"/LOG:$RestorePath\restore_log.tx
t"
)
& [Link] $robocopyArgs
Write-Host "Restore completed
successfully. Restored to: $RestorePath"
} catch {
Write-Host "Restore failed. Error:
$($_.[Link])"
}
}
Restore-SystemFromBackup -BackupPath
"D:\Backups\Backup_20230515_120000" -
RestorePath "C:\RestoredData" -Force
This script restores a system from
a backup using Robocopy. It takes
parameters for the backup source
path and the restoration
destination path. The script
includes a Force switch to
overwrite existing data at the
destination. It uses Robocopy to
copy all files and folders from the
backup to the restore location.
148. Detect Hardware
Failures
function Detect-HardwareFailures {
$hardwareComponents = @(
@{Name="Disk";
Class="Win32_DiskDrive"},
@{Name="CPU";
Class="Win32_Processor"},
@{Name="Memory";
Class="Win32_PhysicalMemory"},
@{Name="Network Adapter";
Class="Win32_NetworkAdapter"}
)
foreach ($component in
$hardwareComponents) {
Write-Host "Checking
$($[Link]) status:"
$devices = Get-WmiObject -Class
$[Link]
foreach ($device in $devices) {
$status = switch
($[Link]) {
"Disk" { $[Link] }
"CPU" { $[Link] }
"Memory" { if ($[Link]
-eq "Ok") { "OK" } else { "Failure" } }
"Network Adapter" { if
($[Link]) { "OK" } else {
"Disabled or Failure" } }
}
Write-Host " $($[Link]):
$status"
}
Write-Host ""
}
}
Detect-HardwareFailures
This script checks the status of
various hardware components (disk,
CPU, memory, and network adapters)
to detect potential failures. It
uses WMI queries to retrieve
information about each component
and reports their status.
149. Export Hardware
Inventory Details
function Export-HardwareInventory {
param (
[Parameter(Mandatory=$true)]
[string]$OutputPath
)
$inventory = @()
# System Information
$system = Get-WmiObject
Win32_ComputerSystem
$os = Get-WmiObject Win32_OperatingSystem
$inventory += [PSCustomObject]@{
Component = "System"
Manufacturer = $[Link]
Model = $[Link]
OS = $[Link]
Version = $[Link]
}
# CPU Information
$cpu = Get-WmiObject Win32_Processor
$inventory += [PSCustomObject]@{
Component = "CPU"
Name = $[Link]
Cores = $[Link]
Threads =
$[Link]
}
# Memory Information
$memory = Get-WmiObject
Win32_PhysicalMemory | Measure-Object -
Property Capacity -Sum
$inventory += [PSCustomObject]@{
Component = "Memory"
TotalGB = [math]::Round($[Link] /
1GB, 2)
}
# Disk Information
Get-WmiObject Win32_DiskDrive | ForEach-
Object {
$inventory += [PSCustomObject]@{
Component = "Disk"
Model = $_.Model
SizeGB = [math]::Round($_.Size /
1GB, 2)
}
}
# Network Adapters
Get-WmiObject Win32_NetworkAdapter |
Where-Object { $_.PhysicalAdapter } |
ForEach-Object {
$inventory += [PSCustomObject]@{
Component = "Network"
Name = $_.Name
MACAddress = $_.MACAddress
}
}
$inventory | Export-Csv -Path $OutputPath
-NoTypeInformation
Write-Host "Hardware inventory exported
to $OutputPath"
}
Export-HardwareInventory -OutputPath
"C:\Reports\[Link]"
This script exports detailed
hardware inventory information to a
CSV file. It collects information
about the system, CPU, memory,
disks, and network adapters using
WMI queries. The collected data is
then exported to the specified CSV
file.
150. Manage Device
Drivers
function Manage-DeviceDrivers {
param (
[Parameter(Mandatory=$true)]
[ValidateSet("List", "Update",
"Rollback", "Disable", "Enable")]
[string]$Action,
[string]$DeviceName
)
switch ($Action) {
"List" {
Get-WmiObject
Win32_PnPSignedDriver | Select-Object
DeviceName, DriverVersion, DriverDate |
Format-Table -AutoSize
}
"Update" {
if (-not $DeviceName) { throw
"DeviceName is required for Update action" }
$device = Get-WmiObject
Win32_PnPSignedDriver | Where-Object {
$_.DeviceName -eq $DeviceName }
if ($device) {
[Link] /add-driver
"$($[Link])" /install
Write-Host "Driver update
initiated for $DeviceName"
} else {
Write-Host "Device not found:
$DeviceName"
}
}
"Rollback" {
if (-not $DeviceName) { throw
"DeviceName is required for Rollback action"
}
$device = Get-WmiObject
Win32_PnPSignedDriver | Where-Object {
$_.DeviceName -eq $DeviceName }
if ($device) {
[Link] /rollback-driver
"$($[Link])" /force
Write-Host "Driver rollback
initiated for $DeviceName"
} else {
Write-Host "Device not found:
$DeviceName"
}
}
"Disable" {
if (-not $DeviceName) { throw
"DeviceName is required for Disable action" }
$device = Get-PnpDevice | Where-
Object { $_.FriendlyName -eq $DeviceName }
if ($device) {
Disable-PnpDevice -InstanceId
$[Link] -Confirm:$false
Write-Host "Device disabled:
$DeviceName"
} else {
Write-Host "Device not found:
$DeviceName"
}
}
"Enable" {
if (-not $DeviceName) { throw
"DeviceName is required for Enable action" }
$device = Get-PnpDevice | Where-
Object { $_.FriendlyName -eq $DeviceName }
if ($device) {
Enable-PnpDevice -InstanceId
$[Link] -Confirm:$false
Write-Host "Device enabled:
$DeviceName"
} else {
Write-Host "Device not found:
$DeviceName"
}
}
}
}
Manage-DeviceDrivers -Action "List"
Manage-DeviceDrivers -Action "Update" -
DeviceName "Intel(R) Wireless-AC 9560 160MHz"
This script provides functionality
to manage device drivers. It can
list all drivers, update a specific
driver, rollback a driver, and
disable or enable devices. The
script uses a combination of WMI
queries and built-in Windows
utilities like [Link] to
perform these actions.
151. Audit Firewall Rules
function Audit-FirewallRules {
param (
[Parameter(Mandatory=$true)]
[string]$OutputPath
)
$rules = Get-NetFirewallRule |
Where-Object { $_.Enabled -eq
$true } |
Select-Object Name, DisplayName,
Direction, Action,
@{Name='Protocol';
Expression={($_ | Get-
NetFirewallPortFilter).Protocol}},
@{Name='LocalPort'
; Expression={($_ | Get-
NetFirewallPortFilter).LocalPort}},
@{Name='RemotePort
'; Expression={($_ | Get-
NetFirewallPortFilter).RemotePort}},
@{Name='RemoteAddr
ess'; Expression={($_ | Get-
NetFirewallAddressFilter).RemoteAddress}}
$rules | Export-Csv -Path $OutputPath -
NoTypeInformation
Write-Host "Firewall rules exported to
$OutputPath"
Write-Host "Total active rules:
$($[Link])"
}
Audit-FirewallRules -OutputPath
"C:\Reports\[Link]"
This script audits the Windows
Firewall rules and exports them to
a CSV file. It retrieves all
enabled firewall rules, including
details such as name, direction,
action, protocol, ports, and remote
addresses. The script then exports
this information to the specified
CSV file.
152. Configure Firewall
Rules
function Configure-FirewallRule {
param (
[Parameter(Mandatory=$true)]
[string]$RuleName,
[Parameter(Mandatory=$true)]
[ValidateSet("Inbound", "Outbound")]
[string]$Direction,
[Parameter(Mandatory=$true)]
[ValidateSet("Allow", "Block")]
[string]$Action,
[Parameter(Mandatory=$true)]
[string]$Program,
[string]$Protocol = "Any",
[string]$LocalPort,
[string]$RemotePort,
[string]$RemoteAddress = "Any"
)
try {
$params = @{
DisplayName = $RuleName
Direction = $Direction
Action = $Action
Program = $Program
Protocol = $Protocol
}
if ($LocalPort) { $[Link] =
$LocalPort }
if ($RemotePort) { $[Link]
= $RemotePort }
if ($RemoteAddress -ne "Any") {
$[Link] = $RemoteAddress }
New-NetFirewallRule @params
Write-Host "Firewall rule '$RuleName'
created successfully."
} catch {
Write-Host "Failed to create firewall
rule. Error: $($_.[Link])"
}
}
Configure-FirewallRule -RuleName "Allow
MyApp" -Direction "Inbound" -Action "Allow" -
Program "C:\MyApp\[Link]" -Protocol "TCP"
-LocalPort "8080"
This script allows you to configure
new Windows Firewall rules. It
provides parameters for specifying
the rule name, direction, action,
program path, protocol, ports, and
remote address. The script uses the
New-NetFirewallRule cmdlet to
create the new rule with the
specified parameters.
153. Test Network
Connectivity
function Test-NetworkConnectivity {
param (
[Parameter(Mandatory=$true)]
[string[]]$Targets,
[int]$PingCount = 4,
[int]$Timeout = 1000
)
$results = @()
foreach ($target in $Targets) {
$pingResult = Test-Connection -
ComputerName $target -Count $PingCount -Quiet
if ($pingResult) {
$tcpTest = Test-NetConnection -
ComputerName $target -Port 80 -WarningAction
SilentlyContinue
$results += [PSCustomObject]@{
Target = $target
PingStatus = "Success"
TCPPort80 =
$[Link]
IPAddress =
$[Link]
}
} else {
$results += [PSCustomObject]@{
Target = $target
PingStatus = "Failed"
TCPPort80 = "N/A"
IPAddress = "N/A"
}
}
}
$results | Format-Table -AutoSize
}
Test-NetworkConnectivity -Targets
@("[Link]", "[Link]",
"[Link]")
This script tests network
connectivity to specified targets.
It performs a ping test and, if
successful, also checks TCP
connectivity on port 80. The
results are displayed in a table
format.
154. Resolve DNS Records
function Resolve-DNSRecords {
param (
[Parameter(Mandatory=$true)]
[string[]]$DomainNames,
[ValidateSet("A", "AAAA", "CNAME",
"MX", "NS", "TXT")]
[string[]]$RecordTypes = @("A",
"AAAA", "CNAME", "MX", "NS", "TXT")
)
$results = @()
foreach ($domain in $DomainNames) {
foreach ($type in $RecordTypes) {
try {
$records = Resolve-DnsName -
Name $domain -Type $type -ErrorAction Stop
foreach ($record in $records)
{
$results +=
[PSCustomObject]@{
Domain = $domain
RecordType = $type
Name = $[Link]
Value = switch
($type) {
"A" {
$[Link] }
"AAAA" {
$[Link] }
"CNAME" {
$[Link] }
"MX" {
"$($[Link]) (Priority:
$($[Link]))" }
"NS" {
$[Link] }
"TXT" {
$[Link] -join " " }
default { "N/A" }
}
}
}
} catch {
$results +=
[PSCustomObject]@{
Domain = $domain
RecordType = $type
Name = "N/A"
Value = "Failed to
resolve"
}
}
}
}
$results | Format-Table -AutoSize
}
Resolve-DNSRecords -DomainNames
@("[Link]", "[Link]") -RecordTypes
@("A", "MX", "TXT")
This script resolves various types
of DNS records for specified domain
names. It supports A, AAAA, CNAME,
MX, NS, and TXT record types. The
results are displayed in a table
format.
155. Monitor Open Ports
function Monitor-OpenPorts {
param (
[int[]]$PortsToMonitor = @(80, 443,
3389, 22),
[int]$MonitorDurationMinutes = 5,
[int]$IntervalSeconds = 30
)
$endTime = (Get-
Date).AddMinutes($MonitorDurationMinutes)
$results = @{}
while ((Get-Date) -lt $endTime) {
$netstat = netstat -ano | Select-
String -Pattern "LISTENING"
foreach ($port in $PortsToMonitor) {
$isOpen = $netstat | Select-
String ":$port "
if (-not
$[Link]($port)) {
$results[$port] = @()
}
$results[$port] +=
[PSCustomObject]@{
Timestamp = Get-Date
IsOpen = [bool]$isOpen
}
}
Start-Sleep -Seconds $IntervalSeconds
}
foreach ($port in $PortsToMonitor) {
$openCount = ($results[$port] |
Where-Object { $_.IsOpen }).Count
$totalChecks = $results[$port].Count
$percentageOpen =
[math]::Round(($openCount / $totalChecks) *
100, 2)
Write-Host "Port $port was open
$openCount out of $totalChecks checks
($percentageOpen% of the time)"
}
}
Monitor-OpenPorts -PortsToMonitor @(80, 443,
3389) -MonitorDurationMinutes 10 -
IntervalSeconds 60
This script monitors specified
ports to check if they are open
over a given period. It uses the
netstat command to check port
status at regular intervals and
reports the percentage of time each
port was open.
156. Automate Disk
Cleanup
function Invoke-DiskCleanup {
param (
[string]$DriveLetter = "C"
)
# Define cleanup categories
$cleanupCategories = @{
"Active Setup Temp Folders" = 2
"Downloaded Program Files" = 3
"Internet Cache Files" = 4
"Offline Pages Files" = 5
"Old ChkDsk Files" = 6
"Recycle Bin" = 7
"Setup Log Files" = 8
"System error memory dump files" = 9
"System error minidump files" = 10
"Temporary Files" = 11
"Temporary Setup Files" = 12
"Thumbnail Cache" = 13
"Windows Update Cleanup" = 14
}
# Enable all cleanup categories
foreach ($category in
$[Link]()) {
$path =
"HKLM:\SOFTWARE\Microsoft\Windows\CurrentVers
ion\Explorer\VolumeCaches\$($[Link])"
Set-ItemProperty -Path $path -Name
"StateFlags0001" -Value 2 -Type DWORD
}
# Run Disk Cleanup
Start-Process -FilePath cleanmgr -
ArgumentList "/sagerun:1" -Wait
Write-Host "Disk Cleanup completed for
drive $DriveLetter"
}
Invoke-DiskCleanup -DriveLetter "C"
This script automates the Windows
Disk Cleanup utility. It enables
various cleanup categories and then
runs the cleanmgr utility to
perform the cleanup. This can help
free up disk space by removing
unnecessary files.
157. Mount and Unmount
Drives
function Manage-DriveMounts {
param (
[Parameter(Mandatory=$true)]
[ValidateSet("Mount", "Unmount")]
[string]$Action,
[Parameter(Mandatory=$true)]
[string]$Path,
[string]$DriveLetter
)
try {
switch ($Action) {
"Mount" {
if (-not $DriveLetter) {
throw "DriveLetter is
required for Mount action"
}
$mountResult = Mount-
DiskImage -ImagePath $Path -PassThru
$volumeInfo = $mountResult |
Get-Volume
Set-Partition -DiskNumber
$[Link] -PartitionNumber
$[Link] -NewDriveLetter
$DriveLetter
Write-Host "Mounted $Path to
drive $DriveLetter"
}
"Unmount" {
Dismount-DiskImage -ImagePath
$Path
Write-Host "Unmounted $Path"
}
}
} catch {
Write-Host "Failed to $Action drive.
Error: $($_.[Link])"
}
}
Manage-DriveMounts -Action "Mount" -Path
"C:\Images\[Link]" -DriveLetter "Z"
Manage-DriveMounts -Action "Unmount" -Path
"C:\Images\[Link]"
This script provides functionality
to mount and unmount drives,
particularly useful for ISO files
or VHD/VHDX files. It can mount an
image file to a specified drive
letter or unmount a previously
mounted image.
158. Configure Virtual
Memory
function Set-VirtualMemory {
param (
[Parameter(Mandatory=$true)]
[string]$DriveLetter,
[Parameter(Mandatory=$true)]
[int]$InitialSizeMB,
[Parameter(Mandatory=$true)]
[int]$MaximumSizeMB
)
try {
$computerSystem = Get-WmiObject -
Class Win32_ComputerSystem -
EnableAllPrivileges
$[Link]
ile = $false
$[Link]()
$pagefile = Get-WmiObject -Class
Win32_PageFileSetting -Filter
"SettingID='[Link] @ $DriveLetter:'"
if ($pagefile) {
$[Link] =
$InitialSizeMB
$[Link] =
$MaximumSizeMB
$[Link]()
} else {
$pagefileSetting = New-Object
[Link]("Win32_Page
FileSetting")
$[Link] =
"$DriveLetter:\[Link]"
$[Link] =
$InitialSizeMB
$[Link] =
$MaximumSizeMB
$[Link]()
}
Write-Host "Virtual memory configured
successfully. Reboot required for changes to
take effect."
} catch {
Write-Host "Failed to configure
virtual memory. Error:
$($_.[Link])"
}
}
Set-VirtualMemory -DriveLetter "C" -
InitialSizeMB 4096 -MaximumSizeMB 8192
This script configures the virtual
memory (page file) settings for a
specified drive. It allows you to
set both the initial and maximum
size of the page file. Note that
changes to virtual memory settings
typically require a system reboot
to take effect.
159. Format Disks
function Format-NewDisk {
param (
[Parameter(Mandatory=$true)]
[string]$DiskNumber,
[string]$FileSystem = "NTFS",
[string]$NewLabel,
[string]$DriveLetter
)
try {
# Initialize the disk if it's not
already initialized
$disk = Get-Disk -Number $DiskNumber
if ($[Link] -eq 'RAW') {
Initialize-Disk -Number
$DiskNumber -PartitionStyle GPT
}
# Create a new partition using all
available space
$partition = New-Partition -
DiskNumber $DiskNumber -UseMaximumSize
# If a drive letter is specified,
assign it
if ($DriveLetter) {
$partition | Set-Partition -
NewDriveLetter $DriveLetter
}
# Format the new partition
$formatParams = @{
FileSystem = $FileSystem
NewFileSystemLabel = $NewLabel
Confirm = $false
}
$partition | Format-Volume
@formatParams
Write-Host "Disk $DiskNumber
formatted successfully."
if ($DriveLetter) {
Write-Host "New volume created on
drive $DriveLetter."
} else {
Write-Host "New volume created.
Drive letter assigned automatically."
}
} catch {
Write-Host "Failed to format disk.
Error: $($_.[Link])"
}
}
Format-NewDisk -DiskNumber 1 -FileSystem
"NTFS" -NewLabel "DataDrive" -DriveLetter "E"
This script formats a new disk,
creating a partition that uses all
available space and formatting it
with the specified file system. It
also allows you to set a label for
the new volume and assign a
specific drive letter if desired.
160. Encrypt Disks
function Encrypt-Disk {
param (
[Parameter(Mandatory=$true)]
[string]$DriveLetter,
[switch]$UseTpm,
[string]$Password
)
try {
if ($UseTpm) {
# Check if TPM is present and
activated
$tpm = Get-Tpm
if (-not $[Link] -or -not
$[Link]) {
throw "TPM is not present or
not ready."
}
Enable-BitLocker -MountPoint
$DriveLetter -TpmProtector -UsedSpaceOnly
} elseif ($Password) {
$securePassword = ConvertTo-
SecureString $Password -AsPlainText -Force
Enable-BitLocker -MountPoint
$DriveLetter -PasswordProtector -Password
$securePassword -UsedSpaceOnly
} else {
throw "Either -UseTpm or -
Password must be specified."
}
Write-Host "BitLocker encryption
initiated on drive $DriveLetter."
Write-Host "Please wait for the
encryption process to complete."
} catch {
Write-Host "Failed to encrypt disk.
Error: $($_.[Link])"
}
}
# Example usage:
# Encrypt-Disk -DriveLetter "E" -UseTpm
# Encrypt-Disk -DriveLetter "E" -Password
"YourStrongPassword"
This script initiates BitLocker
encryption on a specified drive. It
supports two methods of protection:
TPM-based (if available) or
password-based. The script checks
for TPM availability when that
option is selected and initiates
the encryption process using the
chosen method.
161. Clone a Disk
function Clone-Disk {
param (
[Parameter(Mandatory=$true)]
[string]$SourceDiskNumber,
[Parameter(Mandatory=$true)]
[string]$DestinationDiskNumber
)
try {
# Validate source and destination
disks
$sourceDisk = Get-Disk -Number
$SourceDiskNumber
$destDisk = Get-Disk -Number
$DestinationDiskNumber
if ($[Link] -gt
$[Link]) {
throw "Destination disk is
smaller than the source disk."
}
# Clear the destination disk
Clear-Disk -Number
$DestinationDiskNumber -RemoveData -
Confirm:$false
# Clone the disk
$sourceDisk | Select-Object -
ExpandProperty Partitions | ForEach-Object {
$sourcePartition = $_
$destPartition = New-Partition -
DiskNumber $DestinationDiskNumber -Size
$[Link] -AssignDriveLetter
# Copy data
$sourceVolume = $sourcePartition
| Get-Volume
$destVolume = $destPartition |
Get-Volume
robocopy
$[Link]": "
$[Link]": " /E /ZB /COPYALL
/R:1 /W:1
# Set partition type if it's not
a basic data partition
if ($[Link] -ne
"Basic") {
$destPartition | Set-
Partition -PartitionType
$[Link]
}
}
Write-Host "Disk cloning completed
successfully."
} catch {
Write-Host "Failed to clone disk.
Error: $($_.[Link])"
}
}
# Example usage:
# Clone-Disk -SourceDiskNumber 1 -
DestinationDiskNumber 2
This script clones one disk to
another. It first validates that
the destination disk is large
enough, then clears the destination
disk. It then recreates the
partition structure of the source
disk on the destination disk and
uses robocopy to copy all data.
This script should be used with
caution as it will erase all data
on the destination disk.
162. Create VHDs using
PowerShell
function New-VirtualHardDisk {
param (
[Parameter(Mandatory=$true)]
[string]$Path,
[Parameter(Mandatory=$true)]
[int64]$SizeBytes,
[ValidateSet("VHD", "VHDX")]
[string]$Format = "VHDX",
[ValidateSet("Dynamic", "Fixed")]
[string]$Type = "Dynamic"
)
try {
$params = @{
Path = $Path
SizeBytes = $SizeBytes
}
if ($Format -eq "VHD") {
$[Link]("VHD", $true)
}
if ($Type -eq "Fixed") {
$[Link]("Fixed", $true)
}
New-VHD @params
Write-Host "Virtual Hard Disk created
successfully at $Path"
# Optionally, mount the VHD
$mountVHD = Read-Host "Do you want to
mount the VHD? (Y/N)"
if ($mountVHD -eq "Y") {
$vhd = Mount-VHD -Path $Path -
PassThru
$disk = $vhd | Get-Disk
$disk | Initialize-Disk -
PartitionStyle GPT -PassThru |
New-Partition -UseMaximumSize
-AssignDriveLetter |
Format-Volume -FileSystem
NTFS -NewFileSystemLabel "NewVHD" -
Confirm:$false
Write-Host "VHD mounted and
initialized."
}
} catch {
Write-Host "Failed to create VHD.
Error: $($_.[Link])"
}
}
# Example usage:
# New-VirtualHardDisk -Path
"C:\VHDs\[Link]" -SizeBytes 10GB
This script creates a new Virtual
Hard Disk (VHD or VHDX) file. It
allows you to specify the path,
size, format (VHD or VHDX), and
type (Dynamic or Fixed). After
creating the VHD, it offers the
option to mount, initialize, and
format the new virtual disk.
163. Manage Scheduled
Tasks
function Manage-ScheduledTask {
param (
[Parameter(Mandatory=$true)]
[ValidateSet("Create", "Delete",
"Enable", "Disable", "List")]
[string]$Action,
[string]$TaskName,
[string]$TaskPath = "\",
[string]$Execute,
[string]$Argument,
[string]$Schedule
)
try {
switch ($Action) {
"Create" {
if (-not $TaskName -or -not
$Execute -or -not $Schedule) {
throw "TaskName, Execute,
and Schedule are required for Create action."
}
$action = New-
ScheduledTaskAction -Execute $Execute -
Argument $Argument
$trigger = New-
ScheduledTaskTrigger -Once -At $Schedule
Register-ScheduledTask -
TaskName $TaskName -TaskPath $TaskPath -
Action $action -Trigger $trigger
Write-Host "Task '$TaskName'
created successfully."
}
"Delete" {
if (-not $TaskName) {
throw "TaskName is
required for Delete action."
}
Unregister-ScheduledTask -
TaskName $TaskName -TaskPath $TaskPath -
Confirm:$false
Write-Host "Task '$TaskName'
deleted successfully."
}
"Enable" {
if (-not $TaskName) {
throw "TaskName is
required for Enable action."
}
Enable-ScheduledTask -
TaskName $TaskName -TaskPath $TaskPath
Write-Host "Task '$TaskName'
enabled successfully."
}
"Disable" {
if (-not $TaskName) {
throw "TaskName is
required for Disable action."
}
Disable-ScheduledTask -
TaskName $TaskName -TaskPath $TaskPath
Write-Host "Task '$TaskName'
disabled successfully."
}
"List" {
Get-ScheduledTask -TaskPath
$TaskPath | Format-Table -AutoSize
}
}
} catch {
Write-Host "Failed to manage
scheduled task. Error:
$($_.[Link])"
}
}
# Example usage:
# Manage-ScheduledTask -Action Create -
TaskName "DailyBackup" -Execute
"C:\Scripts\Backup.ps1" -Schedule "3:00AM"
# Manage-ScheduledTask -Action List
# Manage-ScheduledTask -Action Delete -
TaskName "DailyBackup"
This script provides comprehensive
management of scheduled tasks. It
allows you to create, delete,
enable, disable, and list scheduled
tasks. When creating a task, you
can specify the task name,
execution path, arguments, and
schedule.
164. Monitor Printer
Status
function Monitor-PrinterStatus {
param (
[int]$IntervalSeconds = 60,
[int]$DurationMinutes = 60
)
$endTime = (Get-
Date).AddMinutes($DurationMinutes)
while ((Get-Date) -lt $endTime) {
$printers = Get-Printer
foreach ($printer in $printers) {
$status = $[Link]
$jobCount = (Get-PrintJob -
PrinterName $[Link]).Count
Write-Host "$(Get-Date) -
Printer: $($[Link])"
Write-Host " Status: $status"
Write-Host " Jobs in queue:
$jobCount"
Write-Host ""
}
Write-Host "--------------------"
Start-Sleep -Seconds $IntervalSeconds
}
}
# Example usage:
# Monitor-PrinterStatus -IntervalSeconds 300
-DurationMinutes 120
This script monitors the status of
all printers on the system. It
periodically checks each printer's
status and the number of jobs in
its queue. The script runs for a
specified duration, checking at
regular intervals.
165. Add/Remove Printers
Remotely
function Manage-RemotePrinter {
param (
[Parameter(Mandatory=$true)]
[string]$ComputerName,
[Parameter(Mandatory=$true)]
[ValidateSet("Add", "Remove")]
[string]$Action,
[Parameter(Mandatory=$true)]
[string]$PrinterName,
[string]$DriverName,
[string]$PortName
)
try {
$session = New-PSSession -
ComputerName $ComputerName
switch ($Action) {
"Add" {
if (-not $DriverName -or -not
$PortName) {
throw "DriverName and
PortName are required for Add action."
}
Invoke-Command -Session
$session -ScriptBlock {
param($name, $driver,
$port)
Add-Printer -Name $name -
DriverName $driver -PortName $port
} -ArgumentList $PrinterName,
$DriverName, $PortName
Write-Host "Printer
'$PrinterName' added successfully to
$ComputerName."
}
"Remove" {
Invoke-Command -Session
$session -ScriptBlock {
param($name)
Remove-Printer -Name
$name -Confirm:$false
} -ArgumentList $PrinterName
Write-Host "Printer
'$PrinterName' removed successfully from
$ComputerName."
}
}
Remove-PSSession $session
} catch {
Write-Host "Failed to manage remote
printer. Error: $($_.[Link])"
}
}
# Example usage:
# Manage-RemotePrinter -ComputerName
"RemotePC01" -Action Add -PrinterName "HP
LaserJet" -DriverName "HP LaserJet PCL 6" -
PortName "USB001"
# Manage-RemotePrinter -ComputerName
"RemotePC01" -Action Remove -PrinterName "HP
LaserJet"
This script allows you to add or
remove printers on a remote
computer. It uses PowerShell
remoting to connect to the remote
system and perform the requested
action. When adding a printer, you
need to specify the printer name,
driver name, and port name.
166. Configure Network
Adapters
function Configure-NetworkAdapter {
param (
[Parameter(Mandatory=$true)]
[string]$AdapterName,
[ValidateSet("DHCP", "Static")]
[string]$IPType = "DHCP",
[string]$IPAddress,
[string]$SubnetMask,
[string]$Gateway,
[string[]]$DNSServers
)
try {
$adapter = Get-NetAdapter -Name
$AdapterName -ErrorAction Stop
if ($IPType -eq "DHCP") {
$adapter | Set-NetIPInterface -
DHCP Enabled
$adapter | Set-
DnsClientServerAddress -ResetServerAddresses
Write-Host "Network adapter
'$AdapterName' set to use DHCP."
} else {
if (-not $IPAddress -or -not
$SubnetMask) {
throw "IPAddress and
SubnetMask are required for Static IP
configuration."
}
$adapter | New-NetIPAddress -
IPAddress $IPAddress -PrefixLength
$SubnetMask -DefaultGateway $Gateway
if ($DNSServers) {
$adapter | Set-
DnsClientServerAddress -ServerAddresses
$DNSServers
}
Write-Host "Network adapter
'$AdapterName' configured with static IP:
$IPAddress"
}
} catch {
Write-Host "Failed to configure
network adapter. Error:
$($_.[Link])"
}
}
# Example usage:
# Configure-NetworkAdapter -AdapterName
"Ethernet" -IPType Static -IPAddress
"[Link]" -SubnetMask "24" -Gateway
"[Link]" -DNSServers "[Link]","[Link]"
# Configure-NetworkAdapter -AdapterName "Wi-
Fi" -IPType DHCP
This script allows you to configure
network adapters. You can set the
adapter to use DHCP or configure a
static IP address, along with
subnet mask, gateway, and DNS
servers. The script uses PowerShell
cmdlets to modify the network
adapter settings.
167. Manage Wireless
Profiles
function Manage-WirelessProfile {
param (
[Parameter(Mandatory=$true)]
[ValidateSet("Add", "Remove",
"List")]
[string]$Action,
[string]$ProfileName,
[string]$SSID,
[string]$Password
)
try {
switch ($Action) {
"Add" {
if (-not $ProfileName -or -
not $SSID -or -not $Password) {
throw "ProfileName, SSID,
and Password are required for Add action."
}
$xmlProfile = @"
<?xml version="1.0"?>
<WLANProfile
xmlns="[Link]
AN/profile/v1">
<name>$ProfileName</name>
<SSIDConfig>
<SSID>
<name>$SSID</name>
</SSID>
</SSIDConfig>
<connectionType>ESS</connectionType>
<connectionMode>auto</connectionMode>
<MSM>
<security>
<authEncryption>
<authentication>WPA2PSK</auth
entication>
<encryption>AES</encryption>
<useOneX>false</useOneX>
</authEncryption>
<sharedKey>
<keyType>passPhrase</keyType>
<protected>false</protected>
<keyMaterial>$Password</keyMa
terial>
</sharedKey>
</security>
</MSM>
</WLANProfile>
"@
$xmlProfile | Set-Content -
Path "$env:TEMP\[Link]"
netsh wlan add profile
filename="$env:TEMP\[Link]" user=all
Remove-Item -Path
"$env:TEMP\[Link]"
Write-Host "Wireless profile
'$ProfileName' added successfully."
}
"Remove" {
if (-not $ProfileName) {
throw "ProfileName is
required for Remove action."
}
netsh wlan delete profile
name="$ProfileName"
Write-Host "Wireless profile
'$ProfileName' removed successfully."
}
"List" {
$profiles = netsh wlan show
profiles | Select-String "All User Profile" |
ForEach-Object { $_ -replace ".*:\s+" }
$profiles | ForEach-Object {
Write-Host $_
}
}
}
} catch {
Write-Host "Failed to manage wireless
profile. Error: $($_.[Link])"
}
}
# Example usage:
# Manage-WirelessProfile -Action Add -
ProfileName "HomeWiFi" -SSID "MyHomeNetwork"
-Password "MySecurePassword"
# Manage-WirelessProfile -Action List
# Manage-WirelessProfile -Action Remove -
ProfileName "HomeWiFi"
This script manages wireless
network profiles. It allows you to
add new profiles, remove existing
ones, and list all profiles. When
adding a profile, it creates an XML
configuration file for the wireless
network and uses netsh to add it to
the system.
168. Update System BIOS
function Update-SystemBIOS {
param (
[Parameter(Mandatory=$true)]
[string]$BIOSUpdatePath,
[switch]$ForceRestart
)
try {
# Check if running with administrator
privileges
$currentPrincipal = New-Object
[Link]([Security
.[Link]]::GetCurrent())
if (-not
$[Link]([[Link]
[Link]]::Administrator)) {
throw "This script must be run as
an Administrator."
}
# Check if the BIOS update file
exists
if (-not (Test-Path $BIOSUpdatePath))
{
throw "BIOS update file not found
at specified path."
}
# Get current BIOS version
$currentBIOS = Get-WmiObject -Class
Win32_BIOS
Write-Host "Current BIOS Version:
$($[Link])"
# Execute BIOS update
Write-Host "Starting BIOS update..."
Start-Process -FilePath
$BIOSUpdatePath -ArgumentList "/s" -Wait
# Check if restart is required
if ($ForceRestart) {
Write-Host "Restarting system to
complete BIOS update..."
Restart-Computer -Force
} else {
Write-Host "BIOS update
completed. Please restart your system to
apply changes."
}
} catch {
Write-Host "Failed to update BIOS.
Error: $($_.[Link])"
}
}
# Example usage:
# Update-SystemBIOS -BIOSUpdatePath
"C:\BIOS\[Link]" -ForceRestart
This script attempts to update the
system BIOS. It requires the path
to the BIOS update executable and
can optionally force a system
restart after the update. Note that
BIOS updates are vendor-specific,
so this script may need to be
adjusted based on the specific
manufacturer's update process.
169. Monitor System
Updates
function Monitor-SystemUpdates {
param (
[int]$CheckIntervalHours = 24,
[int]$MonitorDurationDays = 30
)
$endDate = (Get-
Date).AddDays($MonitorDurationDays)
$updateLog = @()
while ((Get-Date) -lt $endDate) {
$session = New-Object -ComObject
[Link]
$searcher =
$[Link]()
$result =
$[Link]("IsInstalled=0 and
Type='Software'")
$updateCount = $[Link]
$criticalCount = ($[Link] |
Where-Object { $_.MsrcSeverity -eq "Critical"
}).Count
$updateLog += [PSCustomObject]@{
Date = Get-Date
TotalUpdates = $updateCount
CriticalUpdates = $criticalCount
}
Write-Host "$(Get-Date) - Total
Updates: $updateCount, Critical Updates:
$criticalCount"
if ($updateCount -gt 0) {
Write-Host "Updates available:"
foreach ($update in
$[Link]) {
Write-Host " -
$($[Link])"
}
}
Start-Sleep -Hours
$CheckIntervalHours
}
# Export log to CSV
$logPath = "C:\Logs\UpdateMonitor_$(Get-
Date -Format 'yyyyMMdd').csv"
$updateLog | Export-Csv -Path $logPath -
NoTypeInformation
Write-Host "Update monitoring completed.
Log exported to $logPath"
}
# Example usage:
# Monitor-SystemUpdates -CheckIntervalHours
12 -MonitorDurationDays 7
This script monitors available
system updates over a specified
period. It checks for updates at
regular intervals and logs the
total number of updates and
critical updates. The results are
displayed in the console and
exported to a CSV file at the end
of the monitoring period.
170. Configure Windows
Defender Settings
function Configure-WindowsDefender {
param (
[switch]$EnableRealTimeMonitoring,
[switch]$EnableCloudProtection,
[switch]$EnableSampleSubmission,
[int]$QuickScanInterval = 24,
[string]$ExclusionPath
)
try {
# Enable or disable real-time
monitoring
Set-MpPreference -
DisableRealtimeMonitoring
(!$EnableRealTimeMonitoring)
# Enable or disable cloud-delivered
protection
Set-MpPreference -MAPSReporting
($EnableCloudProtection ? 2 : 0)
# Enable or disable automatic sample
submission
Set-MpPreference -
SubmitSamplesConsent ($EnableSampleSubmission
? 1 : 0)
# Set quick scan interval (in hours)
Set-MpPreference -QuickScanInterval
$QuickScanInterval
# Add exclusion path if provided
if ($ExclusionPath) {
Add-MpPreference -ExclusionPath
$ExclusionPath
}
Write-Host "Windows Defender settings
updated successfully:"
Write-Host "Real-Time Monitoring:
$($EnableRealTimeMonitoring ? 'Enabled' :
'Disabled')"
Write-Host "Cloud Protection:
$($EnableCloudProtection ? 'Enabled' :
'Disabled')"
Write-Host "Sample Submission:
$($EnableSampleSubmission ? 'Enabled' :
'Disabled')"
Write-Host "Quick Scan Interval:
$QuickScanInterval hours"
if ($ExclusionPath) {
Write-Host "Exclusion Path Added:
$ExclusionPath"
}
}
catch {
Write-Host "Failed to configure
Windows Defender. Error:
$($_.[Link])"
}
}
# Example usage:
# Configure-WindowsDefender -
EnableRealTimeMonitoring -
EnableCloudProtection -QuickScanInterval 12 -
ExclusionPath "C:\ExcludedFolder"
This script configures various
settings for Windows Defender. It
allows you to enable or disable
real-time monitoring, cloud
protection, and automatic sample
submission. You can also set the
quick scan interval and add
exclusion paths.
171. Manage SSL
Certificates
function Manage-SSLCertificate {
param (
[Parameter(Mandatory=$true)]
[ValidateSet("Import", "Export",
"List")]
[string]$Action,
[string]$CertPath,
[string]$CertPassword,
[string]$CertStore =
"LocalMachine\My"
)
try {
switch ($Action) {
"Import" {
if (-not $CertPath -or -not
$CertPassword) {
throw "CertPath and
CertPassword are required for Import action."
}
$securePassword = ConvertTo-
SecureString -String $CertPassword -Force -
AsPlainText
Import-PfxCertificate -
FilePath $CertPath -CertStoreLocation
"Cert:\$CertStore" -Password $securePassword
Write-Host "Certificate
imported successfully."
}
"Export" {
if (-not $CertPath -or -not
$CertPassword) {
throw "CertPath and
CertPassword are required for Export action."
}
$cert = Get-ChildItem -Path
"Cert:\$CertStore" | Out-GridView -Title
"Select a certificate to export" -OutputMode
Single
if ($cert) {
$securePassword =
ConvertTo-SecureString -String $CertPassword
-Force -AsPlainText
Export-PfxCertificate -
Cert $cert -FilePath $CertPath -Password
$securePassword
Write-Host "Certificate
exported successfully to $CertPath"
} else {
Write-Host "No
certificate selected for export."
}
}
"List" {
Get-ChildItem -Path
"Cert:\$CertStore" | Format-Table Subject,
Thumbprint, NotBefore, NotAfter
}
}
}
catch {
Write-Host "Failed to manage SSL
certificate. Error: $($_.[Link])"
}
}
# Example usage:
# Manage-SSLCertificate -Action Import -
CertPath "C:\Certs\[Link]" -CertPassword
"SecurePassword"
# Manage-SSLCertificate -Action Export -
CertPath "C:\Certs\[Link]" -
CertPassword "NewPassword"
# Manage-SSLCertificate -Action List
This script provides functionality
to manage SSL certificates. It can
import certificates from PFX files,
export certificates to PFX files,
and list all certificates in a
specified certificate store.
172. Automate User
Session Logging
function Enable-UserSessionLogging {
param (
[string]$LogPath =
"C:\Logs\[Link]"
)
try {
# Ensure the log directory exists
$logDir = Split-Path $LogPath -Parent
if (-not (Test-Path $logDir)) {
New-Item -Path $logDir -ItemType
Directory
}
# Create or modify the event filter
$filterXml = @"
<QueryList>
<Query Id="0" Path="Security">
<Select Path="Security">*
[System[(EventID=4624 or EventID=4634)]]
</Select>
</Query>
</QueryList>
"@
$filter = New-Object
[Link]
ery("Security",
[[Link]]
::LogName, $filterXml)
$reader = New-Object
[Link]
ader($filter)
# Start logging
Write-Host "Starting user session
logging. Press Ctrl+C to stop."
while ($true) {
$event = $[Link]()
if ($event) {
$eventId = $[Link]
$timeCreated =
$[Link]
$username =
$[Link][5].Value
$logMessage = switch
($eventId) {
4624 { "$timeCreated -
User Logon: $username" }
4634 { "$timeCreated -
User Logoff: $username" }
}
Add-Content -Path $LogPath -
Value $logMessage
Write-Host $logMessage
}
}
}
catch {
Write-Host "Error in session logging:
$($_.[Link])"
}
finally {
if ($reader) {
$[Link]()
}
Write-Host "Session logging stopped."
}
}
# Example usage:
# Enable-UserSessionLogging
This script automates user session
logging by monitoring Windows
security events for user logons
(Event ID 4624) and logoffs (Event
ID 4634). It continuously writes
these events to a log file and
displays them in the console.
173. Configure Remote
Desktop Access
function Configure-RemoteDesktop {
param (
[switch]$Enable,
[switch]$Disable,
[string[]]$AllowedUsers
)
try {
if ($Enable -and $Disable) {
throw "Cannot specify both -
Enable and -Disable switches."
}
if ($Enable) {
Set-ItemProperty -Path
'HKLM:\System\CurrentControlSet\Control\Termi
nal Server' -Name "fDenyTSConnections" -Value
0
Enable-NetFirewallRule -
DisplayGroup "Remote Desktop"
Write-Host "Remote Desktop access
has been enabled."
}
elseif ($Disable) {
Set-ItemProperty -Path
'HKLM:\System\CurrentControlSet\Control\Termi
nal Server' -Name "fDenyTSConnections" -Value
1
Disable-NetFirewallRule -
DisplayGroup "Remote Desktop"
Write-Host "Remote Desktop access
has been disabled."
}
if ($AllowedUsers) {
$group = [ADSI]"WinNT://./Remote
Desktop Users,group"
foreach ($user in $AllowedUsers)
{
try {
$[Link]("WinNT://$env:
COMPUTERNAME/$user,user")
Write-Host "User $user
added to Remote Desktop Users group."
}
catch {
Write-Host "Failed to add
user $user. Error: $($_.[Link])"
}
}
}
Write-Host "Remote Desktop
configuration completed."
}
catch {
Write-Host "Failed to configure
Remote Desktop. Error:
$($_.[Link])"
}
}
# Example usage:
# Configure-RemoteDesktop -Enable -
AllowedUsers "User1", "User2"
# Configure-RemoteDesktop -Disable
This script configures Remote
Desktop access on the local
machine. It can enable or disable
Remote Desktop, configure the
Windows Firewall to allow Remote
Desktop connections, and add
specified users to the "Remote
Desktop Users" group.
174. Generate Uptime
Reports
function Generate-UptimeReport {
param (
[string]$ComputerName =
$env:COMPUTERNAME,
[int]$DaysToReport = 30,
[string]$OutputPath =
"C:\Reports\[Link]"
)
try {
$startDate = (Get-
Date).AddDays(-$DaysToReport)
$events = Get-WinEvent -ComputerName
$ComputerName -FilterHashtable @{
LogName = 'System'
ID = @(1074, 6005, 6006)
StartTime = $startDate
} -ErrorAction Stop
$uptimeLog = @()
$lastBootTime = $null
$totalUptime = [TimeSpan]::Zero
$totalDowntime = [TimeSpan]::Zero
foreach ($event in $events | Sort-
Object TimeCreated) {
switch ($[Link]) {
6005 { # System boot
if ($lastBootTime) {
$downtime =
$[Link] - $lastBootTime
$totalDowntime +=
$downtime
$uptimeLog +=
[PSCustomObject]@{
BootTime =
$lastBootTime
ShutdownTime =
$[Link]
Uptime = "N/A"
Downtime =
$[Link]()
}
}
$lastBootTime =
$[Link]
}
{1074, 6006 -contains $_} { #
System shutdown
if ($lastBootTime) {
$uptime =
$[Link] - $lastBootTime
$totalUptime +=
$uptime
$uptimeLog +=
[PSCustomObject]@{
BootTime =
$lastBootTime
ShutdownTime =
$[Link]
Uptime =
$[Link]()
Downtime = "N/A"
}
$lastBootTime = $null
}
}
}
}
# Add current session if system is
still running
if ($lastBootTime) {
$currentUptime = (Get-Date) -
$lastBootTime
$totalUptime += $currentUptime
$uptimeLog += [PSCustomObject]@{
BootTime = $lastBootTime
ShutdownTime = "Current
Session"
Uptime =
$[Link]()
Downtime = "N/A"
}
}
$uptimeLog | Export-Csv -Path
$OutputPath -NoTypeInformation
$totalTime = $totalUptime +
$totalDowntime
$uptimePercentage =
($[Link] /
$[Link]) * 100
Write-Host "Uptime report generated
for $ComputerName"
Write-Host "Total Uptime:
$($[Link]())"
Write-Host "Total Downtime:
$($[Link]())"
Write-Host "Uptime Percentage:
$($[Link]("F2"))%"
Write-Host "Detailed report saved to:
$OutputPath"
}
catch {
Write-Host "Failed to generate uptime
report. Error: $($_.[Link])"
}
}
# Example usage:
# Generate-UptimeReport -DaysToReport 60 -
OutputPath "C:\Reports\[Link]"
This script generates an uptime
report for the specified computer.
It analyzes system events to
determine boot and shutdown times,
calculates total uptime and
downtime, and provides an uptime
percentage. The detailed report is
saved to a CSV file.
175. Enable/Disable
BitLocker
function Manage-BitLocker {
param (
[Parameter(Mandatory=$true)]
[string]$DriveLetter,
[Parameter(Mandatory=$true)]
[ValidateSet("Enable", "Disable")]
[string]$Action,
[ValidateSet("TPM", "Password",
"TPMAndPIN")]
[string]$ProtectionMethod = "TPM",
[string]$Password,
[string]$PIN
)
try {
$drive = Get-BitLockerVolume -
MountPoint $DriveLetter -ErrorAction Stop
if ($Action -eq "Enable") {
if (-not $[Link])
{
switch ($ProtectionMethod) {
"TPM" {
Enable-BitLocker -
MountPoint $DriveLetter -TpmProtector -
UsedSpaceOnly
}
"Password" {
if (-not $Password) {
throw "Password
is required for Password protection method."
}
$securePassword =
ConvertTo-SecureString $Password -AsPlainText
-Force
Enable-BitLocker -
MountPoint $DriveLetter -PasswordProtector -
Password $securePassword -UsedSpaceOnly
}
"TPMAndPIN" {
if (-not $PIN) {
throw "PIN is
required for TPMAndPIN protection method."
}
$securePIN =
ConvertTo-SecureString $PIN -AsPlainText -
Force
Enable-BitLocker -
MountPoint $DriveLetter -TpmAndPinProtector -
Pin $securePIN -UsedSpaceOnly
}
}
Write-Host "BitLocker enabled
on drive $DriveLetter with $ProtectionMethod
protection."
} else {
Write-Host "BitLocker is
already enabled on drive $DriveLetter."
}
} elseif ($Action -eq "Disable") {
if ($[Link]) {
Disable-BitLocker -MountPoint
$DriveLetter
Write-Host "BitLocker
disabled on drive $DriveLetter."
} else {
Write-Host "BitLocker is not
enabled on drive $DriveLetter."
}
}
}
catch {
Write-Host "Failed to manage
BitLocker. Error: $($_.[Link])"
}
}
# Example usage:
# Manage-BitLocker -DriveLetter "C:" -Action
Enable -ProtectionMethod TPM
# Manage-BitLocker -DriveLetter "D:" -Action
Enable -ProtectionMethod Password -Password
"SecurePassword123"
# Manage-BitLocker -DriveLetter "E:" -Action
Disable
This script provides functionality
to enable or disable BitLocker
encryption on a specified drive. It
supports different protection
methods including TPM, Password,
and TPM with PIN. The script checks
the current BitLocker status before
attempting to enable or disable it.
176. Export Installed
Updates
function Export-InstalledUpdates {
param (
[string]$OutputPath =
"C:\Reports\[Link]",
[switch]$IncludeHotfixes
)
try {
$updates = Get-WmiObject -Class
Win32_QuickFixEngineering |
Select-Object Description,
HotFixID, InstalledOn
if (-not $IncludeHotfixes) {
$updates = $updates | Where-
Object { $_.HotFixID -notlike "KB*" }
}
$session = New-Object -ComObject
[Link]
$searcher =
$[Link]()
$historyCount =
$[Link]()
$history = $[Link](0,
$historyCount) |
Where-Object {
$_.Operation -eq 1 } |
Select-Object @{Name="KB";
Expression={$_.Title -replace '^.*\
((KB\d+)\).*$', '$1'}},
Title,
@{Name="Inst
alledOn"; Expression={$_.Date}}
$allUpdates = $updates + $history |
Sort-Object InstalledOn -Descending
$allUpdates | Export-Csv -Path
$OutputPath -NoTypeInformation
Write-Host "Installed updates
exported to $OutputPath"
Write-Host "Total updates exported:
$($[Link])"
}
catch {
Write-Host "Failed to export
installed updates. Error:
$($_.[Link])"
}
}
# Example usage:
# Export-InstalledUpdates -IncludeHotfixes
This script exports a list of
installed Windows updates to a CSV
file. It combines information from
both the Windows Management
Instrumentation (WMI) and the
Windows Update API to provide a
comprehensive list of updates. The
-IncludeHotfixes switch allows you to
include or exclude hotfixes from
the report.
177. Monitor Disk Health
function Monitor-DiskHealth {
param (
[int]$WarningThresholdPercent = 10,
[string]$OutputPath =
"C:\Reports\[Link]"
)
try {
$disks = Get-PhysicalDisk | Where-
Object MediaType -ne "Unspecified"
$diskHealth = @()
foreach ($disk in $disks) {
$smart = Get-
StorageReliabilityCounter -PhysicalDisk $disk
$usage = $[Link] +
$[Link]
$lifeRemaining = 100 -
$[Link]
$status = if ($lifeRemaining -le
$WarningThresholdPercent) { "Warning" } else
{ "OK" }
$diskHealth += [PSCustomObject]@{
DiskNumber = $[Link]
Model = $[Link]
MediaType = $[Link]
HealthStatus =
$[Link]
OperationalStatus =
$[Link]
Size =
"$([math]::Round($[Link] / 1GB, 2)) GB"
UsageRate =
"$([math]::Round($usage, 2))%"
LifeRemaining =
"$lifeRemaining%"
Status = $status
}
}
$htmlReport = @"
<html>
<head>
<style>
body { font-family: Arial,
sans-serif; }
table { border-collapse:
collapse; width: 100%; }
th, td { border: 1px solid
#ddd; padding: 8px; text-align: left; }
th { background-color:
#f2f2f2; }
.warning { background-color:
#ffcccc; }
</style>
</head>
<body>
<h1>Disk Health Report</h1>
<table>
<tr>
<th>Disk Number</th>
<th>Model</th>
<th>Media Type</th>
<th>Health Status</th>
<th>Operational
Status</th>
<th>Size</th>
<th>Usage Rate</th>
<th>Life Remaining</th>
<th>Status</th>
</tr>
$(foreach ($disk in
$diskHealth) {
$class = if ($[Link]
-eq "Warning") { ' class="warning"' } else {
'' }
"<tr$class>"
"<td>$($[Link])
</td>"
"<td>$($[Link])</td>"
"<td>$($[Link])
</td>"
"
<td>$($[Link])</td>"
"
<td>$($[Link])</td>"
"<td>$($[Link])</td>"
"<td>$($[Link])
</td>"
"
<td>$($[Link])</td>"
"<td>$($[Link])
</td>"
"</tr>"
})
</table>
</body>
</html>
"@
$htmlReport | Out-File -FilePath
$OutputPath
Write-Host "Disk health report
generated: $OutputPath"
}
catch {
Write-Host "Failed to monitor disk
health. Error: $($_.[Link])"
}
}
# Example usage:
# Monitor-DiskHealth -WarningThresholdPercent
15
This script monitors the health of
physical disks in the system. It
retrieves various health metrics
including wear percentage, usage
rate, and operational status. The
script generates an HTML report
with color-coded warnings for disks
that are nearing the end of their
lifespan.
178. Set Default System
Policies
function Set-DefaultSystemPolicies {
param (
[switch]$EnableFirewall,
[switch]$DisableGuestAccount,
[switch]$EnableAutoUpdates,
[switch]$SetComplexPasswords,
[int]$PasswordHistoryCount = 5
)
try {
if ($EnableFirewall) {
Set-NetFirewallProfile -Profile
Domain,Public,Private -Enabled True
Write-Host "Windows Firewall
enabled for all profiles."
}
if ($DisableGuestAccount) {
Disable-LocalUser -Name "Guest"
Write-Host "Guest account
disabled."
}
if ($EnableAutoUpdates) {
$AutoUpdatePath =
"HKLM:\SOFTWARE\Microsoft\Windows\CurrentVers
ion\WindowsUpdate\Auto Update"
Set-ItemProperty -Path
$AutoUpdatePath -Name AUOptions -Value 4
Write-Host "Automatic Windows
Updates enabled."
}
if ($SetComplexPasswords) {
$securityPolicy = @"
[Unicode]
Unicode=yes
[System Access]
PasswordComplexity = 1
MinimumPasswordLength = 12
PasswordHistorySize = $PasswordHistoryCount
[Version]
signature="$CHICAGO$"
Revision=1
"@
$securityPolicy | Out-File
"$env:TEMP\[Link]"
secedit /configure /db
"$env:WINDIR\security\[Link]" /cfg
"$env:TEMP\[Link]" /areas SECURITYPOLICY
Remove-Item -Path
"$env:TEMP\[Link]" -Force
Write-Host "Complex password
policy set."
}
Write-Host "Default system policies
have been applied."
}
catch {
Write-Host "Failed to set default
system policies. Error:
$($_.[Link])"
}
}
# Example usage:
# Set-DefaultSystemPolicies -EnableFirewall -
DisableGuestAccount -EnableAutoUpdates -
SetComplexPasswords
This script sets various default
system policies to enhance
security. It can enable the Windows
Firewall, disable the guest
account, enable automatic Windows
updates, and set complex password
requirements. The script uses a
combination of PowerShell cmdlets
and the secedit tool to apply these
policies.
179. Manage Local
Security Policies
function Manage-LocalSecurityPolicy {
param (
[Parameter(Mandatory=$true)]
[ValidateSet("View", "Set")]
[string]$Action,
[string]$PolicyName,
[string]$PolicyValue
)
try {
$securityPolicies = @{
"LockoutDuration" =
"LockoutDuration"
"LockoutThreshold" =
"LockoutBadCount"
"ResetLockoutCount" =
"ResetLockoutCount"
"PasswordComplexity" =
"PasswordComplexity"
"PasswordLength" =
"MinimumPasswordLength"
"PasswordAge" =
"MaximumPasswordAge"
}
if ($Action -eq "View") {
$results = @()
foreach ($policy in
$[Link]) {
$value = secedit /export /cfg
"$env:TEMP\[Link]" | Out-Null
$value = (Get-Content
"$env:TEMP\[Link]" | Select-String
$securityPolicies[$policy]).ToString().Split(
'=')[1].Trim()
$results +=
[PSCustomObject]@{
Policy = $policy
Value = $value
}
}
$results | Format-Table -AutoSize
}
elseif ($Action -eq "Set") {
if (-not $PolicyName -or -not
$PolicyValue) {
throw "PolicyName and
PolicyValue are required for Set action."
}
if (-not
$[Link]($PolicyName)) {
throw "Invalid PolicyName.
Supported policies are:
$($[Link] -join ', ')"
}
$tempFile =
"$env:TEMP\[Link]"
secedit /export /cfg $tempFile |
Out-Null
$content = Get-Content $tempFile
$content = $content -replace
"$($securityPolicies[$PolicyName]) = .*",
"$($securityPolicies[$PolicyName]) =
$PolicyValue"
$content | Set-Content $tempFile
secedit /configure /db
"$env:WINDIR\security\[Link]" /cfg
$tempFile /areas SECURITYPOLICY
Remove-Item $tempFile -Force
Write-Host "Policy '$PolicyName'
set to '$PolicyValue'"
}
}
catch {
Write-Host "Failed to manage local
security policy. Error:
$($_.[Link])"
}
}
# Example usage:
# Manage-LocalSecurityPolicy -Action View
# Manage-LocalSecurityPolicy -Action Set -
PolicyName "PasswordLength" -PolicyValue 14
This script allows viewing and
modifying local security policies.
It supports common policies such as
lockout duration, password
complexity, and minimum password
length. The script uses the secedit
tool to export, modify, and import
security policies.
180. Configure System
Restore Points
function Manage-SystemRestorePoints {
param (
[Parameter(Mandatory=$true)]
[ValidateSet("Enable", "Disable",
"Create", "List", "Restore")]
[string]$Action,
[string]$DriveLetter = "C:",
[string]$Description,
[int]$RestorePointNumber
)
try {
switch ($Action) {
"Enable" {
Enable-ComputerRestore -Drive
$DriveLetter
Write-Host "System Restore
enabled for drive $DriveLetter"
}
"Disable" {
Disable-ComputerRestore -
Drive $DriveLetter
Write-Host "System Restore
disabled for drive $DriveLetter"
}
"Create" {
if (-not $Description) {
$Description = "Manually
created restore point"
}
Checkpoint-Computer -
Description $Description -RestorePointType
"MODIFY_SETTINGS"
Write-Host "System Restore
point created: $Description"
}
"List" {
$restorePoints = Get-
ComputerRestorePoint
if ($restorePoints) {
$restorePoints | Format-
Table SequenceNumber, CreationTime,
Description -AutoSize
} else {
Write-Host "No restore
points found."
}
}
"Restore" {
if (-not $RestorePointNumber)
{
throw "RestorePointNumber
is required for Restore action."
}
$restorePoint = Get-
ComputerRestorePoint | Where-Object {
$_.SequenceNumber -eq $RestorePointNumber }
if ($restorePoint) {
$confirmation = Read-Host
"Are you sure you want to restore to this
point? (Y/N)"
if ($confirmation -eq
'Y') {
Restore-Computer -
RestorePoint $RestorePointNumber -
Confirm:$false
Write-Host "System
restored to point $RestorePointNumber. Please
restart your computer."
} else {
Write-Host "Restore
operation cancelled."
}
} else {
Write-Host "Restore point
$RestorePointNumber not found."
}
}
}
}
catch {
Write-Host "Failed to manage System
Restore points. Error:
$($_.[Link])"
}
}
# Example usage:
# Manage-SystemRestorePoints -Action Enable -
DriveLetter "C:"
# Manage-SystemRestorePoints -Action Create -
Description "Before software installation"
# Manage-SystemRestorePoints -Action List
# Manage-SystemRestorePoints -Action Restore
-RestorePointNumber 5
This script provides comprehensive
management of System Restore
points. It allows enabling or
disabling System Restore for a
specific drive, creating new
restore points, listing existing
restore points, and restoring the
system to a previous point. The
script includes safety measures
such as confirmation before
performing a system restore.
Chapter 6: Network
Administration (Scripts
181-220)
181. Scan Network Ports
function Scan-NetworkPorts {
param (
[string]$IPAddress,
[int[]]$Ports
)
foreach ($port in $Ports) {
$tcpClient = New-Object
[Link]
try {
$[Link]($IPAddress,
$port)
Write-Output "Port $port is open
on $IPAddress"
}
catch {
Write-Output "Port $port is
closed on $IPAddress"
}
finally {
$[Link]()
}
}
}
# Example usage
Scan-NetworkPorts -IPAddress "[Link]" -
Ports 80, 443, 22, 3389
This script defines a function Scan-
NetworkPorts that takes an IP address
and an array of port numbers as
parameters. It then attempts to
connect to each specified port on
the given IP address using a
TcpClient object. If the connection
is successful, it reports that the
port is open; otherwise, it reports
that the port is closed. This can
be useful for identifying open
services on a network device or
troubleshooting connectivity
issues.
182. Detect Network
Outages
function Detect-NetworkOutage {
param (
[string[]]$Hosts,
[int]$Interval = 60,
[int]$Timeout = 2000
)
while ($true) {
$date = Get-Date
foreach ($host in $Hosts) {
if (Test-Connection -ComputerName
$host -Count 1 -Quiet -TimeToLive $Timeout) {
Write-Output "$date - $host
is reachable"
}
else {
Write-Output "$date - $host
is unreachable"
# You could add alerting
logic here
}
}
Start-Sleep -Seconds $Interval
}
}
# Example usage
Detect-NetworkOutage -Hosts "[Link]",
"[Link]" -Interval 300
This script defines a function
Detect-NetworkOutage that continuously
pings a list of specified hosts at
a given interval. It uses the Test-
Connection cmdlet to check if each
host is reachable. If a host is
unreachable, it logs the outage.
This script can be useful for
monitoring critical network devices
or services and detecting when they
become unavailable.
183. Monitor Network
Traffic
function Monitor-NetworkTraffic {
param (
[string]$InterfaceName,
[int]$Duration = 60
)
$start = Get-NetAdapterStatistics -Name
$InterfaceName
Start-Sleep -Seconds $Duration
$end = Get-NetAdapterStatistics -Name
$InterfaceName
$bytesReceived = $[Link] -
$[Link]
$bytesSent = $[Link] -
$[Link]
$mbpsReceived = ($bytesReceived * 8) /
($Duration * 1000000)
$mbpsSent = ($bytesSent * 8) / ($Duration
* 1000000)
Write-Output "Network traffic on
$InterfaceName over $Duration seconds:"
Write-Output "Received:
$($[Link]('F2')) Mbps"
Write-Output "Sent:
$($[Link]('F2')) Mbps"
}
# Example usage
Monitor-NetworkTraffic -InterfaceName
"Ethernet" -Duration 300
This script defines a function
Monitor-NetworkTraffic that measures the
network traffic on a specified
network interface over a given
duration. It uses the Get-
NetAdapterStatistics cmdlet to retrieve
the bytes sent and received at the
start and end of the monitoring
period. It then calculates the
average throughput in Mbps for both
incoming and outgoing traffic. This
can be useful for identifying
network bottlenecks or unusual
traffic patterns.
184. Generate a Network
Map
function Generate-NetworkMap {
param (
[string]$Subnet,
[int]$Timeout = 1000
)
$networkDevices = @()
1..254 | ForEach-Object {
$ip = "$Subnet.$_"
if (Test-Connection -ComputerName $ip
-Count 1 -Quiet -TimeToLive $Timeout) {
try {
$hostname =
[[Link]]::GetHostEntry($ip).HostName
}
catch {
$hostname = "Unknown"
}
$networkDevices +=
[PSCustomObject]@{
IPAddress = $ip
Hostname = $hostname
}
}
}
$networkDevices | Format-Table -AutoSize
}
# Example usage
Generate-NetworkMap -Subnet "192.168.1"
This script defines a function
Generate-NetworkMap that scans a
specified subnet for active
devices. It uses Test-Connection to
check if each IP address in the
subnet is reachable, and then
attempts to resolve the hostname
for each active IP. The result is a
table of IP addresses and
hostnames, providing a basic map of
the network. This can be useful for
discovering devices on a network or
identifying unauthorized devices.
185. Automate IP Address
Allocation
function Allocate-IPAddress {
param (
[string]$ScopeID,
[string]$ClientID,
[string]$IPAddress
)
try {
Add-DhcpServerv4Reservation -ScopeId
$ScopeID -ClientId $ClientID -IPAddress
$IPAddress
Write-Output "Successfully allocated
IP address $IPAddress to client $ClientID in
scope $ScopeID"
}
catch {
Write-Error "Failed to allocate IP
address: $_"
}
}
# Example usage
Allocate-IPAddress -ScopeID "[Link]" -
ClientID "00-11-22-33-44-55" -IPAddress
"[Link]"
This script defines a function
Allocate-IPAddress that automates the
process of allocating a specific IP
address to a client in a DHCP
scope. It uses the Add-
DhcpServerv4Reservation cmdlet to create a
DHCP reservation. This can be
useful for ensuring that certain
devices always receive the same IP
address, which is important for
servers, printers, or other devices
that need a consistent address.
186. Configure DNS
Settings
function Configure-DNSSettings {
param (
[string]$InterfaceName,
[string[]]$DNSServers
)
try {
Set-DnsClientServerAddress -
InterfaceAlias $InterfaceName -
ServerAddresses $DNSServers
Write-Output "Successfully configured
DNS servers for interface $InterfaceName"
}
catch {
Write-Error "Failed to configure DNS
settings: $_"
}
}
# Example usage
Configure-DNSSettings -InterfaceName
"Ethernet" -DNSServers "[Link]", "[Link]"
This script defines a function
Configure-DNSSettings that sets the DNS
server addresses for a specified
network interface. It uses the Set-
DnsClientServerAddress cmdlet to apply
the changes. This can be useful for
automating DNS configuration across
multiple machines or ensuring
consistent DNS settings in an
organization.
187. Retrieve DHCP Lease
Details
function Get-DHCPLeaseDetails {
param (
[string]$ScopeID
)
try {
$leases = Get-DhcpServerv4Lease -
ScopeId $ScopeID
$leases | Format-Table IPAddress,
ClientId, HostName, AddressState,
LeaseExpiryTime -AutoSize
}
catch {
Write-Error "Failed to retrieve DHCP
lease details: $_"
}
}
# Example usage
Get-DHCPLeaseDetails -ScopeID "[Link]"
This script defines a function Get-
DHCPLeaseDetails that retrieves and
displays the DHCP lease information
for a specified scope. It uses the
Get-DhcpServerv4Lease cmdlet to get the
lease data and then formats it into
a table for easy reading. This can
be useful for monitoring DHCP
usage, identifying unauthorized
devices, or troubleshooting IP
address conflicts.
188. Manage VPN
Connections
function Manage-VPNConnection {
param (
[string]$ConnectionName,
[string]$Action
)
switch ($Action) {
"Connect" {
rasdial $ConnectionName
}
"Disconnect" {
rasdial $ConnectionName
/disconnect
}
"Status" {
rasdial $ConnectionName
}
default {
Write-Error "Invalid action. Use
'Connect', 'Disconnect', or 'Status'."
}
}
}
# Example usage
Manage-VPNConnection -ConnectionName "MyVPN"
-Action "Connect"
This script defines a function
Manage-VPNConnection that allows you to
connect, disconnect, or check the
status of a VPN connection. It uses
the rasdial command-line tool to
perform these actions. This can be
useful for automating VPN
connections or integrating VPN
management into other scripts.
189. Test Network Latency
function Test-NetworkLatency {
param (
[string]$TargetHost,
[int]$PingCount = 10
)
$results = Test-Connection -ComputerName
$TargetHost -Count $PingCount |
Select-Object Address,
ResponseTime
$averageLatency = ($results | Measure-
Object -Property ResponseTime -
Average).Average
$maxLatency = ($results | Measure-Object
-Property ResponseTime -Maximum).Maximum
$minLatency = ($results | Measure-Object
-Property ResponseTime -Minimum).Minimum
Write-Output "Latency to $TargetHost:"
Write-Output "Average:
$($[Link]('F2')) ms"
Write-Output "Maximum: $maxLatency ms"
Write-Output "Minimum: $minLatency ms"
}
# Example usage
Test-NetworkLatency -TargetHost "[Link]"
-PingCount 20
This script defines a function Test-
NetworkLatency that measures the
network latency to a specified
host. It uses the Test-Connection
cmdlet to send a series of pings
and then calculates the average,
maximum, and minimum latency. This
can be useful for troubleshooting
network performance issues or
monitoring the quality of network
connections.
190. Automate Wi-Fi
Connections
function Connect-WiFi {
param (
[string]$SSID,
[string]$Password
)
$profileXml = @"
<?xml version="1.0"?>
<WLANProfile
xmlns="[Link]
AN/profile/v1">
<name>$SSID</name>
<SSIDConfig>
<SSID>
<name>$SSID</name>
</SSID>
</SSIDConfig>
<connectionType>ESS</connectionType>
<connectionMode>auto</connectionMode>
<MSM>
<security>
<authEncryption>
<authentication>WPA2PSK</auth
entication>
<encryption>AES</encryption>
<useOneX>false</useOneX>
</authEncryption>
<sharedKey>
<keyType>passPhrase</keyType>
<protected>false</protected>
<keyMaterial>$Password</keyMa
terial>
</sharedKey>
</security>
</MSM>
</WLANProfile>
"@
$profilePath =
"$env:TEMP\[Link]"
$profileXml | Out-File -FilePath
$profilePath -Encoding ASCII
netsh wlan add profile
filename="$profilePath" user=all
netsh wlan connect name="$SSID"
Remove-Item -Path $profilePath
}
# Example usage
Connect-WiFi -SSID "MyWiFiNetwork" -Password
"MySecurePassword"
This script defines a function
Connect-WiFi that automates the
process of connecting to a Wi-Fi
network. It creates a temporary Wi-
Fi profile XML file, adds it to the
system using netsh , and then
initiates the connection. This can
be useful for automating Wi-Fi
connections on multiple machines or
quickly connecting to known
networks.
191. Monitor Server
Uptime
function Monitor-ServerUptime {
param (
[string[]]$ServerList,
[int]$ThresholdDays = 30
)
foreach ($server in $ServerList) {
try {
$os = Get-WmiObject -Class
Win32_OperatingSystem -ComputerName $server
$lastBootTime =
$[Link]($[Link])
$uptime = (Get-Date) -
$lastBootTime
if ($[Link] -ge
$ThresholdDays) {
Write-Output "$server has
been up for $($[Link]) days. Consider
rebooting."
}
else {
Write-Output "$server uptime:
$($[Link]) days, $($[Link]) hours,
$($[Link]) minutes"
}
}
catch {
Write-Error "Failed to get uptime
for $server: $_"
}
}
}
# Example usage
Monitor-ServerUptime -ServerList "Server1",
"Server2", "Server3" -ThresholdDays 60
This script defines a function
Monitor-ServerUptime that checks the
uptime of specified servers and
reports if any have exceeded a
threshold number of days. It uses
WMI to retrieve the last boot time
of each server and calculates the
uptime. This can be useful for
identifying servers that may need
rebooting for maintenance or to
apply updates.
192. Audit Network Shares
function Audit-NetworkShares {
param (
[string]$ComputerName = "localhost"
)
try {
$shares = Get-WmiObject -Class
Win32_Share -ComputerName $ComputerName
foreach ($share in $shares) {
$acl = Get-Acl -Path $[Link]
Write-Output "Share Name:
$($[Link])"
Write-Output "Path:
$($[Link])"
Write-Output "Description:
$($[Link])"
Write-Output "Permissions:"
foreach ($access in $[Link])
{
Write-Output
" $($[Link]) :
$($[Link])"
}
Write-Output "-------------------
-"
}
}
catch {
Write-Error "Failed to audit network
shares on $ComputerName: $_"
}
}
# Example usage
Audit-NetworkShares -ComputerName
"FileServer01"
This script defines a function Audit-
NetworkShares that retrieves
information about network shares on
a specified computer and their
associated permissions. It uses WMI
to get the share information and
the Get-Acl cmdlet to retrieve the
access control lists. This can be
useful for security audits,
identifying overly permissive
shares, or documenting network
resource configurations.
193. Configure SMB
Settings
function Configure-SMBSettings {
param (
[bool]$EnableSMB1 = $false,
[bool]$EnableSMB2 = $true,
[int]$ServerMinVersion = 2
)
try {
Set-SmbServerConfiguration -
EnableSMB1Protocol $EnableSMB1 -Force
Set-SmbServerConfiguration -
EnableSMB2Protocol $EnableSMB2 -Force
Set-SmbServerConfiguration -
ServerMinVersion $ServerMinVersion -Force
Write-Output "SMB settings configured
successfully:"
Write-Output "SMB1 Enabled:
$EnableSMB1"
Write-Output "SMB2 Enabled:
$EnableSMB2"
Write-Output "Minimum SMB Version:
$ServerMinVersion"
}
catch {
Write-Error "Failed to configure SMB
settings: $_"
}
}
# Example usage
Configure-SMBSettings -EnableSMB1 $false -
EnableSMB2 $true -ServerMinVersion 3
This script defines a function
Configure-SMBSettings that allows you to
configure Server Message Block
(SMB) protocol settings on a
Windows server. It uses the Set-
SmbServerConfiguration cmdlet to enable
or disable SMB1 and SMB2/3, and set
the minimum SMB version. This can
be useful for enhancing security by
disabling older, less secure
versions of SMB or ensuring
consistent SMB configurations
across multiple servers.
194. Test FTP Connections
function Test-FTPConnection {
param (
[string]$FTPServer,
[string]$Username,
[string]$Password,
[int]$Port = 21
)
try {
$ftpRequest =
[[Link]]::Create("[Link]
PServer`:$Port/")
$[Link] = New-Object
[Link]($Username,
$Password)
$[Link] =
[[Link]+Ftp]::ListDirec
tory
$[Link] = 5000
$response = $[Link]()
Write-Output "Successfully connected
to FTP server $FTPServer on port $Port"
$[Link]()
}
catch {
Write-Error "Failed to connect to FTP
server $FTPServer on port $Port: $_"
}
}
# Example usage
Test-FTPConnection -FTPServer
"[Link]" -Username "user" -Password
"password"
This script defines a function Test-
FTPConnection that attempts to
establish a connection to an FTP
server using the specified
credentials. It uses the
[Link] class to create
an FTP request and tries to list
the directory contents. This can be
useful for verifying FTP server
accessibility, troubleshooting
connection issues, or as part of a
larger script that interacts with
FTP servers.
195. Download Files via
FTP
function Download-FTPFile {
param (
[string]$FTPServer,
[string]$Username,
[string]$Password,
[string]$RemoteFile,
[string]$LocalFile
)
try {
$ftpRequest =
[[Link]]::Create("[Link]
PServer/$RemoteFile")
$[Link] = New-Object
[Link]($Username,
$Password)
$[Link] =
[[Link]+Ftp]::DownloadF
ile
$response = $[Link]()
$stream =
$[Link]()
$fileStream =
[[Link]]::Create($LocalFile)
$[Link]($fileStream)
$[Link]()
$[Link]()
$[Link]()
Write-Output "Successfully downloaded
$RemoteFile to $LocalFile"
}
catch {
Write-Error "Failed to download file
from FTP server: $_"
}
}
# Example usage
Download-FTPFile -FTPServer "[Link]"
-Username "user" -Password "password" -
RemoteFile "[Link]" -LocalFile
"C:\[Link]"
This script defines a function
Download-FTPFile that downloads a file
from an FTP server to a local
destination. It uses the
[Link] class to create
an FTP request for downloading the
file. This can be useful for
automating file transfers from FTP
servers, backing up remote files,
or integrating FTP downloads into
other scripts.
196. Upload Files via FTP
function Upload-FTPFile {
param (
[string]$FTPServer,
[string]$Username,
[string]$Password,
[string]$LocalFile,
[string]$RemoteFile
)
try {
$ftpRequest =
[[Link]]::Create("[Link]
PServer/$RemoteFile")
$[Link] = New-Object
[Link]($Username,
$Password)
$[Link] =
[[Link]+Ftp]::UploadFil
e
$[Link] = $true
$[Link] = $false
$content =
[[Link]]::ReadAllBytes($LocalFile)
$[Link] =
$[Link]
$requestStream =
$[Link]()
$[Link]($content, 0,
$[Link])
$[Link]()
$response = $[Link]()
Write-Output "Successfully uploaded
$LocalFile to $RemoteFile on FTP server"
$[Link]()
}
catch {
Write-Error "Failed to upload file to
FTP server: $_"
}
}
# Example usage
Upload-FTPFile -FTPServer "[Link]" -
Username "user" -Password "password" -
LocalFile "C:\[Link]" -RemoteFile
"[Link]"
This script defines a function
Upload-FTPFile that uploads a local
file to an FTP server. It uses the
[Link] class to create
an FTP request for uploading the
file. This can be useful for
automating file transfers to FTP
servers, backing up local files to
remote locations, or integrating
FTP uploads into other scripts.
197. Automate DNS Queries
function Resolve-DNSQuery {
param (
[string]$HostName,
[string]$RecordType = "A"
)
try {
$result = Resolve-DnsName -Name
$HostName -Type $RecordType -ErrorAction Stop
foreach ($record in $result) {
Write-Output "Host:
$($[Link])"
Write-Output "Record Type:
$($[Link])"
Write-Output "TTL:
$($[Link])"
switch ($[Link]) {
"A" { Write-Output "IP
Address: $($[Link])" }
"CNAME" { Write-Output
"Alias: $($[Link])" }
"MX" {
Write-Output "Mail
Server: $($[Link])"
Write-Output "Preference:
$($[Link])"
}
"TXT" { Write-Output "Text:
$($[Link])" }
}
Write-Output "-------------------
-"
}
}
catch {
Write-Error "Failed to resolve DNS
query: $_"
}
}
# Example usage
Resolve-DNSQuery -HostName "[Link]"
-RecordType "A"
This script defines a function
Resolve-DNSQuery that performs DNS
queries for specified hostnames and
record types. It uses the Resolve-
DnsName cmdlet to perform the DNS
resolution and then formats the
output based on the record type.
This can be useful for
troubleshooting DNS issues,
verifying DNS configurations, or
automating DNS-related tasks.
198. Configure Proxy
Settings
function Set-ProxySettings {
param (
[string]$ProxyServer,
[int]$ProxyPort,
[string[]]$BypassList = @("*.local",
"[Link]/16")
)
try {
$regKey =
"HKCU:\Software\Microsoft\Windows\CurrentVers
ion\Internet Settings"
# Enable proxy
Set-ItemProperty -Path $regKey -Name
ProxyEnable -Value 1
# Set proxy server and port
Set-ItemProperty -Path $regKey -Name
ProxyServer -Value "$ProxyServer`:$ProxyPort"
# Set bypass list
$bypassString = $BypassList -join ";"
Set-ItemProperty -Path $regKey -Name
ProxyOverride -Value $bypassString
# Refresh Internet Explorer settings
$ieSettings = New-Object -ComObject
"[Link]"
$[Link]()
Write-Output "Proxy settings
configured successfully:"
Write-Output "Proxy Server:
$ProxyServer`:$ProxyPort"
Write-Output "Bypass List:
$bypassString"
}
catch {
Write-Error "Failed to configure
proxy settings: $_"
}
}
# Example usage
Set-ProxySettings -ProxyServer
"[Link]" -ProxyPort 8080 -
BypassList "*.local", "*.[Link]",
"[Link]/16"
This script defines a function Set-
ProxySettings that configures system-
wide proxy settings on a Windows
machine. It modifies the registry
to set the proxy server, port, and
bypass list. This can be useful for
automating proxy configuration
across multiple machines or quickly
switching between different proxy
settings.
199. Export Network
Configuration
function Export-NetworkConfiguration {
param (
[string]$OutputFile =
"[Link]"
)
try {
$output = @()
# Get network adapters
$adapters = Get-NetAdapter | Where-
Object { $_.Status -eq "Up" }
foreach ($adapter in $adapters) {
$output += "Adapter:
$($[Link])"
$output += " MAC Address:
$($[Link])"
$output += " Link Speed:
$($[Link])"
# Get IP configuration
$ipConfig = Get-
NetIPConfiguration -InterfaceIndex
$[Link]
$output += " IP Address:
$($[Link])"
$output += " Subnet Mask:
$($[Link])"
$output += " Default Gateway:
$($[Link])"
$output += " DNS Servers:
$($[Link] -join
', ')"
$output += ""
}
# Get routing table
$output += "Routing Table:"
$routes = Get-NetRoute -AddressFamily
IPv4
foreach ($route in $routes) {
$output += " Destination:
$($[Link]) via
$($[Link])"
}
# Export to file
$output | Out-File -FilePath
$OutputFile
Write-Output "Network configuration
exported to $OutputFile"
}
catch {
Write-Error "Failed to export network
configuration: $_"
}
}
# Example usage
Export-NetworkConfiguration -OutputFile
"C:\[Link]"
This script defines a function
Export-NetworkConfiguration that gathers
various network-related information
and exports it to a text file. It
includes details about network
adapters, IP configurations, and
the routing table. This can be
useful for documenting network
configurations, troubleshooting
network issues, or creating a
baseline for comparison.
200. Monitor Server Logs
function Monitor-ServerLogs {
param (
[string]$LogName = "System",
[string]$ServerName = "localhost",
[int]$Hours = 24,
[string[]]$EventTypes = @("Error",
"Warning")
)
try {
$startTime = (Get-
Date).AddHours(-$Hours)
$events = Get-WinEvent -ComputerName
$ServerName -FilterHashtable @{
LogName = $LogName
StartTime = $startTime
Level = $EventTypes
} -ErrorAction Stop
foreach ($event in $events) {
Write-Output "Time:
$($[Link])"
Write-Output "Event ID:
$($[Link])"
Write-Output "Level:
$($[Link])"
Write-Output "Source:
$($[Link])"
Write-Output "Message:
$($[Link])"
Write-Output "-------------------
-"
}
Write-Output "Total events found:
$($[Link])"
}
catch {
Write-Error "Failed to monitor server
logs: $_"
}
}
# Example usage
Monitor-ServerLogs -LogName "Application" -
ServerName "Server01" -Hours 48 -EventTypes
"Error"
This script defines a function
Monitor-ServerLogs that retrieves and
displays events from the specified
event log on a server. It allows
you to filter events by log name,
time range, and event types. This
can be useful for proactive
monitoring of server health,
troubleshooting issues, or
generating reports on system
events.
201. Query SNMP Data
function Query-SNMPData {
param (
[string]$TargetIP,
[string]$Community = "public",
[string]$OID
)
try {
# Ensure the SNMP module is available
if (-not (Get-Module -ListAvailable -
Name "SNMPv2")) {
throw "SNMP module not available.
Please install it using 'Install-Module
SNMPv2'"
}
Import-Module SNMPv2
$result = Invoke-SnmpGet -IP
$TargetIP -OID $OID -Community $Community
Write-Output "SNMP Query Result:"
Write-Output "OID: $OID"
Write-Output "Value: $($[Link])"
Write-Output "Type: $($[Link])"
}
catch {
Write-Error "Failed to query SNMP
data: $_"
}
}
# Example usage
Query-SNMPData -TargetIP "[Link]" -
Community "public" -OID "[Link].[Link].0"
This script defines a function Query-
SNMPData that performs an SNMP GET
operation to retrieve data from a
network device. It uses the SNMPv2
module to perform the query. This
can be useful for monitoring
network devices, retrieving system
information, or automating network
management tasks.
202. Configure VLAN
Settings
function Set-VLANConfiguration {
param (
[string]$InterfaceName,
[int]$VLANID
)
try {
# Check if the interface exists
$interface = Get-NetAdapter -Name
$InterfaceName -ErrorAction Stop
# Remove existing VLAN configuration
$existingVLAN = Get-
NetAdapterAdvancedProperty -Name
$InterfaceName -RegistryKeyword VlanID -
ErrorAction SilentlyContinue
if ($existingVLAN) {
Remove-NetAdapterAdvancedProperty
-Name $InterfaceName -RegistryKeyword VlanID
}
# Set new VLAN ID
Set-NetAdapterAdvancedProperty -Name
$InterfaceName -RegistryKeyword VlanID -
RegistryValue $VLANID
Write-Output "VLAN $VLANID configured
successfully on interface $InterfaceName"
}
catch {
Write-Error "Failed to configure VLAN
settings: $_"
}
}
# Example usage
Set-VLANConfiguration -InterfaceName
"Ethernet" -VLANID 100
This script defines a function Set-
VLANConfiguration that configures VLAN
settings on a network interface. It
first removes any existing VLAN
configuration and then sets the new
VLAN ID. This can be useful for
automating VLAN configurations
across multiple servers or
switches, or for quickly changing
VLAN settings as part of network
management tasks.
203. Detect Rogue Devices
on a Network
function Detect-RogueDevices {
param (
[string]$Subnet,
[string]$KnownDevicesFile
)
try {
# Import list of known devices
$knownDevices = Get-Content
$KnownDevicesFile
# Scan the network
$activeDevices = 1..254 | ForEach-
Object {
$ip = "$Subnet.$_"
if (Test-Connection -ComputerName
$ip -Count 1 -Quiet) {
try {
$hostname =
[[Link]]::GetHostEntry($ip).HostName
$mac = (Get-NetNeighbor -
IPAddress $ip).LinkLayerAddress
[PSCustomObject]@{
IP = $ip
Hostname = $hostname
MAC = $mac
}
}
catch {
[PSCustomObject]@{
IP = $ip
Hostname = "Unknown"
MAC = "Unknown"
}
}
}
}
# Identify rogue devices
$rogueDevices = $activeDevices |
Where-Object { $_.MAC -notin $knownDevices }
if ($rogueDevices) {
Write-Output "Potential rogue
devices detected:"
$rogueDevices | Format-Table -
AutoSize
}
else {
Write-Output "No rogue devices
detected."
}
}
catch {
Write-Error "Failed to detect rogue
devices: $_"
}
}
# Example usage
Detect-RogueDevices -Subnet "192.168.1" -
KnownDevicesFile "C:\[Link]"
This script defines a function
Detect-RogueDevices that scans a network
subnet and compares the found
devices against a list of known
devices. It identifies potential
rogue devices by their MAC
addresses. This can be useful for
network security audits,
identifying unauthorized devices on
a network, or maintaining an
inventory of network devices.
204. Manage MAC Address
Filtering
function Manage-MACFiltering {
param (
[string]$Action,
[string]$MACAddress,
[string]$FilterListPath =
"C:\[Link]"
)
try {
switch ($Action) {
"Add" {
Add-Content -Path
$FilterListPath -Value $MACAddress
Write-Output "Added MAC
address $MACAddress to filter list"
}
"Remove" {
$content = Get-Content -Path
$FilterListPath
$newContent = $content |
Where-Object { $_ -ne $MACAddress }
Set-Content -Path
$FilterListPath -Value $newContent
Write-Output "Removed MAC
address $MACAddress from filter list"
}
"List" {
$filterList = Get-Content -
Path $FilterListPath
Write-Output "Current MAC
Address Filter List:"
$filterList | ForEach-Object
{ Write-Output $_ }
}
default {
throw "Invalid action. Use
'Add', 'Remove', or 'List'."
}
}
}
catch {
Write-Error "Failed to manage MAC
address filtering: $_"
}
}
# Example usage
Manage-MACFiltering -Action "Add" -MACAddress
"[Link]"
Manage-MACFiltering -Action "List"
This script defines a function
Manage-MACFiltering that allows you to
manage a list of MAC addresses for
filtering purposes. It supports
adding, removing, and listing MAC
addresses in a text file. This can
be useful for maintaining access
control lists for network devices,
managing allowed devices on a
network, or implementing basic
network access control.
205. Set Static IP
Addresses
function Set-StaticIPAddress {
param (
[string]$InterfaceName,
[string]$IPAddress,
[int]$PrefixLength,
[string]$DefaultGateway,
[string[]]$DNSServers
)
try {
# Remove existing IP configuration
Remove-NetIPAddress -InterfaceAlias
$InterfaceName -Confirm:$false
Remove-NetRoute -InterfaceAlias
$InterfaceName -Confirm:$false
# Set new IP address and default
gateway
New-NetIPAddress -InterfaceAlias
$InterfaceName -IPAddress $IPAddress -
PrefixLength $PrefixLength -DefaultGateway
$DefaultGateway
# Set DNS servers
Set-DnsClientServerAddress -
InterfaceAlias $InterfaceName -
ServerAddresses $DNSServers
Write-Output "Static IP configuration
set successfully:"
Write-Output "Interface:
$InterfaceName"
Write-Output "IP Address:
$IPAddress/$PrefixLength"
Write-Output "Default Gateway:
$DefaultGateway"
Write-Output "DNS Servers:
$($DNSServers -join ', ')"
}
catch {
Write-Error "Failed to set static IP
address: $_"
}
}
# Example usage
Set-StaticIPAddress -InterfaceName "Ethernet"
-IPAddress "[Link]" -PrefixLength 24 -
DefaultGateway "[Link]" -DNSServers
"[Link]", "[Link]"
This script defines a function Set-
StaticIPAddress that configures a
static IP address on a network
interface. It removes any existing
IP configuration and sets the new
IP address, subnet mask, default
gateway, and DNS servers. This can
be useful for automating network
configuration on servers or
workstations, ensuring consistent
network settings across multiple
machines, or quickly changing
network configurations for
different environments.
206. Backup Router
Configuration
function Backup-RouterConfig {
param (
[string]$RouterIP,
[string]$Username,
[string]$Password,
[string]$BackupPath,
[string]$RouterType = "Cisco"
)
try {
# Ensure Posh-SSH module is available
if (-not (Get-Module -ListAvailable -
Name Posh-SSH)) {
throw "Posh-SSH module not
available. Please install it using 'Install-
Module Posh-SSH'"
}
# Connect to router
$securePassword = ConvertTo-
SecureString $Password -AsPlainText -Force
$sshSession = New-SSHSession -
ComputerName $RouterIP -Credential (New-
Object
[Link]
($Username, $securePassword))
# Execute command based on router
type
switch ($RouterType) {
"Cisco" {
$command = "show running-
config"
}
"Juniper" {
$command = "show
configuration | display set"
}
default {
throw "Unsupported router
type"
}
}
$config = Invoke-SSHCommand -
SessionId $[Link] -Command
$command
# Save configuration to file
$timestamp = Get-Date -Format
"yyyyMMdd_HHmmss"
$fileName =
"RouterConfig_${RouterIP}_${timestamp}.txt"
$[Link] | Out-File -FilePath
(Join-Path $BackupPath $fileName)
Write-Output "Router configuration
backed up successfully to $fileName"
}
catch {
Write-Error "Failed to backup router
configuration: $_"
}
finally {
# Close SSH session
if ($sshSession) {
Remove-SSHSession -SessionId
$[Link] | Out-Null
}
}
}
# Example usage
Backup-RouterConfig -RouterIP "[Link]" -
Username "admin" -Password "password" -
BackupPath "C:\RouterBackups" -RouterType
"Cisco"
This script defines a function
Backup-RouterConfig that connects to a
router via SSH and retrieves its
configuration. It supports
different router types (currently
Cisco and Juniper) and saves the
configuration to a file with a
timestamp. This can be useful for
maintaining backups of network
device configurations, preparing
for network changes, or auditing
configuration changes over time.
207. Automate SSH
Connections
function Connect-SSH {
param (
[string]$Hostname,
[string]$Username,
[string]$Password,
[string[]]$Commands
)
try {
# Ensure Posh-SSH module is available
if (-not (Get-Module -ListAvailable -
Name Posh-SSH)) {
throw "Posh-SSH module not
available. Please install it using 'Install-
Module Posh-SSH'"
}
# Create credential object
$securePassword = ConvertTo-
SecureString $Password -AsPlainText -Force
$credential = New-Object
[Link]
($Username, $securePassword)
# Establish SSH session
$session = New-SSHSession -
ComputerName $Hostname -Credential
$credential -AcceptKey
if ($session) {
Write-Output "Connected to
$Hostname"
# Execute commands
foreach ($command in $Commands) {
Write-Output "Executing:
$command"
$result = Invoke-SSHCommand -
SessionId $[Link] -Command
$command
Write-Output $[Link]
}
}
else {
throw "Failed to establish SSH
connection"
}
}
catch {
Write-Error "SSH connection failed:
$_"
}
finally {
# Close SSH session
if ($session) {
Remove-SSHSession -SessionId
$[Link] | Out-Null
}
}
}
# Example usage
Connect-SSH -Hostname "[Link]" -
Username "admin" -Password "password" -
Commands "ls -l", "df -h", "uptime"
This script defines a function
Connect-SSH that establishes an SSH
connection to a remote host and
executes a series of commands. It
uses the Posh-SSH module to handle
the SSH connection. This can be
useful for automating tasks on
remote servers, performing routine
checks, or executing scripts on
multiple machines.
208. Monitor Remote
Servers
function Monitor-RemoteServers {
param (
[string[]]$ServerList,
[int]$Interval = 300,
[string]$LogFile =
"C:\[Link]"
)
try {
while ($true) {
$timestamp = Get-Date -Format
"yyyy-MM-dd HH:mm:ss"
foreach ($server in $ServerList)
{
$status = Test-Connection -
ComputerName $server -Count 1 -Quiet
if ($status) {
$cpuLoad = Get-WmiObject
-ComputerName $server -Class Win32_Processor
| Measure-Object -Property LoadPercentage -
Average | Select-Object -ExpandProperty
Average
$memoryUsage = Get-
WmiObject -ComputerName $server -Class
Win32_OperatingSystem | Select-Object
@{Name="MemoryUsage";Expression={"{0:N2}" -f
((($_.TotalVisibleMemorySize -
$_.FreePhysicalMemory) /
$_.TotalVisibleMemorySize) * 100)}}
$diskSpace = Get-
WmiObject -ComputerName $server -Class
Win32_LogicalDisk -Filter "DeviceID='C:'" |
Select-Object @{Name="FreeSpace";Expression=
{"{0:N2}" -f ($_.FreeSpace / $_.Size * 100)}}
$logEntry = "$timestamp -
$server - Online - CPU: $cpuLoad% - Memory:
$($[Link])% - Disk:
$($[Link])% free"
}
else {
$logEntry = "$timestamp -
$server - Offline"
}
Add-Content -Path $LogFile -
Value $logEntry
Write-Output $logEntry
}
Start-Sleep -Seconds $Interval
}
}
catch {
Write-Error "Monitoring failed: $_"
}
}
# Example usage
Monitor-RemoteServers -ServerList "Server1",
"Server2", "Server3" -Interval 600 -LogFile
"C:\[Link]"
This script defines a function
Monitor-RemoteServers that continuously
monitors a list of remote servers
for their status, CPU usage, memory
usage, and disk space. It logs the
information to a file and outputs
it to the console. This can be
useful for proactive server
monitoring, identifying performance
issues, or maintaining an audit
trail of server health.
209. Manage Load
Balancers
function Manage-LoadBalancer {
param (
[string]$LoadBalancerIP,
[string]$Username,
[string]$Password,
[string]$Action,
[string]$PoolName,
[string]$ServerIP,
[int]$ServerPort
)
try {
# This is a simplified example. In a
real-world scenario, you'd use the load
balancer's API or management interface.
$securePassword = ConvertTo-
SecureString $Password -AsPlainText -Force
$credential = New-Object
[Link]
($Username, $securePassword)
# Establish a remote session to the
load balancer
$session = New-PSSession -
ComputerName $LoadBalancerIP -Credential
$credential
switch ($Action) {
"AddServer" {
Invoke-Command -Session
$session -ScriptBlock {
param($pool, $ip, $port)
# Simulate adding a
server to the pool
Write-Output "Adding
server $ip`:$port to pool $pool"
# In reality, you'd use
the load balancer's specific commands here
} -ArgumentList $PoolName,
$ServerIP, $ServerPort
}
"RemoveServer" {
Invoke-Command -Session
$session -ScriptBlock {
param($pool, $ip, $port)
# Simulate removing a
server from the pool
Write-Output "Removing
server $ip`:$port from pool $pool"
# In reality, you'd use
the load balancer's specific commands here
} -ArgumentList $PoolName,
$ServerIP, $ServerPort
}
"ListServers" {
Invoke-Command -Session
$session -ScriptBlock {
param($pool)
# Simulate listing
servers in the pool
Write-Output "Servers in
pool $pool:"
# In reality, you'd use
the load balancer's specific commands here
} -ArgumentList $PoolName
}
default {
throw "Invalid action. Use
'AddServer', 'RemoveServer', or
'ListServers'."
}
}
}
catch {
Write-Error "Failed to manage load
balancer: $_"
}
finally {
if ($session) {
Remove-PSSession $session
}
}
}
# Example usage
Manage-LoadBalancer -LoadBalancerIP
"[Link]" -Username "admin" -Password
"password" -Action "AddServer" -PoolName
"WebPool" -ServerIP "[Link]" -
ServerPort 80
This script defines a function
Manage-LoadBalancer that simulates
managing a load balancer. It
provides actions for adding a
server to a pool, removing a server
from a pool, and listing servers in
a pool. In a real-world scenario,
you would replace the simulated
commands with actual API calls or
management commands specific to
your load balancer. This can be
useful for automating load balancer
configurations, managing server
pools, or integrating load balancer
management into larger automation
scripts.
210. Automate Network
Failover
function Automate-NetworkFailover {
param (
[string]$PrimaryInterface,
[string]$SecondaryInterface,
[string]$MonitoredHost,
[int]$FailureThreshold = 3,
[int]$CheckInterval = 30
)
try {
$failureCount = 0
$activeInterface = $PrimaryInterface
while ($true) {
$pingResult = Test-Connection -
ComputerName $MonitoredHost -Count 4 -Quiet
if (-not $pingResult) {
$failureCount++
Write-Output "Connection
attempt failed. Failure count: $failureCount"
if ($failureCount -ge
$FailureThreshold) {
if ($activeInterface -eq
$PrimaryInterface) {
Write-Output
"Switching to secondary interface..."
# Disable primary
interface
Disable-NetAdapter -
Name $PrimaryInterface -Confirm:$false
# Enable secondary
interface
Enable-NetAdapter -
Name $SecondaryInterface -Confirm:$false
$activeInterface =
$SecondaryInterface
}
else {
Write-Output
"Secondary interface is already active.
Attempting to restore primary..."
# Try to re-enable
primary interface
Enable-NetAdapter -
Name $PrimaryInterface -Confirm:$false
if (Test-Connection -
ComputerName $MonitoredHost -Count 4 -Quiet)
{
Write-Output
"Primary interface restored successfully."
Disable-
NetAdapter -Name $SecondaryInterface -
Confirm:$false
$activeInterface
= $PrimaryInterface
}
else {
Write-Output
"Failed to restore primary interface. Keeping
secondary active."
Disable-
NetAdapter -Name $PrimaryInterface -
Confirm:$false
}
}
$failureCount = 0
}
}
else {
Write-Output "Connection to
$MonitoredHost successful. Active interface:
$activeInterface"
$failureCount = 0
}
Start-Sleep -Seconds
$CheckInterval
}
}
catch {
Write-Error "Network failover
automation failed: $_"
}
}
# Example usage
Automate-NetworkFailover -PrimaryInterface
"Ethernet" -SecondaryInterface "Wi-Fi" -
MonitoredHost "[Link]" -FailureThreshold 5 -
CheckInterval 60
This script defines a function
Automate-NetworkFailover that implements a
basic network failover mechanism.
It monitors connectivity to a
specified host and switches between
primary and secondary network
interfaces if connectivity fails.
This can be useful for maintaining
network connectivity in
environments with multiple network
paths, implementing basic
redundancy, or automating failover
in small network setups.
211. Monitor Network
Interface Health
function Monitor-NetworkInterfaceHealth {
param (
[string[]]$InterfaceNames,
[int]$MonitoringInterval = 60,
[string]$LogFile =
"C:\[Link]"
)
try {
while ($true) {
$timestamp = Get-Date -Format
"yyyy-MM-dd HH:mm:ss"
foreach ($interface in
$InterfaceNames) {
$adapterInfo = Get-NetAdapter
-Name $interface -ErrorAction
SilentlyContinue
if ($adapterInfo) {
$status =
$[Link]
$speed =
$[Link]
$errors = (Get-
NetAdapterStatistics -Name
$interface).ReceivedErrors
$logEntry = "$timestamp -
Interface: $interface - Status: $status -
Speed: $speed - Errors: $errors"
Add-Content -Path
$LogFile -Value $logEntry
Write-Output $logEntry
# Check for issues
if ($status -ne "Up") {
Write-Warning
"Interface $interface is not up!"
}
if ($errors -gt 0) {
Write-Warning
"Interface $interface has $errors errors!"
}
}
else {
$logEntry = "$timestamp -
Interface: $interface - Not found or
accessible"
Add-Content -Path
$LogFile -Value $logEntry
Write-Warning $logEntry
}
}
Start-Sleep -Seconds
$MonitoringInterval
}
}
catch {
Write-Error "Network interface
monitoring failed: $_"
}
}
# Example usage
Monitor-NetworkInterfaceHealth -
InterfaceNames "Ethernet", "Wi-Fi" -
MonitoringInterval 300 -LogFile
"C:\[Link]"
This script defines a function
Monitor-NetworkInterfaceHealth that
continuously monitors the health of
specified network interfaces. It
checks the status, link speed, and
error count of each interface at
regular intervals. The information
is logged to a file and output to
the console, with warnings for
interfaces that are down or have
errors. This can be useful for
proactive network monitoring,
identifying interface issues early,
or maintaining a log of network
interface performance over time.
212. Set QoS Rules
function Set-QoSRule {
param (
[string]$RuleName,
[string]$AppName,
[int]$ThrottleRate,
[ValidateSet("In", "Out")]
[string]$Direction = "Out"
)
try {
# Check if the rule already exists
$existingRule = Get-NetQosPolicy -
Name $RuleName -ErrorAction SilentlyContinue
if ($existingRule) {
# Update existing rule
Set-NetQosPolicy -Name $RuleName
-ThrottleRateActionBitsPerSecond
($ThrottleRate * 1000000)
Write-Output "Updated QoS rule:
$RuleName"
}
else {
# Create new rule
New-NetQosPolicy -Name $RuleName
-AppPathNameMatchCondition $AppName -
ThrottleRateActionBitsPerSecond
($ThrottleRate * 1000000)
Write-Output "Created new QoS
rule: $RuleName"
}
# Set the direction
Set-NetQosPolicy -Name $RuleName -
IPProtocol Both -IPDstPortMatchCondition 0-
65535 -IPSrcPortMatchCondition 0-65535 -
DSCPAction 0 -NetworkProfile All
if ($Direction -eq "In") {
Set-NetQosPolicy -Name $RuleName
-IPProtocols Both -IPDstPortMatchCondition 0-
65535 -IPSrcPortMatchCondition 0-65535
}
else {
Set-NetQosPolicy -Name $RuleName
-IPProtocols Both -IPSrcPortMatchCondition 0-
65535 -IPDstPortMatchCondition 0-65535
}
Write-Output "QoS rule $RuleName set
for $Direction direction"
}
catch {
Write-Error "Failed to set QoS rule:
$_"
}
}
# Example usage
Set-QoSRule -RuleName "LimitVideoStreaming" -
AppName "C:\Program
Files\VideoApp\[Link]" -ThrottleRate 5
-Direction "Out"
This script defines a function Set-
QoSRule that creates or updates
Quality of Service (QoS) rules for
network traffic. It allows you to
set bandwidth limits for specific
applications and control whether
the rule applies to incoming or
outgoing traffic. This can be
useful for managing network
bandwidth usage, prioritizing
critical applications, or limiting
the impact of bandwidth-intensive
applications on network
performance.
213. Generate Bandwidth
Usage Reports
function Generate-BandwidthReport {
param (
[string]$InterfaceName,
[int]$DurationHours = 24,
[string]$ReportPath =
"C:\[Link]"
)
try {
$startTime = (Get-
Date).AddHours(-$DurationHours)
$endTime = Get-Date
$data = Get-NetAdapterStatistics -
Name $InterfaceName |
Select-Object
@{Name="Timestamp";Expression={$endTime}},
@{Name="Receive
dBytes";Expression={$_.ReceivedBytes}},
@{Name="SentByt
es";Expression={$_.SentBytes}}
$initialReceived =
$[Link]
$initialSent = $[Link]
$reportData = @()
for ($i = 1; $i -le $DurationHours;
$i++) {
Start-Sleep -Seconds 3600 # Wait
for an hour
$newData = Get-
NetAdapterStatistics -Name $InterfaceName
$timestamp =
$[Link]($i)
$receivedDelta =
$[Link] - $initialReceived
$sentDelta = $[Link] -
$initialSent
$reportData += [PSCustomObject]@{
Timestamp = $timestamp
ReceivedMB =
[math]::Round($receivedDelta / 1MB, 2)
SentMB =
[math]::Round($sentDelta / 1MB, 2)
}
$initialReceived =
$[Link]
$initialSent = $[Link]
}
$reportData | Export-Csv -Path
$ReportPath -NoTypeInformation
Write-Output "Bandwidth report
generated: $ReportPath"
}
catch {
Write-Error "Failed to generate
bandwidth report: $_"
}
}
# Example usage
Generate-BandwidthReport -InterfaceName
"Ethernet" -DurationHours 48 -ReportPath
"C:\[Link]"
This script defines a function
Generate-BandwidthReport that monitors
network bandwidth usage over a
specified period and generates a
report. It collects data on bytes
received and sent at hourly
intervals and saves the results to
a CSV file. This can be useful for
tracking bandwidth consumption
trends, identifying peak usage
times, or auditing network usage
for billing or capacity planning
purposes.
214. Manage NTP
Configurations
function Manage-NTPConfiguration {
param (
[string[]]$NTPServers,
[switch]$EnableNTPClient,
[switch]$DisableNTPClient
)
try {
if ($EnableNTPClient) {
# Enable Windows Time service
Set-Service -Name W32Time -
StartupType Automatic
Start-Service -Name W32Time
Write-Output "Windows Time
service enabled and started."
}
if ($DisableNTPClient) {
# Disable Windows Time service
Stop-Service -Name W32Time
Set-Service -Name W32Time -
StartupType Disabled
Write-Output "Windows Time
service stopped and disabled."
}
if ($NTPServers) {
# Set NTP servers
w32tm /config
/syncfromflags:manual /manualpeerlist:
($NTPServers -join ",")
Write-Output "NTP servers
configured: $($NTPServers -join ", ")"
# Restart the Windows Time
service to apply changes
Restart-Service -Name W32Time
Write-Output "Windows Time
service restarted."
}
# Display current NTP configuration
$currentConfig = w32tm /query
/configuration
Write-Output "Current NTP
Configuration:"
Write-Output $currentConfig
# Force time synchronization
w32tm /resync /force
Write-Output "Time synchronization
forced."
}
catch {
Write-Error "Failed to manage NTP
configuration: $_"
}
}
# Example usage
Manage-NTPConfiguration -NTPServers
"[Link]", "[Link]" -
EnableNTPClient
This script defines a function
Manage-NTPConfiguration that allows you
to configure Network Time Protocol
(NTP) settings on a Windows system.
It can enable or disable the
Windows Time service, set NTP
servers, and force time
synchronization. This can be useful
for ensuring accurate time
synchronization across network
devices, configuring time servers
for a group of machines, or
troubleshooting time-related issues
in a network environment.
215. Audit Firewall
Access Logs
function Audit-FirewallLogs {
param (
[string]$LogPath =
"C:\Windows\System32\LogFiles\Firewall\pfirew
[Link]",
[int]$DaysToAudit = 7,
[string]$OutputPath =
"C:\[Link]"
)
try {
$startDate = (Get-
Date).AddDays(-$DaysToAudit)
$logEntries = Get-Content $LogPath |
Where-Object { $_ -notmatch '^#' } | ForEach-
Object {
$fields = $_ -split ' '
[PSCustomObject]@{
Date =
[DateTime]::ParseExact($fields[0], 'yyyy-MM-
dd', $null)
Time = $fields[1]
Action = $fields[2]
Protocol = $fields[3]
SourceIP = $fields[4]
DestinationIP = $fields[5]
SourcePort = $fields[6]
DestinationPort = $fields[7]
}
} | Where-Object { $_.Date -ge
$startDate }
$summary = $logEntries | Group-Object
Action, SourceIP, DestinationIP,
DestinationPort |
Select-Object
@{Name='Action';Expression=
{$_.Group[0].Action}},
@{Name='Sour
ceIP';Expression={$_.Group[0].SourceIP}},
@{Name='Dest
inationIP';Expression=
{$_.Group[0].DestinationIP}},
@{Name='Dest
inationPort';Expression=
{$_.Group[0].DestinationPort}},
@{Name='Coun
t';Expression={$_.Count}}
$summary | Export-Csv -Path
$OutputPath -NoTypeInformation
Write-Output "Firewall log audit
complete. Results saved to $OutputPath"
Write-Output "Summary:"
$summary | Format-Table -AutoSize
}
catch {
Write-Error "Failed to audit firewall
logs: $_"
}
}
# Example usage
Audit-FirewallLogs -DaysToAudit 30 -
OutputPath "C:\[Link]"
This script defines a function Audit-
FirewallLogs that analyzes Windows
Firewall logs for a specified
period. It summarizes the log
entries, grouping them by action,
source IP, destination IP, and
destination port. The results are
exported to a CSV file and a
summary is displayed in the
console. This can be useful for
identifying unusual network
activity, auditing firewall
effectiveness, or investigating
security incidents.
216. Automate Security
Scans
function Automate-SecurityScan {
param (
[string[]]$TargetIPs,
[string]$OutputPath =
"C:\SecurityScanResults",
[switch]$PerformPortScan,
[switch]$CheckVulnerabilities,
[switch]$AuditServices
)
try {
# Ensure output directory exists
if (-not (Test-Path $OutputPath)) {
New-Item -Path $OutputPath -
ItemType Directory | Out-Null
}
foreach ($ip in $TargetIPs) {
Write-Output "Scanning $ip..."
if ($PerformPortScan) {
$portScanResult = 1..1024 |
ForEach-Object {
$tcp = New-Object
[Link]
$connection =
$[Link]($ip, $_, $null, $null)
$wait =
$[Link](100,$fals
e)
if($wait) {
$[Link]($conn
ection) | Out-Null
[PSCustomObject]@{Por
t=$_; Status='Open'}
} else {
[PSCustomObject]@{Por
t=$_; Status='Closed'}
}
$[Link]()
} | Where-Object {$_.Status -
eq 'Open'}
$portScanResult | Export-Csv
-Path "$OutputPath\${ip}_PortScan.csv" -
NoTypeInformation
Write-Output "Port scan
results saved to
$OutputPath\${ip}_PortScan.csv"
}
if ($CheckVulnerabilities) {
# This is a placeholder for
vulnerability checking
# In a real scenario, you
would integrate with a vulnerability scanner
or use specific checks
Write-Output "Vulnerability
check not implemented in this example"
}
if ($AuditServices) {
$services = Get-WmiObject -
Class Win32_Service -ComputerName $ip |
Select-Object
DisplayName, State, StartMode
$services | Export-Csv -Path
"$OutputPath\${ip}_Services.csv" -
NoTypeInformation
Write-Output "Service audit
results saved to
$OutputPath\${ip}_Services.csv"
}
}
Write-Output "Security scan complete.
Results saved in $OutputPath"
}
catch {
Write-Error "Security scan failed:
$_"
}
}
# Example usage
Automate-SecurityScan -TargetIPs
"[Link]", "[Link]" -
PerformPortScan -AuditServices
This script defines a function
Automate-SecurityScan that performs basic
security scans on specified target
IP addresses. It includes options
for port scanning, vulnerability
checking (placeholder), and
auditing services. The results are
saved to CSV files in the specified
output directory. This can be
useful for regular security audits,
identifying open ports and running
services on network devices, or as
part of a larger security
assessment process.
217. Detect Packet Drops
function Detect-PacketDrops {
param (
[string]$TargetHost,
[int]$PingCount = 100,
[int]$Interval = 1000
)
try {
Write-Output "Testing connection to
$TargetHost..."
$results = Test-Connection -
ComputerName $TargetHost -Count $PingCount -
Delay $Interval -ErrorAction Stop
$sent = $[Link]
$received = ($results | Where-Object
{$_.StatusCode -eq 0}).Count
$lost = $sent - $received
$packetLossPercentage = ($lost /
$sent) * 100
$minRtt = ($results | Measure-Object
-Property ResponseTime -Minimum).Minimum
$maxRtt = ($results | Measure-Object
-Property ResponseTime -Maximum).Maximum
$avgRtt = ($results | Measure-Object
-Property ResponseTime -Average).Average
Write-Output "Ping statistics for
$TargetHost:"
Write-Output " Packets: Sent =
$sent, Received = $received, Lost = $lost
($packetLossPercentage% loss)"
Write-Output "Approximate round trip
times in milli-seconds:"
Write-Output " Minimum =
${minRtt}ms, Maximum = ${maxRtt}ms, Average =
$($[Link]('F2'))ms"
if ($packetLossPercentage -gt 5) {
Write-Warning "Significant packet
loss detected! Consider investigating network
issues."
}
}
catch {
Write-Error "Failed to detect packet
drops: $_"
}
}
# Example usage
Detect-PacketDrops -TargetHost
"[Link]" -PingCount 200 -Interval 500
This script defines a function
Detect-PacketDrops that performs a
series of pings to a specified
target host and analyzes the
results to detect packet drops. It
calculates packet loss percentage
and round-trip times, providing a
summary of the connection quality.
This can be useful for
troubleshooting network
connectivity issues, monitoring
link quality, or identifying
potential network problems before
they become severe.
218. Troubleshoot DNS
Issues
function Troubleshoot-DNSIssues {
param (
[string]$DomainName,
[string[]]$DNSServers = @("[Link]",
"[Link]")
)
try {
Write-Output "Troubleshooting DNS
issues for $DomainName..."
# Check local DNS resolution
Write-Output "Local DNS resolution:"
$localResolution = Resolve-DnsName -
Name $DomainName -ErrorAction
SilentlyContinue
if ($localResolution) {
$localResolution | Format-Table
Name, IPAddress
} else {
Write-Warning "Failed to resolve
$DomainName using local DNS"
}
# Check resolution using specified
DNS servers
foreach ($dnsServer in $DNSServers) {
Write-Output "Resolving using DNS
server $dnsServer:"
$resolution = Resolve-DnsName -
Name $DomainName -Server $dnsServer -
ErrorAction SilentlyContinue
if ($resolution) {
$resolution | Format-Table
Name, IPAddress
} else {
Write-Warning "Failed to
resolve $DomainName using DNS server
$dnsServer"
}
}
# Perform NS lookup
Write-Output "Name Server (NS)
records for $DomainName:"
$nsRecords = Resolve-DnsName -Name
$DomainName -Type NS -ErrorAction
SilentlyContinue
if ($nsRecords) {
$nsRecords | Format-Table Name,
NameHost
} else {
Write-Warning "Failed to retrieve
NS records for $DomainName"
}
# Check for common DNS record types
$recordTypes = @("A", "AAAA", "MX",
"TXT", "CNAME")
foreach ($type in $recordTypes) {
Write-Output "$type records for
$DomainName:"
$records = Resolve-DnsName -Name
$DomainName -Type $type -ErrorAction
SilentlyContinue
if ($records) {
$records | Format-Table Name,
Type, @{Name="Data";Expression=
{$_.IPAddress,$_.NameExchange,$_.Strings,$_.N
ameHost -ne $null}}
} else {
Write-Output "No $type
records found"
}
}
}
catch {
Write-Error "DNS troubleshooting
failed: $_"
}
}
# Example usage
Troubleshoot-DNSIssues -DomainName
"[Link]" -DNSServers "[Link]",
"[Link]"
This script defines a function
Troubleshoot-DNSIssues that performs
various DNS-related checks for a
specified domain name. It attempts
local DNS resolution, checks
resolution using specified DNS
servers, retrieves NS records, and
looks up common DNS record types.
This can be useful for diagnosing
DNS-related problems, verifying DNS
configurations, or auditing DNS
records for a domain.
219. Configure VPN
Tunnels
function Configure-VPNTunnel {
param (
[string]$ConnectionName,
[string]$ServerAddress,
[string]$PreSharedKey,
[ValidateSet("L2TP", "PPTP",
"IKEv2")]
[string]$TunnelType = "L2TP",
[string]$Username,
[string]$Password
)
try {
# Remove existing VPN connection if
it exists
Remove-VpnConnection -Name
$ConnectionName -Force -ErrorAction
SilentlyContinue
# Create new VPN connection
switch ($TunnelType) {
"L2TP" {
Add-VpnConnection -Name
$ConnectionName -ServerAddress $ServerAddress
-TunnelType L2tp -L2tpPsk $PreSharedKey -
Force
}
"PPTP" {
Add-VpnConnection -Name
$ConnectionName -ServerAddress $ServerAddress
-TunnelType Pptp -Force
}
"IKEv2" {
Add-VpnConnection -Name
$ConnectionName -ServerAddress $ServerAddress
-TunnelType IKEv2 -Force
}
}
# Set authentication method
Set-VpnConnection -Name
$ConnectionName -AuthenticationMethod
MSChapv2 -UseWinlogonCredential $false -
SplitTunneling $true -Force
# Set username and password if
provided
if ($Username -and $Password) {
$vpnCredential = New-Object
[Link]($Us
ername, (ConvertTo-SecureString $Password -
AsPlainText -Force))
Set-VpnConnection -Name
$ConnectionName -RememberCredential $true
Add-VpnConnectionRoute -
ConnectionName $ConnectionName -
DestinationPrefix "[Link]/0" -PassThru
# Attempt to connect
rasdial $ConnectionName
$[Link]
$[Link]().Passwor
d
}
Write-Output "VPN tunnel
'$ConnectionName' configured successfully."
Get-VpnConnection -Name
$ConnectionName | Format-List
}
catch {
Write-Error "Failed to configure VPN
tunnel: $_"
}
}
# Example usage
Configure-VPNTunnel -ConnectionName "MyVPN" -
ServerAddress "[Link]" -PreSharedKey
"MySecretKey" -TunnelType "L2TP" -Username
"user@[Link]" -Password "MyPassword"
This script defines a function
Configure-VPNTunnel that sets up a VPN
connection on a Windows system. It
supports different VPN types (L2TP,
PPTP, IKEv2) and can optionally
attempt to connect using provided
credentials. This can be useful for
automating VPN setup across
multiple machines, quickly
configuring VPN access for new
users, or managing VPN connections
programmatically.
220. Automate Network
Switch Configurations
function Configure-NetworkSwitch {
param (
[string]$SwitchIP,
[string]$Username,
[string]$Password,
[string]$ConfigFile
)
try {
# Ensure Posh-SSH module is available
if (-not (Get-Module -ListAvailable -
Name Posh-SSH)) {
throw "Posh-SSH module not
available. Please install it using 'Install-
Module Posh-SSH'"
}
# Read configuration commands from
file
$commands = Get-Content $ConfigFile
# Create credential object
$securePassword = ConvertTo-
SecureString $Password -AsPlainText -Force
$credential = New-Object
[Link]
($Username, $securePassword)
# Establish SSH session
$session = New-SSHSession -
ComputerName $SwitchIP -Credential
$credential -AcceptKey
if ($session) {
Write-Output "Connected to switch
at $SwitchIP"
# Enter configuration mode
$result = Invoke-SSHCommand -
SessionId $[Link] -Command
"configure terminal"
Write-Output $[Link]
# Execute configuration commands
foreach ($command in $commands) {
$result = Invoke-SSHCommand -
SessionId $[Link] -Command
$command
Write-Output "Executing:
$command"
Write-Output $[Link]
}
# Exit configuration mode
$result = Invoke-SSHCommand -
SessionId $[Link] -Command "end"
Write-Output $[Link]
# Save configuration
$result = Invoke-SSHCommand -
SessionId $[Link] -Command "write
memory"
Write-Output $[Link]
Write-Output "Switch
configuration completed successfully."
}
else {
throw "Failed to establish SSH
connection to the switch."
}
}
catch {
Write-Error "Failed to configure
network switch: $_"
}
finally {
# Close SSH session
if ($session) {
Remove-SSHSession -SessionId
$[Link] | Out-Null
}
}
}
# Example usage
Configure-NetworkSwitch -SwitchIP
"[Link]" -Username "admin" -Password
"password" -ConfigFile
"C:\[Link]"
This script defines a function
Configure-NetworkSwitch that automates
the configuration of a network
switch using SSH. It reads
configuration commands from a file
and executes them on the switch.
This can be useful for
standardizing switch
configurations, quickly applying
changes to multiple switches, or
maintaining consistent network
device configurations across an
organization.
These scripts provide a
comprehensive set of tools for
network administration tasks,
ranging from basic monitoring to
advanced configuration and
troubleshooting. They can be
customized and expanded to fit
specific network environments and
requirements.
Chapter 7: Advanced
Automation
221. Automate Incident
Response
function Invoke-IncidentResponse {
param (
[string]$IncidentType,
[string]$AffectedSystem
)
# Log incident details
Write-Log -Message "Incident detected:
$IncidentType on $AffectedSystem"
# Isolate affected system
Disable-NetworkAdapter -ComputerName
$AffectedSystem
# Collect system information
$systemInfo = Get-SystemInfo -
ComputerName $AffectedSystem
# Create incident ticket
New-IncidentTicket -Type $IncidentType -
System $AffectedSystem -Info $systemInfo
# Notify incident response team
Send-IncidentAlert -Type $IncidentType -
System $AffectedSystem
# Start forensic collection
Start-ForensicCollection -ComputerName
$AffectedSystem
}
This script provides a framework
for automating incident response.
It includes steps to log the
incident, isolate the affected
system, collect system information,
create an incident ticket, notify
the response team, and initiate
forensic data collection. Each
function (e.g., Disable-NetworkAdapter ,
Get-SystemInfo ) would need to be
implemented separately based on
your specific environment and
tools.
222. Configure Cloud Resources
via PowerShell
# Azure example
Import-Module Az
Connect-AzAccount
# Create a new resource group
New-AzResourceGroup -Name "MyResourceGroup" -
Location "EastUS"
# Create a storage account
New-AzStorageAccount -ResourceGroupName
"MyResourceGroup" `
-Name "mystorageaccount"
`
-Location "EastUS" `
-SkuName "Standard_LRS"
`
-Kind "StorageV2"
# Create a virtual network
$subnetConfig = New-
AzVirtualNetworkSubnetConfig -Name "MySubnet"
-AddressPrefix "[Link]/24"
New-AzVirtualNetwork -ResourceGroupName
"MyResourceGroup" `
-Location "EastUS" `
-Name "MyVNet" `
-AddressPrefix
"[Link]/16" `
-Subnet $subnetConfig
This script demonstrates how to use
PowerShell to configure cloud
resources in Azure. It creates a
resource group, a storage account,
and a virtual network with a
subnet. You'll need to have the Az
PowerShell module installed and be
authenticated to your Azure
account.
223. Automate Azure Virtual
Machine Creation
# Connect to Azure
Connect-AzAccount
# Set variables
$resourceGroup = "MyResourceGroup"
$location = "EastUS"
$vmName = "MyVM"
$vmSize = "Standard_DS1_v2"
# Create a public IP address
$publicIp = New-AzPublicIpAddress -Name
"${vmName}PublicIP" -ResourceGroupName
$resourceGroup -Location $location -
AllocationMethod Dynamic
# Create a virtual network card and associate
with public IP address
$subnetConfig = New-
AzVirtualNetworkSubnetConfig -Name "MySubnet"
-AddressPrefix "[Link]/24"
$vnet = New-AzVirtualNetwork -
ResourceGroupName $resourceGroup -Location
$location -Name "${vmName}VNET" -
AddressPrefix "[Link]/16" -Subnet
$subnetConfig
$nic = New-AzNetworkInterface -Name
"${vmName}NIC" -ResourceGroupName
$resourceGroup -Location $location -SubnetId
$[Link][0].Id -PublicIpAddressId
$[Link]
# Define a credential object
$securePassword = ConvertTo-SecureString
'P@ssw0rd123!' -AsPlainText -Force
$cred = New-Object
[Link]
("azureuser", $securePassword)
# Create a virtual machine configuration
$vmConfig = New-AzVMConfig -VMName $vmName -
VMSize $vmSize |
Set-AzVMOperatingSystem -Windows -
ComputerName $vmName -Credential $cred |
Set-AzVMSourceImage -PublisherName
"MicrosoftWindowsServer" -Offer
"WindowsServer" -Skus "2019-Datacenter" -
Version "latest" |
Add-AzVMNetworkInterface -Id $[Link]
# Create the virtual machine
New-AzVM -ResourceGroupName $resourceGroup -
Location $location -VM $vmConfig
This script automates the creation
of an Azure virtual machine. It
sets up necessary resources like a
public IP address, virtual network,
and network interface before
creating the VM itself. Remember to
replace the placeholder values with
your desired configuration.
224. Manage AWS Resources
Using PowerShell
# Install AWS Tools for PowerShell if not
already installed
# Install-Module -Name [Link] -
Force
# Install-AWSToolsModule EC2,S3 -CleanUp
# Import the AWS module
Import-Module [Link]
# Set up AWS credentials (alternatively, use
AWS CLI configuration)
Set-AWSCredential -AccessKey
AKIAIOSFODNN7EXAMPLE -SecretKey
wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY -
StoreAs MyProfile
# Set the default region
Set-DefaultAWSRegion -Region us-west-2
# List EC2 instances
Get-EC2Instance | Select-Object -
ExpandProperty Instances |
Format-Table InstanceId, InstanceType,
[Link], PublicIpAddress
# Create an S3 bucket
New-S3Bucket -BucketName "my-unique-bucket-
name"
# Upload a file to S3
Write-S3Object -BucketName "my-unique-bucket-
name" -File "C:\path\to\[Link]" -Key
"[Link]"
# List S3 buckets
Get-S3Bucket
# Delete an S3 bucket (must be empty)
Remove-S3Bucket -BucketName "my-unique-
bucket-name" -Force
This script demonstrates how to use
PowerShell to manage AWS resources.
It includes examples of listing EC2
instances, creating and managing S3
buckets. You'll need to install the
AWS Tools for PowerShell and
configure your AWS credentials
before running these commands.
225. Automate Database Backups
function Backup-SqlDatabase {
param (
[string]$ServerInstance,
[string]$DatabaseName,
[string]$BackupPath
)
$timestamp = Get-Date -Format
"yyyyMMdd_HHmmss"
$backupFile = Join-Path $BackupPath
"$DatabaseName`_$[Link]"
$query = "BACKUP DATABASE [$DatabaseName]
TO DISK = N'$backupFile' WITH NOFORMAT,
NOINIT, NAME = N'$DatabaseName-Full Database
Backup', SKIP, NOREWIND, NOUNLOAD, STATS =
10"
try {
Invoke-Sqlcmd -ServerInstance
$ServerInstance -Database "master" -Query
$query
Write-Host "Backup of $DatabaseName
completed successfully. Backup file:
$backupFile"
}
catch {
Write-Error "Failed to backup
$DatabaseName. Error: $_"
}
}
# Usage example
Backup-SqlDatabase -ServerInstance
"SQLSERVER01" -DatabaseName "MyDatabase" -
BackupPath "C:\Backups"
This script defines a function to
automate SQL Server database
backups. It creates a backup file
with a timestamp in the specified
backup path. The function uses
Invoke-Sqlcmd to execute the backup
command. Make sure you have the
necessary SQL Server modules
installed and permissions to
perform backups.
226. Schedule Database
Maintenance
$maintenanceTasks = @(
@{
Name = "Rebuild Indexes"
Query = "EXEC sp_MSforeachtable
@command1='print ''?''; ALTER INDEX ALL ON ?
REBUILD'"
},
@{
Name = "Update Statistics"
Query = "EXEC sp_updatestats"
},
@{
Name = "Check Database Integrity"
Query = "DBCC CHECKDB WITH
NO_INFOMSGS"
}
)
function Invoke-DatabaseMaintenance {
param (
[string]$ServerInstance,
[string]$DatabaseName
)
foreach ($task in $maintenanceTasks) {
Write-Host "Executing $($[Link])
on $DatabaseName..."
try {
Invoke-Sqlcmd -ServerInstance
$ServerInstance -Database $DatabaseName -
Query $[Link] -QueryTimeout 0
Write-Host "$($[Link])
completed successfully."
}
catch {
Write-Error "Failed to execute
$($[Link]). Error: $_"
}
}
}
# Schedule the maintenance task
$action = New-ScheduledTaskAction -Execute
"[Link]" `
-Argument "-NoProfile -ExecutionPolicy
Bypass -File
`"C:\Scripts\DatabaseMaintenance.ps1`""
$trigger = New-ScheduledTaskTrigger -Weekly -
DaysOfWeek Sunday -At 2am
Register-ScheduledTask -TaskName "Weekly
Database Maintenance" -Action $action -
Trigger $trigger -RunLevel Highest -User
"SYSTEM"
# In DatabaseMaintenance.ps1
Invoke-DatabaseMaintenance -ServerInstance
"SQLSERVER01" -DatabaseName "MyDatabase"
This script sets up a scheduled
database maintenance task. It
defines several maintenance
operations (rebuilding indexes,
updating statistics, and checking
database integrity) and creates a
Windows Scheduled Task to run these
operations weekly. Adjust the
server instance, database name, and
schedule as needed for your
environment.
227. Automate Email
Notifications
function Send-EmailNotification {
param (
[string]$SmtpServer,
[int]$Port = 587,
[string]$From,
[string[]]$To,
[string]$Subject,
[string]$Body,
[string]$Username,
[string]$Password
)
$securePassword = ConvertTo-SecureString
$Password -AsPlainText -Force
$credential = New-Object
[Link]
($Username, $securePassword)
$mailParams = @{
SmtpServer = $SmtpServer
Port = $Port
UseSsl = $true
Credential = $credential
From = $From
To = $To
Subject = $Subject
Body = $Body
BodyAsHtml = $true
}
try {
Send-MailMessage @mailParams
Write-Host "Email sent successfully."
}
catch {
Write-Error "Failed to send email.
Error: $_"
}
}
# Usage example
$emailParams = @{
SmtpServer = "[Link]"
From = "sender@[Link]"
To = "recipient1@[Link]",
"recipient2@[Link]"
Subject = "Automated Notification"
Body = "<h2>This is an automated
notification</h2><p>Something important
happened!</p>"
Username = "sender@[Link]"
Password = "YourPasswordHere"
}
Send-EmailNotification @emailParams
This script defines a function to
send email notifications using
PowerShell. It supports HTML
content and can be easily
integrated into other scripts to
send automated notifications. Make
sure to replace the SMTP server,
email addresses, and credentials
with your own.
228. Create Slack Alerts via
Scripts
function Send-SlackMessage {
param (
[string]$WebhookUrl,
[string]$Channel,
[string]$Username,
[string]$Message,
[string]$IconEmoji
)
$payload = @{
channel = $Channel
username = $Username
text = $Message
icon_emoji = $IconEmoji
} | ConvertTo-Json
$params = @{
Uri = $WebhookUrl
Method = 'Post'
Body = $payload
ContentType = 'application/json'
}
try {
Invoke-RestMethod @params
Write-Host "Slack message sent
successfully."
}
catch {
Write-Error "Failed to send Slack
message. Error: $_"
}
}
# Usage example
$slackParams = @{
WebhookUrl =
"[Link]
00000000/XXXXXXXXXXXXXXXXXXXXXXXX"
Channel = "#alerts"
Username = "PowerShell Bot"
Message = "Alert: High CPU usage detected
on Server01"
IconEmoji = ":warning:"
}
Send-SlackMessage @slackParams
This script defines a function to
send Slack messages using
PowerShell. It uses a Slack webhook
URL to post messages to a specific
channel. You'll need to set up an
incoming webhook in your Slack
workspace and replace the webhook
URL in the example with your own.
229. Automate File
Synchronization
function Sync-Folders {
param (
[string]$SourcePath,
[string]$DestinationPath,
[switch]$Mirror,
[switch]$WhatIf
)
if (-not (Test-Path $SourcePath)) {
Write-Error "Source path does not
exist: $SourcePath"
return
}
if (-not (Test-Path $DestinationPath)) {
New-Item -Path $DestinationPath -
ItemType Directory -Force | Out-Null
}
$robocopyParams = @(
$SourcePath
$DestinationPath
"/E" # Copy subdirectories,
including empty ones
"/Z" # Copy files in restartable
mode
"/XO" # Exclude older files
"/NP" # Don't show progress
percentage in log
"/NDL" # No Directory List - don't
log directory names
"/NJH" # No Job Header
"/NJS" # No Job Summary
)
if ($Mirror) {
$robocopyParams += "/MIR" # Mirror
mode (equivalent to /E plus /PURGE)
}
if ($WhatIf) {
$robocopyParams += "/L" # List only -
don't copy, timestamp or delete any files
}
try {
$result = robocopy @robocopyParams
switch ($LASTEXITCODE) {
0 { Write-Host "No files were
copied. No failure was encountered. No files
were mismatched. The files already exist in
the destination directory; therefore, the
copy operation was skipped." -ForegroundColor
Green }
1 { Write-Host "All files were
copied successfully." -ForegroundColor Green
}
2 { Write-Host "There are some
additional files in the destination directory
that are not present in the source directory.
No files were copied." -ForegroundColor
Yellow }
3 { Write-Host "Some files were
copied. Additional files were present. No
failure was encountered." -ForegroundColor
Yellow }
default { Write-Warning "Robocopy
encountered errors. Exit code: $LASTEXITCODE"
}
}
}
catch {
Write-Error "An error occurred during
file synchronization: $_"
}
}
# Usage example
Sync-Folders -SourcePath "C:\SourceFolder" -
DestinationPath "D:\BackupFolder" -Mirror
This script defines a function to
synchronize folders using robocopy .
It includes options for mirroring
(which can delete files in the
destination that don't exist in the
source) and a "what-if" mode for
testing. The function interprets
robocopy's exit codes to provide
meaningful feedback.
230. Configure CI/CD Pipelines
# This script assumes you're using Azure
DevOps for CI/CD
# Install and import the Azure DevOps module
Install-Module -Name [Link] -Force -
AllowClobber
Import-Module [Link]
# Connect to Azure DevOps
$org =
"[Link]
$project = "YourProject"
$pat = "YourPersonalAccessToken"
$token =
[[Link]]::ToBase64String([[Link]
.Encoding]::[Link](":$($pat)"))
$header = @{authorization = "Basic $token"}
Connect-AzDevOps -Organization $org -
PersonalAccessToken $pat
# Create a new pipeline
$pipelineName = "MyNewPipeline"
$repoName = "MyRepository"
$branchName = "main"
$pipeline = @{
name = $pipelineName
folder = "\\"
configuration = @{
type = "yaml"
path = "[Link]"
repository = @{
name = $repoName
type = "git"
defaultBranch =
"refs/heads/$branchName"
}
}
}
$newPipeline = New-AzDevOpsPipeline -Project
$project -PipelineObject $pipeline
# Configure pipeline variables
$variables = @(
@{
name = "BUILD_CONFIGURATION"
value = "Release"
},
@{
name = "DEPLOY_ENVIRONMENT"
value = "Production"
}
)
foreach ($var in $variables) {
New-AzDevOpsPipelineVariable -Project
$project -PipelineId $[Link] -Name
$[Link] -Value $[Link]
}
# Add a build task to the pipeline
$buildTask = @{
task = @{
id = "VSBuild@1"
versionSpec = "1.*"
}
inputs = @{
solution = "**/*.sln"
msbuildArgs = "/p:DeployOnBuild=true
/p:WebPublishMethod=Package
/p:PackageAsSingleFile=true
/p:SkipInvalidConfigurations=true
/p:PackageLocation=""$([Link]
irectory)"""
platform = "Any CPU"
configuration =
"$(BUILD_CONFIGURATION)"
}
}
Add-AzDevOpsPipelineTask -Project $project -
PipelineId $[Link] -Task $buildTask
Write-Host "CI/CD pipeline '$pipelineName'
has been configured."
This script demonstrates how to
configure a CI/CD pipeline using
Azure DevOps and PowerShell. It
creates a new pipeline, sets up
variables, and adds a build task.
You'll need to replace placeholders
like YourOrganization , YourProject , and
with your actual
YourPersonalAccessToken
Azure DevOps details.
231. Monitor Application Logs
function Monitor-ApplicationLogs {
param (
[string]$LogPath,
[string]$FilePattern = "*.log",
[int]$CheckIntervalSeconds = 60,
[string[]]$AlertKeywords,
[string]$SmtpServer,
[string]$FromEmail,
[string[]]$ToEmail
)
$lastCheck = Get-Date
while ($true) {
$currentCheck = Get-Date
$newLogs = Get-ChildItem -Path
$LogPath -Filter $FilePattern |
Where-Object {
$_.LastWriteTime -gt $lastCheck }
foreach ($log in $newLogs) {
$content = Get-Content
$[Link]
foreach ($keyword in
$AlertKeywords) {
$matches = $content | Select-
String -Pattern $keyword
if ($matches) {
$subject = "Alert:
$keyword found in $($[Link])"
$body = "The following
lines in $($[Link]) contain
'$keyword':`n`n"
$body += $matches |
ForEach-Object { $_.Line } | Out-String
Send-MailMessage -
SmtpServer $SmtpServer -From $FromEmail -To
$ToEmail -Subject $subject -Body $body
}
}
}
$lastCheck = $currentCheck
Start-Sleep -Seconds
$CheckIntervalSeconds
}
}
# Usage example
$monitorParams = @{
LogPath = "C:\ApplicationLogs"
FilePattern = "app_*.log"
CheckIntervalSeconds = 300 # Check every
5 minutes
AlertKeywords = @("error", "critical",
"failure")
SmtpServer = "[Link]"
FromEmail = "alerts@[Link]"
ToEmail = @("admin1@[Link]",
"admin2@[Link]")
}
Monitor-ApplicationLogs @monitorParams
This script defines a function to
continuously monitor application
log files for specific keywords.
When it finds a match, it sends an
email alert. The script checks for
new log entries at regular
intervals. Adjust the parameters to
fit your specific logging setup and
alert requirements.
232. Generate Compliance
Reports
function Generate-ComplianceReport {
param (
[string]$ReportPath,
[string[]]$Servers,
[hashtable]$ComplianceChecks
)
$results = @()
foreach ($server in $Servers) {
Write-Host "Checking compliance for
server: $server"
$serverResult = [PSCustomObject]@{
ServerName = $server
Checks = @()
}
foreach ($check in
$[Link]()) {
$checkName = $[Link]
$checkScript = $[Link]
Write-Host " Running check:
$checkName"
try {
$result = Invoke-Command -
ComputerName $server -ScriptBlock
$checkScript
$status = if ($result) {
"Compliant" } else { "Non-Compliant" }
}
catch {
$status = "Error"
$result =
$_.[Link]
}
$[Link] +=
[PSCustomObject]@{
CheckName = $checkName
Status = $status
Details = $result
}
}
$results += $serverResult
}
# Generate HTML report
$htmlReport = @"
<html>
<head>
<style>
body { font-family: Arial, sans-
serif; }
table { border-collapse:
collapse; width: 100%; }
th, td { border: 1px solid #ddd;
padding: 8px; }
th { background-color: #f2f2f2; }
.Compliant { color: green; }
.Non-Compliant { color: red; }
.Error { color: orange; }
</style>
</head>
<body>
<h1>Compliance Report</h1>
<h2>Generated on $(Get-Date)</h2>
$(foreach ($serverResult in $results)
{
@"
<h3>Server:
$($[Link])</h3>
<table>
<tr><th>Check</th>
<th>Status</th><th>Details</th></tr>
$(foreach ($check in
$[Link]) {
"<tr>
<td>$($[Link])</td><td
class='$($[Link])'>$($[Link])
</td><td>$($[Link])</td></tr>"
})
</table>
"@
})
</body>
</html>
"@
$htmlReport | Out-File -FilePath
$ReportPath
Write-Host "Compliance report generated:
$ReportPath"
}
# Define compliance checks
$complianceChecks = @{
"Windows Updates" = { Get-HotFix | Where-
Object { $_.InstalledOn -gt (Get-
Date).AddDays(-30) } }
"Firewall Enabled" = { (Get-
NetFirewallProfile).Enabled -contains $true }
"Antivirus Status" = { Get-
MpComputerStatus | Select-Object -
ExpandProperty AntivirusEnabled }
"Disk Space" = { Get-WmiObject
Win32_LogicalDisk | Where-Object {
$_.DriveType -eq 3 -and $_.FreeSpace /
$_.Size -lt 0.1 } }
}
# Generate the report
Generate-ComplianceReport -ReportPath
"C:\Reports\ComplianceReport_$(Get-Date -
Format 'yyyyMMdd').html" `
-Servers
@("Server1", "Server2", "Server3") `
-ComplianceChecks
$complianceChecks
This script defines a function to
generate compliance reports for
multiple servers. It runs a series
of compliance checks (defined in
the $complianceChecks hashtable) on
each server and produces an HTML
report with the results. The report
highlights compliant, non-
compliant, and error states with
different colors for easy
identification.
233. Detect Anomalies in
System Logs
function Detect-LogAnomalies {
param (
[string]$LogName = "System",
[int]$HoursToAnalyze = 24,
[int]$ThresholdMultiplier = 2
)
$startTime = (Get-
Date).AddHours(-$HoursToAnalyze)
$logs = Get-WinEvent -FilterHashtable @{
LogName = $LogName
StartTime = $startTime
} -ErrorAction SilentlyContinue
if (-not $logs) {
Write-Warning "No logs found in the
specified time range."
return
}
# Group logs by EventID and calculate
average occurrences
$eventCounts = $logs | Group-Object -
Property Id | ForEach-Object {
[PSCustomObject]@{
EventID = $_.Name
Count = $_.Count
AveragePerHour = $_.Count /
$HoursToAnalyze
}
}
# Calculate overall average
$overallAverage = ($eventCounts |
Measure-Object -Property AveragePerHour -
Average).Average
# Detect anomalies
$anomalies = $eventCounts | Where-Object
{ $_.AveragePerHour -gt ($overallAverage *
$ThresholdMultiplier) }
if ($anomalies) {
Write-Host "Anomalies detected in
$LogName log:" -ForegroundColor Yellow
$anomalies | Format-Table -AutoSize
# Get details of anomalous events
foreach ($anomaly in $anomalies) {
$eventDetails = $logs | Where-
Object { $_.Id -eq $[Link] } |
Select-Object -
First 5 TimeCreated, Id, LevelDisplayName,
Message
Write-Host "Details for EventID
$($[Link]):" -ForegroundColor Cyan
$eventDetails | Format-List
}
}
else {
Write-Host "No anomalies detected in
$LogName log." -ForegroundColor Green
}
}
# Usage example
Detect-LogAnomalies -LogName "System" -
HoursToAnalyze 48 -ThresholdMultiplier 3
This script defines a function to
detect anomalies in Windows event
logs. It analyzes the specified log
(default is the System log) for a
given time period and identifies
event types that occur more
frequently than usual. The script
uses a simple threshold-based
approach, flagging events that
occur more than a specified
multiple of the average occurrence
rate.
234. Configure IoT Devices
function Configure-IoTDevice {
param (
[string]$DeviceIP,
[int]$Port = 22,
[string]$Username,
[string]$Password,
[hashtable]$Configuration
)
# Ensure the Posh-SSH module is installed
if (-not (Get-Module -ListAvailable -Name
Posh-SSH)) {
Install-Module -Name Posh-SSH -Force
-Scope CurrentUser
}
try {
# Create SSH session
$securePassword = ConvertTo-
SecureString $Password -AsPlainText -Force
$credential = New-Object
[Link]
($Username, $securePassword)
$session = New-SSHSession -
ComputerName $DeviceIP -Port $Port -
Credential $credential -AcceptKey
if ($[Link]) {
Write-Host "Connected to
$DeviceIP" -ForegroundColor Green
# Apply configuration
foreach ($setting in
$[Link]()) {
$command = "sudo sed -i
's/^$($[Link])=.*/$($[Link])=$($set
[Link])/' /etc/[Link]"
$result = Invoke-SSHCommand -
SessionId $[Link] -Command
$command
if ($[Link] -eq 0)
{
Write-Host "Updated
$($[Link]) to $($[Link])" -
ForegroundColor Green
}
else {
Write-Warning "Failed to
update $($[Link]). Error:
$($[Link])"
}
}
# Restart IoT service to apply
changes
$restartResult = Invoke-
SSHCommand -SessionId $[Link] -
Command "sudo systemctl restart iotservice"
if ($[Link] -eq
0) {
Write-Host "IoT service
restarted successfully" -ForegroundColor
Green
}
else {
Write-Warning "Failed to
restart IoT service. Error:
$($[Link])"
}
}
else {
Write-Error "Failed to connect to
$DeviceIP"
}
}
catch {
Write-Error "An error occurred: $_"
}
finally {
# Close the SSH session
if ($session) {
Remove-SSHSession -SessionId
$[Link] | Out-Null
}
}
}
# Usage example
$deviceConfig = @{
"SENSOR_INTERVAL" = "300"
"DATA_UPLOAD_FREQUENCY" = "3600"
"LOGGING_LEVEL" = "INFO"
"MQTT_BROKER" = "[Link]"
"MQTT_PORT" = "1883"
}
Configure-IoTDevice -DeviceIP "[Link]"
-Username "admin" -Password "password123" -
Configuration $deviceConfig
This script defines a function to
configure IoT devices remotely
using SSH. It assumes the devices
are running a Linux-based operating
system and have a configuration
file at /etc/[Link] . The script
uses the Posh-SSH module to
establish an SSH connection, update
the configuration file, and restart
the IoT service. Adjust the paths
and commands as necessary for your
specific IoT device setup.
235. Integrate PowerShell with
APIs
function Invoke-ApiRequest {
param (
[string]$BaseUrl,
[string]$Endpoint,
[string]$Method = "GET",
[hashtable]$Headers,
[object]$Body,
[string]$ContentType =
"application/json"
)
$url = "$BaseUrl/$Endpoint"
$params = @{
Method = $Method
Uri = $url
Headers = $Headers
ContentType = $ContentType
}
if ($Body) {
$[Link] = if ($ContentType -eq
"application/json") { $Body | ConvertTo-Json
} else { $Body }
}
try {
$response = Invoke-RestMethod @params
return $response
}
catch {
Write-Error "API request failed: $_"
return $null
}
}
function Get-WeatherForecast {
param (
[string]$City,
[string]$ApiKey
)
$baseUrl =
"[Link]
$endpoint = "forecast?
q=$City&appid=$ApiKey&units=metric"
$response = Invoke-ApiRequest -BaseUrl
$baseUrl -Endpoint $endpoint
if ($response) {
$forecast = $[Link] | Select-
Object -First 5 | ForEach-Object {
[PSCustomObject]@{
DateTime =
[DateTime]::Parse($_.dt_txt)
Temperature = $_.[Link]
Description =
$_.weather[0].description
}
}
return $forecast
}
else {
Write-Warning "Failed to retrieve
weather forecast for $City"
return $null
}
}
function Post-ToSlack {
param (
[string]$WebhookUrl,
[string]$Message
)
$body = @{
text = $Message
}
$response = Invoke-ApiRequest -BaseUrl
$WebhookUrl -Method "POST" -Body $body
if ($response -eq "ok") {
Write-Host "Message posted to Slack
successfully" -ForegroundColor Green
}
else {
Write-Warning "Failed to post message
to Slack"
}
}
# Usage examples
$weatherApiKey =
"your_openweathermap_api_key"
$slackWebhookUrl =
"[Link]
YYYYYYYY/ZZZZZZZZZZZZZZZZZZZZZZZZ"
$forecast = Get-WeatherForecast -City
"London" -ApiKey $weatherApiKey
if ($forecast) {
$message = "Weather forecast for
London:`n"
$forecast | ForEach-Object {
$message +=
"$($_.[Link]('yyyy-MM-dd HH:mm')):
$($_.Temperature)°C, $($_.Description)`n"
}
Post-ToSlack -WebhookUrl $slackWebhookUrl
-Message $message
}
This script demonstrates how to
integrate PowerShell with APIs. It
includes a generic Invoke-ApiRequest
function that can be used to make
various API calls, and two specific
functions that use this generic
function:
1. Get-WeatherForecast:
Retrieves weather
forecast data from the
OpenWeatherMap API.
2. Post-ToSlack: Posts a message to a
Slack channel using a webhook
URL.
The script then shows how to use
these functions together to fetch a
weather forecast and post it to
Slack. Remember to replace
your_openweathermap_api_key and the Slack
webhook URL with your actual API
credentials.
236. Automate DevOps Pipelines
# This script demonstrates automating a
DevOps pipeline using Azure DevOps REST API
# Function to invoke Azure DevOps API
function Invoke-AzureDevOpsApi {
param (
[string]$Organization,
[string]$Project,
[string]$ApiVersion = "6.0",
[string]$Resource,
[string]$Method = "GET",
[object]$Body,
[string]$PersonalAccessToken
)
$url =
"[Link]
/_apis/$Resource`?api-version=$ApiVersion"
$token =
[[Link]]::ToBase64String([[Link]
.Encoding]::[Link](":$($PersonalAcces
sToken)"))
$headers = @{
Authorization = "Basic $token"
}
$params = @{
Uri = $url
Method = $Method
Headers = $headers
ContentType = "application/json"
}
if ($Body) {
$[Link] = $Body | ConvertTo-Json
-Depth 10
}
try {
$response = Invoke-RestMethod @params
return $response
}
catch {
Write-Error "API request failed: $_"
return $null
}
}
# Function to create a new build definition
function New-BuildDefinition {
param (
[string]$Organization,
[string]$Project,
[string]$RepoName,
[string]$BranchName,
[string]$BuildName,
[string]$PersonalAccessToken
)
$buildDefinition = @{
name = $BuildName
type = "build"
quality = "definition"
queue = @{
name = "Hosted Windows 2019 with
VS2019"
}
process = @{
type = 2
yamlFilename = "azure-
[Link]"
}
repository = @{
id = $RepoName
type = "tfsgit"
name = $RepoName
defaultBranch =
"refs/heads/$BranchName"
url =
"[Link]
/_git/$RepoName"
}
}
$response = Invoke-AzureDevOpsApi -
Organization $Organization -Project $Project
-Resource "build/definitions" -Method "POST"
-Body $buildDefinition -PersonalAccessToken
$PersonalAccessToken
if ($response) {
Write-Host "Build definition
'$BuildName' created successfully. ID:
$($[Link])" -ForegroundColor Green
return $[Link]
}
else {
Write-Warning "Failed to create build
definition"
return $null
}
}
# Function to queue a new build
function Start-Build {
param (
[string]$Organization,
[string]$Project,
[int]$DefinitionId,
[string]$PersonalAccessToken
)
$buildRequest = @{
definition = @{
id = $DefinitionId
}
}
$response = Invoke-AzureDevOpsApi -
Organization $Organization -Project $Project
-Resource "build/builds" -Method "POST" -Body
$buildRequest -PersonalAccessToken
$PersonalAccessToken
if ($response) {
Write-Host "Build queued
successfully. Build ID: $($[Link])" -
ForegroundColor Green
return $[Link]
}
else {
Write-Warning "Failed to queue build"
return $null
}
}
# Function to check build status
function Get-BuildStatus {
param (
[string]$Organization,
[string]$Project,
[int]$BuildId,
[string]$PersonalAccessToken
)
$response = Invoke-AzureDevOpsApi -
Organization $Organization -Project $Project
-Resource "build/builds/$BuildId" -
PersonalAccessToken $PersonalAccessToken
if ($response) {
return $[Link]
}
else {
Write-Warning "Failed to get build
status"
return $null
}
}
# Main script execution
$organization = "YourOrganization"
$project = "YourProject"
$repoName = "YourRepository"
$branchName = "main"
$buildName = "AutomatedBuild-$(Get-Date -
Format 'yyyyMMdd-HHmmss')"
$personalAccessToken =
"YourPersonalAccessToken"
# Create a new build definition
$definitionId = New-BuildDefinition -
Organization $organization -Project $project
-RepoName $repoName -BranchName $branchName -
BuildName $buildName -PersonalAccessToken
$personalAccessToken
if ($definitionId) {
# Queue a new build
$buildId = Start-Build -Organization
$organization -Project $project -DefinitionId
$definitionId -PersonalAccessToken
$personalAccessToken
if ($buildId) {
# Check build status
do {
$status = Get-BuildStatus -
Organization $organization -Project $project
-BuildId $buildId -PersonalAccessToken
$personalAccessToken
Write-Host "Current build status:
$status"
Start-Sleep -Seconds 30
} while ($status -eq "inProgress")
Write-Host "Final build status:
$status" -ForegroundColor Cyan
}
}
This script demonstrates how to
automate a DevOps pipeline using
the Azure DevOps REST API. It
includes functions to:
1. Create a new build definition
2. Queue a new build
3. Check the status of a build
The script creates a new build
definition, starts a build, and
then monitors the build status
until it completes. This can be
integrated into larger automation
workflows or scheduled tasks to
trigger builds automatically.
Remember to replace YourOrganization ,
YourProject , YourRepository , and
YourPersonalAccessToken with your actual
Azure DevOps details.
237. Configure PowerShell DSC
# Install required modules
Install-Module -Name
PSDesiredStateConfiguration -Force
Install-Module -Name ComputerManagementDsc -
Force
# Define the configuration
Configuration WebServerConfig {
param (
[string[]]$ComputerName = "localhost"
)
Import-DscResource -ModuleName
PSDesiredStateConfiguration
Import-DscResource -ModuleName
ComputerManagementDsc
Node $ComputerName {
# Install IIS role
WindowsFeature IIS {
Ensure = "Present"
Name = "Web-Server"
}
# Install [Link] 4.5
WindowsFeature ASP {
Ensure = "Present"
Name = "Web-Asp-Net45"
}
# Configure firewall rule
Firewall IISFirewallRule {
Name = "IIS-WebServerRole-HTTP-
In-TCP"
Ensure = "Present"
Enabled = "True"
}
# Set timezone
TimeZone SetTimeZone {
IsSingleInstance = 'Yes'
TimeZone = 'Pacific Standard
Time'
}
# Create a website
Website DefaultSite {
Ensure = "Present"
Name = "Default Web Site"
PhysicalPath =
"C:\inetpub\wwwroot"
State = "Started"
DependsOn = "[WindowsFeature]IIS"
}
# Add a file to the website
File IndexFile {
Ensure = "Present"
Type = "File"
DestinationPath =
"C:\inetpub\wwwroot\[Link]"
Contents = "<html><body>
<h1>Hello, DSC!</h1></body></html>"
}
}
}
# Generate the MOF file
WebServerConfig -ComputerName "WebServer01" -
OutputPath "C:\DSC\WebServerConfig"
# Apply the configuration
Start-DscConfiguration -Path
"C:\DSC\WebServerConfig" -Wait -Verbose -
Force
# Test the configuration
Test-DscConfiguration -Path
"C:\DSC\WebServerConfig"
# Get the current configuration status
Get-DscConfigurationStatus
# Function to monitor DSC configuration
function Monitor-DscConfiguration {
param (
[int]$IntervalSeconds = 300,
[string]$LogPath =
"C:\DSC\[Link]"
)
while ($true) {
$status = Get-DscConfigurationStatus
$logEntry = "$(Get-Date) - Status:
$($[Link]), StartDate:
$($[Link])"
if ($[Link])
{
$logEntry += ",
ResourcesInDesiredState:
$($[Link])"
}
if
($[Link]) {
$logEntry += ",
ResourcesNotInDesiredState:
$($[Link])"
}
Add-Content -Path $LogPath -Value
$logEntry
if ($[Link] -eq "Failure") {
Write-Warning "DSC configuration
failed. Check the log for details."
break
}
Start-Sleep -Seconds $IntervalSeconds
}
}
# Start monitoring
Monitor-DscConfiguration -IntervalSeconds 600
-LogPath "C:\DSC\[Link]"
This script demonstrates how to use
PowerShell Desired State
Configuration (DSC) to configure a
web server. It includes:
1. Installation of required modules
2. Definition of a DSC configuration
for a web server (IIS, [Link],
firewall rule, timezone, website)
3. Generation and application of the
MOF (Managed Object Format) file
4. Testing and getting the status of
the DSC configuration
5. A function to continuously
monitor the DSC configuration
status
The configuration ensures that IIS
is installed, [Link] 4.5 is
enabled, a firewall rule is set up,
the timezone is configured, a
default website is created, and an
index file is added to the website.
The monitoring function logs the
configuration status at regular
intervals and alerts if the
configuration fails.
Remember to run this script with
administrative privileges and
adjust the computer name, paths,
and other settings as needed for
your environment.
238. Manage Containers Using
PowerShell
# Ensure Docker is installed and the Docker
service is running
if (-not (Get-Command docker -ErrorAction
SilentlyContinue)) {
Write-Error "Docker is not installed or
not in PATH. Please install Docker and try
again."
exit
}
# Function to manage Docker containers
function Manage-DockerContainers {
param (
[Parameter(Mandatory=$true)]
[ValidateSet("List", "Start", "Stop",
"Remove", "Inspect", "Logs")]
[string]$Action,
[string]$ContainerName,
[string]$ImageName,
[string]$Port,
[hashtable]$EnvironmentVariables
)
switch ($Action) {
"List" {
docker ps -a
}
"Start" {
if (-not $ImageName) {
Write-Error "ImageName is
required for Start action"
return
}
$envParams = @()
if ($EnvironmentVariables) {
$[Link]
erator() | ForEach-Object {
$envParams += "-e",
"$($_.Key)=$($_.Value)"
}
}
$portParam = if ($Port) { @("-p",
"$Port") } else { @() }
docker run -d --name
$ContainerName $portParam $envParams
$ImageName
}
"Stop" {
docker stop $ContainerName
}
"Remove" {
docker rm -f $ContainerName
}
"Inspect" {
docker inspect $ContainerName
}
"Logs" {
docker logs $ContainerName
}
}
}
# Function to build a Docker image
function Build-DockerImage {
param (
[Parameter(Mandatory=$true)]
[string]$DockerfilePath,
[Parameter(Mandatory=$true)]
[string]$ImageName,
[string]$Tag = "latest"
)
docker build -t "${ImageName}:${Tag}" -f
$DockerfilePath (Split-Path $DockerfilePath)
}
# Function to push a Docker image to a
registry
function Push-DockerImage {
param (
[Parameter(Mandatory=$true)]
[string]$ImageName,
[string]$Tag = "latest",
[string]$RegistryUrl
)
if ($RegistryUrl) {
$fullImageName =
"${RegistryUrl}/${ImageName}:${Tag}"
docker tag "${ImageName}:${Tag}"
$fullImageName
}
else {
$fullImageName =
"${ImageName}:${Tag}"
}
docker push $fullImageName
}
# Example usage
# List all containers
Manage-DockerContainers -Action List
# Start a new container
$envVars = @{
"DB_HOST" = "localhost"
"DB_PORT" = "5432"
}
Manage-DockerContainers -Action Start -
ContainerName "my-app" -ImageName "my-app-
image" -Port "8080:80" -EnvironmentVariables
$envVars
# Stop a container
Manage-DockerContainers -Action Stop -
ContainerName "my-app"
# Remove a container
Manage-DockerContainers -Action Remove -
ContainerName "my-app"
# Inspect a container
Manage-DockerContainers -Action Inspect -
ContainerName "my-app"
# View container logs
Manage-DockerContainers -Action Logs -
ContainerName "my-app"
# Build a Docker image
Build-DockerImage -DockerfilePath
"C:\Projects\MyApp\Dockerfile" -ImageName
"my-app-image"
# Push a Docker image to a registry
Push-DockerImage -ImageName "my-app-image" -
Tag "v1.0" -RegistryUrl
"[Link]"
This script provides a set of
PowerShell functions to manage
Docker containers and images using
the Docker CLI. It includes:
1. A Manage-DockerContainers function that
can:
List all containers
Start a new container with
specified image, port mappings,
and environment variables
Stop a container
Remove a container
Inspect a container
View container logs
2. A Build-DockerImage function to build
a Docker image from a Dockerfile
3. A Push-DockerImage function to push a
Docker image to a registry
These functions wrap Docker CLI
commands, making it easier to
automate Docker-related tasks in
PowerShell scripts. The script
includes example usage for each
function.
Remember to have Docker installed
and running on your system before
using these functions. Also, ensure
you have the necessary permissions
to manage Docker containers and
images.
239. Monitor Kubernetes
Clusters
# Ensure kubectl is installed and configured
if (-not (Get-Command kubectl -ErrorAction
SilentlyContinue)) {
Write-Error "kubectl is not installed or
not in PATH. Please install kubectl and
configure it for your cluster."
exit
}
# Function to get cluster information
function Get-ClusterInfo {
kubectl cluster-info
}
# Function to get node status
function Get-NodeStatus {
kubectl get nodes
}
# Function to get pod status
function Get-PodStatus {
param (
[string]$Namespace = "default"
)
kubectl get pods -n $Namespace
}
# Function to get service status
function Get-ServiceStatus {
param (
[string]$Namespace = "default"
)
kubectl get services -n $Namespace
}
# Function to get deployment status
function Get-DeploymentStatus {
param (
[string]$Namespace = "default"
)
kubectl get deployments -n $Namespace
}
# Function to get logs for a pod
function Get-PodLogs {
param (
[Parameter(Mandatory=$true)]
[string]$PodName,
[string]$Namespace = "default",
[int]$TailLines = 100
)
kubectl logs $PodName -n $Namespace --
tail=$TailLines
}
# Function to describe a resource
function Get-ResourceDescription {
param (
[Parameter(Mandatory=$true)]
[string]$ResourceType,
[Parameter(Mandatory=$true)]
[string]$ResourceName,
[string]$Namespace = "default"
)
kubectl describe $ResourceType
$ResourceName -n $Namespace
}
# Function to monitor cluster health
function Monitor-ClusterHealth {
param (
[int]$IntervalSeconds = 300,
[string]$LogPath =
"C:\K8s\[Link]"
)
while ($true) {
$timestamp = Get-Date -Format "yyyy-
MM-dd HH:mm:ss"
$nodeStatus = kubectl get nodes -o
json | ConvertFrom-Json
$podStatus = kubectl get pods --all-
namespaces -o json | ConvertFrom-Json
$healthSummary = @"
Timestamp: $timestamp
Nodes:
Total: $($[Link])
Ready: $($[Link] | Where-Object {
$_.[Link] | Where-Object { $_.type
-eq 'Ready' -and $_.status -eq 'True' } } |
Measure-Object).Count
Pods:
Total: $($[Link])
Running: $($[Link] | Where-Object
{ $_.[Link] -eq 'Running' } | Measure-
Object).Count
Pending: $($[Link] | Where-Object
{ $_.[Link] -eq 'Pending' } | Measure-
Object).Count
Failed: $($[Link] | Where-Object {
$_.[Link] -eq 'Failed' } | Measure-
Object).Count
"@
Add-Content -Path $LogPath -Value
$healthSummary
# Check for issues
$notReadyNodes = $[Link] |
Where-Object { $_.[Link] | Where-
Object { $_.type -eq 'Ready' -and $_.status -
ne 'True' } }
$failedPods = $[Link] |
Where-Object { $_.[Link] -eq 'Failed' }
if ($notReadyNodes -or $failedPods) {
$issues = "Issues detected:`n"
if ($notReadyNodes) {
$issues += "Not Ready
Nodes:`n"
$issues += $notReadyNodes |
ForEach-Object { " $($_.[Link])`n" }
}
if ($failedPods) {
$issues += "Failed Pods:`n"
$issues += $failedPods |
ForEach-Object {
" $($_.[Link])/$($_.[Link]
e)`n" }
}
Add-Content -Path $LogPath -Value
$issues
Write-Warning $issues
}
Start-Sleep -Seconds $IntervalSeconds
}
}
# Example usage
# Get cluster information
Get-ClusterInfo
# Get node status
Get-NodeStatus
# Get pod status in the default namespace
Get-PodStatus
# Get pod status in a specific namespace
Get-PodStatus -Namespace "kube-system"
# Get service status
Get-ServiceStatus
# Get deployment status
Get-DeploymentStatus
# Get logs for a specific pod
Get-PodLogs -PodName "my-pod-xyz123" -
Namespace "default" -TailLines 50
# Describe a specific resource
Get-ResourceDescription -ResourceType "pod" -
ResourceName "my-pod-xyz123" -Namespace
"default"
# Start monitoring cluster health
Monitor-ClusterHealth -IntervalSeconds 600 -
LogPath "C:\K8s\[Link]"
This script provides a set of
PowerShell functions to monitor and
manage Kubernetes clusters using
the kubectl command-line tool. It
includes:
1. Functions to get various cluster
information:
Cluster info
Node status
Pod status
Service status
Deployment status
2. A function to get logs from a
specific pod
3. A function to describe a specific
Kubernetes resource
4. A Monitor-ClusterHealth function that
continuously monitors the cluster
health, logging the status of
nodes and pods, and alerting on
issues like not ready nodes or
failed pods
These functions wrap kubectl
commands, making it easier to
automate Kubernetes-related tasks
in PowerShell scripts. The script
includes example usage for each
function.
Remember to have kubectl installed
and properly configured to connect
to your Kubernetes cluster before
using these functions. Also, ensure
you have the necessary permissions
to access and manage the cluster
resources.
240. Automate VM Snapshots
# Ensure Hyper-V module is available
if (-not (Get-Module -ListAvailable -Name
Hyper-V)) {
Write-Error "Hyper-V module is not
available. Please ensure Hyper-V is installed
and you have the necessary permissions."
exit
}
Import-Module Hyper-V
# Function to create a VM snapshot
function New-VMSnapshot {
param (
[Parameter(Mandatory=$true)]
[string]$VMName,
[string]$SnapshotName =
"Automated-$(Get-Date -Format 'yyyyMMdd-
HHmmss')"
)
try {
$snapshot = Checkpoint-VM -Name
$VMName -SnapshotName $SnapshotName -PassThru
Write-Host "Snapshot '$SnapshotName'
created for VM '$VMName'" -ForegroundColor
Green
return $snapshot
}
catch {
Write-Error "Failed to create
snapshot for VM '$VMName': $_"
}
}
# Function to remove a VM snapshot
function Remove-VMSnapshot {
param (
[Parameter(Mandatory=$true)]
[string]$VMName,
[Parameter(Mandatory=$true)]
[string]$SnapshotName
)
try {
Remove-VMSnapshot -VMName $VMName -
Name $SnapshotName
Write-Host "Snapshot '$SnapshotName'
removed from VM '$VMName'" -ForegroundColor
Green
}
catch {
Write-Error "Failed to remove
snapshot '$SnapshotName' from VM '$VMName':
$_"
}
}
# Function to list VM snapshots
function Get-VMSnapshotList {
param (
[Parameter(Mandatory=$true)]
[string]$VMName
)
try {
$snapshots = Get-VMSnapshot -VMName
$VMName
if ($snapshots) {
Write-Host "Snapshots for VM
'$VMName':" -ForegroundColor Cyan
$snapshots | Format-Table Name,
CreationTime
}
else {
Write-Host "No snapshots found
for VM '$VMName'" -ForegroundColor Yellow
}
}
catch {
Write-Error "Failed to list snapshots
for VM '$VMName': $_"
}
}
# Function to revert VM to a snapshot
function Restore-VMSnapshot {
param (
[Parameter(Mandatory=$true)]
[string]$VMName,
[Parameter(Mandatory=$true)]
[string]$SnapshotName
)
try {
Restore-VMSnapshot -VMName $VMName -
Name $SnapshotName -Confirm:$false
Write-Host "VM '$VMName' reverted to
snapshot '$SnapshotName'" -ForegroundColor
Green
}
catch {
Write-Error "Failed to revert VM
'$VMName' to snapshot '$SnapshotName': $_"
}
}
# Function to automate snapshot creation for
multiple VMs
function New-AutomatedVMSnapshots {
param (
[string[]]$VMNames,
[int]$RetentionDays = 7
)
foreach ($vm in $VMNames) {
# Create new snapshot
$snapshotName = "Auto-$vm-$(Get-Date
-Format 'yyyyMMdd-HHmmss')"
New-VMSnapshot -VMName $vm -
SnapshotName $snapshotName
# Remove old snapshots
$oldSnapshots = Get-VMSnapshot -
VMName $vm | Where-Object { $_.CreationTime -
lt (Get-Date).AddDays(-$RetentionDays) }
foreach ($oldSnapshot in
$oldSnapshots) {
Remove-VMSnapshot -VMName $vm -
SnapshotName $[Link]
}
}
}
# Function to schedule automated snapshots
function Schedule-AutomatedSnapshots {
param (
[string[]]$VMNames,
[int]$RetentionDays = 7,
[string]$ScheduleTime = "03:00"
)
$action = New-ScheduledTaskAction -
Execute "[Link]" `
-Argument "-NoProfile -
ExecutionPolicy Bypass -File
`"$PSScriptRoot\AutomateVMSnapshots.ps1`" -
VMNames $($VMNames -join ',') -RetentionDays
$RetentionDays"
$trigger = New-ScheduledTaskTrigger -
Daily -At $ScheduleTime
Register-ScheduledTask -TaskName
"Automated VM Snapshots" -Action $action -
Trigger $trigger -RunLevel Highest -User
"SYSTEM"
Write-Host "Automated VM snapshots
scheduled to run daily at $ScheduleTime" -
ForegroundColor Green
}
# Example usage
# Create a snapshot for a single VM
New-VMSnapshot -VMName "WebServer01"
# List snapshots for a VM
Get-VMSnapshotList -VMName "WebServer01"
# Remove a specific snapshot
Remove-VMSnapshot -VMName "WebServer01" -
SnapshotName "Snapshot1"
# Revert a VM to a specific snapshot
Restore-VMSnapshot -VMName "WebServer01" -
SnapshotName "Snapshot2"
# Automate snapshots for multiple VMs
$vmsToSnapshot = @("WebServer01",
"DatabaseServer01", "AppServer01")
New-AutomatedVMSnapshots -VMNames
$vmsToSnapshot -RetentionDays 5
# Schedule automated snapshots
Schedule-AutomatedSnapshots -VMNames
$vmsToSnapshot -RetentionDays 5 -ScheduleTime
"02:00"
This script provides a set of
PowerShell functions to automate VM
snapshot management using Hyper-V.
It includes:
1. Functions for basic snapshot
operations:
Creating a snapshot
Removing a snapshot
Listing snapshots
Reverting to a snapshot
2. A function to automate snapshot
creation for multiple VMs,
including retention policy
enforcement
3. A function to schedule automated
snapshots using Windows Task
Scheduler
These functions use the Hyper-V
PowerShell module to manage VM
snapshots. The script includes
example usage for each function.
To use this script:
1. Ensure you have the necessary
permissions to manage Hyper-V VMs
and create scheduled tasks.
2. Save the script as
AutomateVMSnapshots.ps1 in a known
location.
3. Run the script with
administrative privileges.
4. Use the provided functions to
manage snapshots manually or set
up automated snapshot creation.
Remember to adjust the VM names,
retention periods, and schedule
times according to your specific
requirements and environment.
241. Configure Hyper-V
Settings
[long]$MemoryMaximumBytes,
[bool]$DynamicMemory = $true,
[string]$Notes
)
try {
$vmParams = @{
Name = $VMName
}
if ($ProcessorCount) {
$[Link] = $ProcessorCount }
if ($MemoryStartupBytes) {
$[Link] =
$MemoryStartupBytes }
if ($DynamicMemory) {
$[Link] = $true
if ($MemoryMinimumBytes) {
$[Link] =
$MemoryMinimumBytes }
if ($MemoryMaximumBytes) {
$[Link] =
$MemoryMaximumBytes }
} else {
$[Link] = $true
}
if ($Notes) { $[Link] =
$Notes }
Set-VM @vmParams
Write-Host "VM '$VMName' configured
successfully." -ForegroundColor Green
}
catch {
Write-Error "Failed to configure VM
'$VMName': $_"
}
}
# Function to configure VM network adapters
function Set-VMNetworkAdapter {
param (
[Parameter(Mandatory=$true)]
[string]$VMName,
[Parameter(Mandatory=$true)]
[string]$SwitchName,
[string]$MACAddress,
[bool]$VMQEnabled = $true,
[bool]$IPSecOffloadMaximumSecurityAss
ociation = $true
)
try {
$params = @{
VMName = $VMName
SwitchName = $SwitchName
VMQEnabled = $VMQEnabled
IPSecOffloadMaximumSecurityAssoci
ation =
$IPSecOffloadMaximumSecurityAssociation
}
if ($MACAddress) {
$[Link] =
$MACAddress
}
Get-VMNetworkAdapter -VMName $VMName
| Set-VMNetworkAdapter @params
Write-Host "Network adapter for VM
'$VMName' configured successfully." -
ForegroundColor Green
}
catch {
Write-Error "Failed to configure
network adapter for VM '$VMName': $_"
}
}
# Function to enable nested virtualization
function Enable-NestedVirtualization {
param (
[Parameter(Mandatory=$true)]
[string]$VMName
)
try {
Set-VMProcessor -VMName $VMName -
ExposeVirtualizationExtensions $true
Write-Host "Nested virtualization
enabled for VM '$VMName'." -ForegroundColor
Green
}
catch {
Write-Error "Failed to enable nested
virtualization for VM '$VMName': $_"
}
}
# Function to configure VM integration
services
function Set-VMIntegrationServices {
param (
[Parameter(Mandatory=$true)]
[string]$VMName,
[bool]$VSS = $true,
[bool]$Shutdown = $true,
[bool]$TimeSync = $true,
[bool]$Heartbeat = $true,
[bool]$KeyValuePair = $true
)
try {
$services = @{
"Guest Service Interface" = $true
"Heartbeat" = $Heartbeat
"Key-Value Pair Exchange" =
$KeyValuePair
"Shutdown" = $Shutdown
"Time Synchronization" =
$TimeSync
"VSS" = $VSS
}
foreach ($service in
$[Link]()) {
Enable-VMIntegrationService -
VMName $VMName -Name $[Link]
if (-not $[Link]) {
Disable-VMIntegrationService
-VMName $VMName -Name $[Link]
}
}
Write-Host "Integration services
configured for VM '$VMName'." -
ForegroundColor Green
}
catch {
Write-Error "Failed to configure
integration services for VM '$VMName': $_"
}
}
# Example usage
# Configure Hyper-V host settings
Set-HyperVHostConfig -VirtualHardDiskPath
"D:\Hyper-V\Virtual Hard Disks" `
-VirtualMachinePath
"D:\Hyper-V\Virtual Machines" `
-NumaSpanningEnabled 1 `
-
MaximumStorageMigrations 2 `
-
MaximumVirtualMachineMigrations 2
# Create a new virtual switch
New-HyperVSwitch -SwitchName "ExternalSwitch"
-SwitchType "External" -NetAdapterName
"Ethernet"
# Configure VM settings
Set-VMConfig -VMName "WebServer01" `
-ProcessorCount 4 `
-MemoryStartupBytes 4GB `
-MemoryMinimumBytes 2GB `
-MemoryMaximumBytes 8GB `
-DynamicMemory $true `
-Notes "Web Server VM"
# Configure VM network adapter
Set-VMNetworkAdapter -VMName "WebServer01" -
SwitchName "ExternalSwitch" -VMQEnabled $true
# Enable nested virtualization
Enable-NestedVirtualization -VMName
"NestedHyperV"
# Configure VM integration services
Set-VMIntegrationServices -VMName
"WebServer01" -VSS $true -Shutdown $true -
TimeSync $true
This script provides a
comprehensive set of PowerShell
functions to configure various
Hyper-V settings:
1. Set-HyperVHostConfig:
Configures Hyper-
V host settings such as virtual
hard disk and machine paths, NUMA
spanning, and migration settings.
2. New-HyperVSwitch: Creates a new
virtual switch (external,
internal, or private).
3. Set-VMConfig: Configures VM settings
including processor count, memory
allocation, and dynamic memory.
4. Set-VMNetworkAdapter: Configures VM
network adapter settings,
including switch assignment and
advanced features.
5. Enable-NestedVirtualization: Enables
nested virtualization for a VM.
6. Set-VMIntegrationServices: Configures
integration services for a VM.
These functions provide granular
control over Hyper-V and VM
settings, allowing for easy
automation of Hyper-V configuration
tasks. The script includes example
usage for each function.
To use this script:
1. Ensure you have the necessary
permissions to manage Hyper-V.
2. Run the script with
administrative privileges.
3. Use the provided functions to
configure Hyper-V and VM settings
as needed.
Remember to adjust the parameters
according to your specific
requirements and environment.
Always test these configurations in
a non-production environment before
applying them to production
systems.
242. Manage Docker Containers
# Ensure Docker is installed and running
if (-not (Get-Command docker -ErrorAction
SilentlyContinue)) {
Write-Error "Docker is not installed or
not in PATH. Please install Docker and try
again."
exit
}
# Function to list Docker containers
function Get-DockerContainers {
param (
[switch]$All
)
if ($All) {
docker ps -a
} else {
docker ps
}
}
# Function to start a Docker container
function Start-DockerContainer {
param (
[Parameter(Mandatory=$true)]
[string]$ContainerName
)
docker start $ContainerName
Write-Host "Container '$ContainerName'
started." -ForegroundColor Green
}
# Function to stop a Docker container
function Stop-DockerContainer {
param (
[Parameter(Mandatory=$true)]
[string]$ContainerName
)
docker stop $ContainerName
Write-Host "Container '$ContainerName'
stopped." -ForegroundColor Green
}
# Function to remove a Docker container
function Remove-DockerContainer {
param (
[Parameter(Mandatory=$true)]
[string]$ContainerName,
[switch]$Force
)
if ($Force) {
docker rm -f $ContainerName
} else {
docker rm $ContainerName
}
Write-Host "Container '$ContainerName'
removed." -ForegroundColor Green
}
# Function to create and run a new Docker
container
function New-DockerContainer {
param (
[Parameter(Mandatory=$true)]
[string]$ContainerName,
[Parameter(Mandatory=$true)]
[string]$ImageName,
[string]$PortMapping,
[string]$VolumeMapping,
[hashtable]$EnvironmentVariables,
[switch]$Detach
)
$args = @("run", "--name",
$ContainerName)
if ($Detach) {
$args += "-d"
}
if ($PortMapping) {
$args += "-p", $PortMapping
}
if ($VolumeMapping) {
$args += "-v", $VolumeMapping
}
if ($EnvironmentVariables) {
foreach ($env in
$[Link]()) {
$args += "-e",
"$($[Link])=$($[Link])"
}
}
$args += $ImageName
docker @args
Write-Host "Container '$ContainerName'
created and started." -ForegroundColor Green
}
# Function to view Docker container logs
function Get-DockerContainerLogs {
param (
[Parameter(Mandatory=$true)]
[string]$ContainerName,
[int]$Tail = 100
)
docker logs --tail $Tail $ContainerName
}
# Function to execute a command in a running
Docker container
function Invoke-DockerCommand {
param (
[Parameter(Mandatory=$true)]
[string]$ContainerName,
[Parameter(Mandatory=$true)]
[string]$Command
)
docker exec $ContainerName $Command
}
# Function to get Docker container details
function Get-DockerContainerDetails {
param (
[Parameter(Mandatory=$true)]
[string]$ContainerName
)
docker inspect $ContainerName
}
# Function to monitor Docker container
resource usage
function Get-DockerContainerStats {
param (
[string]$ContainerName
)
if ($ContainerName) {
docker stats $ContainerName --no-
stream
} else {
docker stats --no-stream
}
}
# Example usage
# List all containers
Get-DockerContainers -All
# Create and run a new container
$envVars = @{
"MYSQL_ROOT_PASSWORD" = "secretpassword"
"MYSQL_DATABASE" = "myapp"
}
New-DockerContainer -ContainerName "mydb" -
ImageName "mysql:latest" -PortMapping
"3306:3306" -EnvironmentVariables $envVars -
Detach
# View container logs
Get-DockerContainerLogs -ContainerName "mydb"
# Execute a command in the container
Invoke-DockerCommand -ContainerName "mydb" -
Command "mysql -uroot -p'secretpassword' -e
'SHOW DATABASES;'"
# Get container details
Get-DockerContainerDetails -ContainerName
"mydb"
# Monitor container resource usage
Get-DockerContainerStats -ContainerName
"mydb"
# Stop the container
Stop-DockerContainer -ContainerName "mydb"
# Remove the container
Remove-DockerContainer -ContainerName "mydb"
-Force
This script provides a
comprehensive set of PowerShell
functions to manage Docker
containers:
1. Get-DockerContainers: Lists Docker
containers (running or all).
2. Start-DockerContainer: Starts a stopped
container.
3. Stop-DockerContainer: Stops a running
container.
4. Remove-DockerContainer: Removes a
container.
5. New-DockerContainer: Creates and runs a
new container with various
options.
6. Get-DockerContainerLogs: Views container
logs.
7. Invoke-DockerCommand:Executes a
command in a running container.
8. Get-DockerContainerDetails: Retrieves
detailed information about a
container.
9. Get-DockerContainerStats: Monitors
container resource usage.
These functions wrap Docker CLI
commands, providing a more
PowerShell-friendly interface for
managing Docker containers. The
script includes example usage for
each function.
To use this script:
1. Ensure Docker is installed and
running on your system.
2. Run the script with appropriate
permissions.
3. Use the provided functions to
manage Docker containers as
needed.
Remember to adjust the parameters
(like container names, image names,
port mappings, etc.) according to
your specific requirements and
Docker setup.
243. Automate Certificate
Renewals
# Ensure the PSPKI module is installed
if (-not (Get-Module -ListAvailable -Name
PSPKI)) {
Install-Module -Name PSPKI -Force -Scope
CurrentUser
}
Import-Module PSPKI
# Function to renew a certificate
function Renew-Certificate {
param (
[Parameter(Mandatory=$true)]
[string]$CertificateTemplateName,
[Parameter(Mandatory=$true)]
[string]$SubjectName,
[string]$CertificateAuthorityName =
"[Link]",
[string]$ExportPath,
[SecureString]$Password
)
try {
# Request a new certificate
$cert = Get-Certificate -Template
$CertificateTemplateName -SubjectName
$SubjectName -CertStoreLocation
Cert:\LocalMachine\My -Url
"ldap:///CN=$CertificateAuthorityName" -
Verbose
if ($cert) {
Write-Host "Certificate renewed
successfully." -ForegroundColor Green
# Export the certificate if
ExportPath is provided
if ($ExportPath) {
if (-not $Password) {
$Password = Read-Host
"Enter password for certificate export" -
AsSecureString
}
$cert | Export-PfxCertificate
-FilePath $ExportPath -Password $Password
Write-Host "Certificate
exported to $ExportPath" -ForegroundColor
Green
}
return $cert
} else {
Write-Error "Failed to renew
certificate."
}
}
catch {
Write-Error "An error occurred while
renewing the certificate: $_"
}
}
# Function to check certificate expiration
function Get-CertificateExpirationStatus {
param (
[Parameter(Mandatory=$true)]
[string]$CertificateStorePath,
[int]$WarningThresholdDays = 30
)
$certs = Get-ChildItem -Path
$CertificateStorePath
foreach ($cert in $certs) {
$daysUntilExpiration =
($[Link] - (Get-Date)).Days
if ($daysUntilExpiration -le 0) {
Write-Host "Certificate
'$($[Link])' has expired." -
ForegroundColor Red
}
elseif ($daysUntilExpiration -le
$WarningThresholdDays) {
Write-Host "Certificate
'$($[Link])' will expire in
$daysUntilExpiration days." -ForegroundColor
Yellow
}
else {
Write-Host "Certificate
'$($[Link])' is valid for
$daysUntilExpiration days." -ForegroundColor
Green
}
}
}
# Function to automate certificate renewal
process
function Start-CertificateRenewalProcess {
param (
[Parameter(Mandatory=$true)]
[string]$CertificateStorePath,
[int]$RenewalThresholdDays = 30,
[string]$CertificateTemplateName,
[string]$CertificateAuthorityName,
[string]$ExportPath,
[SecureString]$Password
)
$certs = Get-ChildItem -Path
$CertificateStorePath
foreach ($cert in $certs) {
$daysUntilExpiration =
($[Link] - (Get-Date)).Days
if ($daysUntilExpiration -le
$RenewalThresholdDays) {
Write-Host "Renewing certificate:
$($[Link])" -ForegroundColor Yellow
$subjectName = $[Link] -
replace "CN=", ""
$renewedCert = Renew-Certificate
-CertificateTemplateName
$CertificateTemplateName `
-SubjectName $subjectName `
-CertificateAuthorityName
$CertificateAuthorityName `
-ExportPath $ExportPath `
-Password $Password
if ($renewedCert) {
# Remove the old certificate
Remove-Item -Path
$[Link] -Force
Write-Host "Old certificate
removed." -ForegroundColor Green
}
}
}
}
# Function to schedule certificate renewal
check
function Schedule-CertificateRenewalCheck {
param (
[string]$TaskName =
"CertificateRenewalCheck",
[string]$ScriptPath,
[string]$ScheduleTime = "03:00"
)
$action = New-ScheduledTaskAction -
Execute "[Link]" `
-Argument "-NoProfile -
ExecutionPolicy Bypass -File `"$ScriptPath`""
$trigger = New-ScheduledTaskTrigger -
Daily -At $ScheduleTime
Register-ScheduledTask -TaskName
$TaskName -Action $action -Trigger $trigger -
RunLevel Highest -User "SYSTEM"
Write-Host "Certificate renewal check
scheduled to run daily at $ScheduleTime" -
ForegroundColor Green
}
# Example usage
# Check certificate expiration status
Get-CertificateExpirationStatus -
CertificateStorePath "Cert:\LocalMachine\My"
-WarningThresholdDays 45
# Renew a specific certificate
$password = ConvertTo-SecureString
"CertPassword123!" -AsPlainText -Force
Renew-Certificate -CertificateTemplateName
"WebServerCert" `
-SubjectName
"CN=[Link]" `
-CertificateAuthorityName
"[Link]" `
-ExportPath
"C:\Certificates\renewed_cert.pfx" `
-Password $password
# Start automated renewal process
Start-CertificateRenewalProcess -
CertificateStorePath "Cert:\LocalMachine\My"
`
-
RenewalThresholdDays 45 `
-
CertificateTemplateName "WebServerCert" `
-
CertificateAuthorityName "[Link]" `
-ExportPath
"C:\Certificates" `
-Password
$password
# Schedule daily certificate renewal check
Schedule-CertificateRenewalCheck -ScriptPath
"C:\Scripts\CertificateRenewalCheck.ps1" -
ScheduleTime "02:00"
This script provides a set of
PowerShell functions to automate
certificate renewal processes:
1. Renew-Certificate:Renews a specific
certificate using a certificate
template and optionally exports
it.
2. Get-CertificateExpirationStatus: Checks
the expiration status of
certificates in a specified
store.
3. Start-CertificateRenewalProcess: Automates
the renewal process for multiple
certificates in a store.
4. Schedule-CertificateRenewalCheck: Sets up
a scheduled task to run the
certificate renewal check
regularly.
These functions use the PSPKI
module to interact with
certificates and certificate
authorities. The script includes
example usage for each function.
To use this script:
1. Ensure you have the necessary
permissions to manage
certificates and interact with
the certificate authority.
2. Install the PSPKI module if it's
not already available.
3. Adjust the parameters (like
certificate template names, CA
names, export paths, etc.)
according to your specific PKI
setup.
4. Run the script with appropriate
permissions.
Remember to test these functions in
a non-production environment before
applying them to production
systems. Also, ensure that your
certificate renewal processes
comply with your organization's
security policies and best
practices.
244. Encrypt Sensitive Data
for Transmission
# Function to generate a new AES key
function New-AESKey {
$AES = New-Object
[Link]
$[Link] = 256
$[Link]()
return
[[Link]]::ToBase64String($[Link])
}
# Function to encrypt data using AES
function Protect-AESData {
param (
[Parameter(Mandatory=$true)]
[string]$Data,
[Parameter(Mandatory=$true)]
[string]$Key
)
try {
$AES = New-Object
[Link]
$[Link] =
[[Link]]::FromBase64String($Key)
$[Link]()
$encryptor = $[Link]()
$dataBytes =
[[Link]]::[Link]($Data)
$encryptedData =
$[Link]($dataBytes, 0,
$[Link])
$result = @{
EncryptedData =
[[Link]]::ToBase64String($encryptedDa
ta)
IV =
[[Link]]::ToBase64String($[Link])
}
return $result
}
catch {
Write-Error "Encryption failed: $_"
}
finally {
if ($AES) { $[Link]() }
}
}
# Function to decrypt AES encrypted data
function Unprotect-AESData {
param (
[Parameter(Mandatory=$true)]
[string]$EncryptedData,
[Parameter(Mandatory=$true)]
[string]$Key,
[Parameter(Mandatory=$true)]
[string]$IV
)
try {
$AES = New-Object
[Link]
$[Link] =
[[Link]]::FromBase64String($Key)
$[Link] =
[[Link]]::FromBase64String($IV)
$decryptor = $[Link]()
$encryptedBytes =
[[Link]]::FromBase64String($Encrypted
Data)
$decryptedBytes =
$[Link]($encryptedByte
s, 0, $[Link])
return
[[Link]]::[Link]($decry
ptedBytes)
}
catch {
Write-Error "Decryption failed: $_"
}
finally {
if ($AES) { $[Link]() }
}
}
# Function to encrypt data using a public key
function Protect-RSAData {
param (
[Parameter(Mandatory=$true)]
[string]$Data,
[Parameter(Mandatory=$true)]
[string]$PublicKeyPath
)
try {
$publicKey = Get-Content
$PublicKeyPath
$rsaProvider = New-Object
[Link]
Provider
$[Link]($publicKey
)
$dataBytes =
[[Link]]::[Link]($Data)
$encryptedBytes =
$[Link]($dataBytes, $false)
return
[[Link]]::ToBase64String($encryptedBy
tes)
}
catch {
Write-Error "RSA encryption failed:
$_"
}
finally {
if ($rsaProvider) {
$[Link]() }
}
}
# Function to decrypt data using a private
key
function Unprotect-RSAData {
param (
[Parameter(Mandatory=$true)]
[string]$EncryptedData,
[Parameter(Mandatory=$true)]
[string]$PrivateKeyPath
)
try {
$privateKey = Get-Content
$PrivateKeyPath
$rsaProvider = New-Object
[Link]
Provider
$[Link]($privateKe
y)
$encryptedBytes =
[[Link]]::FromBase64String($Encrypted
Data)
$decryptedBytes =
$[Link]($encryptedBytes, $false)
return
[[Link]]::[Link]($decry
ptedBytes)
}
catch {
Write-Error "RSA decryption failed:
$_"
}
finally {
if ($rsaProvider) {
$[Link]() }
}
}
# Function to securely transmit data
function Send-EncryptedData {
param (
[Parameter(Mandatory=$true)]
[string]$Data,
[Parameter(Mandatory=$true)]
[string]$RecipientPublicKeyPath,
[Parameter(Mandatory=$true)]
[string]$Destination
)
# Generate a new AES key for this
transmission
$aesKey = New-AESKey
# Encrypt the data with AES
$encryptedData = Protect-AESData -Data
$Data -Key $aesKey
# Encrypt the AES key with the
recipient's public key
$encryptedKey = Protect-RSAData -Data
$aesKey -PublicKeyPath
$RecipientPublicKeyPath
# Prepare the package
$package = @{
EncryptedData =
$[Link]
IV = $[Link]
EncryptedKey = $encryptedKey
}
# Convert the package to JSON
$jsonPackage = $package | ConvertTo-Json
# Send the package (simulated here)
Write-Host "Sending encrypted data to
$Destination"
# In a real scenario, you would use
appropriate method to send data (e.g.,
Invoke-RestMethod, Send-MailMessage, etc.)
# For demonstration, we'll just write it
to a file
$jsonPackage | Out-File -FilePath
$Destination
}
# Function to receive and decrypt data
function Receive-EncryptedData {
param (
[Parameter(Mandatory=$true)]
[string]$Source,
[Parameter(Mandatory=$true)]
[string]$PrivateKeyPath
)
# Read the encrypted package (simulated
here)
$jsonPackage = Get-Content -Path $Source
| ConvertFrom-Json
# Decrypt the AES key
$aesKey = Unprotect-RSAData -
EncryptedData $[Link] -
PrivateKeyPath $PrivateKeyPath
# Decrypt the data
$decryptedData = Unprotect-AESData -
EncryptedData $[Link] -Key
$aesKey -IV $[Link]
return $decryptedData
}
# Example usage
# Generate RSA key pair (in a real scenario,
these would be securely stored)
$rsa = New-Object
[Link]
Provider(2048)
$publicKey = $[Link]($false)
$privateKey = $[Link]($true)
$publicKeyPath = "C:\temp\public_key.xml"
$privateKeyPath = "C:\temp\private_key.xml"
$publicKey | Out-File -FilePath
$publicKeyPath
$privateKey | Out-File -FilePath
$privateKeyPath
# Sensitive data to be transmitted
$sensitiveData = "This is sensitive
information that needs to be securely
transmitted."
# Encrypt and send the data
Send-EncryptedData -Data $sensitiveData -
RecipientPublicKeyPath $publicKeyPath -
Destination "C:\temp\encrypted_package.json"
# Receive and decrypt the data
$receivedData = Receive-EncryptedData -Source
"C:\temp\encrypted_package.json" -
PrivateKeyPath $privateKeyPath
Write-Host "Decrypted data: $receivedData"
# Clean up (remove temporary files)
Remove-Item -Path $publicKeyPath,
$privateKeyPath,
"C:\temp\encrypted_package.json"
This script provides a set of
PowerShell functions to encrypt
sensitive data for secure
transmission:
1. New-AESKey: Generates a new AES key.
2. Protect-AESData: Encrypts data using
AES.
3. Unprotect-AESData: Decrypts AES-
encrypted data.
4. Protect-RSAData: Encrypts data using
RSA (public key).
5. Unprotect-RSAData: Decrypts RSA-
encrypted data (private key).
6. Send-EncryptedData: Simulates sending
encrypted data (AES + RSA).
7. Receive-EncryptedData: Simulates
receiving and decrypting data.
The script uses a hybrid encryption
approach:
AES for encrypting the actual
data (faster for large amounts of
data).
RSA for encrypting the AES key
(allows secure key exchange).
To use this script:
1. Ensure you have the necessary
permissions to perform
cryptographic operations.
2. In a real-world scenario, replace
the file operations in Send-
EncryptedData and Receive-EncryptedData
with actual data transmission
methods (e.g., API calls, email,
etc.).
3. Securely manage and store the RSA
keys. The example generates
temporary keys, but in practice,
these should be securely created
and stored.
4. Adjust the file paths and other
parameters as needed for your
environment.
Remember, this is a simplified
example for demonstration purposes.
In a production environment, you
would need to consider additional
security measures, error handling,
and compliance with relevant
security standards and regulations.
245. Automate Key Rotation
Policies
# Ensure the required modules are available
Install-Module -Name AzureRM -Force -
AllowClobber
Import-Module AzureRM
# Function to rotate Azure Key Vault secrets
function Rotate-AzureKeyVaultSecret {
param (
[Parameter(Mandatory=$true)]
[string]$KeyVaultName,
[Parameter(Mandatory=$true)]
[string]$SecretName,
[Parameter(Mandatory=$true)]
[scriptblock]$NewValueGenerator,
[int]$RetentionDays = 30
)
try {
# Connect to Azure (make sure you're
logged in)
$azureContext = Get-AzureRmContext
if (-not $[Link]) {
Connect-AzureRmAccount
}
# Get the current secret
$currentSecret = Get-
AzureKeyVaultSecret -VaultName $KeyVaultName
-Name $SecretName
if ($currentSecret) {
# Generate new secret value
$newSecretValue = &
$NewValueGenerator
# Create a new version of the
secret
$secretParams = @{
VaultName = $KeyVaultName
Name = $SecretName
SecretValue = ConvertTo-
SecureString -String $newSecretValue -
AsPlainText -Force
}
$newSecret = Set-
AzureKeyVaultSecret @secretParams
Write-Host "Rotated secret
'$SecretName' in Key Vault '$KeyVaultName'" -
ForegroundColor Green
# Set expiration on the old
version
$expirationDate = (Get-
Date).AddDays($RetentionDays)
Update-AzureKeyVaultSecret -
VaultName $KeyVaultName -Name $SecretName -
Version $[Link] `
-
Expires $expirationDate
Write-Host "Set expiration date
on old version of secret '$SecretName' to
$expirationDate" -ForegroundColor Yellow
return $newSecret
}
else {
Write-Warning "Secret
'$SecretName' not found in Key Vault
'$KeyVaultName'"
}
}
catch {
Write-Error "Failed to rotate secret:
$_"
}
}
}
# Function to rotate database connection
strings
function Rotate-DatabaseConnectionString {
param (
[Parameter(Mandatory=$true)]
[string]$KeyVaultName,
[Parameter(Mandatory=$true)]
[string]$SecretName,
[Parameter(Mandatory=$true)]
[string]$ServerName,
[Parameter(Mandatory=$true)]
[string]$DatabaseName
)
$newPasswordGenerator = {
# Generate a complex password
$length = 16
$nonAlphaChars = 5
Add-Type -AssemblyName [Link]
[[Link]]::Gen
eratePassword($length, $nonAlphaChars)
}
$newSecret = Rotate-AzureKeyVaultSecret -
KeyVaultName $KeyVaultName -SecretName
$SecretName -NewValueGenerator
$newPasswordGenerator
if ($newSecret) {
$newPassword = $[Link]
| ConvertFrom-SecureString -AsPlainText
# Update the database user password
# Note: This is a placeholder. You'd
need to implement the actual database
password update logic here.
# This might involve connecting to
the database and running an ALTER USER
command.
Write-Host "Updating database user
password... (simulated)" -ForegroundColor
Yellow
# Construct and return the new
connection string
$newConnectionString =
"Server=$ServerName;Database=$DatabaseName;Us
er Id=yourUsername;Password=$newPassword;"
return $newConnectionString
}
}
# Function to rotate API keys
function Rotate-ApiKey {
param (
[Parameter(Mandatory=$true)]
[string]$KeyVaultName,
[Parameter(Mandatory=$true)]
[string]$SecretName,
[Parameter(Mandatory=$true)]
[string]$ApiEndpoint
)
$newApiKeyGenerator = {
# Generate a new API key (simulated
here)
return
[Guid]::NewGuid().ToString("N")
}
$newSecret = Rotate-AzureKeyVaultSecret -
KeyVaultName $KeyVaultName -SecretName
$SecretName -NewValueGenerator
$newApiKeyGenerator
if ($newSecret) {
$newApiKey = $[Link] |
ConvertFrom-SecureString -AsPlainText
# Update the API key at the API
endpoint
# Note: This is a placeholder. You'd
need to implement the actual API key update
logic here.
# This might involve making an API
call to update the key.
Write-Host "Updating API key at
$ApiEndpoint... (simulated)" -ForegroundColor
Yellow
return $newApiKey
}
}
# Function to schedule key rotation
function Schedule-KeyRotation {
param (
[Parameter(Mandatory=$true)]
[string]$TaskName,
[Parameter(Mandatory=$true)]
[string]$ScriptPath,
[string]$Frequency = "Weekly",
[int]$DayOfWeek = 1, # Monday
[string]$Time = "03:00"
)
$action = New-ScheduledTaskAction -
Execute "[Link]" `
-Argument "-NoProfile -
ExecutionPolicy Bypass -File `"$ScriptPath`""
$trigger = New-ScheduledTaskTrigger -
Weekly -DaysOfWeek $DayOfWeek -At $Time
Register-ScheduledTask -TaskName
$TaskName -Action $action -Trigger $trigger -
RunLevel Highest -User "SYSTEM"
Write-Host "Scheduled key rotation task
'$TaskName' to run $Frequency at $Time" -
ForegroundColor Green
}
# Example usage
# Rotate a database connection string
$newConnectionString = Rotate-
DatabaseConnectionString -KeyVaultName
"MyKeyVault" `
-SecretName
"MyDatabaseConnectionString" `
-ServerName
"[Link]" `
-DatabaseName "mydb"
Write-Host "New connection string:
$newConnectionString"
# Rotate an API key
$newApiKey = Rotate-ApiKey -KeyVaultName
"MyKeyVault" `
-SecretName
"MyApiKey" `
-ApiEndpoint
"[Link]
Write-Host "New API key: $newApiKey"
# Schedule weekly key rotation
Schedule-KeyRotation -TaskName
"WeeklyKeyRotation" `
-ScriptPath
"C:\Scripts\RotateKeys.ps1" `
-Frequency "Weekly" `
-DayOfWeek 1 `
-Time "03:00"
This script provides a set of
PowerShell functions to automate
key rotation policies, particularly
focusing on Azure Key Vault
secrets. Here's a breakdown of the
functions:
1. Rotate-AzureKeyVaultSecret:This is the
core function that rotates a
secret in Azure Key Vault. It
generates a new secret value,
creates a new version of the
secret, and sets an expiration
date on the old version.
2. Rotate-DatabaseConnectionString: This
function demonstrates how to
rotate a database connection
string. It generates a new
password, updates the secret in
Key Vault, and simulates updating
the database user password.
3. Rotate-ApiKey: This function shows
how to rotate an API key. It
generates a new API key, updates
the secret in Key Vault, and
simulates updating the API key at
an API endpoint.
4. Schedule-KeyRotation:
This function
sets up a scheduled task to
automate the key rotation process
on a regular basis.
To use this script:
1. Ensure you have the Azure
PowerShell module installed and
that you're logged into your
Azure account.
2. Customize the NewValueGenerator
scriptblocks in each rotation
function to generate appropriate
new values for your secrets.
3. Implement the actual logic for
updating database passwords and
API keys in their respective
functions. The current script
only simulates these actions.
4. Adjust the Key Vault names,
secret names, and other
parameters to match your Azure
environment.
5. Use the Schedule-KeyRotation function
to set up automated, regular key
rotations.
Remember, this script provides a
framework for key rotation. In a
production environment, you would
need to:
Implement proper error handling
and logging.
Ensure secure handling of secrets
and credentials.
Test thoroughly in a non-
production environment before
applying to production systems.
Consider the impact of key
rotation on your applications and
implement appropriate update
mechanisms.
Comply with your organization's
security policies and any
relevant regulatory requirements.
246. Integrate with Git
Repositories
# Ensure the posh-git module is installed
if (-not (Get-Module -ListAvailable -Name
posh-git)) {
Install-Module posh-git -Scope
CurrentUser -Force
}
Import-Module posh-git
# Function to clone a Git repository
function Clone-GitRepository {
param (
[Parameter(Mandatory=$true)]
[string]$RepoUrl,
[string]$DestinationPath,
[string]$Branch = "main"
)
if (-not $DestinationPath) {
$DestinationPath = Split-Path -Path
$RepoUrl -Leaf
$DestinationPath = $DestinationPath -
replace '\.git$', ''
}
try {
git clone -b $Branch $RepoUrl
$DestinationPath
Write-Host "Repository cloned
successfully to $DestinationPath" -
ForegroundColor Green
}
catch {
Write-Error "Failed to clone
repository: $_"
}
}
# Function to update a Git repository
function Update-GitRepository {
param (
[Parameter(Mandatory=$true)]
[string]$RepoPath,
[string]$Remote = "origin",
[string]$Branch = "main"
)
try {
Push-Location $RepoPath
# Fetch the latest changes
git fetch $Remote
# Check if there are any changes
$status = git status -uno
if ($status -match "Your branch is up
to date") {
Write-Host "Repository is already
up to date." -ForegroundColor Green
return
}
# Pull the latest changes
git pull $Remote $Branch
Write-Host "Repository updated
successfully" -ForegroundColor Green
}
catch {
Write-Error "Failed to update
repository: $_"
}
finally {
Pop-Location
}
}
# Function to commit and push changes
function Push-GitChanges {
param (
[Parameter(Mandatory=$true)]
[string]$RepoPath,
[Parameter(Mandatory=$true)]
[string]$CommitMessage,
[string]$Remote = "origin",
[string]$Branch = "main"
)
try {
Push-Location $RepoPath
# Check if there are any changes
$status = git status --porcelain
if (-not $status) {
Write-Host "No changes to
commit." -ForegroundColor Yellow
return
}
# Add all changes
git add .
# Commit changes
git commit -m $CommitMessage
# Push changes
git push $Remote $Branch
Write-Host "Changes committed and
pushed successfully" -ForegroundColor Green
}
catch {
Write-Error "Failed to commit and
push changes: $_"
}
finally {
Pop-Location
}
}
# Function to create a new branch
function New-GitBranch {
param (
[Parameter(Mandatory=$true)]
[string]$RepoPath,
[Parameter(Mandatory=$true)]
[string]$BranchName,
[switch]$Checkout
)
try {
Push-Location $RepoPath
# Create the new branch
git branch $BranchName
if ($Checkout) {
# Switch to the new branch
git checkout $BranchName
Write-Host "Created and switched
to new branch '$BranchName'" -ForegroundColor
Green
}
else {
Write-Host "Created new branch
'$BranchName'" -ForegroundColor Green
}
}
catch {
Write-Error "Failed to create new
branch: $_"
}
finally {
Pop-Location
}
}
# Function to merge branches
function Merge-GitBranch {
param (
[Parameter(Mandatory=$true)]
[string]$RepoPath,
[Parameter(Mandatory=$true)]
[string]$SourceBranch,
[string]$TargetBranch = "main"
)
try {
Push-Location $RepoPath
# Ensure we're on the target branch
git checkout $TargetBranch
# Merge the source branch
$mergeResult = git merge
$SourceBranch
if ($mergeResult -match "Already up
to date") {
Write-Host "Branches are already
in sync." -ForegroundColor Green
}
elseif ($mergeResult -match "Fast-
forward") {
Write-Host "Fast-forward merge
completed successfully." -ForegroundColor
Green
}
else {
Write-Host "Merge completed.
Please review any merge conflicts." -
ForegroundColor Yellow
}
}
catch {
Write-Error "Failed to merge
branches: $_"
}
finally {
Pop-Location
}
}
# Function to get Git repository status
function Get-GitStatus {
param (
[Parameter(Mandatory=$true)]
[string]$RepoPath
)
try {
Push-Location $RepoPath
$status = git status --porcelain
$branchName = git rev-parse --abbrev-
ref HEAD
$lastCommit = git log -1 --
pretty=format:"%h - %an, %ar : %s"
$statusInfo = @{
Branch = $branchName
LastCommit = $lastCommit
ChangedFiles = @()
}
if ($status) {
$[Link] =
$status -split "`n" | ForEach-Object {
$parts = $_ -split "\s+", 2
[PSCustomObject]@{
Status = $parts[0]
File = $parts[1]
}
}
}
return $statusInfo
}
catch {
Write-Error "Failed to get repository
status: $_"
}
finally {
Pop-Location
}
}
# Example usage
# Clone a repository
Clone-GitRepository -RepoUrl
"[Link] -
DestinationPath "C:\Projects\MyRepo"
# Update a repository
Update-GitRepository -RepoPath
"C:\Projects\MyRepo"
# Commit and push changes
Push-GitChanges -RepoPath
"C:\Projects\MyRepo" -CommitMessage "Updated
documentation"
# Create a new branch
New-GitBranch -RepoPath "C:\Projects\MyRepo"
-BranchName "feature/new-feature" -Checkout
# Merge branches
Merge-GitBranch -RepoPath
"C:\Projects\MyRepo" -SourceBranch
"feature/new-feature" -TargetBranch "main"
# Get repository status
$repoStatus = Get-GitStatus -RepoPath
"C:\Projects\MyRepo"
Write-Host "Current branch:
$($[Link])"
Write-Host "Last commit:
$($[Link])"
if ($[Link]) {
Write-Host "Changed files:"
$[Link] | ForEach-Object
{
Write-Host " $($_.Status)
$($_.File)"
}
}
This script provides a set of
PowerShell functions to interact
with Git repositories:
1. Clone-GitRepository:
Clones a Git
repository to a specified
location.
2. Update-GitRepository: Updates a local
repository by pulling the latest
changes.
3. Push-GitChanges: Commits and pushes
local changes to a remote
repository.
4. New-GitBranch:Creates a new branch
in the repository.
5. Merge-GitBranch: Merges one branch
into another.
6. Get-GitStatus: Retrieves the current
status of a Git repository,
including branch name, last
commit, and changed files.
To use this script:
1. Ensure you have Git installed and
accessible from the command line.
2. Install the posh-git module if
it's not already installed (the
script will attempt to install it
if missing).
3. Adjust the repository paths,
URLs, and branch names in the
example usage section to match
your specific Git repositories.
4. Run the script with appropriate
permissions to interact with the
file system and Git repositories.
These functions provide a
PowerShell-friendly interface to
common Git operations, making it
easier to automate Git-related
tasks in your scripts. You can
integrate these functions into
larger automation workflows, such
as:
Automated code deployment
pipelines
Scheduled repository updates and
backups
Continuous integration processes
Bulk operations across multiple
Git repositories
Remember to handle credentials
securely when working with private
repositories. You might need to set
up SSH keys or use a credential
manager for authentication.
Also, be cautious when automating
operations that modify repositories
(like merging branches or pushing
changes). Always ensure you have
proper backups and consider
implementing additional safeguards
or approval processes for critical
operations.
247. Detect Security
Vulnerabilities
# Ensure necessary modules are installed
$requiredModules = @("PSWindowsUpdate",
"SecurityPolicyDsc", "AuditPolicyDsc")
foreach ($module in $requiredModules) {
if (-not (Get-Module -ListAvailable -Name
$module)) {
Install-Module -Name $module -Force -
Scope CurrentUser
}
}
# Function to check for Windows updates
function Check-WindowsUpdates {
try {
Import-Module PSWindowsUpdate
$updates = Get-WUList
if ($updates) {
Write-Host "Available Windows
Updates:" -ForegroundColor Yellow
$updates | ForEach-Object {
Write-Host " - $($_.Title)"
-ForegroundColor Cyan
}
}
else {
Write-Host "No Windows updates
available." -ForegroundColor Green
}
}
catch {
Write-Error "Failed to check for
Windows updates: $_"
}
}
# Function to check firewall status
function Check-FirewallStatus {
$firewallProfiles = Get-
NetFirewallProfile
foreach ($profile in $firewallProfiles) {
$status = if ($[Link]) {
"Enabled" } else { "Disabled" }
$color = if ($[Link]) {
"Green" } else { "Red" }
Write-Host "Firewall profile
$($[Link]) is $status" -ForegroundColor
$color
}
}
# Function to check antivirus status
function Check-AntivirusStatus {
try {
$avStatus = Get-MpComputerStatus
if ($[Link]) {
Write-Host "Antivirus is enabled"
-ForegroundColor Green
Write-Host " Last scan:
$($[Link])" -
ForegroundColor Cyan
Write-Host " Definitions:
$($[Link])" -
ForegroundColor Cyan
}
else {
Write-Host "Antivirus is
disabled" -ForegroundColor Red
}
}
catch {
Write-Error "Failed to check
antivirus status: $_"
}
}
# Function to check for open ports
function Check-OpenPorts {
$openPorts = Get-NetTCPConnection |
Where-Object { $_.State -eq "Listen" }
Write-Host "Open ports:" -ForegroundColor
Yellow
$openPorts | ForEach-Object {
Write-Host " - Port $($_.LocalPort)
($($_.OwningProcess))" -ForegroundColor Cyan
}
}
# Function to check password policy
function Check-PasswordPolicy {
$policy = Get-
ADDefaultDomainPasswordPolicy
Write-Host "Password Policy:" -
ForegroundColor Yellow
Write-Host " Minimum Length:
$($[Link])" -
ForegroundColor Cyan
Write-Host " Complexity Enabled:
$($[Link])" -
ForegroundColor Cyan
Write-Host " Maximum Age:
$($[Link])" -ForegroundColor
Cyan
}
# Function to check for unauthorized software
function Check-UnauthorizedSoftware {
param (
[string[]]$AuthorizedSoftware
)
$installedSoftware = Get-WmiObject -Class
Win32_Product | Select-Object Name, Vendor
$unauthorizedSoftware =
$installedSoftware | Where-Object {
$AuthorizedSoftware -notcontains $_.Name }
if ($unauthorizedSoftware) {
Write-Host "Unauthorized software
detected:" -ForegroundColor Red
$unauthorizedSoftware | ForEach-
Object {
Write-Host " - $($_.Name)
($($_.Vendor))" -ForegroundColor Cyan
}
}
else {
Write-Host "No unauthorized software
detected." -ForegroundColor Green
}
}
# Function to check user account status
function Check-UserAccountStatus {
$inactiveThreshold = (Get-
Date).AddDays(-90)
$users = Get-ADUser -Filter * -Properties
LastLogonDate, Enabled
$inactiveUsers = $users | Where-Object {
$_.LastLogonDate -lt $inactiveThreshold -and
$_.Enabled -eq $true }
$disabledUsers = $users | Where-Object {
$_.Enabled -eq $false }
if ($inactiveUsers) {
Write-Host "Inactive user accounts
(no login for 90+ days):" -ForegroundColor
Yellow
$inactiveUsers | ForEach-Object {
Write-Host " - $($_.Name) (Last
logon: $($_.LastLogonDate))" -ForegroundColor
Cyan
}
}
if ($disabledUsers) {
Write-Host "Disabled user accounts:"
-ForegroundColor Yellow
$disabledUsers | ForEach-Object {
Write-Host " - $($_.Name)" -
ForegroundColor Cyan
}
}
}
# Function to check for critical services
status
function Check-CriticalServices {
param (
[string[]]$CriticalServices
)
foreach ($service in $CriticalServices) {
$status = Get-Service -Name $service
-ErrorAction SilentlyContinue
if ($status) {
$color = if ($[Link] -eq
"Running") { "Green" } else { "Red" }
Write-Host "Service
$($[Link]) is $($[Link])"
-ForegroundColor $color
}
else {
Write-Host "Service $service not
found" -ForegroundColor Yellow
}
}
}
# Function to run a comprehensive security
check
function Invoke-SecurityVulnerabilityCheck {
Write-Host "Starting security
vulnerability check..." -ForegroundColor Cyan
Write-Host "`nChecking Windows Updates:"
-ForegroundColor Magenta
Check-WindowsUpdates
Write-Host "`nChecking Firewall Status:"
-ForegroundColor Magenta
Check-FirewallStatus
Write-Host "`nChecking Antivirus Status:"
-ForegroundColor Magenta
Check-AntivirusStatus
Write-Host "`nChecking Open Ports:" -
ForegroundColor Magenta
Check-OpenPorts
Write-Host "`nChecking Password Policy:"
-ForegroundColor Magenta
Check-PasswordPolicy
Write-Host "`nChecking for Unauthorized
Software:" -ForegroundColor Magenta
$authorizedSoftware = @("Microsoft
Office", "Adobe Acrobat", "Google Chrome")
Check-UnauthorizedSoftware -
AuthorizedSoftware $authorizedSoftware
Write-Host "`nChecking User Account
Status:" -ForegroundColor Magenta
Check-UserAccountStatus
Write-Host "`nChecking Critical
Services:" -ForegroundColor Magenta
$criticalServices = @("Dhcp", "Dnscache",
"NTDS", "Netlogon")
Check-CriticalServices -CriticalServices
$criticalServices
Write-Host "`nSecurity vulnerability
check completed." -ForegroundColor Cyan
}
# Run the comprehensive security check
Invoke-SecurityVulnerabilityCheck
This script provides a
comprehensive set of functions to
detect various security
vulnerabilities and perform
security checks on a Windows
system. Here's a breakdown of the
functions:
1. Check-WindowsUpdates: Checks for
available Windows updates.
2. Check-FirewallStatus: Verifies the
status of Windows Firewall
profiles.
3. Check-AntivirusStatus: Checks the
status of the antivirus software
(Windows Defender).
4. Check-OpenPorts: Lists open ports on
the system.
5. Check-PasswordPolicy: Retrieves and
displays the current password
policy.
6. Check-UnauthorizedSoftware: Identifies
installed software that is not on
an authorized list.
7. Check-UserAccountStatus: Identifies
inactive and disabled user
accounts.
8. Check-CriticalServices:Verifies the
status of specified critical
services.
9. Invoke-SecurityVulnerabilityCheck: Runs all
the above checks in sequence.
To use this script:
1. Ensure you have administrative
privileges on the system.
2. Run the script in an elevated
PowerShell session.
3. The script will attempt to
install required modules if
they're not already present.
4. Customize the $authorizedSoftware and
$criticalServices arrays in the Invoke-
SecurityVulnerabilityCheck function to
match your organization's
requirements.
This script provides a starting
point for automated security
vulnerability detection. However,
keep in mind:
It focuses on Windows-specific
checks and may need to be adapted
for other operating systems.
Some checks (like password
policy) may require Domain Admin
privileges in an Active Directory
environment.
This is not an exhaustive
security audit and should be used
in conjunction with other
security tools and practices.
Regular updates to the script may
be necessary to address new
security concerns and changes in
Windows systems.
Consider extending the script with
additional checks specific to your
environment, such as:
Checking for specific registry
keys related to security settings
Verifying the presence and status
of security patches
Scanning for known
vulnerabilities using external
databases
Checking for compliance with
specific security standards
(e.g., NIST, CIS benchmarks)
Always test the script in a
controlled environment before
running it on production systems,
and ensure it complies with your
organization's security policies
and practices.
248. Audit Compliance
Frameworks
# Ensure necessary modules are installed
$requiredModules = @("AuditPolicyDsc",
"SecurityPolicyDsc", "PSWindowsUpdate")
foreach ($module in $requiredModules) {
if (-not (Get-Module -ListAvailable -Name
$module)) {
Install-Module -Name $module -Force -
Scope CurrentUser
}
}
# Import required modules
Import-Module AuditPolicyDsc
Import-Module SecurityPolicyDsc
Import-Module PSWindowsUpdate
# Function to check password policy
compliance
function Test-PasswordPolicy {
param (
[int]$MinPasswordLength = 14,
[int]$PasswordHistoryCount = 24,
[int]$MaxPasswordAge = 60
)
$policy = Get-
ADDefaultDomainPasswordPolicy
$compliance = @{
MinPasswordLength =
$[Link] -ge
$MinPasswordLength
PasswordHistoryCount =
$[Link] -ge
$PasswordHistoryCount
MaxPasswordAge =
$[Link] -le
$MaxPasswordAge
ComplexityEnabled =
$[Link]
}
return $compliance
}
# Function to check account lockout policy
function Test-AccountLockoutPolicy {
param (
[int]$LockoutDuration = 15,
[int]$LockoutThreshold = 5,
[int]$LockoutObservationWindow = 15
)
$policy = Get-
ADDefaultDomainPasswordPolicy
$compliance = @{
LockoutDuration =
$[Link] -ge
$LockoutDuration
LockoutThreshold =
$[Link] -le
$LockoutThreshold
LockoutObservationWindow =
$[Link] -le
$LockoutObservationWindow
}
return $compliance
}
# Function to check audit policy
function Test-AuditPolicy {
$desiredState = @{
"Account Logon" = "Success and
Failure"
"Account Management" = "Success and
Failure"
"Detailed Tracking" = "Success and
Failure"
"DS Access" = "Success and Failure"
"Logon/Logoff" = "Success and
Failure"
"Object Access" = "Success and
Failure"
"Policy Change" = "Success and
Failure"
"Privilege Use" = "Success and
Failure"
"System" = "Success and Failure"
}
$currentPolicy = AuditPol /get
/category:* /r | ConvertFrom-Csv
$compliance = @{}
foreach ($category in $[Link])
{
$actualSetting = ($currentPolicy |
Where-Object { $_."Subcategory" -eq $category
})."Inclusion Setting"
$compliance[$category] =
$actualSetting -eq $desiredState[$category]
}
return $compliance
}
# Function to check Windows Firewall status
function Test-WindowsFirewall {
$firewallProfiles = Get-
NetFirewallProfile
$compliance = @{}
foreach ($profile in $firewallProfiles) {
$compliance[$[Link]] =
$[Link]
}
return $compliance
}
# Function to check Windows Update status
function Test-WindowsUpdate {
$updateStatus = Get-WULastResults
$lastUpdateDate =
$[Link]
$compliance = @{
LastUpdateCheck = ($lastUpdateDate -
gt (Get-Date).AddDays(-7))
PendingUpdates = (Get-WUList).Count -
eq 0
}
return $compliance
}
# Function to check BitLocker status
function Test-BitLockerStatus {
$bitlockerVolumes = Get-BitLockerVolume
$compliance = @{}
foreach ($volume in $bitlockerVolumes) {
$compliance[$[Link]] =
$[Link] -eq "On"
}
return $compliance
}
# Function to check local administrator
accounts
function Test-LocalAdminAccounts {
$adminGroup = Get-LocalGroupMember -Group
"Administrators"
$localAdmins = $adminGroup | Where-Object
{ $_.PrincipalSource -eq "Local" }
$compliance = @{
LocalAdminCount = $[Link]
CompliantCount = ($[Link]
-le 2)
}
return $compliance
}
# Function to perform compliance audit
function Invoke-ComplianceAudit {
$auditResults = @{
PasswordPolicy = Test-PasswordPolicy
AccountLockoutPolicy = Test-
AccountLockoutPolicy
AuditPolicy = Test-AuditPolicy
WindowsFirewall = Test-
WindowsFirewall
WindowsUpdate = Test-WindowsUpdate
BitLocker = Test-BitLockerStatus
LocalAdminAccounts = Test-
LocalAdminAccounts
}
return $auditResults
}
# Function to generate compliance report
function New-ComplianceReport {
param (
[Parameter(Mandatory=$true)]
[hashtable]$AuditResults,
[string]$ReportPath =
"[Link]"
)
$htmlReport = @"
<html>
<head>
<style>
body { font-family: Arial, sans-
serif; }
table { border-collapse:
collapse; width: 100%; }
th, td { border: 1px solid #ddd;
padding: 8px; text-align: left; }
th { background-color: #f2f2f2; }
.compliant { color: green; }
.non-compliant { color: red; }
</style>
</head>
<body>
<h1>Compliance Audit Report</h1>
<h2>Generated on $(Get-Date)</h2>
"@
foreach ($category in $[Link])
{
$htmlReport += "<h3>$category</h3>
<table><tr><th>Check</th><th>Status</th>
</tr>"
foreach ($check in
$AuditResults[$category].Keys) {
$status = if
($AuditResults[$category][$check]) {
"Compliant" } else { "Non-Compliant" }
$cssClass = if
($AuditResults[$category][$check]) {
"compliant" } else { "non-compliant" }
$htmlReport += "<tr>
<td>$check</td><td
class='$cssClass'>$status</td></tr>"
}
$htmlReport += "</table>"
}
$htmlReport += "</body></html>"
$htmlReport | Out-File -FilePath
$ReportPath
Write-Host "Compliance report generated:
$ReportPath"
}
# Main execution
$auditResults = Invoke-ComplianceAudit
New-ComplianceReport -AuditResults
$auditResults -ReportPath
"C:\Reports\ComplianceReport_$(Get-Date -
Format 'yyyyMMdd').html"
This script provides a
comprehensive framework for
auditing compliance with various
security standards on a Windows
system. Here's a breakdown of the
functions:
1. Test-PasswordPolicy:Checks password
policy settings against defined
standards.
2. Test-AccountLockoutPolicy: Verifies
account lockout policy settings.
3. Test-AuditPolicy: Checks the system
audit policy settings.
4. Test-WindowsFirewall: Verifies the
status of Windows Firewall
profiles.
5. Test-WindowsUpdate: Checks the Windows
Update status and pending
updates.
6. Test-BitLockerStatus: Verifies
BitLocker encryption status for
volumes.
7. Test-LocalAdminAccounts: Checks the
number of local administrator
accounts.
8. Invoke-ComplianceAudit: Runs all the
compliance checks and collects
the results.
9. New-ComplianceReport: Generates an HTML
report of the compliance audit
results.
To use this script:
1. Ensure you have administrative
privileges on the system.
2. Run the script in an elevated
PowerShell session.
3. The script will attempt to
install required modules if
they're not already present.
4. The compliance audit will run
automatically, and a report will
be generated in the specified
location.
This script provides a foundation
for compliance auditing, but keep
in mind:
It covers common compliance areas
but may need to be expanded to
meet specific regulatory
requirements (e.g., HIPAA, PCI
DSS, GDPR).
Some checks may require Domain
Admin privileges in an Active
Directory environment.
The compliance criteria (e.g.,
password length, lockout
threshold) are set as parameters
and should be adjusted to match
your organization's policies or
specific compliance framework
requirements.
To enhance this script for more
comprehensive compliance auditing:
1. Add more specific compliance
checks:
2. Software inventory and approved
software list compliance
3. Network configuration checks
(e.g., DNS, DHCP settings)
4. Specific registry key checks for
security settings
5. File system permissions audits
6. Implement scoring or weighting
for different compliance areas:
function Get-ComplianceScore {
param (
[hashtable]$AuditResults,
[hashtable]$Weights
)
$totalScore = 0
$maxScore = 0
foreach ($category in $[Link])
{
$categoryScore = 0
$categoryMax = 0
foreach ($check in
$AuditResults[$category].Keys) {
$weight = if ($Weights[$category]
[$check]) { $Weights[$category][$check] }
else { 1 }
$categoryScore += if
($AuditResults[$category][$check]) { $weight
} else { 0 }
$categoryMax += $weight
}
$totalScore += $categoryScore
$maxScore += $categoryMax
}
return @{
Score = $totalScore
MaxScore = $maxScore
Percentage =
[math]::Round(($totalScore / $maxScore) *
100, 2)
}
}
3. Add remediation suggestions for
non-compliant items:
function Get-ComplianceRemediation {
param (
[hashtable]$AuditResults
)
$remediation = @{}
foreach ($category in $[Link])
{
$remediation[$category] = @{}
foreach ($check in
$AuditResults[$category].Keys) {
if (-not $AuditResults[$category]
[$check]) {
$remediation[$category]
[$check] = switch ($category) {
"PasswordPolicy" {
"Adjust password policy in Group Policy" }
"AccountLockoutPolicy" {
"Modify account lockout settings in Group
Policy" }
"AuditPolicy" {
"Configure audit policy settings using
[Link] or Group Policy" }
"WindowsFirewall" {
"Enable Windows Firewall for all profiles" }
"WindowsUpdate" { "Check
Windows Update settings and install pending
updates" }
"BitLocker" { "Enable
BitLocker encryption on all volumes" }
"LocalAdminAccounts" {
"Review and reduce the number of local
administrator accounts" }
default { "Review and
address non-compliant item" }
}
}
}
}
return $remediation
}
4. Implement historical tracking of
compliance scores:
function Save-ComplianceHistory {
param (
[hashtable]$ComplianceScore,
[string]$HistoryFile =
"C:\[Link]"
)
$date = Get-Date -Format "yyyy-MM-dd
HH:mm:ss"
$newRecord = [PSCustomObject]@{
Date = $date
Score = $[Link]
MaxScore = $[Link]
Percentage =
$[Link]
}
if (Test-Path $HistoryFile) {
$newRecord | Export-Csv -Path
$HistoryFile -Append -NoTypeInformation
} else {
$newRecord | Export-Csv -Path
$HistoryFile -NoTypeInformation
}
}
5. Enhance the report with graphs or
charts (using a library like
PSChart):
# Install-Module PSChart
Import-Module PSChart
function Add-ComplianceChart {
param (
[string]$HistoryFile,
[string]$ChartPath
)
$history = Import-Csv $HistoryFile
$chartData = $history | Select-Object -
Last 10
New-Chart -Title "Compliance Score
History" -Width 800 -Height 400 |
Add-Series -XField Date -YField
Percentage -SeriesType Line -Name "Compliance
%" |
Export-Chart -Path $ChartPath
}
6. Implement automated remediation
for certain non-compliant items
(use with caution):
function Invoke-ComplianceRemediation {
param (
[hashtable]$AuditResults
)
if (-not
$[Link]) {
Set-NetFirewallProfile -Profile
Domain -Enabled True
}
if (-not
$[Link]) {
Install-WindowsUpdate -AcceptAll -
AutoReboot
}
# Add more automated remediation steps as
needed
}
Remember to thoroughly test any
additions or modifications to the
script in a controlled environment
before using it in production.
Also, ensure that any automated
remediation steps are approved by
your organization's security team
and comply with your change
management processes.
249. Automate Patch Management
# Ensure necessary modules are installed
$requiredModules = @("PSWindowsUpdate",
"ActiveDirectory")
foreach ($module in $requiredModules) {
if (-not (Get-Module -ListAvailable -Name
$module)) {
Install-Module -Name $module -Force -
Scope CurrentUser
}
}
Import-Module PSWindowsUpdate
Import-Module ActiveDirectory
# Function to get available updates
function Get-AvailableUpdates {
param (
[string]$ComputerName =
$env:COMPUTERNAME
)
try {
$updates = Get-WUList -ComputerName
$ComputerName
return $updates
}
catch {
Write-Error "Failed to get available
updates for $ComputerName: $_"
return $null
}
}
# Function to install updates
function Install-PendingUpdates {
param (
[string]$ComputerName =
$env:COMPUTERNAME,
[switch]$AutoReboot
)
try {
$installResult = Install-
WindowsUpdate -ComputerName $ComputerName -
AcceptAll -AutoReboot:$AutoReboot -Verbose
return $installResult
}
catch {
Write-Error "Failed to install
updates on $ComputerName: $_"
return $null
}
}
# Function to get update history
function Get-UpdateHistory {
param (
[string]$ComputerName =
$env:COMPUTERNAME,
[int]$NumberOfDays = 30
)
try {
$history = Get-WUHistory -
ComputerName $ComputerName -Last
$NumberOfDays
return $history
}
catch {
Write-Error "Failed to get update
history for $ComputerName: $_"
return $null
}
}
# Function to schedule updates
function Schedule-WindowsUpdates {
param (
[string]$ComputerName =
$env:COMPUTERNAME,
[DateTime]$ScheduleDate
)
try {
$scheduleResult = Add-WUScheduledTask
-ComputerName $ComputerName -TaskName
"Scheduled Windows Update" -TaskStart
$ScheduleDate
return $scheduleResult
}
catch {
Write-Error "Failed to schedule
updates for $ComputerName: $_"
return $null
}
}
# Function to get computers from Active
Directory
function Get-ADComputers {
param (
[string]$OUPath
)
try {
$computers = Get-ADComputer -Filter *
-SearchBase $OUPath -Properties Name,
OperatingSystem
return $computers
}
catch {
Write-Error "Failed to get computers
from AD: $_"
return $null
}
}
# Function to generate patch management
report
function New-PatchManagementReport {
param (
[array]$ComputerList,
[string]$ReportPath =
"[Link]"
)
$reportData = @()
foreach ($computer in $ComputerList) {
$computerName = $[Link]
$updates = Get-AvailableUpdates -
ComputerName $computerName
$history = Get-UpdateHistory -
ComputerName $computerName -NumberOfDays 30
$reportData += [PSCustomObject]@{
ComputerName = $computerName
OperatingSystem =
$[Link]
PendingUpdates = $[Link]
LastUpdateDate = ($history |
Select-Object -First 1).Date
}
}
$htmlReport = @"
<html>
<head>
<style>
body { font-family: Arial, sans-
serif; }
table { border-collapse:
collapse; width: 100%; }
th, td { border: 1px solid #ddd;
padding: 8px; text-align: left; }
th { background-color: #f2f2f2; }
.warning { background-color:
#fff3cd; }
</style>
</head>
<body>
<h1>Patch Management Report</h1>
<h2>Generated on $(Get-Date)</h2>
<table>
<tr>
<th>Computer Name</th>
<th>Operating System</th>
<th>Pending Updates</th>
<th>Last Update Date</th>
</tr>
"@
foreach ($data in $reportData) {
$warningClass = if
($[Link] -gt 0) { '
class="warning"' } else { '' }
$htmlReport += @"
<tr$warningClass>
<td>$($[Link])
</td>
<td>$($[Link])
</td>
<td>$($[Link])
</td>
<td>$($[Link])
</td>
</tr>
"@
}
$htmlReport += @"
</table>
</body>
</html>
"@
$htmlReport | Out-File -FilePath
$ReportPath
Write-Host "Patch management report
generated: $ReportPath"
}
# Function to perform automated patch
management
function Invoke-AutomatedPatchManagement {
param (
[string]$OUPath,
[DateTime]$ScheduleDate,
[switch]$InstallImmediately,
[switch]$AutoReboot,
[string]$ReportPath =
"[Link]"
)
# Get computers from Active Directory
$computers = Get-ADComputers -OUPath
$OUPath
if (-not $computers) {
Write-Error "No computers found in
the specified OU."
return
}
foreach ($computer in $computers) {
$computerName = $[Link]
Write-Host "Processing
$computerName..." -ForegroundColor Cyan
if ($InstallImmediately) {
Write-Host "Installing updates on
$computerName..."
$installResult = Install-
PendingUpdates -ComputerName $computerName -
AutoReboot:$AutoReboot
if ($installResult) {
Write-Host "Updates installed
on $computerName" -ForegroundColor Green
}
}
else {
Write-Host "Scheduling updates
for $computerName..."
$scheduleResult = Schedule-
WindowsUpdates -ComputerName $computerName -
ScheduleDate $ScheduleDate
if ($scheduleResult) {
Write-Host "Updates scheduled
for $computerName" -ForegroundColor Green
}
}
}
# Generate report
New-PatchManagementReport -ComputerList
$computers -ReportPath $ReportPath
}
# Example usage
$ouPath = "OU=Workstations,DC=contoso,DC=com"
$scheduleDate = (Get-
Date).AddDays(1).[Link](22) # Schedule
for 10 PM tomorrow
Invoke-AutomatedPatchManagement -OUPath
$ouPath -ScheduleDate $scheduleDate -
ReportPath "C:\Reports\PatchReport_$(Get-Date
-Format 'yyyyMMdd').html"
# To install updates immediately with auto-
reboot:
# Invoke-AutomatedPatchManagement -OUPath
$ouPath -InstallImmediately -AutoReboot
This script provides a
comprehensive framework for
automating patch management in a
Windows environment. Here's a
breakdown of the functions:
1. Get-AvailableUpdates:Retrieves
available updates for a specified
computer.
2. Install-PendingUpdates: Installs pending
updates on a specified computer.
3. Get-UpdateHistory: Retrieves the
update history for a specified
computer.
4. Schedule-WindowsUpdates: Schedules
Windows updates for a specified
computer.
5. Get-ADComputers: Retrieves computers
from Active Directory within a
specified OU.
6. New-PatchManagementReport: Generates an
HTML report of the patch status
for multiple computers.
7. Invoke-AutomatedPatchManagement:
Orchestrates the patch management
process for multiple computers.
To use this script:
1. Ensure you have administrative
privileges on the target systems
and in Active Directory.
2. Run the script in an elevated
PowerShell session.
3. The script will attempt to
install required modules if
they're not already present.
4. Customize the $ouPath and $scheduleDate
variables to match your
environment.
5. Run the script with appropriate
parameters (e.g., to install
immediately or schedule for
later).
This script provides a solid
foundation for automated patch
management, but consider the
following enhancements:
1. Error handling and logging:
Add more robust error handling and
implement logging to track the
patch management process over time.
2. Staggered deployment:
Implement a mechanism to deploy
updates in waves to minimize
potential impacts.
function Invoke-StaggeredPatchManagement {
param (
[string]$OUPath,
[int]$BatchSize = 10,
[int]$DelayMinutes = 60
)
$computers = Get-ADComputers -OUPath
$OUPath
$batches =
[Math]::Ceiling($[Link] /
$BatchSize)
for ($i = 0; $i -lt $batches; $i++) {
$batchComputers = $computers |
Select-Object -Skip ($i * $BatchSize) -First
$BatchSize
Invoke-AutomatedPatchManagement -
ComputerList $batchComputers -
InstallImmediately -AutoReboot
if ($i -lt ($batches - 1)) {
Write-Host "Waiting $DelayMinutes
minutes before next batch..."
Start-Sleep -Seconds
($DelayMinutes * 60)
}
}
}
3. Pre and post-update checks:
Implement functions to check system
health before and after applying
updates.
function Test-SystemHealth {
param (
[string]$ComputerName
)
# Implement system health checks (e.g.,
disk space, critical services)
# Return $true if system is healthy,
$false otherwise
}
function Invoke-PreUpdateChecks {
param (
[string]$ComputerName
)
if (Test-SystemHealth -ComputerName
$ComputerName) {
return $true
}
else {
Write-Warning "Pre-update health
check failed for $ComputerName"
return $false
}
}
function Invoke-PostUpdateChecks {
param (
[string]$ComputerName
)
if (Test-SystemHealth -ComputerName
$ComputerName) {
return $true
}
else {
Write-Warning "Post-update health
check failed for $ComputerName"
return $false
}
}
4. Update rollback:
Implement a function to rollback
updates if post-update checks fail.
function Invoke-UpdateRollback {
param (
[string]$ComputerName,
[string]$UpdateKB
)
try {
$result = Invoke-Command -
ComputerName $ComputerName -ScriptBlock {
wusa /uninstall
/kb:$using:UpdateKB /quiet /norestart
}
return $result
}
catch {
Write-Error "Failed to rollback
update KB$UpdateKB on $ComputerName: $_"
return $false
}
}
5. Reporting enhancements:
Add more detailed reporting,
including success/failure rates and
specific update information.
6. Integration with WSUS:
If your organization uses Windows
Server Update Services (WSUS),
integrate the script with WSUS for
better control over update
approvals and deployments.
Remember to thoroughly test the
script and any enhancements in a
non-production environment before
using it in production. Also,
ensure that your patch management
process complies with your
organization's change management
policies and any relevant
regulatory requirements.
250. Generate End-of-Day IT
Reports
# Ensure necessary modules are installed
$requiredModules = @("ActiveDirectory",
"PSWindowsUpdate", "ImportExcel")
foreach ($module in $requiredModules) {
if (-not (Get-Module -ListAvailable -Name
$module)) {
Install-Module -Name $module -Force -
Scope CurrentUser
}
}
Import-Module ActiveDirectory
Import-Module PSWindowsUpdate
Import-Module ImportExcel
# Function to get system uptime
function Get-SystemUptime {
param (
[string]$ComputerName =
$env:COMPUTERNAME
)
try {
$os = Get-WmiObject -Class
Win32_OperatingSystem -ComputerName
$ComputerName
$uptime = (Get-Date) -
$[Link]($[Link])
return $uptime
}
catch {
Write-Error "Failed to get uptime for
$ComputerName: $_"
return $null
}
}
# Function to get disk space information
function Get-DiskSpaceInfo {
param (
[string]$ComputerName =
$env:COMPUTERNAME
)
try {
$disks = Get-WmiObject -Class
Win32_LogicalDisk -ComputerName $ComputerName
-Filter "DriveType = 3"
$diskInfo = $disks | Select-Object
DeviceID,
@{Name="Size(GB)";Expression=
{[math]::Round($_.Size/1GB, 2)}},
@{Name="FreeSpace(GB)";Expression
={[math]::Round($_.FreeSpace/1GB, 2)}},
@{Name="FreePercent";Expression=
{[math]::Round(($_.FreeSpace/$_.Size)*100,
2)}}
return $diskInfo
}
catch {
Write-Error "Failed to get disk space
info for $ComputerName: $_"
return $null
}
}
# Function to get recent event log entries
function Get-RecentEventLogs {
param (
[string]$ComputerName =
$env:COMPUTERNAME,
[string]$LogName = "System",
[int]$Hours = 24
)
try {
$startTime = (Get-
Date).AddHours(-$Hours)
$events = Get-WinEvent -ComputerName
$ComputerName -FilterHashtable @{
LogName = $LogName
StartTime = $startTime
Level = 1,2,3 # Error, Warning,
Information
} -MaxEvents 100
return $events | Select-Object
TimeCreated, LevelDisplayName, Id, Message
}
catch {
Write-Error "Failed to get recent
event logs for $ComputerName: $_"
return $null
}
}
# Function to get installed software
function Get-InstalledSoftware {
param (
[string]$ComputerName =
$env:COMPUTERNAME
)
try {
$software = Get-WmiObject -Class
Win32_Product -ComputerName $ComputerName |
Select-Object Name, Version,
Vendor
return $software
}
catch {
Write-Error "Failed to get installed
software for $ComputerName: $_"
return $null
}
}
# Function to get active network connections
function Get-ActiveNetworkConnections {
param (
[string]$ComputerName =
$env:COMPUTERNAME
)
try {
$connections = Get-NetTCPConnection -
CimSession $ComputerName |
Where-Object State -eq
'Established' |
Select-Object LocalAddress,
LocalPort, RemoteAddress, RemotePort, State,
OwningProcess
return $connections
}
catch {
Write-Error "Failed to get active
network connections for $ComputerName: $_"
return $null
}
}
# Function to get Windows Update status
function Get-WindowsUpdateStatus {
param (
[string]$ComputerName =
$env:COMPUTERNAME
)
try {
$updateSession = New-Object -
ComObject [Link]
$updateSearcher =
$[Link]()
$pendingUpdates =
$[Link]("IsInstalled=0 and
Type='Software'").Updates
$updateStatus = @{
PendingUpdatesCount =
$[Link]
LastUpdateDate = (Get-WmiObject -
Class Win32_OperatingSystem -ComputerName
$ComputerName).LastBootUpTime
}
return $updateStatus
}
catch {
Write-Error "Failed to get Windows
Update status for $ComputerName: $_"
return $null
}
}
# Function to generate the end-of-day IT
report
function New-EndOfDayITReport {
param (
[string[]]$ComputerNames,
[string]$ReportPath =
"[Link]"
)
$reportData = @()
foreach ($computer in $ComputerNames) {
Write-Host "Gathering information for
$computer..." -ForegroundColor Cyan
$uptime = Get-SystemUptime -
ComputerName $computer
$diskInfo = Get-DiskSpaceInfo -
ComputerName $computer
$events = Get-RecentEventLogs -
ComputerName $computer
$software = Get-InstalledSoftware -
ComputerName $computer
$connections = Get-
ActiveNetworkConnections -ComputerName
$computer
$updateStatus = Get-
WindowsUpdateStatus -ComputerName $computer
$reportData += [PSCustomObject]@{
ComputerName = $computer
Uptime = $uptime
DiskInfo = $diskInfo
RecentEvents = $events
InstalledSoftware = $software
ActiveConnections = $connections
WindowsUpdateStatus =
$updateStatus
}
}
# Create Excel workbook
$excelPackage = New-Object
[Link]
$workbook = $[Link]
# Create Summary sheet
$summarySheet =
$[Link]("Summary")
$row = 1
$[Link]["A$row:F$row"].Merge
= $true
$[Link]["A$row"].Value =
"End-of-Day IT Report - $(Get-Date -Format
'yyyy-MM-dd')"
$[Link]["A$row"].[Link].S
ize = 16
$[Link]["A$row"].[Link].B
old = $true
$row += 2
$[Link]["A$row"].Value =
"Computer Name"
$[Link]["B$row"].Value =
"Uptime (Days)"
$[Link]["C$row"].Value =
"Disk Space (% Free)"
$[Link]["D$row"].Value =
"Critical Events"
$[Link]["E$row"].Value =
"Pending Updates"
$[Link]["F$row"].Value =
"Active Connections"
foreach ($data in $reportData) {
$row++
$[Link]["A$row"].Value =
$[Link]
$[Link]["B$row"].Value =
[math]::Round($[Link], 2)
$[Link]["C$row"].Value =
($[Link] | Measure-Object -Property
FreePercent -Minimum).Minimum
$[Link]["D$row"].Value =
($[Link] | Where-Object {
$_.LevelDisplayName -eq 'Error' }).Count
$[Link]["E$row"].Value =
$[Link]
$[Link]["F$row"].Value =
$[Link]
}
$[Link]()
# Create detailed sheets for each
computer
foreach ($data in $reportData) {
$computerSheet =
$[Link]($[Link])
$row = 1
# Computer Name and Uptime
$[Link]["A$row:B$row"].M
erge = $true
$[Link]["A$row"].Value =
"Computer: $($[Link])"
$[Link]["A$row"].Style.F
[Link] = $true
$row++
$[Link]["A$row"].Value =
"Uptime:"
$[Link]["B$row"].Value =
"$($[Link]) days,
$($[Link]) hours,
$($[Link]) minutes"
$row += 2
# Disk Space
$[Link]["A$row:D$row"].M
erge = $true
$[Link]["A$row"].Value =
"Disk Space"
$[Link]["A$row"].Style.F
[Link] = $true
$row++
$[Link]["A$row"].Value =
"Drive"
$[Link]["B$row"].Value =
"Size (GB)"
$[Link]["C$row"].Value =
"Free Space (GB)"
$[Link]["D$row"].Value =
"Free (%)"
foreach ($disk in $[Link]) {
$row++
$[Link]["A$row"].Val
ue = $[Link]
$[Link]["B$row"].Val
ue = $disk.'Size(GB)'
$[Link]["C$row"].Val
ue = $disk.'FreeSpace(GB)'
$[Link]["D$row"].Val
ue = $[Link]
}
$row += 2
# Recent Events
$[Link]["A$row:D$row"].M
erge = $true
$[Link]["A$row"].Value =
"Recent Critical Events"
$[Link]["A$row"].Style.F
[Link] = $true
$row++
$[Link]["A$row"].Value =
"Time"
$[Link]["B$row"].Value =
"Level"
$[Link]["C$row"].Value =
"Event ID"
$[Link]["D$row"].Value =
"Message"
foreach ($event in
($[Link] | Where-Object {
$_.LevelDisplayName -eq 'Error' } | Select-
Object -First 10)) {
$row++
$[Link]["A$row"].Val
ue = $[Link]
$[Link]["B$row"].Val
ue = $[Link]
$[Link]["C$row"].Val
ue = $[Link]
$[Link]["D$row"].Val
ue = $[Link]
}
$row += 2
# Windows Update Status
$[Link]["A$row:B$row"].M
erge = $true
$[Link]["A$row"].Value =
"Windows Update Status"
$[Link]["A$row"].Style.F
[Link] = $true
$row++
$[Link]["A$row"].Value =
"Pending Updates:"
$[Link]["B$row"].Value =
$[Link]
$row++
$[Link]["A$row"].Value =
"Last Update Date:"
$[Link]["B$row"].Value =
$[Link]
$row += 2
# Active Network Connections
$[Link]["A$row:E$row"].M
erge = $true
$[Link]["A$row"].Value =
"Active Network Connections"
$[Link]["A$row"].Style.F
[Link] = $true
$row++
$[Link]["A$row"].Value =
"Local Address"
$[Link]["B$row"].Value =
"Local Port"
$[Link]["C$row"].Value =
"Remote Address"
$[Link]["D$row"].Value =
"Remote Port"
$[Link]["E$row"].Value =
"State"
foreach ($conn in
($[Link] | Select-Object -
First 10)) {
$row++
$[Link]["A$row"].Val
ue = $[Link]
$[Link]["B$row"].Val
ue = $[Link]
$[Link]["C$row"].Val
ue = $[Link]
$[Link]["D$row"].Val
ue = $[Link]
$[Link]["E$row"].Val
ue = $[Link]
}
$[Link]()
}
# Save the Excel file
$[Link]($ReportPath)
Write-Host "End-of-Day IT Report
generated: $ReportPath" -ForegroundColor
Green
}
# Example usage
$computers = @("Server01", "Server02",
"Workstation01", "Workstation02")
New-EndOfDayITReport -ComputerNames
$computers -ReportPath
"C:\Reports\EndOfDayITReport_$(Get-Date -
Format 'yyyyMMdd').xlsx"
This script generates a
comprehensive End-of-Day IT Report
in Excel format. Here's a breakdown
of its functions and features:
1. Get-SystemUptime:Retrieves system
uptime for a specified computer.
2. Get-DiskSpaceInfo: Gathers disk space
information for all drives on a
computer.
3. Get-RecentEventLogs: Collects recent
event log entries, focusing on
critical events.
4. Get-InstalledSoftware: Lists installed
software on a computer.
5. Get-ActiveNetworkConnections:Retrieves
information about active network
connections.
6. Get-WindowsUpdateStatus: Checks the
Windows Update status, including
pending updates.
7. New-EndOfDayITReport: The main function
that generates the Excel report,
including:
8. A summary sheet with key
information for all computers
9. Detailed sheets for each computer
with in-depth information
0. Formatting and styling of the
Excel workbook for improved
readability
To use this script:
1. Ensure you have administrative
privileges on the target systems.
2. Run the script in an elevated
PowerShell session.
3. The script will attempt to
install required modules if
they're not already present.
4. Customize the $computers array with
the names of the computers you
want to include in the report.
5. Adjust the report path as needed.
This script provides a solid
foundation for generating end-of-
day IT reports, but you can enhance
it further:
1. Add error handling and logging:
Implement more robust error
handling and logging to track
issues with data collection.
function Write-Log {
param (
[string]$Message,
[string]$LogPath =
"C:\Logs\[Link]"
)
$timestamp = Get-Date -Format "yyyy-MM-dd
HH:mm:ss"
"$timestamp - $Message" | Out-File -
FilePath $LogPath -Append
}
# Use this function throughout the script for
logging
# Example:
# Write-Log "Failed to collect disk space
info for Computer01: Access denied"
2. Include performance metrics:
Add functions to collect and report
on CPU, memory, and network
performance.
function Get-PerformanceMetrics {
param (
[string]$ComputerName =
$env:COMPUTERNAME
)
try {
$cpu = Get-WmiObject -Class
Win32_Processor -ComputerName $ComputerName |
Measure-Object -Property
LoadPercentage -Average |
Select-Object -ExpandProperty
Average
$memory = Get-WmiObject -Class
Win32_OperatingSystem -ComputerName
$ComputerName |
Select-Object
@{Name="MemoryUsage";Expression={"{0:N2}" -f
((($_.TotalVisibleMemorySize -
$_.FreePhysicalMemory) /
$_.TotalVisibleMemorySize) * 100)}}
$network = Get-WmiObject -Class
Win32_PerfFormattedData_Tcpip_NetworkInterfac
e -ComputerName $ComputerName |
Select-Object Name,
BytesTotalPersec, CurrentBandwidth
return @{
CPUUsage = $cpu
MemoryUsage = $[Link]
NetworkUsage = $network
}
}
catch {
Write-Error "Failed to get
performance metrics for $ComputerName: $_"
return $null
}
}
# Add this data to the report generation
function
3. Include security-related
information:
Add checks for security-related
items such as failed login
attempts, firewall status, or
antivirus updates.
function Get-SecurityInfo {
param (
[string]$ComputerName =
$env:COMPUTERNAME
)
try {
$failedLogins = Get-WinEvent -
ComputerName $ComputerName -FilterHashtable
@{
LogName = 'Security'
ID = 4625
StartTime = (Get-
Date).AddHours(-24)
} -ErrorAction SilentlyContinue
$firewallStatus = Get-
NetFirewallProfile -CimSession $ComputerName
|
Select-Object Name, Enabled
$antivirusStatus = Get-WmiObject -
Namespace "root\SecurityCenter2" -Class
AntiVirusProduct -ComputerName $ComputerName
|
Select-Object displayName,
productState
return @{
FailedLoginAttempts =
$[Link]
FirewallStatus = $firewallStatus
AntivirusStatus =
$antivirusStatus
}
}
catch {
Write-Error "Failed to get security
info for $ComputerName: $_"
return $null
}
}
# Include this information in the report
4. Add trend analysis:
Implement a function to compare
current data with previous reports
to identify trends or changes.
function Compare-WithPreviousReport {
param (
[string]$CurrentReportPath,
[string]$PreviousReportPath
)
$current = Import-Excel -Path
$CurrentReportPath -WorksheetName "Summary"
$previous = Import-Excel -Path
$PreviousReportPath -WorksheetName "Summary"
$comparison = Compare-Object -
ReferenceObject $previous -DifferenceObject
$current -Property ComputerName, 'Disk Space
(% Free)', 'Critical Events', 'Pending
Updates' -IncludeEqual
return $comparison | Select-Object
ComputerName,
@{N='Metric';E={$_.Property}},
@{N='Previous';E={$_.ReferenceObject.
($_.Property)}},
@{N='Current';E={$_.DifferenceObject.
($_.Property)}},
@{N='Change';E={$_.DifferenceObject.
($_.Property) - $_.ReferenceObject.
($_.Property)}}
}
# Add this comparison to a new sheet in the
report
5. Implement report distribution:
Add functionality to automatically
email the report to relevant
stakeholders.
function Send-ReportEmail {
param (
[string]$ReportPath,
[string]$EmailTo,
[string]$EmailFrom,
[string]$SmtpServer
)
$subject = "End-of-Day IT Report - $(Get-
Date -Format 'yyyy-MM-dd')"
$body = "Please find attached the End-of-
Day IT Report for $(Get-Date -Format 'yyyy-
MM-dd')."
Send-MailMessage -From $EmailFrom -To
$EmailTo -Subject $subject -Body $body -
Attachments $ReportPath -SmtpServer
$SmtpServer
}
# Use this function after generating the
report
# Send-ReportEmail -ReportPath $ReportPath -
EmailTo "it@[Link]" -EmailFrom
"reports@[Link]" -SmtpServer
"[Link]"
6. Add data visualization:
Enhance the Excel report with
charts and graphs for key metrics.
# Add this to the report generation function
$chart =
$[Link]("DiskSpaceCha
rt",
[[Link]]::Col
umnClustered)
$[Link](1, 0, 6, 0)
$[Link](600, 300)
$[Link] = "Disk Space Usage"
$[Link]("C$row:C$($row+$reportData.
Count-1)", "A$row:A$($row+$[Link]-
1)")
$[Link] = "Computer Name"
$[Link] = "Free Space (%)"
These enhancements will make your
End-of-Day IT Report more
comprehensive, insightful, and
actionable. Remember to test
thoroughly in a non-production
environment before implementing in
your production systems, and ensure
that the script complies with your
organization's security and data
handling policies.