Kansa PowerShell
Kansa PowerShell is a modular incident response framework that has the capability to operate in a
one-to-many style across endpoints in an enterprise. Kansa can be deployed in scenarios such as
incident response, breach hunts, or for building an environmental baseline. The simplicity of the
framework allows a CSIRT to collect data quickly and at scale, speeding up the remediation and
recovery of a breach.
Kansa takes advantage of Windows Remote Management and PowerShell remoting to collect data
from across an enterprise domain. Kansa is also safe to deploy across multiple endpoints as it uses
PowerShell’s default non-delegated Kerberos network logons, not the vulnerable CredSSP and
therefore does not expose credentials to threats of harvesting.
The framework contains many stand-alone PowerShell ‘.ps1’ modules that can collect a vast amount
of useful data needed during an initial triage of an incident.
All the modules can be run individually from their respective folders and the output viewed in
PowerShell. Not all modules in Kansa are run by default. The modules set to run can be seen in the
‘[Link]’ file. If a module is commented (#) out it will not run, simply uncomment to allow
Kansa to run that module.
Installing Kansa
The simplest way to install Kansa on a target workstation is to download the .zip from the Kansa
GitHub Repo.
Once the file is downloaded, extract the .zip file:
PS > Expand-Archive ‘.\[Link]’
Kansa is designed to be run at scale across hundreds of hosts attached to an Active Directory domain.
As by default Kansa runs on hosts with an associated Domain Controller, it should run out of the
box. Windows Remote Management and Remote Service Administration Tools are required to be
enabled for Kansa to be executed. To enable the requires tools to be use the following command:
PS > enable-psremoting
Next, navigate to the Kansa-master directory and allow the ‘.ps1’ files and modules to be executed:
PS > ls -r *.ps1 | Unblock-File
PS > Set-ExecutionPolicy Unrestricted
If Kansa is required to be executed on a single host, or a host not connected to an Active Directory
Domain, there is an additional step that needs to be completed.
The network profile of the host will need to be altered from ‘public’ to ‘private’. This stops Kansa
from looking for more hosts to collect data from, reducing the probability of failures.
To view the hosts current network profile in PowerShell you can use the following command:
PS > Get-NetConnectionProfile
To change the connection profile, you edit the correct interface profile with this command:
PS > Set-NetConnectionProfile -InterfaceIndex [0] -NetworkCategory Private
Ensure the InterfaceIndex number is altered to match that of the hosts InterfaceIndex.
Running Kansa
After the host environment is correctly configured, Kansa can now be executed using a simple, one
line command.
As Kansa will use Active Directory Kerberos to authenticate, which is does by default, there is no
need to instruct the script to do any differently. The main difference is in the command below, a
target list has been provided. This is a text file with a single hostname on a new line. This is where
Kansa can be deployed to hundreds of hosts across a at once.
PS > .\kansa.ps1 -TargetList [Link] -ModulePath .\Modules -Verbose
As with the setup, running Kansa on a single host, or a host not connected to an Active Directory
domain requires a slightly different command argument.
To get it to run on a localhost, that host needs to be specified as a target. Without a target, Kansa will
look to run on an entire AD. As Kansa relies on the AD Kerberos to authenticate, when running on a
localhost there is no Kerberos, so Kansa needs to be told to run NTLM authentication, with the
username passed as an argument.
PS > .\kansa.ps1 -pushbin -target localhost -credential username -authentication negotiate
The successful completion of Kansa execution will generate a .zip folder titled
‘Output_[timestamp]’, which will contain the results of the PowerShell modules in .csv format
organised in named folders, and an [Link] with details of modules that had error or failed during
execution.
CLOUD COMPUTING
Cloud computing is a technology paradigm that is offering useful services to
consumers. Cloud Computing has the long-term potential to change the way
information technology is provided and used. The entire cloud ecosystem consists of
majorly four different entities which plays vital role to fulfill the requirements of all
the stake holders. The role played by each individual depends on their position in the
market and their business strategy. These most prominent entities in the cloud
ecosystem are:
Cloud Service Provider: it provides cloud services available to cater the needs
of different users from different domain by acquiring and managing the
computing resources both hardware and software and arranging for networked
access to the cloud customers.
Cloud Integrator: the facilitators, one who identify, customize and integrate
the cloud services as per the requirement and in accordance with the customers’
needs. It plays the important role of matchmaking and negotiating the
relationship between the consumer and producer of the services.
Cloud Carrier: it is an intermediary which facilitates the connectivity and
takes the cloud services at the doorsteps of end-user by providing access
through different network access and devices.
Cloud Customer: the actual user of services extended by the service provider
which may be an individuals or organizations which in turn may have their own
end-users like employees or other customers.
Cloud computing being a modern technology offers numerous advantages. In order to
harness all these benefits, one has to scrupulously investigate as many cloud security
measures as possible. These concerns may vary from vulnerability to malicious code
penetration to hijacked accounts to full-scale data breaches. Based on literature
searches and analysis efforts, some of the major cloud-unique vulnerabilities and
threats were identified which one must consider before making decision to migrate to
cloud for opting the services are as follows:
1. Data Breaches/Data Loss
2. Denial of Service Attacks/Malware Injection
3. Hijacking Account
4. Inadequate Change Control and Misconfiguration
5. Insecure Interfaces and Poor APIs implementation
6. Insider Threats
7. Insufficient Credentials and Identity/Compromised accounts
8. Weak control plane/Insufficient Due Diligence
9. Shared Vulnerabilities
[Link] use or Abuse of Cloud Services
[Link] of cloud security strategy/Regulatory violations
[Link] cloud usage visibility