Thanks to visit codestin.com
Credit goes to github.com

Skip to content

A tool collection for filtering and visualizing logon events. Designed to help answering the "Cotton Eye Joe" question (Where did you come from where did you go) in Security Incidents and Threat Hunts

License

Notifications You must be signed in to change notification settings

sec-hbaer/Blauhaunt

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

61 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Blauhaunt

A tool collection for filtering and visualizing logon events. Designed to help answering the "Cotton Eye Joe" question (Where did you come from where did you go) in Security Incidents and Threat Hunts.

This tool is designed for experienced DFIR specialists. You may have little to none usage from it without expirience in Threat Hunting

Table of Contents

image image image

Get started

Running Blauhaunt is as simple as that:

open https://cgosec.github.io/Blauhaunt/app/ since there is no backend no data will leave your local system. (third party librarys integrated and I do not take any responsibilities of their communication behavior. Imports are in the index.html file on top)

run a cmd or bash or what ever you like...

Then:

 git clone https://github.com/cgosec/Blauhaunt
 cd Blauhaunt/app
 python -m http.server

Now you can navigate to http://localhost:8000/ in your browser and start blau haunting the baddies.

Some random test data is in the directory test_data to get started. However this is just randomly generated and nothing to start investigate with.

Upload Data

image Kick "Upload Data" (suprising isn't it :-P)

image Upload the json export of the velo artifact or the result(s) of the powershell script here. Do not upload the client_info.json or anything in here!

image This is optional and just needed for having system tags and their os info. Upload your client_info.json extract here. This is just an export of the Velociraptor clients() function. Just use this query:

 SELECT * FROM clients()

and export the json

image This is optional too. Upload a mapping for having IP-Adresses resolved to their hostnames. You need to have a file where there is one col for Hostnames and a col for IP-Adresses. If a System has multiple IP-Adresses you can have them in this one col seperated by an arbitrarily symbol e.G. "/".

Example:

Hostname IP-Addresses MaybeSomeNotNeededStuff
System_A 10.10.10.100 bonjour
System_B 10.10.10.100 / 10.10.20.100 hello
System_C 10.10.10.100 hola

image

Once a propper file is selected a delimiter (if non is specified a comma is expected). And click Load Map.

image

  1. Choose the name of the col where the hostname is in
  2. OPTIONAL: Specify if there are any entries you want to exclude from parsing e.g. lines having an "UNKNOWN" in the Hostname Ip Mapping.
  3. Choose the name of the col where the IP-Address is in
  4. Specify the delimiter for multiple IP-Addresses in this line When everything is correct click image

When done click image

image

If everything was processed as intendet you should now see the number of total nodes and edges

Filtering

Click image to open the sidebar.

image

The User Filter applies to the users. All Usernames are set to uppercase when parsing. This is important for filtering!

image

The Src Host Filter specifies systems from where the user initiated the connection. Filtering for Src Hosts only works when the graph is set to System image

image

The Dst Host filter specifies systems that where target of logons.

image

Filter for a time span for activities.

image

The Daily times filter specifies from what time we are intrested in the events. This is useful if nightly user logons are not common in your environment. This is regardles of the date - that means in your timespan only events that occured during that hourly timespan are in the set. (Works over night like in the example picture too)

image

This is a filter that helps to focus on the top talkers in big sets. You can specify a threshold for a minimum of outgoing connection for the source system.

image

By default events where source and destination are the same node are not displayed. If you want to display them active it by clicking.

image

Filtering for EventIDs is a good idea to reduce the data. There is no diffrence in choosing all or none.

image

Logon Types are only relevant for 4624 or 4625 events. I assume you know them already when you are using this tool.

image

Filtering for Tags only is available when client infos are uploaded. Those are your tags specified in Velociraptor for the Systems. It does not have an effect if all on none are chosen. Those apply only for the source not for the destination system.

Source: System or User

Usually I am rather focused on system -> system activitie in favour of identifying the initial access. Since there are a lot of situations you want to focus on user behavior you can choose what your source should be: System or User.

image

Render Graph, Timeline or Heatmap

When your filters are set you need to press render to display the results. image

Graph

comming soon

Timeline

comming soon

Heatmap

comming soon

If you want to change from one view to another: choose the view you need and then click render. 'Be careful with Timelin! Few nodes and edges can still have a huge timeline!* Checking the Stats image is a good idea before rendering a timeline.

Graph Style

comming soon

Tag vizualisation

comming soon

Exports

comming soon

Stats

comming soon

Node information

comming soon

Edge information

comming soon

Integration in investigation

I recommend using Blauhaunt with Velociraptor since it is the fastest way to get data from multible systems. The Blauhaunt import format for event data and client info is the one that can be exported from Velo. The blauhaunt_script.ps1 works well if you prefere working with e.g. KAPE triage data.

Blauhaunt really gets useful if you have multible systems to identify your next pivot system or sus users. Blauhaunt standalone will not magically bring you to the compromised systems and users. But if you have hundrets of systems to check it really speeds up your game.

Example workflow

Known compromised system

(e.g. from a Velo hunt) -> Check in Blauhaunt what users connected to this system -> sus user -> sus systems -> further sus users -> the story goes on. You have good chances identifing the systems where deeper forensics will speed you up in your hunt. If you e.g. identify compromised users on that system again you can go back to Blauhaunt and repeat the game.

No idea where to start

With several filters Blauhaunt gives you statistical and visual possibilities identifing unusual connections. You can e.g. check for user activities occuring at night. Or simply see a logon fire coming form a system where an attacker is enumerating the AD-Infrastructure.

Lucky shot

If you are really lucky and have a noisy attacker + solid administration in the network, Blauhaunt can potentially deliver you an optical attack map with the timeline of compromised systems along the y-axis in the center.

Architekture

Blauhaunt is designed to run entirely without a backend system. I suggest simply starting a python http server on the local system from a shell in the directory where the index.html is in with this command:

 python -m http.server

if you are using linux likely you have to type python3 instead of python - but if you are using this tool you sould be technical skilled enough to figure that out yourself ;)

Some day I will create a backend in Django with an API to get realtime data to display for better threat hunting

Default Layout

The layout of the graph is calculated according to the set filters. The icon size of a node is calculated by its activities within the set filters. The x-axis position of a node is calculated by its outgoing connections. Nodes having many outgoing connections are rather in the center of the graph. Nodes with fewer outgoing connections are at the left and the right of the graph. The y-axis is calculated by the first quatile of the nodes activitie time.

To not have too many nodes at the same spot there is some movement when there are too many on the same spot.

The other layouts are defaults from the cytoscape universe that can be choosen as well.

Filters

description comming soon

Displays

description comming soon

General Data Schema

There are three types of data - only the event data is mandatory

Event Data

This is the input Schema for the Event data that is needed by Blauhaunt to process it:

 {
    "LogonTimes":[
        "2023-07-28T20:30:19Z",
        "2023-07-27T21:12:12Z",
        "2023-07-27T21:10:49Z"
        ],
    "UserName":"Dumdidum",
    "SID":"-",
    "Destination":"Desti-LAPTOP",
    "Description":"using explicit credentials",
    "EventID":4648,
    "LogonType":"-",
    "SourceIP":"-",
    "SourceHostname":"Sourci-LAPTOP",
    "LogonCount":3
    }

To correctly process the files each dataset starting with { and ending with } must be in a new line

Client Info

 {
    "os_info": {
         "hostname": "Desti-LAPTOP"
         "release": "Windows 10"
    },
    "labels": [
         "Touched",
         "C2",
         "CredDumped"
         ]
    }

To correctly process the files each dataset starting with { and ending with } must be in a new line

Host IP Mapping

Can be any CSV File. Delimiter can be specified and cols for Hostname and IP can be choosen

PowerShell Script

blauhaunt_script.ps1 If you face any issues with execution policy the easiest thing to do is to spawn a powershell with execution policy bypass like this:

 PowerShell.exe -ExecutionPolicy Bypass powershell

To get information about usage and parameters use Get-Help

 Get-Help blauhaunt_script.ps1 -Detailed

Usage

image

Depending on the size, StartDate and EndDate this can take quiet some time so be a little patient

Velociraptor Artifact

This speeds up collecting the relevant data on scale. I recommend creating a notebook (template may be provided soon here too) where all the results are listed. You can simply take the json export from this artefact to import it into Blauhaunt

The client_info import is designed to work directly with the client_info from Velociraptor too. You can simply export the json file and upload it into Blauhaunt.

Usage

Comming soon

Acknowledgements

  • SEC Consult This work was massively motivated by my work in and with the SEC Defence team
  • Velociraptor is the game changer making it possible to collect the data to display at scale (tested with > 8000 systems already!)
  • Cytoscape.js is the library making the interactive graph visualisation possible
  • LogonTracer inspired the layout and part of the techstack of this project
  • CyberChef inspired the idea of creating a version of Blauhaunt running without backend system all browser based

(The icon is intentionally shitty - this is how I actually look while hunting... just the look in the face not the big arms though :-P )

About

A tool collection for filtering and visualizing logon events. Designed to help answering the "Cotton Eye Joe" question (Where did you come from where did you go) in Security Incidents and Threat Hunts

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • JavaScript 68.0%
  • HTML 20.0%
  • PowerShell 9.0%
  • CSS 3.0%